diff options
author | Joe Fernandez <joefernandez@google.com> | 2011-08-22 15:49:52 -0700 |
---|---|---|
committer | Joe Fernandez <joefernandez@google.com> | 2011-10-12 14:27:53 -0700 |
commit | 99b70f3f5d051261229d1792c169a374fc23326b (patch) | |
tree | 0632b5b5ed4cb4d860b6d59e02dfa87a726a7721 /docs/html/guide/topics/media | |
parent | 8c7951afa28ffa08efe3c920db364788a0968f94 (diff) | |
download | frameworks_base-99b70f3f5d051261229d1792c169a374fc23326b.zip frameworks_base-99b70f3f5d051261229d1792c169a374fc23326b.tar.gz frameworks_base-99b70f3f5d051261229d1792c169a374fc23326b.tar.bz2 |
DO NOT MERGE
cherrypick from master Change-Id: I63bc055991405c56e9dcc83a54b106b870cf6b29
Change-Id: Ia5150288ce6fe57460159dd7555c6786023b1d9e
Diffstat (limited to 'docs/html/guide/topics/media')
-rw-r--r-- | docs/html/guide/topics/media/audio-capture.jd | 253 | ||||
-rw-r--r-- | docs/html/guide/topics/media/camera.jd | 1055 | ||||
-rw-r--r-- | docs/html/guide/topics/media/index.jd | 987 | ||||
-rw-r--r-- | docs/html/guide/topics/media/jetplayer.jd | 70 | ||||
-rw-r--r-- | docs/html/guide/topics/media/mediaplayer.jd | 747 |
5 files changed, 2164 insertions, 948 deletions
diff --git a/docs/html/guide/topics/media/audio-capture.jd b/docs/html/guide/topics/media/audio-capture.jd new file mode 100644 index 0000000..75d294b --- /dev/null +++ b/docs/html/guide/topics/media/audio-capture.jd @@ -0,0 +1,253 @@ +page.title=Audio Capture +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + + <div id="qv-wrapper"> + <div id="qv"> + +<h2>In this document</h2> +<ol> +<li><a href="#audiocapture">Performing Audio Capture</a> + <ol> + <li><a href='#example'>Code Example</a></li> + </ol> +</li> +</ol> + +<h2>Key classes</h2> +<ol> +<li>{@link android.media.MediaRecorder}</li> +</ol> + +<h2>See also</h2> +<ol> + <li><a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media Formats</a></li> + <li><a href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a></li> + <li><a href="{@docRoot}guide/topics/media/mediaplayer.html">MediaPlayer</a> +</ol> + +</div> +</div> + +<p>The Android multimedia framework includes support for capturing and encoding a variety of common +audio formats, so that you can easily integrate audio into your applications. You can record audio +using the {@link android.media.MediaRecorder} APIs if supported by the device hardware.</p> + +<p>This document shows you how to write an application that captures audio from a device +microphone, save the audio and play it back.</p> + +<p class="note"><strong>Note:</strong> The Android Emulator does not have the ability to capture +audio, but actual devices are likely to provide these capabilities.</p> + +<h2 id="audiocapture">Performing Audio Capture</h2> + +<p>Audio capture from the device is a bit more complicated than audio and video playback, but still +fairly simple:</p> +<ol> + <li>Create a new instance of {@link android.media.MediaRecorder android.media.MediaRecorder}.</li> + <li>Set the audio source using + {@link android.media.MediaRecorder#setAudioSource MediaRecorder.setAudioSource()}. You will +probably want to use + <code>MediaRecorder.AudioSource.MIC</code>.</li> + <li>Set output file format using + {@link android.media.MediaRecorder#setOutputFormat MediaRecorder.setOutputFormat()}. + </li> + <li>Set output file name using + {@link android.media.MediaRecorder#setOutputFile MediaRecorder.setOutputFile()}. + </li> + <li>Set the audio encoder using + {@link android.media.MediaRecorder#setAudioEncoder MediaRecorder.setAudioEncoder()}. + </li> + <li>Call {@link android.media.MediaRecorder#prepare MediaRecorder.prepare()} + on the MediaRecorder instance.</li> + <li>To start audio capture, call + {@link android.media.MediaRecorder#start MediaRecorder.start()}. </li> + <li>To stop audio capture, call {@link android.media.MediaRecorder#stop MediaRecorder.stop()}. + <li>When you are done with the MediaRecorder instance, call +{@link android.media.MediaRecorder#release MediaRecorder.release()} on it. Calling +{@link android.media.MediaRecorder#release MediaRecorder.release()} is always recommended to +free the resource immediately.</li> +</ol> + +<h3 id="example">Example: Record audio and play the recorded audio</h3> +<p>The example class below illustrates how to set up, start and stop audio capture, and to play the +recorded audio file.</p> +<pre> +/* + * The application needs to have the permission to write to external storage + * if the output file is written to the external storage, and also the + * permission to record audio. These permissions must be set in the + * application's AndroidManifest.xml file, with something like: + * + * <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> + * <uses-permission android:name="android.permission.RECORD_AUDIO" /> + * + */ +package com.android.audiorecordtest; + +import android.app.Activity; +import android.widget.LinearLayout; +import android.os.Bundle; +import android.os.Environment; +import android.view.ViewGroup; +import android.widget.Button; +import android.view.View; +import android.view.View.OnClickListener; +import android.content.Context; +import android.util.Log; +import android.media.MediaRecorder; +import android.media.MediaPlayer; + +import java.io.IOException; + + +public class AudioRecordTest extends Activity +{ + private static final String LOG_TAG = "AudioRecordTest"; + private static String mFileName = null; + + private RecordButton mRecordButton = null; + private MediaRecorder mRecorder = null; + + private PlayButton mPlayButton = null; + private MediaPlayer mPlayer = null; + + private void onRecord(boolean start) { + if (start) { + startRecording(); + } else { + stopRecording(); + } + } + + private void onPlay(boolean start) { + if (start) { + startPlaying(); + } else { + stopPlaying(); + } + } + + private void startPlaying() { + mPlayer = new MediaPlayer(); + try { + mPlayer.setDataSource(mFileName); + mPlayer.prepare(); + mPlayer.start(); + } catch (IOException e) { + Log.e(LOG_TAG, "prepare() failed"); + } + } + + private void stopPlaying() { + mPlayer.release(); + mPlayer = null; + } + + private void startRecording() { + mRecorder = new MediaRecorder(); + mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); + mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); + mRecorder.setOutputFile(mFileName); + mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); + + try { + mRecorder.prepare(); + } catch (IOException e) { + Log.e(LOG_TAG, "prepare() failed"); + } + + mRecorder.start(); + } + + private void stopRecording() { + mRecorder.stop(); + mRecorder.release(); + mRecorder = null; + } + + class RecordButton extends Button { + boolean mStartRecording = true; + + OnClickListener clicker = new OnClickListener() { + public void onClick(View v) { + onRecord(mStartRecording); + if (mStartRecording) { + setText("Stop recording"); + } else { + setText("Start recording"); + } + mStartRecording = !mStartRecording; + } + }; + + public RecordButton(Context ctx) { + super(ctx); + setText("Start recording"); + setOnClickListener(clicker); + } + } + + class PlayButton extends Button { + boolean mStartPlaying = true; + + OnClickListener clicker = new OnClickListener() { + public void onClick(View v) { + onPlay(mStartPlaying); + if (mStartPlaying) { + setText("Stop playing"); + } else { + setText("Start playing"); + } + mStartPlaying = !mStartPlaying; + } + }; + + public PlayButton(Context ctx) { + super(ctx); + setText("Start playing"); + setOnClickListener(clicker); + } + } + + public AudioRecordTest() { + mFileName = Environment.getExternalStorageDirectory().getAbsolutePath(); + mFileName += "/audiorecordtest.3gp"; + } + + @Override + public void onCreate(Bundle icicle) { + super.onCreate(icicle); + + LinearLayout ll = new LinearLayout(this); + mRecordButton = new RecordButton(this); + ll.addView(mRecordButton, + new LinearLayout.LayoutParams( + ViewGroup.LayoutParams.WRAP_CONTENT, + ViewGroup.LayoutParams.WRAP_CONTENT, + 0)); + mPlayButton = new PlayButton(this); + ll.addView(mPlayButton, + new LinearLayout.LayoutParams( + ViewGroup.LayoutParams.WRAP_CONTENT, + ViewGroup.LayoutParams.WRAP_CONTENT, + 0)); + setContentView(ll); + } + + @Override + public void onPause() { + super.onPause(); + if (mRecorder != null) { + mRecorder.release(); + mRecorder = null; + } + + if (mPlayer != null) { + mPlayer.release(); + mPlayer = null; + } + } +} +</pre>
\ No newline at end of file diff --git a/docs/html/guide/topics/media/camera.jd b/docs/html/guide/topics/media/camera.jd new file mode 100644 index 0000000..877bded --- /dev/null +++ b/docs/html/guide/topics/media/camera.jd @@ -0,0 +1,1055 @@ +page.title=Camera +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + +<div id="qv-wrapper"> + <div id="qv"> + <h2>In this document</h2> + <ol> + <li><a href="#considerations">Considerations</a></li> + <li><a href="#basics">The Basics</a> + <li><a href="#manifest">Manifest Declarations</a></li> + <li><a href="#intents">Using Existing Camera Apps</a> + <ol> + <li><a href="#intent-image">Image capture intent</a></li> + <li><a href="#intent-video">Video capture intent</a></li> + <li><a href="#intent-receive">Receiving camera intent result</a></li> + </ol> + <li><a href="#custom-camera">Building a Camera App</a> + <ol> + <li><a href="#detect-camera">Detecting camera hardware</a></li> + <li><a href="#access-camera">Accessing cameras</a></li> + <li><a href="#check-camera-features">Checking camera features</a></li> + <li><a href="#camera-preview">Creating a preview class</a></li> + <li><a href="#preview-layout">Placing preview in a layout</a></li> + <li><a href="#capture-picture">Capturing pictures</a></li> + <li><a href="#capture-video">Capturing videos</a></li> + <li><a href="#release-camera">Releasing the camera</a></li> + </ol> + </li> + <li><a href="#saving-media">Saving Media Files</a></li> + </ol> + <h2>Key Classes</h2> + <ol> + <li>{@link android.hardware.Camera}</li> + <li>{@link android.view.SurfaceView}</li> + <li>{@link android.media.MediaRecorder}</li> + <li>{@link android.content.Intent}</li> + </ol> + <h2>See also</h2> + <ol> + <li><a href="{@docRoot}reference/android/hardware/Camera.html">Camera</a></li> + <li><a href="{@docRoot}reference/android/media/MediaRecorder.html">MediaRecorder</a></li> + <li><a href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a></li> + </ol> + </div> +</div> + + +<p>The Android framework includes support for various cameras and camera features available on +devices, allowing you to capture pictures and videos in your applications. This document discusses a +quick, simple approach to image and video capture and outlines an advanced approach for creating +custom camera experiences for your users.</p> + +<h2 id="considerations">Considerations</h2> +<p>Before enabling your application to use cameras on Android devices, you should consider a few +questions about how your app intends to use this hardware feature.</p> + +<ul> + <li><strong>Camera Requirement</strong> - Is the use of a camera so important to your +application that you do not want your application installed on a device that does not have a +camera? If so, you should declare the <a href="#manifest">camera requirement in your +manifest</a>.</li> + + <li><strong>Quick Picture or Customized Camera</strong> - How will your application use the +camera? Are you just interested in snapping a quick picture or video clip, or will your application +provide a new way to use cameras? For a getting a quick snap or clip, consider +<a href="#intents">Using Existing Camera Apps</a>. For developing a customized camera feature, check +out the <a href="#custom-camera">Building a Camera App</a> section.</li> + + <li><strong>Storage</strong> - Are the images or videos your application generates intended to be +only visible to your application or shared so that other applications such as Gallery or other +media and social apps can use them? Do you want the pictures and videos to be available even if your +application is uninstalled? Check out the <a href="#saving-media">Saving Media Files</a> section to +see how to implement these options.</li> +</ul> + + + +<h2 id="basics">The Basics</h2> +<p>The Android framework supports capturing images and video through the +{@link android.hardware.Camera} API or camera {@link android.content.Intent}. Here are the relevant +classes:</p> + +<dl> + <dt>{@link android.hardware.Camera}</dt> + <dd>This class is the primary API for controlling device cameras. This class is used to take +pictures or videos when you are building a camera application.</a>.</dd> + + <dt>{@link android.view.SurfaceView}</dt> + <dd>This class is used to present a live camera preview to the user.</dd> + + <dt>{@link android.media.MediaRecorder}</dt> + <dd>This class is used to record video from the camera.</dd> + + <dt>{@link android.content.Intent}</dt> + <dd>An intent action type of {@link android.provider.MediaStore#ACTION_IMAGE_CAPTURE +MediaStore.ACTION_IMAGE_CAPTURE} or {@link android.provider.MediaStore#ACTION_VIDEO_CAPTURE +MediaStore.ACTION_VIDEO_CAPTURE} can be used to capture images or videos without directly +using the {@link android.hardware.Camera} object.</dd> +</dl> + + +<h2 id="manifest">Manifest Declarations</h2> +<p>Before starting development on your application with the Camera API, you should make sure +your manifest has the appropriate declarations to allow use of camera hardware and other +related features.</p> + +<ul> + <li><strong>Camera Permission</strong> - Your application must request permission to use a device +camera. +<pre> +<uses-permission android:name="android.permission.CAMERA" /> +</pre> + <p class="note"><strong>Note:</strong> If you are using the camera <a href="#intents">via an +intent</a>, your application does not need to request this permission.</p> + </li> + <li><strong>Camera Features</strong> - Your application must also declare use of camera features, +for example: +<pre> +<uses-feature android:name="android.hardware.camera" /> +</pre> + <p>For a list of camera features, see the manifest <a +href="{@docRoot}guide/topics/manifest/uses-feature-element.html#features-reference">Features +Reference</a>.</p> + <p>Adding camera features to your manifest causes Android Market to prevent your application from +being installed to devices that do not include a camera or do not support the camera features you +specify. For more information about using feature-based filtering with Android Market, see <a +href="{@docRoot}guide/topics/manifest/uses-feature-element.html#market-feature-filtering">Android +Market and Feature-Based Filtering</a>.</p> + <p>If your application <em>can use</em> a camera or camera feature for proper operation, but does +not <em>require</em> it, you should specify this in the manifest by including the {@code +android:required} attribute, and setting it to {@code false}:</p> +<pre> +<uses-feature android:name="android.hardware.camera" android:required="false" /> +</pre> + + </li> + <li><strong>Storage Permission</strong> - If your application saves images or videos to the +device's external storage (SD Card), you must also specify this in the manifest. +<pre> +<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> +</pre> + </li> + <li><strong>Audio Recording Permission</strong> - For recording audio with video capture, your +application must request the audio capture permission. +<pre> +<uses-permission android:name="android.permission.RECORD_AUDIO" /> +</pre> + </li> +</ul> + + +<h2 id="intents">Using Existing Camera Apps</h2> +<p>A quick way to enable taking pictures or videos in your application without a lot of extra code +is to use an {@link android.content.Intent} to invoke an existing Android camera application. A +camera intent makes a request to capture a picture or video clip through an existing camera app and +then returns control back to your application. This section shows you how to capture an image or +video using this technique.</p> + +<p>The procedure for invoking a camera intent follows these general steps:</p> + +<ol> + <li><strong>Compose a Camera Intent</strong> - Create an {@link android.content.Intent} that +requests an image or video, using one of these intent types: + <ul> + <li>{@link android.provider.MediaStore#ACTION_IMAGE_CAPTURE MediaStore.ACTION_IMAGE_CAPTURE} - +Intent action type for requesting an image from an existing camera application.</li> + <li>{@link android.provider.MediaStore#ACTION_VIDEO_CAPTURE MediaStore.ACTION_VIDEO_CAPTURE} - +Intent action type for requesting a video from an existing camera application. </li> + </ul> + </li> + <li><strong>Start the Camera Intent</strong> - Use the {@link +android.app.Activity#startActivityForResult(android.content.Intent, int) startActivityForResult()} +method to execute the camera intent. After you start the intent, the Camera application user +interface appears on the device screen and the user can take a picture or video.</li> + <li><strong>Receive the Intent Result</strong> - Set up an {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} method +in your application to receive the callback and data from the camera intent. When the user +finishes taking a picture or video (or cancels the operation), the system calls this method.</li> +</ol> + + +<h3 id="intent-image">Image capture intent</h3> +<p>Capturing images using a camera intent is quick way to enable your application to take pictures +with minimal coding. An image capture intent can include the following extra information:</p> + +<ul> + <li>{@link android.provider.MediaStore#EXTRA_OUTPUT MediaStore.EXTRA_OUTPUT} - This setting +requires a {@link android.net.Uri} object specifying a path and file name where you'd like to +save the picture. This setting is optional but strongly recommended. If you do not specify this +value, the camera application saves the requested picture in the default location with a default +name, specified in the returned intent's {@link android.content.Intent#getData() Intent.getData()} +field.</li> +</ul> + +<p>The following example demonstrates how to construct a image capture intent and execute it. +The {@code getOutputMediaFileUri()} method in this example refers to the sample code shown in <a +href= "#saving-media">Saving Media Files</a>.</p> + +<pre> +private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 100; +private Uri fileUri; + +@Override +public void onCreate(Bundle savedInstanceState) { + super.onCreate(savedInstanceState); + setContentView(R.layout.main); + + // create Intent to take a picture and return control to the calling application + Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE); + + fileUri = getOutputMediaFileUri(MEDIA_TYPE_IMAGE); // create a file to save the image + intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri); // set the image file name + + // start the image capture Intent + startActivityForResult(intent, CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE); +} +</pre> + +<p>When the {@link android.app.Activity#startActivityForResult(android.content.Intent, int) +startActivityForResult()} method is executed, users see a camera application interface. +After the user finishes taking a picture (or cancels the operation), the user interface returns to +your application, and you must intercept the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} +method to receive the result of the intent and continue your application execution. For information +on how to receive the completed intent, see <a href="#intent-receive">Receiving Camera Intent +Result</a>.</p> + + +<h3 id="intent-video">Video capture intent</h3> +<p>Capturing video using a camera intent is a quick way to enable your application to take videos +with minimal coding. A video capture intent can include the following extra information:</p> + +<ul> + <li>{@link android.provider.MediaStore#EXTRA_OUTPUT MediaStore.EXTRA_OUTPUT} - This setting +requires a {@link android.net.Uri} specifying a path and file name where you'd like to save the +video. This setting is optional but strongly recommended. If you do not specify this value, the +Camera application saves the requested video in the default location with a default name, specified +in the returned intent's {@link android.content.Intent#getData() Intent.getData()} field.</li> + <li>{@link android.provider.MediaStore#EXTRA_VIDEO_QUALITY MediaStore.EXTRA_VIDEO_QUALITY} - +This value can be 0 for lowest quality and smallest file size or 1 for highest quality and +larger file size.</li> + <li>{@link android.provider.MediaStore#EXTRA_DURATION_LIMIT MediaStore.EXTRA_DURATION_LIMIT} - +Set this value to limit the length, in seconds, of the video being captured.</li> + <li>{@link android.provider.MediaStore#EXTRA_SIZE_LIMIT MediaStore.EXTRA_SIZE_LIMIT} - +Set this value to limit the file size, in bytes, of the video being captured. +</li> +</ul> + +<p>The following example demonstrates how to construct a video capture intent and execute it. +The {@code getOutputMediaFileUri()} method in this example refers to the sample code shown in <a +href= "#saving-media">Saving Media Files</a>.</p> + +<pre> +private static final int CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE = 200; +private Uri fileUri; + +@Override +public void onCreate(Bundle savedInstanceState) { + super.onCreate(savedInstanceState); + setContentView(R.layout.main); + + //create new Intent + Intent intent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE); + + fileUri = getOutputMediaFileUri(MEDIA_TYPE_VIDEO); // create a file to save the video + intent.putExtra(MediaStore.EXTRA_OUTPUT, fileUri); // set the image file name + + intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1); // set the video image quality to high + + // start the Video Capture Intent + startActivityForResult(intent, CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE); +} +</pre> + +<p>When the {@link +android.app.Activity#startActivityForResult(android.content.Intent, int) +startActivityForResult()} method is executed, users see a modified camera application interface. +After the user finishes taking a video (or cancels the operation), the user interface +returns to your application, and you must intercept the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} +method to receive the result of the intent and continue your application execution. For information +on how to receive the completed intent, see the next section.</p> + +<h3 id="intent-receive">Receiving camera intent result</h3> +<p>Once you have constructed and executed an image or video camera intent, your application must be +configured to receive the result of the intent. This section shows you how to intercept the callback +from a camera intent so your application can do further processing of the captured image or +video.</p> + +<p>In order to receive the result of an intent, you must override the {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} in the +activity that started the intent. The following example demonstrates how to override {@link +android.app.Activity#onActivityResult(int, int, android.content.Intent) onActivityResult()} to +capture the result of the <a href="#intent-image">image camera intent</a> or <a +href="#intent-video">video camera intent</a> examples shown in the previous sections.</p> + +<pre> +private static final int CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE = 100; +private static final int CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE = 200; + +@Override +protected void onActivityResult(int requestCode, int resultCode, Intent data) { + if (requestCode == CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE) { + if (resultCode == RESULT_OK) { + // Image captured and saved to fileUri specified in the Intent + Toast.makeText(this, "Image saved to:\n" + + data.getData(), Toast.LENGTH_LONG).show(); + } else if (resultCode == RESULT_CANCELED) { + // User cancelled the image capture + } else { + // Image capture failed, advise user + } + } + + if (requestCode == CAPTURE_VIDEO_ACTIVITY_REQUEST_CODE) { + if (resultCode == RESULT_OK) { + // Video captured and saved to fileUri specified in the Intent + Toast.makeText(this, "Video saved to:\n" + + data.getData(), Toast.LENGTH_LONG).show(); + } else if (resultCode == RESULT_CANCELED) { + // User cancelled the video capture + } else { + // Video capture failed, advise user + } + } +} +</pre> + +<p>Once your activity receives a successful result, the captured image or video is available in the +specified location for your application to access.</p> + + + +<h2 id="custom-camera">Building a Camera App</h2> +<p>Some developers may require a camera user interface that is customized to the look of their +application or provides special features. Creating a customized camera activity requires more +code than <a href="#intents">using an intent</a>, but it can provide a more compelling experience +for your users.</p> + +<p>The general steps for creating a custom camera interface for your application are as follows:</p> + +<ul> + <li><strong>Detect and Access Camera</strong> - Create code to check for the existence of +cameras and request access.</li> + <li><strong>Create a Preview Class</strong> - Create a camera preview class that extends {@link +android.view.SurfaceView} and implements the {@link android.view.SurfaceHolder} interface. This +class previews the live images from the camera.</li> + <li><strong>Build a Preview Layout</strong> - Once you have the camera preview class, create a +view layout that incorporates the preview and the user interface controls you want.</li> + <li><strong>Setup Listeners for Capture</strong> - Connect listeners for your interface +controls to start image or video capture in response to user actions, such as pressing a +button.</li> + <li><strong>Capture and Save Files</strong> - Setup the code for capturing pictures or +videos and saving the output.</li> + <li><strong>Release the Camera</strong> - After using the camera, your application must +properly release it for use by other applications.</li> +</ul> + +<p>Camera hardware is a shared resource that must be carefully managed so your application does +not collide with other applications that may also want to use it. The following sections discusses +how to detect camera hardware, how to request access to a camera and how to release it when your +application is done using it.</p> + +<p class="caution"><strong>Caution:</strong> Remember to release the {@link android.hardware.Camera} +object by calling the {@link android.hardware.Camera#release() Camera.release()} when your +application is done using it! If your application does not properly release the camera, all +subsequent attempts to access the camera, including those by your own application, will fail and may +cause your or other applications to be shut down.</p> + + +<h3 id="detect-camera">Detecting camera hardware</h3> +<p>If your application does not specifically require a camera using a manifest declaration, you +should check to see if a camera is available at runtime. To perform this check, use the {@link +android.content.pm.PackageManager#hasSystemFeature(java.lang.String) +PackageManager.hasSystemFeature()} method, as shown in the example code below:</p> + +<pre> +/** Check if this device has a camera */ +private boolean checkCameraHardware(Context context) { + if (context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA)){ + // this device has a camera + return true; + } else { + // no camera on this device + return false; + } +} +</pre> + +<p>Android devices can have multiple cameras, for example a back-facing camera for photography and a +front-facing camera for video calls. Android 2.3 (API Level 9) and later allows you to check the +number of cameras available on a device using the {@link +android.hardware.Camera#getNumberOfCameras() Camera.getNumberOfCameras()} method.</p> + +<h3 id="access-camera">Accessing cameras</h3> +<p>If you have determined that the device on which your application is running has a camera, you +must request to access it by getting an instance of {@link android.hardware.Camera} (unless you +are using an <a href="#intents">intent to access the camera</a>). </p> + +<p>To access the primary camera, use the {@link android.hardware.Camera#open() Camera.open()} method +and be sure to catch any exceptions, as shown in the code below:</p> + +<pre> +/** A safe way to get an instance of the Camera object. */ +public static Camera getCameraInstance(){ + Camera c = null; + try { + c = Camera.open(); // attempt to get a Camera instance + } + catch (Exception e){ + // Camera is not available (in use or does not exist) + } + return c; // returns null if camera is unavailable +} +</pre> + +<p class="caution"><strong>Caution:</strong> Always check for exceptions when using {@link +android.hardware.Camera#open() Camera.open()}. Failing to check for exceptions if the camera is in +use or does not exist will cause your application to be shut down by the system.</p> + +<p>On devices running Android 2.3 (API Level 9) or higher, you can access specific cameras using +{@link android.hardware.Camera#open(int) Camera.open(int)}. The example code above will access +the first, back-facing camera on a device with more than one camera.</p> + +<h3 id="check-camera-features">Checking camera features</h3> +<p>Once you obtain access to a camera, you can get further information about its capabilties using +the {@link android.hardware.Camera#getParameters() Camera.getParameters()} method and checking the +returned {@link android.hardware.Camera.Parameters} object for supported capabilities. When using +API Level 9 or higher, use the {@link android.hardware.Camera#getCameraInfo(int, +android.hardware.Camera.CameraInfo) Camera.getCameraInfo()} to determine if a camera is on the front +or back of the device, and the orientation of the image.</p> + + + +<h3 id="camera-preview">Creating a preview class</h3> +<p>For users to effectively take pictures or video, they must be able to see what the device camera +sees. A camera preview class is a {@link android.view.SurfaceView} that can display the live image +data coming from a camera, so users can frame and capture a picture or video.</p> + +<p>The following example code demonstrates how to create a basic camera preview class that can be +included in a {@link android.view.View} layout. This class implements {@link +android.view.SurfaceHolder.Callback SurfaceHolder.Callback} in order to capture the callback events +for creating and destroying the view, which are needed for assigning the camera preview input.</p> + +<pre> +/** A basic Camera preview class */ +public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback { + private SurfaceHolder mHolder; + private Camera mCamera; + + public CameraPreview(Context context, Camera camera) { + super(context); + mCamera = camera; + + // Install a SurfaceHolder.Callback so we get notified when the + // underlying surface is created and destroyed. + mHolder = getHolder(); + mHolder.addCallback(this); + // deprecated setting, but required on Android versions prior to 3.0 + mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS); + } + + public void surfaceCreated(SurfaceHolder holder) { + // The Surface has been created, now tell the camera where to draw the preview. + try { + mCamera.setPreviewDisplay(holder); + mCamera.startPreview(); + } catch (IOException e) { + Log.d(TAG, "Error setting camera preview: " + e.getMessage()); + } + } + + public void surfaceDestroyed(SurfaceHolder holder) { + // empty. Take care of releasing the Camera preview in your activity. + } + + public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) { + // If your preview can change or rotate, take care of those events here. + // Make sure to stop the preview before resizing or reformatting it. + + if (mHolder.getSurface() == null){ + // preview surface does not exist + return; + } + + // stop preview before making changes + try { + mCamera.stopPreview(); + } catch (Exception e){ + // ignore: tried to stop a non-existent preview + } + + // make any resize, rotate or reformatting changes here + + // start preview with new settings + try { + mCamera.setPreviewDisplay(mHolder); + mCamera.startPreview(); + + } catch (Exception e){ + Log.d(TAG, "Error starting camera preview: " + e.getMessage()); + } + } +} +</pre> + + +<h3 id="preview-layout">Placing preview in a layout</h3> +<p>A camera preview class, such as the example shown in the previous section, must be placed in the +layout of an activity along with other user interface controls for taking a picture or video. This +section shows you how to build a basic layout and activity for the preview.</p> + +<p>The following layout code provides a very basic view that can be used to display a camera +preview. In this example, the {@link android.widget.FrameLayout} element is meant to be the +container for the camera preview class. This layout type is used so that additional picture +information or controls can be overlayed on the live camera preview images.</p> + +<pre> +<?xml version="1.0" encoding="utf-8"?> +<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" + android:orientation="horizontal" + android:layout_width="fill_parent" + android:layout_height="fill_parent" + > + <FrameLayout + android:id="@+id/camera_preview" + android:layout_width="fill_parent" + android:layout_height="fill_parent" + android:layout_weight="1" + /> + + <Button + android:id="@+id/button_capture" + android:text="Capture" + android:layout_width="wrap_content" + android:layout_height="wrap_content" + android:layout_gravity="center" + /> +</LinearLayout> +</pre> + +<p>On most devices, the default orientation of the camera preview is landscape. This example layout +specifies a horizontal (landscape) layout and the code below fixes the orientation of the +application to landscape. For simplicity in rendering a camera preview, you should change your +application's preview activity orientation to landscape by adding the following to your +manifest.</p> + +<pre> +<activity android:name=".CameraActivity" + android:label="@string/app_name" + + android:screenOrientation="landscape"> + <!-- configure this activity to use landscape orientation --> + + <intent-filter> + <action android:name="android.intent.action.MAIN" /> + <category android:name="android.intent.category.LAUNCHER" /> + </intent-filter> +</activity> +</pre> + +<p class="note"><strong>Note:</strong> A camera preview does not have to be in landscape mode. +Starting in Android 2.2 (API Level 8), you can use the {@link +android.hardware.Camera#setDisplayOrientation(int) setDisplayOrientation()} method to set the +rotation of the preview image. In order to change preview orientation as the user re-orients the +phone, within the {@link +android.view.SurfaceHolder.Callback#surfaceChanged(android.view.SurfaceHolder, int, int, int) +surfaceChanged()} method of your preview class, first stop the preview with {@link +android.hardware.Camera#stopPreview() Camera.stopPreview()} change the orientation and then +start the preview again with {@link android.hardware.Camera#startPreview() +Camera.startPreview()}.</p> + +<p>In the activity for your camera view, add your preview class to the {@link +android.widget.FrameLayout} element shown in the example above. Your camera activity must also +ensure that it releases the camera when it is paused or shut down. The following example shows how +to modify a camera activity to attach the preview class shown in <a href="#camera-preview">Creating +a preview class</a>.</p> + +<pre> +public class CameraActivity extends Activity { + + private Camera mCamera; + private CameraPreview mPreview; + + @Override + public void onCreate(Bundle savedInstanceState) { + super.onCreate(savedInstanceState); + setContentView(R.layout.main); + + // Create an instance of Camera + mCamera = getCameraInstance(); + + // Create our Preview view and set it as the content of our activity. + mPreview = new CameraPreview(this, mCamera); + FrameLayout preview = (FrameLayout) findViewById(id.camera_preview); + preview.addView(mPreview); + } +} +</pre> + +<p class="note"><strong>Note:</strong> The {@code getCameraInstance()} method in the example above +refers to the example method shown in <a href="#access-camera">Accessing cameras</a>.</p> + + +<h3 id="capture-picture">Capturing pictures</h3> +<p>Once you have built a preview class and a view layout in which to display it, you are ready to +start capturing images with your application. In your application code, you must set up listeners +for your user interface controls to respond to a user action by taking a picture.</p> + +<p>In order to retrieve a picture, use the {@link +android.hardware.Camera#takePicture(android.hardware.Camera.ShutterCallback, +android.hardware.Camera.PictureCallback, android.hardware.Camera.PictureCallback) +Camera.takePicture()} method. This method takes three parameters which receive data from the camera. +In order to receive data in a JPEG format, you must implement an {@link +android.hardware.Camera.PictureCallback} interface to receive the image data and +write it to a file. The following code shows a basic implementation of the {@link +android.hardware.Camera.PictureCallback} interface to save an image received from the camera.</p> + +<pre> +private PictureCallback mPicture = new PictureCallback() { + + @Override + public void onPictureTaken(byte[] data, Camera camera) { + + File pictureFile = getOutputMediaFile(MEDIA_TYPE_IMAGE); + if (pictureFile == null){ + Log.d(TAG, "Error creating media file, check storage permissions: " + + e.getMessage()); + return; + } + + try { + FileOutputStream fos = new FileOutputStream(pictureFile); + fos.write(data); + fos.close(); + } catch (FileNotFoundException e) { + Log.d(TAG, "File not found: " + e.getMessage()); + } catch (IOException e) { + Log.d(TAG, "Error accessing file: " + e.getMessage()); + } + } +}; +</pre> + +<p>Trigger capturing an image by calling the {@link +android.hardware.Camera#takePicture(android.hardware.Camera.ShutterCallback, +android.hardware.Camera.PictureCallback, android.hardware.Camera.PictureCallback) +Camera.takePicture()} method. The following example code shows how to call this method from a +button {@link android.view.View.OnClickListener}.</p> + +<pre> +// Add a listener to the Capture button +Button captureButton = (Button) findViewById(id.button_capture); + captureButton.setOnClickListener( + new View.OnClickListener() { + @Override + public void onClick(View v) { + // get an image from the camera + mCamera.takePicture(null, null, mPicture); + } + } +); +</pre> + +<p class="note"><strong>Note:</strong> The {@code mPicture} member in the following example refers +to the example code above.</p> + +<p class="caution"><strong>Caution:</strong> Remember to release the {@link android.hardware.Camera} +object by calling the {@link android.hardware.Camera#release() Camera.release()} when your +application is done using it! For information about how to release the camera, see <a +href="#release-camera">Releasing the camera</a>.</p> + + +<h3 id="capture-video">Capturing videos</h3> + +<p>Video capture using the Android framework requires careful management of the {@link +android.hardware.Camera} object and coordination with the {@link android.media.MediaRecorder} +class. When recording video with {@link android.hardware.Camera}, you must manage the {@link +android.hardware.Camera#lock() Camera.lock()} and {@link android.hardware.Camera#unlock() +Camera.unlock()} calls to allow {@link android.media.MediaRecorder} access to the camera hardware, +in addition to the {@link android.hardware.Camera#open() Camera.open()} and {@link +android.hardware.Camera#release() Camera.release()} calls.</p> + +<p class="note"><strong>Note:</strong> Starting with Android 4.0 (API level 14), the {@link +android.hardware.Camera#lock() Camera.lock()} and {@link android.hardware.Camera#unlock() +Camera.unlock()} calls are managed for you automatically.</p> + +<p>Unlike taking pictures with a device camera, capturing video requires a very particular call +order. You must follow a specific order of execution to successfully prepare for and capture video +with your application, as detailed below.</p> + +<ol> + <li><strong>Open Camera</strong> - Use the {@link android.hardware.Camera#open() Camera.open()} +to get an instance of the camera object.</li> + <li><strong>Connect Preview</strong> - Prepare a live camera image preview by connecting a {@link +android.view.SurfaceView} to the camera using {@link +android.hardware.Camera#setPreviewDisplay(android.view.SurfaceHolder) Camera.setPreviewDisplay()}. + </li> + <li><strong>Start Preview</strong> - Call {@link android.hardware.Camera#startPreview() +Camera.startPreview()} to begin displaying the live camera images.</li> + <li><strong>Start Recording Video</strong> - The following steps must be completed <em>in +order</em> to successfully record video: + <ol style="list-style-type: lower-alpha;"> + <li><strong>Unlock the Camera</strong> - Unlock the camera for use by {@link +android.media.MediaRecorder} by calling {@link android.hardware.Camera#unlock() +Camera.unlock()}.</li> + <li><strong>Configure MediaRecorder</strong> - Call in the following {@link +android.media.MediaRecorder} methods <em>in this order</em>. For more information, see the {@link +android.media.MediaRecorder} reference documentation. + <ol> + <li>{@link android.media.MediaRecorder#setCamera(android.hardware.Camera) +setCamera()} - Set the camera to be used for video capture, use your application's current instance +of {@link android.hardware.Camera}.</li> + <li>{@link android.media.MediaRecorder#setAudioSource(int) setAudioSource()} - Set the +audio source, use {@link android.media.MediaRecorder.AudioSource#CAMCORDER +MediaRecorder.AudioSource.CAMCORDER}. </li> + <li>{@link android.media.MediaRecorder#setVideoSource(int) setVideoSource()} - Set +the video source, use {@link android.media.MediaRecorder.VideoSource#CAMERA +MediaRecorder.VideoSource.CAMERA}.</li> + <li>Set the video output format and encoding. For Android 2.2 (API Level 8) and +higher, use the {@link android.media.MediaRecorder#setProfile(android.media.CamcorderProfile) +MediaRecorder.setProfile} method, and get a profile instance using {@link +android.media.CamcorderProfile#get(int) CamcorderProfile.get()}. For versions of Android prior to +2.2, you must set the video output format and encoding parameters: + <ol style="list-style-type: lower-roman;"> + <li>{@link android.media.MediaRecorder#setOutputFormat(int) setOutputFormat()} - Set +the output format, specify the default setting or {@link +android.media.MediaRecorder.OutputFormat#MPEG_4 MediaRecorder.OutputFormat.MPEG_4}.</li> + <li>{@link android.media.MediaRecorder#setAudioEncoder(int) setAudioEncoder()} - Set +the sound encoding type, specify the default setting or {@link +android.media.MediaRecorder.AudioEncoder#AMR_NB MediaRecorder.AudioEncoder.AMR_NB}.</li> + <li>{@link android.media.MediaRecorder#setVideoEncoder(int) setVideoEncoder()} - Set +the video encoding type, specify the default setting or {@link +android.media.MediaRecorder.VideoEncoder#MPEG_4_SP MediaRecorder.VideoEncoder.MPEG_4_SP}.</li> + </ol> + </li> + <li>{@link android.media.MediaRecorder#setOutputFile(java.lang.String) setOutputFile()} - +Set the output file, use {@code getOutputMediaFile(MEDIA_TYPE_VIDEO).toString()} from the example +method in the <a href="#saving-media">Saving Media Files</a> section.</li> + <li>{@link android.media.MediaRecorder#setPreviewDisplay(android.view.Surface) +setPreviewDisplay()} - Specify the {@link android.view.SurfaceView} preview layout element for +your application. Use the same object you specified for <strong>Connect Preview</strong>.</li> + </ol> + <p class="caution"><strong>Caution:</strong> You must call these {@link +android.media.MediaRecorder} configuration methods <em>in this order</em>, otherwise your +application will encounter errors and the recording will fail.</p> + </li> + <li><strong>Prepare MediaRecorder</strong> - Prepare the {@link android.media.MediaRecorder} +with provided configuration settings by calling {@link android.media.MediaRecorder#prepare() +MediaRecorder.prepare()}.</li> + <li><strong>Start MediaRecorder</strong> - Start recording video by calling {@link +android.media.MediaRecorder#start() MediaRecorder.start()}.</li> + </ol> + </li> + <li><strong>Stop Recording Video</strong> - Call the following methods <em>in order</em>, to +successfully complete a video recording: + <ol style="list-style-type: lower-alpha;"> + <li><strong>Stop MediaRecorder</strong> - Stop recording video by calling {@link +android.media.MediaRecorder#stop() MediaRecorder.stop()}.</li> + <li><strong>Reset MediaRecorder</strong> - Optionally, remove the configuration settings from +the recorder by calling {@link android.media.MediaRecorder#reset() MediaRecorder.reset()}.</li> + <li><strong>Release MediaRecorder</strong> - Release the {@link android.media.MediaRecorder} +by calling {@link android.media.MediaRecorder#release() MediaRecorder.release()}.</li> + <li><strong>Lock the Camera</strong> - Lock the camera so that future {@link +android.media.MediaRecorder} sessions can use it by calling {@link android.hardware.Camera#lock() +Camera.lock()}. Starting with Android 4.0 (API level 14), this call is not required unless the +{@link android.media.MediaRecorder#prepare() MediaRecorder.prepare()} call fails.</li> + </ol> + </li> + <li><strong>Stop the Preview</strong> - When your activity has finished using the camera, stop the +preview using {@link android.hardware.Camera#stopPreview() Camera.stopPreview()}.</li> + <li><strong>Release Camera</strong> - Release the camera so that other applications can use +it by calling {@link android.hardware.Camera#release() Camera.release()}.</li> +</ol> + +<p class="note"><strong>Note:</strong> It is possible to use {@link android.media.MediaRecorder} +without creating a camera preview first and skip the first few steps of this process. However, +since users typically prefer to see a preview before starting a recording, that process is not +discussed here.</p> + +<h4 id="configuring-mediarecorder">Configuring MediaRecorder</h4> +<p>When using the {@link android.media.MediaRecorder} class to record video, you must perform +configuration steps in a <em>specific order</em> and then call the {@link +android.media.MediaRecorder#prepare() MediaRecorder.prepare()} method to check and implement the +configuration. The following example code demonstrates how to properly configure and prepare the +{@link android.media.MediaRecorder} class for video recording.</p> + +<pre> +private boolean prepareVideoRecorder(){ + + mCamera = getCameraInstance(); + mMediaRecorder = new MediaRecorder(); + + // Step 1: Unlock and set camera to MediaRecorder + mCamera.unlock(); + mMediaRecorder.setCamera(mCamera); + + // Step 2: Set sources + mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); + mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); + + // Step 3: Set a CamcorderProfile (requires API Level 8 or higher) + mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH)); + + // Step 4: Set output file + mMediaRecorder.setOutputFile(getOutputMediaFile(MEDIA_TYPE_VIDEO).toString()); + + // Step 5: Set the preview output + mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface()); + + // Step 6: Prepare configured MediaRecorder + try { + mMediaRecorder.prepare(); + } catch (IllegalStateException e) { + Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage()); + releaseMediaRecorder(); + return false; + } catch (IOException e) { + Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage()); + releaseMediaRecorder(); + return false; + } + return true; +} +</pre> + +<p>Prior to Android 2.2 (API Level 8), you must set the output format and encoding formats +parameters directly, instead of using {@link android.media.CamcorderProfile}. This approach is +demonstrated in the following code:</p> + +<pre> + // Step 3: Set output format and encoding (for versions prior to API Level 8) + mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); + mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); + mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.DEFAULT); +</pre> + +<p>The following video recording parameters for {@link android.media.MediaRecorder} are given +default settings, however, you may want to adjust these settings for your application:</p> + +<ul> + <li>{@link android.media.MediaRecorder#setVideoEncodingBitRate(int) +setVideoEncodingBitRate()}</li> + <li>{@link android.media.MediaRecorder#setVideoSize(int, int) setVideoSize()}</li> + <li>{@link android.media.MediaRecorder#setVideoFrameRate(int) setVideoFrameRate()}</li> + <li>{@link android.media.MediaRecorder#setAudioEncodingBitRate(int) +setAudioEncodingBitRate()}</li> <li>{@link android.media.MediaRecorder#setAudioChannels(int) +setAudioChannels()}</li> + <li>{@link android.media.MediaRecorder#setAudioSamplingRate(int) setAudioSamplingRate()}</li> +</ul> + +<h4 id="start-stop-mediarecorder">Starting and Stopping MediaRecorder</h4> +<p>When starting and stopping video recording using the {@link android.media.MediaRecorder} class, +you must follow a specific order, as listed below.</p> + +<ol> + <li>Unlock the camera with {@link android.hardware.Camera#unlock() Camera.unlock()}</li> + <li>Configure {@link android.media.MediaRecorder} as shown in the code example above</li> + <li>Start recording using {@link android.media.MediaRecorder#start() +MediaRecorder.start()}</li> + <li>Record the video</li> + <li>Stop recording using {@link +android.media.MediaRecorder#stop() MediaRecorder.stop()}</li> + <li>Release the media recorder with {@link android.media.MediaRecorder#release() +MediaRecorder.release()}</li> + <li>Lock the camera using {@link android.hardware.Camera#lock() Camera.lock()}</li> +</ol> + +<p>The following example code demonstrates how to wire up a button to properly start and stop +video recording using the camera and the {@link android.media.MediaRecorder} class.</p> + +<p class="note"><strong>Note:</strong> When completing a video recording, do not release the camera +or else your preview will be stopped.</p> + +<pre> +private boolean isRecording = false; + +// Add a listener to the Capture button +Button captureButton = (Button) findViewById(id.button_capture); +captureButton.setOnClickListener( + new View.OnClickListener() { + @Override + public void onClick(View v) { + if (isRecording) { + // stop recording and release camera + mMediaRecorder.stop(); // stop the recording + releaseMediaRecorder(); // release the MediaRecorder object + mCamera.lock(); // take camera access back from MediaRecorder + + // inform the user that recording has stopped + setCaptureButtonText("Capture"); + isRecording = false; + } else { + // initialize video camera + if (prepareVideoRecorder()) { + // Camera is available and unlocked, MediaRecorder is prepared, + // now you can start recording + mMediaRecorder.start(); + + // inform the user that recording has started + setCaptureButtonText("Stop"); + isRecording = true; + } else { + // prepare didn't work, release the camera + releaseMediaRecorder(); + // inform user + } + } + } + } +); +</pre> + +<p class="note"><strong>Note:</strong> In the above example, the {@code prepareVideoRecorder()} +method refers to the example code shown in <a +href="#configuring-mediarecorder">Configuring MediaRecorder</a>. This method takes care of locking +the camera, configuring and preparing the {@link android.media.MediaRecorder} instance.</p> + + +<h3 id="release-camera">Releasing the camera</h3> +<p>Cameras are a resource that is shared by applications on a device. Your application can make +use of the camera after getting an instance of {@link android.hardware.Camera}, and you must be +particularly careful to release the camera object when your application stops using it, and as +soon as your application is paused ({@link android.app.Activity#onPause() Activity.onPause()}). If +your application does not properly release the camera, all subsequent attempts to access the camera, +including those by your own application, will fail and may cause your or other applications to be +shut down.</p> + +<p>To release an instance of the {@link android.hardware.Camera} object, use the {@link +android.hardware.Camera#release() Camera.release()} method, as shown in the example code below.</p> + +<pre> +public class CameraActivity extends Activity { + private Camera mCamera; + private SurfaceView mPreview; + private MediaRecorder mMediaRecorder; + + ... + + @Override + protected void onPause() { + super.onPause(); + releaseMediaRecorder(); // if you are using MediaRecorder, release it first + releaseCamera(); // release the camera immediately on pause event + } + + private void releaseMediaRecorder(){ + if (mMediaRecorder != null) { + mMediaRecorder.reset(); // clear recorder configuration + mMediaRecorder.release(); // release the recorder object + mMediaRecorder = null; + mCamera.lock(); // lock camera for later use + } + } + + private void releaseCamera(){ + if (mCamera != null){ + mCamera.release(); // release the camera for other applications + mCamera = null; + } + } +} +</pre> + +<p class="caution"><strong>Caution:</strong> If your application does not properly release the +camera, all subsequent attempts to access the camera, including those by your own application, will +fail and may cause your or other applications to be shut down.</p> + + +<h2 id="saving-media">Saving Media Files</h2> +<p>Media files created by users such as pictures and videos should be saved to a device's external +storage directory (SD Card) to conserve system space and to allow users to access these files +without their device. There are many possible directory locations to save media files on a device, +however there are only two standard locations you should consider as a developer:</p> + +<ul> + <li><strong>{@link android.os.Environment#getExternalStoragePublicDirectory(java.lang.String) +Environment.getExternalStoragePublicDirectory}({@link android.os.Environment#DIRECTORY_PICTURES +Environment.DIRECTORY_PICTURES})</strong> - This method returns the standard, shared and recommended +location for saving pictures and videos. This directory is shared (public), so other applications +can easily discover, read, change and delete files saved in this location. If your application is +uninstalled by the user, media files saved to this location will not be removed. To avoid +interfering with users existing pictures and videos, you should create a sub-directory for your +application's media files within this directory, as shown in the code sample below. This method is +available in Android 2.2 (API Level 8), for equivalent calls in earlier API versions, see <a +href="{@docRoot}guide/topics/data/data-storage.html#SavingSharedFiles">Saving Shared Files</a>.</li> + <li><strong>{@link android.content.Context#getExternalFilesDir(java.lang.String) +Context.getExternalFilesDir}({@link android.os.Environment#DIRECTORY_PICTURES +Environment.DIRECTORY_PICTURES})</strong> - This method returns a standard location for saving +pictures and videos which are associated with your application. If your application is uninstalled, +any files saved in this location are removed. Security is not enforced for files in this +location and other applications may read, change and delete them.</li> +</ul> + +<p>The following example code demonstrates how to create a {@link java.io.File} or {@link +android.net.Uri} location for a media file that can be used when invoking a device's camera with +an {@link android.content.Intent} or as part of a <a href="#custom-camera">Building a Camera +App</a>.</p> + +<pre> +public static final int MEDIA_TYPE_IMAGE = 1; +public static final int MEDIA_TYPE_VIDEO = 2; + +/** Create a file Uri for saving an image or video */ +private static Uri getOutputMediaFileUri(int type){ + return Uri.fromFile(getOutputMediaFile(type)); +} + +/** Create a File for saving an image or video */ +private static Uri getOutputMediaFile(int type){ + // To be safe, you should check that the SDCard is mounted + // using Environment.getExternalStorageState() before doing this. + + File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory( + Environment.DIRECTORY_PICTURES), "MyCameraApp"); + // This location works best if you want the created images to be shared + // between applications and persist after your app has been uninstalled. + + // Create the storage directory if it does not exist + if (! mediaStorageDir.exists()){ + if (! mediaStorageDir.mkdirs()){ + Log.d("MyCameraApp", "failed to create directory"); + return null; + } + } + + // Create a media file name + String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date()); + File mediaFile; + if (type == MEDIA_TYPE_IMAGE){ + mediaFile = new File(mediaStorageDir.getPath() + File.separator + + "IMG_"+ timeStamp + ".jpg"); + } else if(type == MEDIA_TYPE_VIDEO) { + mediaFile = new File(mediaStorageDir.getPath() + File.separator + + "VID_"+ timeStamp + ".mp4"); + } else { + return null; + } + + return mediaFile; +} +</pre> + +<p class="note"><strong>Note:</strong> {@link +android.os.Environment#getExternalStoragePublicDirectory(java.lang.String) +Environment.getExternalStoragePublicDirectory()} is available in Android 2.2 (API Level 8) or +higher. If you are targeting devices with earlier versions of Android, use {@link +android.os.Environment#getExternalStorageDirectory() Environment.getExternalStorageDirectory()} +instead. For more information, see <a +href="{@docRoot}guide/topics/data/data-storage.html#SavingSharedFiles">Saving Shared Files</a>.</p> + +<p>For more information about saving files on an Android device, see <a +href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a>.</p>
\ No newline at end of file diff --git a/docs/html/guide/topics/media/index.jd b/docs/html/guide/topics/media/index.jd index 06e6208..7c1754f 100644 --- a/docs/html/guide/topics/media/index.jd +++ b/docs/html/guide/topics/media/index.jd @@ -1,971 +1,62 @@ -page.title=Media +page.title=Multimedia and Camera @jd:body <div id="qv-wrapper"> <div id="qv"> -<h2>Quickview</h2> -<ul> -<li>MediaPlayer APIs allow you to play and record media</li> -<li>You can handle data from raw resources, files, and streams</li> -<li>The platform supports a variety of media formats. See <a -href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media Formats</a></li> -</ul> - -<h2>In this document</h2> +<h2>Topics</h2> <ol> -<li><a href="#mediaplayer">Using MediaPlayer</a> - <ol> - <li><a href='#preparingasync'>Asynchronous Preparation</a></li> - <li><a href='#managestate'>Managing State</a></li> - <li><a href='#releaseplayer'>Releasing the MediaPlayer</a></li> - </ol> -</li> -<li><a href="#mpandservices">Using a Service with MediaPlayer</a> - <ol> - <li><a href="#asyncprepare">Running asynchronously</a></li> - <li><a href="#asyncerror">Handling asynchronous errors</a></li> - <li><a href="#wakelocks">Using wake locks</a></li> - <li><a href="#foregroundserv">Running as a foreground service</a></li> - <li><a href="#audiofocus">Handling audio focus</a></li> - <li><a href="#cleanup">Performing cleanup</a></li> - </ol> -</li> -<li><a href="#noisyintent">Handling the AUDIO_BECOMING_NOISY Intent</a> -<li><a href="#viacontentresolver">Retrieving Media from a Content Resolver</a> -<li><a href="#jetcontent">Playing JET content</a> -<li><a href="#audiocapture">Performing Audio Capture</a> +<li><a href="{@docRoot}guide/topics/media/mediaplayer.html">MediaPlayer</a></li> +<li><a href="{@docRoot}guide/topics/media/jetplayer.html">JetPlayer</a></li> +<li><a href="{@docRoot}guide/topics/media/camera.html">Camera</a></li> +<li><a href="{@docRoot}guide/topics/media/audio-capture.html">Audio Capture</a></li> </ol> <h2>Key classes</h2> <ol> <li>{@link android.media.MediaPlayer}</li> +<li>{@link android.media.JetPlayer}</li> +<li>{@link android.hardware.Camera}</li> <li>{@link android.media.MediaRecorder}</li> <li>{@link android.media.AudioManager}</li> -<li>{@link android.media.JetPlayer}</li> <li>{@link android.media.SoundPool}</li> </ol> <h2>See also</h2> <ol> -<li><a href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a></li> -<li><a href="{@docRoot}guide/topics/media/jet/jetcreator_manual.html">JetCreator User Manual</a></li> +<li></li> +<li><a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media Formats</a></li> +<li><a href="{@docRoot}guide/topics/media/jet/jetcreator_manual.html">JetCreator User +Manual</a></li> </ol> </div> </div> -<p>The Android multimedia framework includes support for encoding and decoding a -variety of common media types, so that you can easily integrate audio, -video and images into your applications. You can play audio or video from media files stored in your -application's resources (raw resources), from standalone files in the filesystem, or from a data -stream arriving over a network connection, all using {@link android.media.MediaPlayer} APIs.</p> - -<p>You can also record audio and video using the {@link android.media.MediaRecorder} APIs if -supported by the device hardware. Note that the emulator doesn't have hardware to capture audio or -video, but actual mobile devices are likely to provide these capabilities.</p> - -<p>This document shows you how to write a media-playing application that interacts with the user and -the system in order to obtain good performance and a pleasant user experience.</p> - -<p class="note"><strong>Note:</strong> You can play back the audio data only to the standard output -device. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play sound -files in the conversation audio during a call.</p> - - -<h2 id="mediaplayer">Using MediaPlayer</h2> - -<p>One of the most important components of the media framework is the -{@link android.media.MediaPlayer MediaPlayer} -class. An object of this class can fetch, decode, and play both audio and video -with minimal setup. It supports several different media sources such as: -<ul> - <li>Local resources</li> - <li>Internal URIs, such as one you might obtain from a Content Resolver</li> - <li>External URLs (streaming)</li> -</ul> -</p> - -<p>For a list of media formats that Android supports, -see the <a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media -Formats</a> document. </p> - -<p>Here is an example -of how to play audio that's available as a local raw resource (saved in your application's -{@code res/raw/} directory):</p> - -<pre>MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1); -mediaPlayer.start(); // no need to call prepare(); create() does that for you -</pre> - -<p>In this case, a "raw" resource is a file that the system does not -try to parse in any particular way. However, the content of this resource should not -be raw audio. It should be a properly encoded and formatted media file in one -of the supported formats.</p> - -<p>And here is how you might play from a URI available locally in the system -(that you obtained through a Content Resolver, for instance):</p> - -<pre>Uri myUri = ....; // initialize Uri here -MediaPlayer mediaPlayer = new MediaPlayer(); -mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); -mediaPlayer.setDataSource(getApplicationContext(), myUri); -mediaPlayer.prepare(); -mediaPlayer.start();</pre> - -<p>Playing from a remote URL via HTTP streaming looks like this:</p> - -<pre>String url = "http://........"; // your URL here -MediaPlayer mediaPlayer = new MediaPlayer(); -mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); -mediaPlayer.setDataSource(url); -mediaPlayer.prepare(); // might take long! (for buffering, etc) -mediaPlayer.start();</pre> - -<p class="note"><strong>Note:</strong> -If you're passing a URL to stream an online media file, the file must be capable of -progressive download.</p> - -<p class="caution"><strong>Caution:</strong> You must either catch or pass -{@link java.lang.IllegalArgumentException} and {@link java.io.IOException} when using -{@link android.media.MediaPlayer#setDataSource setDataSource()}, because -the file you are referencing might not exist.</p> - -<h3 id='#preparingasync'>Asynchronous Preparation</h3> - -<p>Using {@link android.media.MediaPlayer MediaPlayer} can be straightforward in -principle. However, it's important to keep in mind that a few more things are -necessary to integrate it correctly with a typical Android application. For -example, the call to {@link android.media.MediaPlayer#prepare prepare()} can -take a long time to execute, because -it might involve fetching and decoding media data. So, as is the case with any -method that may take long to execute, you should <strong>never call it from your -application's UI thread</strong>. Doing that will cause the UI to hang until the method returns, -which is a very bad user experience and can cause an ANR (Application Not Responding) error. Even if -you expect your resource to load quickly, remember that anything that takes more than a tenth -of a second to respond in the UI will cause a noticeable pause and will give -the user the impression that your application is slow.</p> - -<p>To avoid hanging your UI thread, spawn another thread to -prepare the {@link android.media.MediaPlayer} and notify the main thread when done. However, while -you could write the threading logic -yourself, this pattern is so common when using {@link android.media.MediaPlayer} that the framework -supplies a convenient way to accomplish this task by using the -{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. This method -starts preparing the media in the background and returns immediately. When the media -is done preparing, the {@link android.media.MediaPlayer.OnPreparedListener#onPrepared onPrepared()} -method of the {@link android.media.MediaPlayer.OnPreparedListener -MediaPlayer.OnPreparedListener}, configured through -{@link android.media.MediaPlayer#setOnPreparedListener setOnPreparedListener()} is called.</p> - -<h3 id='#managestate'>Managing State</h3> - -<p>Another aspect of a {@link android.media.MediaPlayer} that you should keep in mind is -that it's state-based. That is, the {@link android.media.MediaPlayer} has an internal state -that you must always be aware of when writing your code, because certain operations -are only valid when then player is in specific states. If you perform an operation while in the -wrong state, the system may throw an exception or cause other undesireable behaviors.</p> - -<p>The documentation in the -{@link android.media.MediaPlayer MediaPlayer} class shows a complete state diagram, -that clarifies which methods move the {@link android.media.MediaPlayer} from one state to another. -For example, when you create a new {@link android.media.MediaPlayer}, it is in the <em>Idle</em> -state. At that point, you should initialize it by calling -{@link android.media.MediaPlayer#setDataSource setDataSource()}, bringing it -to the <em>Initialized</em> state. After that, you have to prepare it using either the -{@link android.media.MediaPlayer#prepare prepare()} or -{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. When -the {@link android.media.MediaPlayer} is done preparing, it will then enter the <em>Prepared</em> -state, which means you can call {@link android.media.MediaPlayer#start start()} -to make it play the media. At that point, as the diagram illustrates, -you can move between the <em>Started</em>, <em>Paused</em> and <em>PlaybackCompleted</em> states by -calling such methods as -{@link android.media.MediaPlayer#start start()}, -{@link android.media.MediaPlayer#pause pause()}, and -{@link android.media.MediaPlayer#seekTo seekTo()}, -amongst others. When you -call {@link android.media.MediaPlayer#stop stop()}, however, notice that you -cannot call {@link android.media.MediaPlayer#start start()} again until you -prepare the {@link android.media.MediaPlayer} again.</p> - -<p>Always keep <a href='{@docRoot}images/mediaplayer_state_diagram.gif'>the state diagram</a> -in mind when writing code that interacts with a -{@link android.media.MediaPlayer} object, because calling its methods from the wrong state is a -common cause of bugs.</p> - -<h3 id='#releaseplayer'>Releasing the MediaPlayer</h3> - -<p>A {@link android.media.MediaPlayer MediaPlayer} can consume valuable -system resources. -Therefore, you should always take extra precautions to make sure you are not -hanging on to a {@link android.media.MediaPlayer} instance longer than necessary. When you -are done with it, you should always call -{@link android.media.MediaPlayer#release release()} to make sure any -system resources allocated to it are properly released. For example, if you are -using a {@link android.media.MediaPlayer} and your activity receives a call to {@link -android.app.Activity#onStop onStop()}, you must release the {@link android.media.MediaPlayer}, -because it -makes little sense to hold on to it while your activity is not interacting with -the user (unless you are playing media in the background, which is discussed in the next section). -When your activity is resumed or restarted, of course, you need to -create a new {@link android.media.MediaPlayer} and prepare it again before resuming playback.</p> - -<p>Here's how you should release and then nullify your {@link android.media.MediaPlayer}:</p> -<pre> -mediaPlayer.release(); -mediaPlayer = null; -</pre> - -<p>As an example, consider the problems that could happen if you -forgot to release the {@link android.media.MediaPlayer} when your activity is stopped, but create a -new one when the activity starts again. As you may know, when the user changes the -screen orientation (or changes the device configuration in another way), -the system handles that by restarting the activity (by default), so you might quickly -consume all of the system resources as the user -rotates the device back and forth between portrait and landscape, because at each -orientation change, you create a new {@link android.media.MediaPlayer} that you never -release. (For more information about runtime restarts, see <a -href="{@docRoot}guide/topics/resources/runtime-changes.html">Handling Runtime Changes</a>.)</p> - -<p>You may be wondering what happens if you want to continue playing -"background media" even when the user leaves your activity, much in the same -way that the built-in Music application behaves. In this case, what you need is -a {@link android.media.MediaPlayer MediaPlayer} controlled by a {@link android.app.Service}, as -discussed in <a href="mpandservices">Using a Service with MediaPlayer</a>.</p> - -<h2 id="mpandservices">Using a Service with MediaPlayer</h2> - -<p>If you want your media to play in the background even when your application -is not onscreen—that is, you want it to continue playing while the user is -interacting with other applications—then you must start a -{@link android.app.Service Service} and control the -{@link android.media.MediaPlayer MediaPlayer} instance from there. -You should be careful about this setup, because the user and the system have expectations -about how an application running a background service should interact with the rest of the -system. If your application does not fulfil those expectations, the user may -have a bad experience. This section describes the main issues that you should be -aware of and offers suggestions about how to approach them.</p> - - -<h3 id="asyncprepare">Running asynchronously</h3> - -<p>First of all, like an {@link android.app.Activity Activity}, all work in a -{@link android.app.Service Service} is done in a single thread by -default—in fact, if you're running an activity and a service from the same application, they -use the same thread (the "main thread") by default. Therefore, services need to -process incoming intents quickly -and never perform lengthy computations when responding to them. If any heavy -work or blocking calls are expected, you must do those tasks asynchronously: either from -another thread you implement yourself, or using the framework's many facilities -for asynchronous processing.</p> - -<p>For instance, when using a {@link android.media.MediaPlayer} from your main thread, -you should call {@link android.media.MediaPlayer#prepareAsync prepareAsync()} rather than -{@link android.media.MediaPlayer#prepare prepare()}, and implement -a {@link android.media.MediaPlayer.OnPreparedListener MediaPlayer.OnPreparedListener} -in order to be notified when the preparation is complete and you can start playing. -For example:</p> - -<pre> -public class MyService extends Service implements MediaPlayer.OnPreparedListener { - private static final ACTION_PLAY = "com.example.action.PLAY"; - MediaPlayer mMediaPlayer = null; - - public int onStartCommand(Intent intent, int flags, int startId) { - ... - if (intent.getAction().equals(ACTION_PLAY)) { - mMediaPlayer = ... // initialize it here - mMediaPlayer.setOnPreparedListener(this); - mMediaPlayer.prepareAsync(); // prepare async to not block main thread - } - } - - /** Called when MediaPlayer is ready */ - public void onPrepared(MediaPlayer player) { - player.start(); - } -} -</pre> - - -<h3 id="asyncerror">Handling asynchronous errors</h3> - -<p>On synchronous operations, errors would normally -be signaled with an exception or an error code, but whenever you use asynchronous -resources, you should make sure your application is notified -of errors appropriately. In the case of a {@link android.media.MediaPlayer MediaPlayer}, -you can accomplish this by implementing a -{@link android.media.MediaPlayer.OnErrorListener MediaPlayer.OnErrorListener} and -setting it in your {@link android.media.MediaPlayer} instance:</p> - -<pre> -public class MyService extends Service implements MediaPlayer.OnErrorListener { - MediaPlayer mMediaPlayer; - - public void initMediaPlayer() { - // ...initialize the MediaPlayer here... - - mMediaPlayer.setOnErrorListener(this); - } - - @Override - public boolean onError(MediaPlayer mp, int what, int extra) { - // ... react appropriately ... - // The MediaPlayer has moved to the Error state, must be reset! - } -} -</pre> - -<p>It's important to remember that when an error occurs, the {@link android.media.MediaPlayer} -moves to the <em>Error</em> state (see the documentation for the -{@link android.media.MediaPlayer MediaPlayer} class for the full state diagram) -and you must reset it before you can use it again. - - -<h3 id="wakelocks">Using wake locks</h3> - -<p>When designing applications that play media -in the background, the device may go to sleep -while your service is running. Because the Android system tries to conserve -battery while the device is sleeping, the system tries to shut off any -of the phone's features that are -not necessary, including the CPU and the WiFi hardware. -However, if your service is playing or streaming music, you want to prevent -the system from interfering with your playback.</p> - -<p>In order to ensure that your service continues to run under -those conditions, you have to use "wake locks." A wake lock is a way to signal to -the system that your application is using some feature that should -stay available even if the phone is idle.</p> - -<p class="caution"><strong>Notice:</strong> You should always use wake locks sparingly and hold them -only for as long as truly necessary, because they significantly reduce the battery life of the -device.</p> - -<p>To ensure that the CPU continues running while your {@link android.media.MediaPlayer} is -playing, call the {@link android.media.MediaPlayer#setWakeMode -setWakeMode()} method when initializing your {@link android.media.MediaPlayer}. Once you do, -the {@link android.media.MediaPlayer} holds the specified lock while playing and releases the lock -when paused or stopped:</p> - -<pre> -mMediaPlayer = new MediaPlayer(); -// ... other initialization here ... -mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK); -</pre> - -<p>However, the wake lock acquired in this example guarantees only that the CPU remains awake. If -you are streaming media over the -network and you are using Wi-Fi, you probably want to hold a -{@link android.net.wifi.WifiManager.WifiLock WifiLock} as -well, which you must acquire and release manually. So, when you start preparing the -{@link android.media.MediaPlayer} with the remote URL, you should create and acquire the Wi-Fi lock. -For example:</p> - -<pre> -WifiLock wifiLock = ((WifiManager) getSystemService(Context.WIFI_SERVICE)) - .createWifiLock(WifiManager.WIFI_MODE_FULL, "mylock"); - -wifiLock.acquire(); -</pre> - -<p>When you pause or stop your media, or when you no longer need the -network, you should release the lock:</p> - -<pre> -wifiLock.release(); -</pre> - - -<h3 id="foregroundserv">Running as a foreground service</h3> - -<p>Services are often used for performing background tasks, such as fetching emails, -synchronizing data, downloading content, amongst other possibilities. In these -cases, the user is not actively aware of the service's execution, and probably -wouldn't even notice if some of these services were interrupted and later restarted.</p> - -<p>But consider the case of a service that is playing music. Clearly this is a service that the user -is actively aware of and the experience would be severely affected by any interruptions. -Additionally, it's a service that the user will likely wish to interact with during its execution. -In this case, the service should run as a "foreground service." A -foreground service holds a higher level of importance within the system—the system will -almost never kill the service, because it is of immediate importance to the user. When running -in the foreground, the service also must provide a status bar notification to ensure that users are -aware of the running service and allow them to open an activity that can interact with the -service.</p> - -<p>In order to turn your service into a foreground service, you must create a -{@link android.app.Notification Notification} for the status bar and call -{@link android.app.Service#startForeground startForeground()} from the {@link -android.app.Service}. For example:</p> - -<pre>String songName; -// assign the song name to songName -PendingIntent pi = PendingIntent.getActivity(getApplicationContext(), 0, - new Intent(getApplicationContext(), MainActivity.class), - PendingIntent.FLAG_UPDATE_CURRENT); -Notification notification = new Notification(); -notification.tickerText = text; -notification.icon = R.drawable.play0; -notification.flags |= Notification.FLAG_ONGOING_EVENT; -notification.setLatestEventInfo(getApplicationContext(), "MusicPlayerSample", - "Playing: " + songName, pi); -startForeground(NOTIFICATION_ID, notification); -</pre> - -<p>While your service is running in the foreground, the notification you -configured is visible in the notification area of the device. If the user -selects the notification, the system invokes the {@link android.app.PendingIntent} you supplied. In -the example above, it opens an activity ({@code MainActivity}).</p> - -<p>Figure 1 shows how your notification appears to the user:</p> - -<img src='images/notification1.png' /> - -<img src='images/notification2.png' /> -<p class="img-caption"><strong>Figure 1.</strong> Screenshots of a foreground service's notification, showing the notification icon in the status bar (left) and the expanded view (right).</p> - -<p>You should only hold on to the "foreground service" status while your -service is actually performing something the user is actively aware of. Once -that is no longer true, you should release it by calling -{@link android.app.Service#stopForeground stopForeground()}:</p> - -<pre> -stopForeground(true); -</pre> - -<p>For more information, see the documentation about <a -href="{@docRoot}guide/topics/fundamentals/services.html#Foreground">Services</a> and -<a href="{@docRoot}guide/topics/ui/notifiers/notifications.html">Status Bar Notifications</a>.</p> - - -<h3 id="audiofocus">Handling audio focus</h3> - -<p>Even though only one activity can run at any given time, Android is a -multi-tasking environment. This poses a particular challenge to applications -that use audio, because there is only one audio output and there may be several -media services competing for its use. Before Android 2.2, there was no built-in -mechanism to address this issue, which could in some cases lead to a bad user -experience. For example, when a user is listening to -music and another application needs to notify the user of something very important, -the user might not hear the notification tone due to the loud music. Starting with -Android 2.2, the platform offers a way for applications to negotiate their -use of the device's audio output. This mechanism is called Audio Focus.</p> - -<p>When your application needs to output audio such as music or a notification, -you should always request audio focus. Once it has focus, it can use the sound output freely, but it should -always listen for focus changes. If it is notified that it has lost the audio -focus, it should immediately either kill the audio or lower it to a quiet level -(known as "ducking"—there is a flag that indicates which one is appropriate) and only resume -loud playback after it receives focus again.</p> - -<p>Audio Focus is cooperative in nature. That is, applications are expected -(and highly encouraged) to comply with the audio focus guidelines, but the -rules are not enforced by the system. If an application wants to play loud -music even after losing audio focus, nothing in the system will prevent that. -However, the user is more likely to have a bad experience and will be more -likely to uninstall the misbehaving application.</p> - -<p>To request audio focus, you must call -{@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} from the {@link -android.media.AudioManager}, as the example below demonstrates:</p> - -<pre> -AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE); -int result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC, - AudioManager.AUDIOFOCUS_GAIN); - -if (result != AudioManager.AUDIOFOCUS_REQUEST_GRANTED) { - // could not get audio focus. -} -</pre> - -<p>The first parameter to {@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} -is an {@link android.media.AudioManager.OnAudioFocusChangeListener -AudioManager.OnAudioFocusChangeListener}, -whose {@link android.media.AudioManager.OnAudioFocusChangeListener#onAudioFocusChange -onAudioFocusChange()} method is called whenever there is a change in audio focus. Therefore, you -should also implement this interface on your service and activities. For example:</p> - -<pre> -class MyService extends Service - implements AudioManager.OnAudioFocusChangeListener { - // .... - public void onAudioFocusChange(int focusChange) { - // Do something based on focus change... - } -} -</pre> - -<p>The <code>focusChange</code> parameter tells you how the audio focus has changed, and -can be one of the following values (they are all constants defined in -{@link android.media.AudioManager AudioManager}):</p> - -<ul> -<li>{@link android.media.AudioManager#AUDIOFOCUS_GAIN}: You have gained the audio focus.</li> - -<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS}: You have lost the audio focus for a -presumably long time. -You must stop all audio playback. Because you should expect not to have focus back -for a long time, this would be a good place to clean up your resources as much -as possible. For example, you should release the {@link android.media.MediaPlayer}.</li> - -<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS_TRANSIENT}: You have -temporarily lost audio focus, but should receive it back shortly. You must stop -all audio playback, but you can keep your resources because you will probably get -focus back shortly.</li> - -<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK}: You have temporarily -lost audio focus, -but you are allowed to continue to play audio quietly (at a low volume) instead -of killing audio completely.</li> -</ul> - -<p>Here is an example implementation:</p> - -<pre> -public void onAudioFocusChange(int focusChange) { - switch (focusChange) { - case AudioManager.AUDIOFOCUS_GAIN: - // resume playback - if (mMediaPlayer == null) initMediaPlayer(); - else if (!mMediaPlayer.isPlaying()) mMediaPlayer.start(); - mMediaPlayer.setVolume(1.0f, 1.0f); - break; - - case AudioManager.AUDIOFOCUS_LOSS: - // Lost focus for an unbounded amount of time: stop playback and release media player - if (mMediaPlayer.isPlaying()) mMediaPlayer.stop(); - mMediaPlayer.release(); - mMediaPlayer = null; - break; - - case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: - // Lost focus for a short time, but we have to stop - // playback. We don't release the media player because playback - // is likely to resume - if (mMediaPlayer.isPlaying()) mMediaPlayer.pause(); - break; - - case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: - // Lost focus for a short time, but it's ok to keep playing - // at an attenuated level - if (mMediaPlayer.isPlaying()) mMediaPlayer.setVolume(0.1f, 0.1f); - break; - } -} -</pre> - -<p>Keep in mind that the audio focus APIs are available only with API level 8 (Android 2.2) -and above, so if you want to support previous -versions of Android, you should adopt a backward compatibility strategy that -allows you to use this feature if available, and fall back seamlessly if not.</p> - -<p>You can achieve backward compatibility either by calling the audio focus methods by reflection -or by implementing all the audio focus features in a separate class (say, -<code>AudioFocusHelper</code>). Here is an example of such a class:</p> - -<pre> -public class AudioFocusHelper implements AudioManager.OnAudioFocusChangeListener { - AudioManager mAudioManager; - - // other fields here, you'll probably hold a reference to an interface - // that you can use to communicate the focus changes to your Service - - public AudioFocusHelper(Context ctx, /* other arguments here */) { - mAudioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE); - // ... - } - - public boolean requestFocus() { - return AudioManager.AUDIOFOCUS_REQUEST_GRANTED == - mAudioManager.requestAudioFocus(mContext, AudioManager.STREAM_MUSIC, - AudioManager.AUDIOFOCUS_GAIN); - } - - public boolean abandonFocus() { - return AudioManager.AUDIOFOCUS_REQUEST_GRANTED == - mAudioManager.abandonAudioFocus(this); - } - - @Override - public void onAudioFocusChange(int focusChange) { - // let your service know about the focus change - } -} -</pre> - - -<p>You can create an instance of <code>AudioFocusHelper</code> class only if you detect that -the system is running API level 8 or above. For example:</p> - -<pre> -if (android.os.Build.VERSION.SDK_INT >= 8) { - mAudioFocusHelper = new AudioFocusHelper(getApplicationContext(), this); -} else { - mAudioFocusHelper = null; -} -</pre> - - -<h3 id="cleanup">Performing cleanup</h3> - -<p>As mentioned earlier, a {@link android.media.MediaPlayer} object can consume a significant -amount of system resources, so you should keep it only for as long as you need and call -{@link android.media.MediaPlayer#release release()} when you are done with it. It's important -to call this cleanup method explicitly rather than rely on system garbage collection because -it might take some time before the garbage collector reclaims the {@link android.media.MediaPlayer}, -as it's only sensitive to memory needs and not to shortage of other media-related resources. -So, in the case when you're using a service, you should always override the -{@link android.app.Service#onDestroy onDestroy()} method to make sure you are releasing -the {@link android.media.MediaPlayer}:</p> - -<pre> -public class MyService extends Service { - MediaPlayer mMediaPlayer; - // ... - - @Override - public void onDestroy() { - if (mMediaPlayer != null) mMediaPlayer.release(); - } -} -</pre> - -<p>You should always look for other opportunities to release your {@link android.media.MediaPlayer} -as well, apart from releasing it when being shut down. For example, if you expect not -to be able to play media for an extended period of time (after losing audio focus, for example), -you should definitely release your existing {@link android.media.MediaPlayer} and create it again -later. On the -other hand, if you only expect to stop playback for a very short time, you should probably -hold on to your {@link android.media.MediaPlayer} to avoid the overhead of creating and preparing it -again.</p> - - - -<h2 id="noisyintent">Handling the AUDIO_BECOMING_NOISY Intent</h2> - -<p>Many well-written applications that play audio automatically stop playback when an event -occurs that causes the audio to become noisy (ouput through external speakers). For instance, -this might happen when a user is listening to music through headphones and accidentally -disconnects the headphones from the device. However, this behavior does not happen automatically. -If you don't implement this feature, audio plays out of the device's external speakers, which -might not be what the user wants.</p> - -<p>You can ensure your app stops playing music in these situations by handling -the {@link android.media.AudioManager#ACTION_AUDIO_BECOMING_NOISY} intent, for which you can register a receiver by -adding the following to your manifest:</p> - -<pre> -<receiver android:name=".MusicIntentReceiver"> - <intent-filter> - <action android:name="android.media.AUDIO_BECOMING_NOISY" /> - </intent-filter> -</receiver> -</pre> - -<p>This registers the <code>MusicIntentReceiver</code> class as a broadcast receiver for that -intent. You should then implement this class:</p> - -<pre> -public class MusicIntentReceiver implements android.content.BroadcastReceiver { - @Override - public void onReceive(Context ctx, Intent intent) { - if (intent.getAction().equals( - android.media.AudioManager.ACTION_AUDIO_BECOMING_NOISY)) { - // signal your service to stop playback - // (via an Intent, for instance) - } - } -} -</pre> - - - - -<h2 id="viacontentresolver">Retrieving Media from a Content Resolver</h2> - -<p>Another feature that may be useful in a media player application is the ability to -retrieve music that the user has on the device. You can do that by querying the {@link -android.content.ContentResolver} for external media:</p> - -<pre> -ContentResolver contentResolver = getContentResolver(); -Uri uri = android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; -Cursor cursor = contentResolver.query(uri, null, null, null, null); -if (cursor == null) { - // query failed, handle error. -} else if (!cursor.moveToFirst()) { - // no media on the device -} else { - int titleColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media.TITLE); - int idColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media._ID); - do { - long thisId = cursor.getLong(idColumn); - String thisTitle = cursor.getString(titleColumn); - // ...process entry... - } while (cursor.moveToNext()); -} -</pre> - -<p>To use this with the {@link android.media.MediaPlayer}, you can do this:</p> - -<pre> -long id = /* retrieve it from somewhere */; -Uri contentUri = ContentUris.withAppendedId( - android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, id); - -mMediaPlayer = new MediaPlayer(); -mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); -mMediaPlayer.setDataSource(getApplicationContext(), contentUri); - -// ...prepare and start... -</pre> - - - -<h2 id="jetcontent">Playing JET content</h2> - -<p>The Android platform includes a JET engine that lets you add interactive playback of JET audio -content in your applications. You can create JET content for interactive playback using the -JetCreator authoring application that ships with the SDK. To play and manage JET content from your -application, use the {@link android.media.JetPlayer JetPlayer} class.</p> - -<p>For a description of JET concepts and instructions on how to use the JetCreator authoring tool, -see the <a href="{@docRoot}guide/topics/media/jet/jetcreator_manual.html">JetCreator User -Manual</a>. The tool is available on Windows, OS X, and Linux platforms (Linux does not -support auditioning of imported assets like with the Windows and OS X versions). -</p> - -<p>Here's an example of how to set up JET playback from a <code>.jet</code> file stored on the SD card:</p> - -<pre> -JetPlayer jetPlayer = JetPlayer.getJetPlayer(); -jetPlayer.loadJetFile("/sdcard/level1.jet"); -byte segmentId = 0; - -// queue segment 5, repeat once, use General MIDI, transpose by -1 octave -jetPlayer.queueJetSegment(5, -1, 1, -1, 0, segmentId++); -// queue segment 2 -jetPlayer.queueJetSegment(2, -1, 0, 0, 0, segmentId++); - -jetPlayer.play(); -</pre> - -<p>The SDK includes an example application — JetBoy — that shows how to use {@link -android.media.JetPlayer JetPlayer} to create an interactive music soundtrack in your game. It also -illustrates how to use JET events to synchronize music and game logic. The application is located at -<code><sdk>/platforms/android-1.5/samples/JetBoy</code>.</p> - - -<h2 id="audiocapture">Performing Audio Capture</h2> - -<p>Audio capture from the device is a bit more complicated than audio and video playback, but still fairly simple:</p> -<ol> - <li>Create a new instance of {@link android.media.MediaRecorder android.media.MediaRecorder}.</li> - <li>Set the audio source using - {@link android.media.MediaRecorder#setAudioSource MediaRecorder.setAudioSource()}. You will probably want to use - <code>MediaRecorder.AudioSource.MIC</code>.</li> - <li>Set output file format using - {@link android.media.MediaRecorder#setOutputFormat MediaRecorder.setOutputFormat()}. - </li> - <li>Set output file name using - {@link android.media.MediaRecorder#setOutputFile MediaRecorder.setOutputFile()}. - </li> - <li>Set the audio encoder using - {@link android.media.MediaRecorder#setAudioEncoder MediaRecorder.setAudioEncoder()}. - </li> - <li>Call {@link android.media.MediaRecorder#prepare MediaRecorder.prepare()} - on the MediaRecorder instance.</li> - <li>To start audio capture, call - {@link android.media.MediaRecorder#start MediaRecorder.start()}. </li> - <li>To stop audio capture, call {@link android.media.MediaRecorder#stop MediaRecorder.stop()}. - <li>When you are done with the MediaRecorder instance, call -{@link android.media.MediaRecorder#release MediaRecorder.release()} on it. Calling -{@link android.media.MediaRecorder#release MediaRecorder.release()} is always recommended to -free the resource immediately.</li> -</ol> - -<h3>Example: Record audio and play the recorded audio</h3> -<p>The example class below illustrates how to set up, start and stop audio capture, and to play the recorded audio file.</p> -<pre> -/* - * The application needs to have the permission to write to external storage - * if the output file is written to the external storage, and also the - * permission to record audio. These permissions must be set in the - * application's AndroidManifest.xml file, with something like: - * - * <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> - * <uses-permission android:name="android.permission.RECORD_AUDIO" /> - * - */ -package com.android.audiorecordtest; - -import android.app.Activity; -import android.widget.LinearLayout; -import android.os.Bundle; -import android.os.Environment; -import android.view.ViewGroup; -import android.widget.Button; -import android.view.View; -import android.view.View.OnClickListener; -import android.content.Context; -import android.util.Log; -import android.media.MediaRecorder; -import android.media.MediaPlayer; - -import java.io.IOException; - - -public class AudioRecordTest extends Activity -{ - private static final String LOG_TAG = "AudioRecordTest"; - private static String mFileName = null; - - private RecordButton mRecordButton = null; - private MediaRecorder mRecorder = null; - - private PlayButton mPlayButton = null; - private MediaPlayer mPlayer = null; - - private void onRecord(boolean start) { - if (start) { - startRecording(); - } else { - stopRecording(); - } - } - - private void onPlay(boolean start) { - if (start) { - startPlaying(); - } else { - stopPlaying(); - } - } - - private void startPlaying() { - mPlayer = new MediaPlayer(); - try { - mPlayer.setDataSource(mFileName); - mPlayer.prepare(); - mPlayer.start(); - } catch (IOException e) { - Log.e(LOG_TAG, "prepare() failed"); - } - } - - private void stopPlaying() { - mPlayer.release(); - mPlayer = null; - } - - private void startRecording() { - mRecorder = new MediaRecorder(); - mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); - mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); - mRecorder.setOutputFile(mFileName); - mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); - - try { - mRecorder.prepare(); - } catch (IOException e) { - Log.e(LOG_TAG, "prepare() failed"); - } - - mRecorder.start(); - } - - private void stopRecording() { - mRecorder.stop(); - mRecorder.release(); - mRecorder = null; - } - - class RecordButton extends Button { - boolean mStartRecording = true; - - OnClickListener clicker = new OnClickListener() { - public void onClick(View v) { - onRecord(mStartRecording); - if (mStartRecording) { - setText("Stop recording"); - } else { - setText("Start recording"); - } - mStartRecording = !mStartRecording; - } - }; - - public RecordButton(Context ctx) { - super(ctx); - setText("Start recording"); - setOnClickListener(clicker); - } - } - - class PlayButton extends Button { - boolean mStartPlaying = true; - - OnClickListener clicker = new OnClickListener() { - public void onClick(View v) { - onPlay(mStartPlaying); - if (mStartPlaying) { - setText("Stop playing"); - } else { - setText("Start playing"); - } - mStartPlaying = !mStartPlaying; - } - }; - - public PlayButton(Context ctx) { - super(ctx); - setText("Start playing"); - setOnClickListener(clicker); - } - } - - public AudioRecordTest() { - mFileName = Environment.getExternalStorageDirectory().getAbsolutePath(); - mFileName += "/audiorecordtest.3gp"; - } - - @Override - public void onCreate(Bundle icicle) { - super.onCreate(icicle); - - LinearLayout ll = new LinearLayout(this); - mRecordButton = new RecordButton(this); - ll.addView(mRecordButton, - new LinearLayout.LayoutParams( - ViewGroup.LayoutParams.WRAP_CONTENT, - ViewGroup.LayoutParams.WRAP_CONTENT, - 0)); - mPlayButton = new PlayButton(this); - ll.addView(mPlayButton, - new LinearLayout.LayoutParams( - ViewGroup.LayoutParams.WRAP_CONTENT, - ViewGroup.LayoutParams.WRAP_CONTENT, - 0)); - setContentView(ll); - } - - @Override - public void onPause() { - super.onPause(); - if (mRecorder != null) { - mRecorder.release(); - mRecorder = null; - } - - if (mPlayer != null) { - mPlayer.release(); - mPlayer = null; - } - } -} -</pre> - - - +<p>The Android multimedia framework includes support for capturing and playing audio, video and +images in a variety of common media types, so that you can easily integrate them into your +applications. You can play audio or video from media files stored in your application's resources, +from standalone files in the file system, or from a data stream arriving over a +network connection, all using the {@link android.media.MediaPlayer} or {@link +android.media.JetPlayer} APIs. You can also record audio, video and take pictures using the {@link +android.media.MediaRecorder} and {@link android.hardware.Camera} APIs if supported by the device +hardware.</p> + +<p>The following topics show you how to use the Android framework to implement multimedia capture +and playback.</p> + +<dl> + <dt><strong><a href="{@docRoot}guide/topics/media/mediaplayer.html">MediaPlayer</a></strong></dt> + <dd>How to play audio and video in your application.</dd> + + <dt><strong><a href="{@docRoot}guide/topics/media/jetplayer.html">JetPlayer</a></strong></dt> + <dd>How to play interactive audio and video in your application using content created with +JetCreator.</dd> + + <dt><strong><a href="{@docRoot}guide/topics/media/camera.html">Camera</a></strong></dt> + <dd>How to use a device camera to take pictures or video in your application.</dd> + + <dt><strong><a href="{@docRoot}guide/topics/media/audio-capture.html">Audio +Capture</a></strong></dt> + <dd>How to record sound in your application.</dd> +</dl>
\ No newline at end of file diff --git a/docs/html/guide/topics/media/jetplayer.jd b/docs/html/guide/topics/media/jetplayer.jd new file mode 100644 index 0000000..f3d55f9 --- /dev/null +++ b/docs/html/guide/topics/media/jetplayer.jd @@ -0,0 +1,70 @@ +page.title=JetPlayer +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + + <div id="qv-wrapper"> + <div id="qv"> + +<h2>In this document</h2> +<ol> +<li><a href="#jetcontent">Playing JET content</a> +</ol> + +<h2>Key classes</h2> +<ol> +<li>{@link android.media.JetPlayer}</li> +</ol> + +<h2>Related Samples</h2> +<ol> +<li><a href="{@docRoot}resources/samples/JetBoy/index.html">JetBoy</a></li> +</ol> + +<h2>See also</h2> +<ol> +<li><a href="{@docRoot}guide/topics/media/jet/jetcreator_manual.html">JetCreator User +Manual</a></li> +<li><a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media Formats</a></li> +<li><a href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a></li> +<li><a href="{@docRoot}guide/topics/media/mediaplayer.html">MediaPlayer</a></li> +</ol> + +</div> +</div> + +<p>The Android platform includes a JET engine that lets you add interactive playback of JET audio +content in your applications. You can create JET content for interactive playback using the +JetCreator authoring application that ships with the SDK. To play and manage JET content from your +application, use the {@link android.media.JetPlayer JetPlayer} class.</p> + + +<h2 id="jetcontent">Playing JET content</h2> + +<p>This section shows you how to write, set up and play JET content. For a description of JET +concepts and instructions on how to use the JetCreator authoring tool, see the <a +href="{@docRoot}guide/topics/media/jet/jetcreator_manual.html">JetCreator User +Manual</a>. The tool is available on Windows, OS X, and Linux platforms (Linux does not +support auditioning of imported assets like with the Windows and OS X versions). +</p> + +<p>Here's an example of how to set up JET playback from a <code>.jet</code> file stored on the SD +card:</p> + +<pre> +JetPlayer jetPlayer = JetPlayer.getJetPlayer(); +jetPlayer.loadJetFile("/sdcard/level1.jet"); +byte segmentId = 0; + +// queue segment 5, repeat once, use General MIDI, transpose by -1 octave +jetPlayer.queueJetSegment(5, -1, 1, -1, 0, segmentId++); +// queue segment 2 +jetPlayer.queueJetSegment(2, -1, 0, 0, 0, segmentId++); + +jetPlayer.play(); +</pre> + +<a>The SDK includes an example application — JetBoy — that shows how to use {@link +android.media.JetPlayer JetPlayer} to create an interactive music soundtrack in your game. It also +illustrates how to use JET events to synchronize music and game logic. The application is located at +<a href="{@docRoot}resources/samples/JetBoy/index.html">JetBoy</a>.</p>
\ No newline at end of file diff --git a/docs/html/guide/topics/media/mediaplayer.jd b/docs/html/guide/topics/media/mediaplayer.jd new file mode 100644 index 0000000..b3ca7dd --- /dev/null +++ b/docs/html/guide/topics/media/mediaplayer.jd @@ -0,0 +1,747 @@ +page.title=Media Playback +parent.title=Multimedia and Camera +parent.link=index.html +@jd:body + + <div id="qv-wrapper"> + <div id="qv"> + +<h2>In this document</h2> +<ol> +<li><a href="#basics">The Basics</a> +<li><a href="#manifest">Manifest Declarations</a></li> +<li><a href="#mediaplayer">Using MediaPlayer</a> + <ol> + <li><a href='#preparingasync'>Asynchronous Preparation</a></li> + <li><a href='#managestate'>Managing State</a></li> + <li><a href='#releaseplayer'>Releasing the MediaPlayer</a></li> + </ol> +</li> +<li><a href="#mpandservices">Using a Service with MediaPlayer</a> + <ol> + <li><a href="#asyncprepare">Running asynchronously</a></li> + <li><a href="#asyncerror">Handling asynchronous errors</a></li> + <li><a href="#wakelocks">Using wake locks</a></li> + <li><a href="#foregroundserv">Running as a foreground service</a></li> + <li><a href="#audiofocus">Handling audio focus</a></li> + <li><a href="#cleanup">Performing cleanup</a></li> + </ol> +</li> +<li><a href="#noisyintent">Handling the AUDIO_BECOMING_NOISY Intent</a> +<li><a href="#viacontentresolver">Retrieving Media from a Content Resolver</a> +</ol> + +<h2>Key classes</h2> +<ol> +<li>{@link android.media.MediaPlayer}</li> +<li>{@link android.media.AudioManager}</li> +<li>{@link android.media.SoundPool}</li> +</ol> + +<h2>See also</h2> +<ol> +<li><a href="{@docRoot}guide/topics/media/jetplayer.html">JetPlayer</a></li> +<li><a href="{@docRoot}guide/topics/media/audio-capture.html">Audio Capture</a></li> +<li><a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media Formats</a></li> +<li><a href="{@docRoot}guide/topics/data/data-storage.html">Data Storage</a></li> +</ol> + +</div> +</div> + +<p>The Android multimedia framework includes support for playing variety of common media types, so +that you can easily integrate audio, video and images into your applications. You can play audio or +video from media files stored in your application's resources (raw resources), from standalone files +in the filesystem, or from a data stream arriving over a network connection, all using {@link +android.media.MediaPlayer} APIs.</p> + +<p>This document shows you how to write a media-playing application that interacts with the user and +the system in order to obtain good performance and a pleasant user experience.</p> + +<p class="note"><strong>Note:</strong> You can play back the audio data only to the standard output +device. Currently, that is the mobile device speaker or a Bluetooth headset. You cannot play sound +files in the conversation audio during a call.</p> + +<h2 id="basics">The Basics</h2> +<p>The following classes are used to play sound and video in the Android framework:</p> + +<dl> + <dt>{@link android.media.MediaPlayer}</dt> + <dd>This class is the primary API for playing sound and video.</dd> + <dt>{@link android.media.AudioManager}</dt> + <dd>This class manages audio sources and audio output on a device.</dd> +</dl> + +<h2 id="manifest">Manifest Declarations</h2> +<p>Before starting development on your application using MediaPlayer, make sure your manifest has +the appropriate declarations to allow use of related features.</p> + +<ul> + <li><strong>Internet Permission</strong> - If you are using MediaPlayer to stream network-based +content, your application must request network access. +<pre> +<uses-permission android:name="android.permission.INTERNET" /> +</pre> + </li> + <li><strong>Wake Lock Permission</strong> - If your player application needs to keep the screen +from dimming or the processor from sleeping, or uses the {@link +android.media.MediaPlayer#setScreenOnWhilePlaying(boolean) MediaPlayer.setScreenOnWhilePlaying()} or +{@link android.media.MediaPlayer#setWakeMode(android.content.Context, int) +MediaPlayer.setWakeMode()} methods, you must request this permission. +<pre> +<uses-permission android:name="android.permission.WAKE_LOCK" /> +</pre> + </li> +</ul> + +<h2 id="mediaplayer">Using MediaPlayer</h2> +<p>One of the most important components of the media framework is the +{@link android.media.MediaPlayer MediaPlayer} +class. An object of this class can fetch, decode, and play both audio and video +with minimal setup. It supports several different media sources such as: +<ul> + <li>Local resources</li> + <li>Internal URIs, such as one you might obtain from a Content Resolver</li> + <li>External URLs (streaming)</li> +</ul> +</p> + +<p>For a list of media formats that Android supports, +see the <a href="{@docRoot}guide/appendix/media-formats.html">Android Supported Media +Formats</a> document. </p> + +<p>Here is an example +of how to play audio that's available as a local raw resource (saved in your application's +{@code res/raw/} directory):</p> + +<pre>MediaPlayer mediaPlayer = MediaPlayer.create(context, R.raw.sound_file_1); +mediaPlayer.start(); // no need to call prepare(); create() does that for you +</pre> + +<p>In this case, a "raw" resource is a file that the system does not +try to parse in any particular way. However, the content of this resource should not +be raw audio. It should be a properly encoded and formatted media file in one +of the supported formats.</p> + +<p>And here is how you might play from a URI available locally in the system +(that you obtained through a Content Resolver, for instance):</p> + +<pre>Uri myUri = ....; // initialize Uri here +MediaPlayer mediaPlayer = new MediaPlayer(); +mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); +mediaPlayer.setDataSource(getApplicationContext(), myUri); +mediaPlayer.prepare(); +mediaPlayer.start();</pre> + +<p>Playing from a remote URL via HTTP streaming looks like this:</p> + +<pre>String url = "http://........"; // your URL here +MediaPlayer mediaPlayer = new MediaPlayer(); +mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); +mediaPlayer.setDataSource(url); +mediaPlayer.prepare(); // might take long! (for buffering, etc) +mediaPlayer.start();</pre> + +<p class="note"><strong>Note:</strong> +If you're passing a URL to stream an online media file, the file must be capable of +progressive download.</p> + +<p class="caution"><strong>Caution:</strong> You must either catch or pass +{@link java.lang.IllegalArgumentException} and {@link java.io.IOException} when using +{@link android.media.MediaPlayer#setDataSource setDataSource()}, because +the file you are referencing might not exist.</p> + +<h3 id='preparingasync'>Asynchronous Preparation</h3> + +<p>Using {@link android.media.MediaPlayer MediaPlayer} can be straightforward in +principle. However, it's important to keep in mind that a few more things are +necessary to integrate it correctly with a typical Android application. For +example, the call to {@link android.media.MediaPlayer#prepare prepare()} can +take a long time to execute, because +it might involve fetching and decoding media data. So, as is the case with any +method that may take long to execute, you should <strong>never call it from your +application's UI thread</strong>. Doing that will cause the UI to hang until the method returns, +which is a very bad user experience and can cause an ANR (Application Not Responding) error. Even if +you expect your resource to load quickly, remember that anything that takes more than a tenth +of a second to respond in the UI will cause a noticeable pause and will give +the user the impression that your application is slow.</p> + +<p>To avoid hanging your UI thread, spawn another thread to +prepare the {@link android.media.MediaPlayer} and notify the main thread when done. However, while +you could write the threading logic +yourself, this pattern is so common when using {@link android.media.MediaPlayer} that the framework +supplies a convenient way to accomplish this task by using the +{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. This method +starts preparing the media in the background and returns immediately. When the media +is done preparing, the {@link android.media.MediaPlayer.OnPreparedListener#onPrepared onPrepared()} +method of the {@link android.media.MediaPlayer.OnPreparedListener +MediaPlayer.OnPreparedListener}, configured through +{@link android.media.MediaPlayer#setOnPreparedListener setOnPreparedListener()} is called.</p> + +<h3 id='managestate'>Managing State</h3> + +<p>Another aspect of a {@link android.media.MediaPlayer} that you should keep in mind is +that it's state-based. That is, the {@link android.media.MediaPlayer} has an internal state +that you must always be aware of when writing your code, because certain operations +are only valid when then player is in specific states. If you perform an operation while in the +wrong state, the system may throw an exception or cause other undesireable behaviors.</p> + +<p>The documentation in the +{@link android.media.MediaPlayer MediaPlayer} class shows a complete state diagram, +that clarifies which methods move the {@link android.media.MediaPlayer} from one state to another. +For example, when you create a new {@link android.media.MediaPlayer}, it is in the <em>Idle</em> +state. At that point, you should initialize it by calling +{@link android.media.MediaPlayer#setDataSource setDataSource()}, bringing it +to the <em>Initialized</em> state. After that, you have to prepare it using either the +{@link android.media.MediaPlayer#prepare prepare()} or +{@link android.media.MediaPlayer#prepareAsync prepareAsync()} method. When +the {@link android.media.MediaPlayer} is done preparing, it will then enter the <em>Prepared</em> +state, which means you can call {@link android.media.MediaPlayer#start start()} +to make it play the media. At that point, as the diagram illustrates, +you can move between the <em>Started</em>, <em>Paused</em> and <em>PlaybackCompleted</em> states by +calling such methods as +{@link android.media.MediaPlayer#start start()}, +{@link android.media.MediaPlayer#pause pause()}, and +{@link android.media.MediaPlayer#seekTo seekTo()}, +amongst others. When you +call {@link android.media.MediaPlayer#stop stop()}, however, notice that you +cannot call {@link android.media.MediaPlayer#start start()} again until you +prepare the {@link android.media.MediaPlayer} again.</p> + +<p>Always keep <a href='{@docRoot}images/mediaplayer_state_diagram.gif'>the state diagram</a> +in mind when writing code that interacts with a +{@link android.media.MediaPlayer} object, because calling its methods from the wrong state is a +common cause of bugs.</p> + +<h3 id='releaseplayer'>Releasing the MediaPlayer</h3> + +<p>A {@link android.media.MediaPlayer MediaPlayer} can consume valuable +system resources. +Therefore, you should always take extra precautions to make sure you are not +hanging on to a {@link android.media.MediaPlayer} instance longer than necessary. When you +are done with it, you should always call +{@link android.media.MediaPlayer#release release()} to make sure any +system resources allocated to it are properly released. For example, if you are +using a {@link android.media.MediaPlayer} and your activity receives a call to {@link +android.app.Activity#onStop onStop()}, you must release the {@link android.media.MediaPlayer}, +because it +makes little sense to hold on to it while your activity is not interacting with +the user (unless you are playing media in the background, which is discussed in the next section). +When your activity is resumed or restarted, of course, you need to +create a new {@link android.media.MediaPlayer} and prepare it again before resuming playback.</p> + +<p>Here's how you should release and then nullify your {@link android.media.MediaPlayer}:</p> +<pre> +mediaPlayer.release(); +mediaPlayer = null; +</pre> + +<p>As an example, consider the problems that could happen if you +forgot to release the {@link android.media.MediaPlayer} when your activity is stopped, but create a +new one when the activity starts again. As you may know, when the user changes the +screen orientation (or changes the device configuration in another way), +the system handles that by restarting the activity (by default), so you might quickly +consume all of the system resources as the user +rotates the device back and forth between portrait and landscape, because at each +orientation change, you create a new {@link android.media.MediaPlayer} that you never +release. (For more information about runtime restarts, see <a +href="{@docRoot}guide/topics/resources/runtime-changes.html">Handling Runtime Changes</a>.)</p> + +<p>You may be wondering what happens if you want to continue playing +"background media" even when the user leaves your activity, much in the same +way that the built-in Music application behaves. In this case, what you need is +a {@link android.media.MediaPlayer MediaPlayer} controlled by a {@link android.app.Service}, as +discussed in <a href="mpandservices">Using a Service with MediaPlayer</a>.</p> + +<h2 id="mpandservices">Using a Service with MediaPlayer</h2> + +<p>If you want your media to play in the background even when your application +is not onscreen—that is, you want it to continue playing while the user is +interacting with other applications—then you must start a +{@link android.app.Service Service} and control the +{@link android.media.MediaPlayer MediaPlayer} instance from there. +You should be careful about this setup, because the user and the system have expectations +about how an application running a background service should interact with the rest of the +system. If your application does not fulfil those expectations, the user may +have a bad experience. This section describes the main issues that you should be +aware of and offers suggestions about how to approach them.</p> + + +<h3 id="asyncprepare">Running asynchronously</h3> + +<p>First of all, like an {@link android.app.Activity Activity}, all work in a +{@link android.app.Service Service} is done in a single thread by +default—in fact, if you're running an activity and a service from the same application, they +use the same thread (the "main thread") by default. Therefore, services need to +process incoming intents quickly +and never perform lengthy computations when responding to them. If any heavy +work or blocking calls are expected, you must do those tasks asynchronously: either from +another thread you implement yourself, or using the framework's many facilities +for asynchronous processing.</p> + +<p>For instance, when using a {@link android.media.MediaPlayer} from your main thread, +you should call {@link android.media.MediaPlayer#prepareAsync prepareAsync()} rather than +{@link android.media.MediaPlayer#prepare prepare()}, and implement +a {@link android.media.MediaPlayer.OnPreparedListener MediaPlayer.OnPreparedListener} +in order to be notified when the preparation is complete and you can start playing. +For example:</p> + +<pre> +public class MyService extends Service implements MediaPlayer.OnPreparedListener { + private static final ACTION_PLAY = "com.example.action.PLAY"; + MediaPlayer mMediaPlayer = null; + + public int onStartCommand(Intent intent, int flags, int startId) { + ... + if (intent.getAction().equals(ACTION_PLAY)) { + mMediaPlayer = ... // initialize it here + mMediaPlayer.setOnPreparedListener(this); + mMediaPlayer.prepareAsync(); // prepare async to not block main thread + } + } + + /** Called when MediaPlayer is ready */ + public void onPrepared(MediaPlayer player) { + player.start(); + } +} +</pre> + + +<h3 id="asyncerror">Handling asynchronous errors</h3> + +<p>On synchronous operations, errors would normally +be signaled with an exception or an error code, but whenever you use asynchronous +resources, you should make sure your application is notified +of errors appropriately. In the case of a {@link android.media.MediaPlayer MediaPlayer}, +you can accomplish this by implementing a +{@link android.media.MediaPlayer.OnErrorListener MediaPlayer.OnErrorListener} and +setting it in your {@link android.media.MediaPlayer} instance:</p> + +<pre> +public class MyService extends Service implements MediaPlayer.OnErrorListener { + MediaPlayer mMediaPlayer; + + public void initMediaPlayer() { + // ...initialize the MediaPlayer here... + + mMediaPlayer.setOnErrorListener(this); + } + + @Override + public boolean onError(MediaPlayer mp, int what, int extra) { + // ... react appropriately ... + // The MediaPlayer has moved to the Error state, must be reset! + } +} +</pre> + +<p>It's important to remember that when an error occurs, the {@link android.media.MediaPlayer} +moves to the <em>Error</em> state (see the documentation for the +{@link android.media.MediaPlayer MediaPlayer} class for the full state diagram) +and you must reset it before you can use it again. + + +<h3 id="wakelocks">Using wake locks</h3> + +<p>When designing applications that play media +in the background, the device may go to sleep +while your service is running. Because the Android system tries to conserve +battery while the device is sleeping, the system tries to shut off any +of the phone's features that are +not necessary, including the CPU and the WiFi hardware. +However, if your service is playing or streaming music, you want to prevent +the system from interfering with your playback.</p> + +<p>In order to ensure that your service continues to run under +those conditions, you have to use "wake locks." A wake lock is a way to signal to +the system that your application is using some feature that should +stay available even if the phone is idle.</p> + +<p class="caution"><strong>Notice:</strong> You should always use wake locks sparingly and hold them +only for as long as truly necessary, because they significantly reduce the battery life of the +device.</p> + +<p>To ensure that the CPU continues running while your {@link android.media.MediaPlayer} is +playing, call the {@link android.media.MediaPlayer#setWakeMode +setWakeMode()} method when initializing your {@link android.media.MediaPlayer}. Once you do, +the {@link android.media.MediaPlayer} holds the specified lock while playing and releases the lock +when paused or stopped:</p> + +<pre> +mMediaPlayer = new MediaPlayer(); +// ... other initialization here ... +mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK); +</pre> + +<p>However, the wake lock acquired in this example guarantees only that the CPU remains awake. If +you are streaming media over the +network and you are using Wi-Fi, you probably want to hold a +{@link android.net.wifi.WifiManager.WifiLock WifiLock} as +well, which you must acquire and release manually. So, when you start preparing the +{@link android.media.MediaPlayer} with the remote URL, you should create and acquire the Wi-Fi lock. +For example:</p> + +<pre> +WifiLock wifiLock = ((WifiManager) getSystemService(Context.WIFI_SERVICE)) + .createWifiLock(WifiManager.WIFI_MODE_FULL, "mylock"); + +wifiLock.acquire(); +</pre> + +<p>When you pause or stop your media, or when you no longer need the +network, you should release the lock:</p> + +<pre> +wifiLock.release(); +</pre> + + +<h3 id="foregroundserv">Running as a foreground service</h3> + +<p>Services are often used for performing background tasks, such as fetching emails, +synchronizing data, downloading content, amongst other possibilities. In these +cases, the user is not actively aware of the service's execution, and probably +wouldn't even notice if some of these services were interrupted and later restarted.</p> + +<p>But consider the case of a service that is playing music. Clearly this is a service that the user +is actively aware of and the experience would be severely affected by any interruptions. +Additionally, it's a service that the user will likely wish to interact with during its execution. +In this case, the service should run as a "foreground service." A +foreground service holds a higher level of importance within the system—the system will +almost never kill the service, because it is of immediate importance to the user. When running +in the foreground, the service also must provide a status bar notification to ensure that users are +aware of the running service and allow them to open an activity that can interact with the +service.</p> + +<p>In order to turn your service into a foreground service, you must create a +{@link android.app.Notification Notification} for the status bar and call +{@link android.app.Service#startForeground startForeground()} from the {@link +android.app.Service}. For example:</p> + +<pre>String songName; +// assign the song name to songName +PendingIntent pi = PendingIntent.getActivity(getApplicationContext(), 0, + new Intent(getApplicationContext(), MainActivity.class), + PendingIntent.FLAG_UPDATE_CURRENT); +Notification notification = new Notification(); +notification.tickerText = text; +notification.icon = R.drawable.play0; +notification.flags |= Notification.FLAG_ONGOING_EVENT; +notification.setLatestEventInfo(getApplicationContext(), "MusicPlayerSample", + "Playing: " + songName, pi); +startForeground(NOTIFICATION_ID, notification); +</pre> + +<p>While your service is running in the foreground, the notification you +configured is visible in the notification area of the device. If the user +selects the notification, the system invokes the {@link android.app.PendingIntent} you supplied. In +the example above, it opens an activity ({@code MainActivity}).</p> + +<p>Figure 1 shows how your notification appears to the user:</p> + +<img src='images/notification1.png' /> + +<img src='images/notification2.png' /> +<p class="img-caption"><strong>Figure 1.</strong> Screenshots of a foreground service's +notification, showing the notification icon in the status bar (left) and the expanded view +(right).</p> + +<p>You should only hold on to the "foreground service" status while your +service is actually performing something the user is actively aware of. Once +that is no longer true, you should release it by calling +{@link android.app.Service#stopForeground stopForeground()}:</p> + +<pre> +stopForeground(true); +</pre> + +<p>For more information, see the documentation about <a +href="{@docRoot}guide/topics/fundamentals/services.html#Foreground">Services</a> and +<a href="{@docRoot}guide/topics/ui/notifiers/notifications.html">Status Bar Notifications</a>.</p> + + +<h3 id="audiofocus">Handling audio focus</h3> + +<p>Even though only one activity can run at any given time, Android is a +multi-tasking environment. This poses a particular challenge to applications +that use audio, because there is only one audio output and there may be several +media services competing for its use. Before Android 2.2, there was no built-in +mechanism to address this issue, which could in some cases lead to a bad user +experience. For example, when a user is listening to +music and another application needs to notify the user of something very important, +the user might not hear the notification tone due to the loud music. Starting with +Android 2.2, the platform offers a way for applications to negotiate their +use of the device's audio output. This mechanism is called Audio Focus.</p> + +<p>When your application needs to output audio such as music or a notification, +you should always request audio focus. Once it has focus, it can use the sound output freely, but it +should +always listen for focus changes. If it is notified that it has lost the audio +focus, it should immediately either kill the audio or lower it to a quiet level +(known as "ducking"—there is a flag that indicates which one is appropriate) and only resume +loud playback after it receives focus again.</p> + +<p>Audio Focus is cooperative in nature. That is, applications are expected +(and highly encouraged) to comply with the audio focus guidelines, but the +rules are not enforced by the system. If an application wants to play loud +music even after losing audio focus, nothing in the system will prevent that. +However, the user is more likely to have a bad experience and will be more +likely to uninstall the misbehaving application.</p> + +<p>To request audio focus, you must call +{@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} from the {@link +android.media.AudioManager}, as the example below demonstrates:</p> + +<pre> +AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE); +int result = audioManager.requestAudioFocus(this, AudioManager.STREAM_MUSIC, + AudioManager.AUDIOFOCUS_GAIN); + +if (result != AudioManager.AUDIOFOCUS_REQUEST_GRANTED) { + // could not get audio focus. +} +</pre> + +<p>The first parameter to {@link android.media.AudioManager#requestAudioFocus requestAudioFocus()} +is an {@link android.media.AudioManager.OnAudioFocusChangeListener +AudioManager.OnAudioFocusChangeListener}, +whose {@link android.media.AudioManager.OnAudioFocusChangeListener#onAudioFocusChange +onAudioFocusChange()} method is called whenever there is a change in audio focus. Therefore, you +should also implement this interface on your service and activities. For example:</p> + +<pre> +class MyService extends Service + implements AudioManager.OnAudioFocusChangeListener { + // .... + public void onAudioFocusChange(int focusChange) { + // Do something based on focus change... + } +} +</pre> + +<p>The <code>focusChange</code> parameter tells you how the audio focus has changed, and +can be one of the following values (they are all constants defined in +{@link android.media.AudioManager AudioManager}):</p> + +<ul> +<li>{@link android.media.AudioManager#AUDIOFOCUS_GAIN}: You have gained the audio focus.</li> + +<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS}: You have lost the audio focus for a +presumably long time. +You must stop all audio playback. Because you should expect not to have focus back +for a long time, this would be a good place to clean up your resources as much +as possible. For example, you should release the {@link android.media.MediaPlayer}.</li> + +<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS_TRANSIENT}: You have +temporarily lost audio focus, but should receive it back shortly. You must stop +all audio playback, but you can keep your resources because you will probably get +focus back shortly.</li> + +<li>{@link android.media.AudioManager#AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK}: You have temporarily +lost audio focus, +but you are allowed to continue to play audio quietly (at a low volume) instead +of killing audio completely.</li> +</ul> + +<p>Here is an example implementation:</p> + +<pre> +public void onAudioFocusChange(int focusChange) { + switch (focusChange) { + case AudioManager.AUDIOFOCUS_GAIN: + // resume playback + if (mMediaPlayer == null) initMediaPlayer(); + else if (!mMediaPlayer.isPlaying()) mMediaPlayer.start(); + mMediaPlayer.setVolume(1.0f, 1.0f); + break; + + case AudioManager.AUDIOFOCUS_LOSS: + // Lost focus for an unbounded amount of time: stop playback and release media player + if (mMediaPlayer.isPlaying()) mMediaPlayer.stop(); + mMediaPlayer.release(); + mMediaPlayer = null; + break; + + case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT: + // Lost focus for a short time, but we have to stop + // playback. We don't release the media player because playback + // is likely to resume + if (mMediaPlayer.isPlaying()) mMediaPlayer.pause(); + break; + + case AudioManager.AUDIOFOCUS_LOSS_TRANSIENT_CAN_DUCK: + // Lost focus for a short time, but it's ok to keep playing + // at an attenuated level + if (mMediaPlayer.isPlaying()) mMediaPlayer.setVolume(0.1f, 0.1f); + break; + } +} +</pre> + +<p>Keep in mind that the audio focus APIs are available only with API level 8 (Android 2.2) +and above, so if you want to support previous +versions of Android, you should adopt a backward compatibility strategy that +allows you to use this feature if available, and fall back seamlessly if not.</p> + +<p>You can achieve backward compatibility either by calling the audio focus methods by reflection +or by implementing all the audio focus features in a separate class (say, +<code>AudioFocusHelper</code>). Here is an example of such a class:</p> + +<pre> +public class AudioFocusHelper implements AudioManager.OnAudioFocusChangeListener { + AudioManager mAudioManager; + + // other fields here, you'll probably hold a reference to an interface + // that you can use to communicate the focus changes to your Service + + public AudioFocusHelper(Context ctx, /* other arguments here */) { + mAudioManager = (AudioManager) mContext.getSystemService(Context.AUDIO_SERVICE); + // ... + } + + public boolean requestFocus() { + return AudioManager.AUDIOFOCUS_REQUEST_GRANTED == + mAudioManager.requestAudioFocus(mContext, AudioManager.STREAM_MUSIC, + AudioManager.AUDIOFOCUS_GAIN); + } + + public boolean abandonFocus() { + return AudioManager.AUDIOFOCUS_REQUEST_GRANTED == + mAudioManager.abandonAudioFocus(this); + } + + @Override + public void onAudioFocusChange(int focusChange) { + // let your service know about the focus change + } +} +</pre> + + +<p>You can create an instance of <code>AudioFocusHelper</code> class only if you detect that +the system is running API level 8 or above. For example:</p> + +<pre> +if (android.os.Build.VERSION.SDK_INT >= 8) { + mAudioFocusHelper = new AudioFocusHelper(getApplicationContext(), this); +} else { + mAudioFocusHelper = null; +} +</pre> + + +<h3 id="cleanup">Performing cleanup</h3> + +<p>As mentioned earlier, a {@link android.media.MediaPlayer} object can consume a significant +amount of system resources, so you should keep it only for as long as you need and call +{@link android.media.MediaPlayer#release release()} when you are done with it. It's important +to call this cleanup method explicitly rather than rely on system garbage collection because +it might take some time before the garbage collector reclaims the {@link android.media.MediaPlayer}, +as it's only sensitive to memory needs and not to shortage of other media-related resources. +So, in the case when you're using a service, you should always override the +{@link android.app.Service#onDestroy onDestroy()} method to make sure you are releasing +the {@link android.media.MediaPlayer}:</p> + +<pre> +public class MyService extends Service { + MediaPlayer mMediaPlayer; + // ... + + @Override + public void onDestroy() { + if (mMediaPlayer != null) mMediaPlayer.release(); + } +} +</pre> + +<p>You should always look for other opportunities to release your {@link android.media.MediaPlayer} +as well, apart from releasing it when being shut down. For example, if you expect not +to be able to play media for an extended period of time (after losing audio focus, for example), +you should definitely release your existing {@link android.media.MediaPlayer} and create it again +later. On the +other hand, if you only expect to stop playback for a very short time, you should probably +hold on to your {@link android.media.MediaPlayer} to avoid the overhead of creating and preparing it +again.</p> + + + +<h2 id="noisyintent">Handling the AUDIO_BECOMING_NOISY Intent</h2> + +<p>Many well-written applications that play audio automatically stop playback when an event +occurs that causes the audio to become noisy (ouput through external speakers). For instance, +this might happen when a user is listening to music through headphones and accidentally +disconnects the headphones from the device. However, this behavior does not happen automatically. +If you don't implement this feature, audio plays out of the device's external speakers, which +might not be what the user wants.</p> + +<p>You can ensure your app stops playing music in these situations by handling +the {@link android.media.AudioManager#ACTION_AUDIO_BECOMING_NOISY} intent, for which you can +register a receiver by +adding the following to your manifest:</p> + +<pre> +<receiver android:name=".MusicIntentReceiver"> + <intent-filter> + <action android:name="android.media.AUDIO_BECOMING_NOISY" /> + </intent-filter> +</receiver> +</pre> + +<p>This registers the <code>MusicIntentReceiver</code> class as a broadcast receiver for that +intent. You should then implement this class:</p> + +<pre> +public class MusicIntentReceiver implements android.content.BroadcastReceiver { + @Override + public void onReceive(Context ctx, Intent intent) { + if (intent.getAction().equals( + android.media.AudioManager.ACTION_AUDIO_BECOMING_NOISY)) { + // signal your service to stop playback + // (via an Intent, for instance) + } + } +} +</pre> + + + + +<h2 id="viacontentresolver">Retrieving Media from a Content Resolver</h2> + +<p>Another feature that may be useful in a media player application is the ability to +retrieve music that the user has on the device. You can do that by querying the {@link +android.content.ContentResolver} for external media:</p> + +<pre> +ContentResolver contentResolver = getContentResolver(); +Uri uri = android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; +Cursor cursor = contentResolver.query(uri, null, null, null, null); +if (cursor == null) { + // query failed, handle error. +} else if (!cursor.moveToFirst()) { + // no media on the device +} else { + int titleColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media.TITLE); + int idColumn = cursor.getColumnIndex(android.provider.MediaStore.Audio.Media._ID); + do { + long thisId = cursor.getLong(idColumn); + String thisTitle = cursor.getString(titleColumn); + // ...process entry... + } while (cursor.moveToNext()); +} +</pre> + +<p>To use this with the {@link android.media.MediaPlayer}, you can do this:</p> + +<pre> +long id = /* retrieve it from somewhere */; +Uri contentUri = ContentUris.withAppendedId( + android.provider.MediaStore.Audio.Media.EXTERNAL_CONTENT_URI, id); + +mMediaPlayer = new MediaPlayer(); +mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC); +mMediaPlayer.setDataSource(getApplicationContext(), contentUri); + +// ...prepare and start... +</pre>
\ No newline at end of file |