page.title=API Overview page.keywords=preview,sdk,compatibility page.tags=previewresources, androidm sdk.platform.apiLevel=22-mnc page.image=images/cards/card-api-overview_16-9_2x.png @jd:body

In this document show more

  1. App Linking
  2. Auto Backup for Apps
  3. Authentication
    1. Fingerprint Authentication
    2. Confirm Credentials
  4. Direct Share
  5. Voice Interactions
  6. Assist API
  7. Notifications
  8. Bluetooth Stylus Support
  9. Improved Bluetooth Low Energy Scanning
  10. Hotspot 2.0 Release 1 Support
  11. 4K Display Mode
  12. Themeable ColorStateLists
  13. Audio Features
  14. Video Features
  15. Camera Features
    1. Flashlight API
    2. Camera Reprocessing
  16. Android for Work Features

API Differences

  1. API level 22 to M Preview »

The M Developer Preview gives you an advance look at the upcoming release for the Android platform, which offers new features for users and app developers. This document provides an introduction to the most notable APIs.

The M Developer Preview is intended for developer early adopters and testers. If you are interested in influencing the direction of the Android framework, give the M Developer Preview a try and send us your feedback!

Caution: Do not not publish apps that use the M Developer Preview to the Google Play store.

Note: This document often refers to classes and methods that do not yet have reference material available on developer.android.com. These API elements are formatted in {@code code style} in this document (without hyperlinks). For the preliminary API documentation for these elements, download the preview reference.

Important behavior changes

If you have previously published an app for Android, be aware that your app might be affected by changes in the platform.

Please see Behavior Changes for complete information.

App Linking

This preview enhances Android’s intent system by providing more powerful app linking. This feature allows you to associate an app with a web domain you own. Based on this association, the platform can determine the default app to use to handle a particular web link and skip prompting users to select an app. To learn how to implement this feature, see App Linking.

Auto Backup for Apps

The system now performs automatic full data backup and restore for apps. This behavior is enabled by default for apps targeting M Preview; you do not need to add any additional code. If users delete their Google accounts, their backup data is deleted as well. To learn how this feature works and how to configure what to back up on the file system, see Auto Backup for Apps.

Authentication

This preview offers new APIs to let you authenticate users by using their fingerprint scans on supported devices, and check how recently the user was last authenticated using a device unlocking mechanism (such as a lockscreen password). Use these APIs in conjunction with the Android Keystore system.

Fingerprint Authentication

To authenticate users via fingerprint scan, get an instance of the new {@code android.hardware.fingerprint.FingerprintManager} class and call the {@code FingerprintManager.authenticate()} method. Your app must be running on a compatible device with a fingerprint sensor. You must implement the user interface for the fingerprint authentication flow on your app, and use the standard Android fingerprint icon in your UI. The Android fingerprint icon ({@code c_fp_40px.png}) is included in the sample app. If you are developing multiple apps that use fingerprint authentication, note that each app must authenticate the user’s fingerprint independently.

To use this feature in your app, first add the {@code USE_FINGERPRINT} permission in your manifest.

<uses-permission
        android:name="android.permission.USE_FINGERPRINT" />

To see an app implementation of fingerprint authentication, refer to the Fingerprint Dialog sample.

If you are testing this feature, follow these steps:

  1. Install Android SDK Tools Revision 24.3, if you have not done so.
  2. Enroll a new fingerprint in the emulator by going to Settings > Security > Fingerprint, then follow the enrollment instructions.
  3. Use an emulator to emulate fingerprint touch events with the following command. Use the same command to emulate fingerprint touch events on the lockscreen or in your app.
    adb -e emu finger touch <finger_id>
    

    On Windows, you may have to run {@code telnet 127.0.0.1 <emulator-id>} followed by {@code finger touch <finger_id>}.

Confirm Credentials

Your app can authenticate users based on how recently they last unlocked their device. This feature frees users from having to remember additional app-specific passwords, and avoids the need for you to implement your own authentication user interface. Your app should use this feature in conjunction with a public or secret key implementation for user authentication.

To set the timeout duration for which the same key can be re-used after a user is successfully authenticated, call the new {@code android.security.keystore.KeyGenParameterSpec.setUserAuthenticationValidityDurationSeconds()} method when you set up a {@link javax.crypto.KeyGenerator} or {@link java.security.KeyPairGenerator}. This feature currently works for symmetric cryptographic operations.

Avoid showing the re-authentication dialog excessively -- your apps should try using the cryptographic object first and if the the timeout expires, use the {@link android.app.KeyguardManager#createConfirmDeviceCredentialIntent(java.lang.CharSequence, java.lang.CharSequence) createConfirmDeviceCredentialIntent()} method to re-authenticate the user within your app.

To see an app implementation of this feature, refer to the Confirm Device Credentials sample.

Direct Share

This preview provides you with APIs to make sharing intuitive and quick for users. You can now define direct share targets that launch a specific activity in your app. These direct share targets are exposed to users via the Share menu. This feature allows users to share content to targets, such as contacts, within other apps. For example, the direct share target might launch an activity in another social network app, which lets the user share content directly to a specific friend or community in that app.

To enable direct share targets you must define a class that extends the {@code android.service.}
{@code chooser.ChooserTargetService} class. Declare your {@code ChooserTargetService} in the manifest. Within that declaration, specify the {@code BIND_CHOOSER_TARGET_SERVICE} permission and an intent filter with the {@code SERVICE_INTERFACE} action.

The following example shows how you might declare the {@code ChooserTargetService} in your manifest.

<service android:name=".ChooserTargetService"
        android:label="@string/service_name"
        android:permission="android.permission.BIND_CHOOSER_TARGET_SERVICE">
    <intent-filter>
        <action android:name="android.service.chooser.ChooserTargetService" />
    </intent-filter>
</service>

For each activity that you want to expose to the {@code ChooserTargetService}, add a {@code <meta-data>} element with the name {@code "android.service.chooser.chooser_target_service"} in your app manifest.

<activity android:name=".MyShareActivity”
        android:label="@string/share_activity_label">
    <intent-filter>
        <action android:name="android.intent.action.SEND" />
    </intent-filter>
<meta-data
        android:name="android.service.chooser.chooser_target_service"
        android:value=".ChooserTargetService" />
</activity>

Voice Interactions

This preview provides a new voice interaction API which, together with Voice Actions, allows you to build conversational voice experiences into your apps. Call the {@code android.app.Activity.isVoiceInteraction()} method to determine if your activity was started in response to a voice action. If so, your app can use the {@code android.app.VoiceInteractor} class to request a voice confirmation from the user, select from a list of options, and more. To learn more about implementing voice actions, see the Voice Actions developer site.

Assist API

This preview offers a new way for users to engage with your apps through an assistant. To use this feature, the user must enable the assistant to use the current context. Once enabled, the user can summon the assistant within any app, by long-pressing on the Home button.

Your app can elect to not share the current context with the assistant by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE} flag. In addition to the standard set of information that the platform passes to the assistant, your app can share additional information by using the new {@code android.app.Activity.AssistContent} class.

To provide the assistant with additional context from your app, follow these steps:

  1. Implement the {@link android.app.Application.OnProvideAssistDataListener} interface.
  2. Register this listener by using {@link android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}.
  3. In order to provide activity-specific contextual information, override the {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} callback and, optionally, the new {@code Activity.onProvideAssistContent()} callback.

Notifications

This preview adds the following API changes for notifications:

Bluetooth Stylus Support

This preview provides improved support for user input using a Bluetooth stylus. Users can pair and connect a compatible Bluetooth stylus with their phone or tablet. While connected, position information from the touch screen is fused with pressure and button information from the stylus to provide a greater range of expression than with the touch screen alone. Your app can listen for stylus button presses and perform secondary actions, by registering the new {@code View.onStylusButtonPressListener} and {@code GestureDetector.OnStylusButtonPressListener} callbacks in your activity.

Use the {@link android.view.MotionEvent} methods and constants to detect stylus button interactions:

Improved Bluetooth Low Energy Scanning

If your app performs performs Bluetooth Low Energy scans, you can use the new {@code android.bluetooth.le.ScanSettings.Builder.setCallbackType()} method to specify that you want callbacks to only be notified when an advertisement packet matching the set {@link android.bluetooth.le.ScanFilter} is first found, and when it is not seen for a period of time. This approach to scanning is more power-efficient than what’s provided in the previous platform version.

Hotspot 2.0 Release 1 Support

This preview adds support for the Hotspot 2.0 Release 1 spec on Nexus 6 and Nexus 9 devices. To provision Hotspot 2.0 credentials in your app, use the new methods of the {@link android.net.wifi.WifiEnterpriseConfig} class, such as {@code setPlmn()} and {@code setRealm()}. In the {@link android.net.wifi.WifiConfiguration} object, you can set the {@link android.net.wifi.WifiConfiguration#FQDN} and the {@code providerFriendlyName} fields. The new {@code ScanResult.PasspointNetwork} property indicates if a detected network represents a Hotspot 2.0 access point.

4K Display Mode

The platform now allows apps to request that the display resolution be upgraded to 4K rendering on compatible hardware. To query the current physical resolution, use the new {@code android.view.Display.Mode} APIs. If the UI is drawn at a lower logical resolution and is upscaled to a larger physical resolution, be aware that the physical resolution the {@code Display.Mode.getPhysicalWidth()} method returns may differ from the logical resolution reported by {@link android.view.Display#getSize(android.graphics.Point) getSize()}.

You can request the system to change the physical resolution in your app as it runs, by setting the {@code WindowManager.LayoutParams.preferredDisplayModeId} property of your app’s window. This feature is useful if you want to switch to 4K display resolution. While in 4K display mode, the UI continues to be rendered at the original resolution (such as 1080p) and is upscaled to 4K, but {@link android.view.SurfaceView} objects may show content at the native resolution.

Themeable ColorStateLists

Theme attributes are now supported in {@link android.content.res.ColorStateList} for devices running the M Preview. The {@link android.content.res.Resources#getColorStateList(int) getColorStateList()} and {@link android.content.res.Resources#getColor(int) getColor()} methods have been deprecated. If you are calling these APIs, call the new {@code Context.getColorStateList()} or {@code Context.getColor()} methods instead. These methods are also available in the v4 appcompat library via {@link android.support.v4.content.ContextCompat}.

Audio Features

This preview adds enhancements to audio processing on Android, including:

Video Features

This preview adds new capabilities to the video processing APIs, including:

Camera Features

This preview includes the following new APIs for accessing the camera’s flashlight and for camera reprocessing of images:

Flashlight API

If a camera device has a flash unit, you can call the {@code CameraManager.setTorchMode()} method to switch the flash unit’s torch mode on or off without opening the camera device. The app does not have exclusive ownership of the flash unit or the camera device. The torch mode is turned off and becomes unavailable whenever the camera device becomes unavailable, or when other camera resources keeping the torch on become unavailable. Other apps can also call {@code setTorchMode()} to turn off the torch mode. When the last app that turned on the torch mode is closed, the torch mode is turned off.

You can register a callback to be notified about torch mode status by calling the {@code CameraManager.registerTorchCallback()} method. The first time the callback is registered, it is immediately called with the torch mode status of all currently known camera devices with a flash unit. If the torch mode is turned on or off successfully, the {@code CameraManager.TorchCallback.onTorchModeChanged()} method is invoked.

Reprocessing API

The {@link android.hardware.camera2 Camera2} API is extended to support YUV and private opaque format image reprocessing. Your app determine if the reprocessing capabilities are available via {@code CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES}. If a device supports reprocessing, you can create a reprocessable camera capture session by calling {@code CameraDevice.createReprocessableCaptureSession()}, and create requests for input buffer reprocessing.

Use the {@code ImageWriter} class to connect the input buffer flow to the camera reprocessing input. To get an empty buffer, follow this programming model:

  1. Call the {@code ImageWriter.dequeueInputImage()} method.
  2. Fill the data into the input buffer.
  3. Send the buffer to the camera by calling the {@code ImageWriter.queueInputImage()} method.

If you are using a {@code ImageWriter} object together with an {@code android.graphics.ImageFormat.PRIVATE} image, your app cannot access the image data directly. Instead, pass the {@code ImageFormat.PRIVATE} image directly to the {@code ImageWriter} by calling the {@code ImageWriter.queueInputImage()} method without any buffer copy.

The {@code ImageReader} class now supports {@code android.graphics.ImageFormat.PRIVATE} format image streams. This support allows your app to maintain a circular image queue of {@code ImageReader} output images, select one or more images, and send them to the {@code ImageWriter} for camera reprocessing.

Android for Work Features

This preview includes the following new APIs for Android for Work:

For a detailed view of all API changes in the M Developer Preview, see the API Differences Report.