summaryrefslogtreecommitdiffstats
path: root/docs/html/about
diff options
context:
space:
mode:
Diffstat (limited to 'docs/html/about')
-rw-r--r--docs/html/about/versions/android-4.0.jd2
-rw-r--r--docs/html/about/versions/android-4.3.jd706
-rw-r--r--docs/html/about/versions/jelly-bean.jd583
3 files changed, 932 insertions, 359 deletions
diff --git a/docs/html/about/versions/android-4.0.jd b/docs/html/about/versions/android-4.0.jd
index 1ce005d..2fa180c 100644
--- a/docs/html/about/versions/android-4.0.jd
+++ b/docs/html/about/versions/android-4.0.jd
@@ -822,7 +822,7 @@ the methods necessary for any given view without extending the {@link android.vi
<p>If you want to maintain compatibility with Android versions prior to 4.0, while also supporting
the new the accessibility APIs, you can do so with the latest version of the <em>v4 support
-library</em> (in <a href="{@docRoot}tools/extras/support-library.html">Compatibility Package, r4</a>)
+library</em> (in <a href="{@docRoot}tools/support-library/index.html">Compatibility Package, r4</a>)
using a set of utility classes that provide the new accessibility APIs in a backward-compatible
design.</p>
diff --git a/docs/html/about/versions/android-4.3.jd b/docs/html/about/versions/android-4.3.jd
index 0ca3bc6..bccc9d5 100644
--- a/docs/html/about/versions/android-4.3.jd
+++ b/docs/html/about/versions/android-4.3.jd
@@ -7,7 +7,7 @@ sdk.platform.apiLevel=18
<div id="qv-wrapper">
<div id="qv">
-
+
<h2>In this document
<a href="#" onclick="hideNestedItems('#toc43',this);return false;" class="header-toggle">
<span class="more">show more</span>
@@ -36,7 +36,7 @@ sdk.platform.apiLevel=18
</li>
<li><a href="#Multimedia">Multimedia</a>
<ol>
- <li><a href="#DASH">MPEG DASH support</a></li>
+ <li><a href="#MediaExtractor">MediaExtractor and MediaCodec enhancements</a></li>
<li><a href="#DRM">Media DRM</a></li>
<li><a href="#EncodingSurface">Video encoding from a Surface</a></li>
<li><a href="#MediaMuxing">Media muxing</a></li>
@@ -62,7 +62,6 @@ sdk.platform.apiLevel=18
</li>
<li><a href="#UserInput">User Input</a>
<ol>
- <li><a href="#SignificantMotion">Detect significant motion</a></li>
<li><a href="#Sensors">New sensor types</a></li>
</ol>
</li>
@@ -110,7 +109,7 @@ sdk.platform.apiLevel=18
<li><a href="{@docRoot}sdk/api_diff/18/changes.html">API
Differences Report &raquo;</a> </li>
<li><a
-href="{@docRoot}tools/extras/support-library.html">Support Library</a></li>
+href="{@docRoot}tools/support-library/index.html">Support Library</a></li>
</ol>
</div>
@@ -133,7 +132,7 @@ image to test your app on the <a href="{@docRoot}tools/devices/emulator.html">An
Then build your apps against the Android {@sdkPlatformVersion} platform to begin using the
latest APIs.</p>
-
+
<h3 id="ApiLevel">Update your target API level</h3>
<p>To better optimize your app for devices running Android {@sdkPlatformVersion},
@@ -145,13 +144,13 @@ test it, then publish an update with this change.</p>
<p>You can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding
conditions to your code that check for the system API level before executing
APIs not supported by your <a
-href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>.
+href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>.
To learn more about maintaining backward compatibility, read <a
href="{@docRoot}training/basics/supporting-devices/platforms.html">Supporting Different
Platform Versions</a>.</p>
<p>Various APIs are also available in the Android <a
-href="{@docRoot}tools/extras/support-library.html">Support Library</a> that allow you to implement
+href="{@docRoot}tools/support-library/index.html">Support Library</a> that allow you to implement
new features on older versions of the platform.</p>
<p>For more information about how API levels work, read <a
@@ -172,11 +171,11 @@ be affected by changes in Android {@sdkPlatformVersion}.</p>
<p>Your app might misbehave in a restricted profile environment.</p>
-<p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not
-have all the standard Android apps available. For example, a restricted profile might have the
-web browser and camera app disabled. So your app should not make assumptions about which apps are
-available, because if you call {@link android.app.Activity#startActivity startActivity()} without
-verifying whether an app is available to handle the {@link android.content.Intent},
+<p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not
+have all the standard Android apps available. For example, a restricted profile might have the
+web browser and camera app disabled. So your app should not make assumptions about which apps are
+available, because if you call {@link android.app.Activity#startActivity startActivity()} without
+verifying whether an app is available to handle the {@link android.content.Intent},
your app might crash in a restricted profile.</p>
<p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p>
@@ -197,20 +196,20 @@ if (intent.resolveActivity(getPackageManager()) != null) {
<p>Your app might misbehave in a restricted profile environment.</p>
<p>Users within a restricted profile environment do not have access to user accounts by default.
-If your app depends on an {@link android.accounts.Account}, then your app might crash or behave
+If your app depends on an {@link android.accounts.Account}, then your app might crash or behave
unexpectedly when used in a restricted profile.</p>
<p>If you'd like to prevent restricted profiles from using your app entirely because your
-app depends on account information that's sensitive, specify the <a
-href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
-android:requiredAccountType}</a> attribute in your manifest's <a
-href="{@docRoot}guide/topics/manifest/application-element.html">{@code &lt;application>}</a>
+app depends on account information that's sensitive, specify the <a
+href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
+android:requiredAccountType}</a> attribute in your manifest's <a
+href="{@docRoot}guide/topics/manifest/application-element.html">{@code &lt;application>}</a>
element.</p>
-<p>If you’d like to allow restricted profiles to continue using your app even though they can’t
-create their own accounts, then you can either disable your app features that require an account
+<p>If you’d like to allow restricted profiles to continue using your app even though they can’t
+create their own accounts, then you can either disable your app features that require an account
or allow restricted profiles to access the accounts created by the primary user. For more
-information, see the section
+information, see the section
below about <a href="#AccountsInProfile">Supporting accounts in a restricted profile</a>.</p>
@@ -218,67 +217,67 @@ below about <a href="#AccountsInProfile">Supporting accounts in a restricted pro
<h2 id="RestrictedProfiles">Restricted Profiles</h2>
-<p>On Android tablets, users can now create restricted profiles based on the primary user.
+<p>On Android tablets, users can now create restricted profiles based on the primary user.
When users create a restricted profile, they can enable restrictions such as which apps are
available to the profile. A new set of APIs in Android 4.3 also allow you to build fine-grain
-restriction settings for the apps you develop. For example, by using the new APIs, you can
-allow users to control what type of content is available within your app when running in a
+restriction settings for the apps you develop. For example, by using the new APIs, you can
+allow users to control what type of content is available within your app when running in a
restricted profile environment.</p>
-<p>The UI for users to control the restrictions you've built is managed by the system's
+<p>The UI for users to control the restrictions you've built is managed by the system's
Settings application. To make your app's restriction settings appear to the user,
-you must declare the restrictions your app provides by creating a {@link
-android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query
-all apps for available restrictions, then builds the UI to allow the primary user to
+you must declare the restrictions your app provides by creating a {@link
+android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query
+all apps for available restrictions, then builds the UI to allow the primary user to
manage restrictions for each restricted profile. </p>
-<p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of
-your {@link android.content.BroadcastReceiver}, you must create a {@link
-android.content.RestrictionEntry} for each restriction your app provides. Each {@link
-android.content.RestrictionEntry} defines a restriction title, description, and one of the
+<p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of
+your {@link android.content.BroadcastReceiver}, you must create a {@link
+android.content.RestrictionEntry} for each restriction your app provides. Each {@link
+android.content.RestrictionEntry} defines a restriction title, description, and one of the
following data types:</p>
<ul>
- <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is
+ <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is
either true or false.
- <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has
+ <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has
multiple choices that are mutually exclusive (radio button choices).
- <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that
+ <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that
has multiple choices that are <em>not</em> mutually exclusive (checkbox choices).
</ul>
-<p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link
-java.util.ArrayList} and put it into the broadcast receiver's result as the value for the
+<p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link
+java.util.ArrayList} and put it into the broadcast receiver's result as the value for the
{@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p>
-<p>The system creates the UI for your app's restrictions in the Settings app and saves each
-restriction with the unique key you provided for each {@link android.content.RestrictionEntry}
-object. When the user opens your app, you can query for any current restrictions by
-calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}.
+<p>The system creates the UI for your app's restrictions in the Settings app and saves each
+restriction with the unique key you provided for each {@link android.content.RestrictionEntry}
+object. When the user opens your app, you can query for any current restrictions by
+calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}.
This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction
you defined with the {@link android.content.RestrictionEntry} objects.</p>
-<p>If you want to provide more specific restrictions that can't be handled by boolean, single
-choice, and multi-choice values, then you can create an activity where the user can specify the
-restrictions and allow users to open that activity from the restriction settings. In your
-broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra
+<p>If you want to provide more specific restrictions that can't be handled by boolean, single
+choice, and multi-choice values, then you can create an activity where the user can specify the
+restrictions and allow users to open that activity from the restriction settings. In your
+broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra
in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent}
-indicating the {@link android.app.Activity} class to launch (use the
-{@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link
+indicating the {@link android.app.Activity} class to launch (use the
+{@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link
android.content.Intent#EXTRA_RESTRICTIONS_INTENT} with the intent).
-When the primary user enters your activity to set custom restrictions, your
-activity must then return a result containing the restriction values in an extra using either
-the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link
+When the primary user enters your activity to set custom restrictions, your
+activity must then return a result containing the restriction values in an extra using either
+the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link
android.content.Intent#EXTRA_RESTRICTIONS_BUNDLE} key, depending on whether you specify
{@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p>
<h3 id="AccountsInProfile">Supporting accounts in a restricted profile</h3>
-<p>Any accounts added to the primary user are available to a restricted profile, but the
-accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default.
+<p>Any accounts added to the primary user are available to a restricted profile, but the
+accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default.
If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted
-profile, you will get a failure result. Due to these restrictions, you have the following
+profile, you will get a failure result. Due to these restrictions, you have the following
three options:</p>
<li><strong>Allow access to the owner’s accounts from a restricted profile.</strong>
@@ -289,21 +288,25 @@ href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application>
android:restrictedAccountType="com.example.account.type" >
</pre>
-<p class="caution"><strong>Caution:</strong> Enabling this attribute provides your
-app access to the primary user's accounts from restricted profiles. So you should allow this
+<p class="caution"><strong>Caution:</strong> Enabling this attribute provides your
+app access to the primary user's accounts from restricted profiles. So you should allow this
only if the information displayed by your app does not reveal personally identifiable
-information (PII) that’s considered sensitive.</p>
+information (PII) that’s considered sensitive. The system settings will inform the primary
+user that your app grants restricted profiles to their accounts, so it should be clear to the user
+that account access is important for your app's functionality. If possible, you should also
+provide adequate restriction controls for the primary user that define how much account access
+is allowed in your app.</p>
</li>
<li><strong>Disable certain functionality when unable to modify accounts.</strong>
-<p>If you want to use accounts, but don’t actually require them for your app’s primary
-functionality, you can check for account availability and disable features when not available.
-You should first check if there is an existing account available. If not, then query whether
-it’s possible to create a new account by calling {@link
-android.os.UserManager#getUserRestrictions()} and check the {@link
-android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true},
-then you should disable whatever functionality of your app requires access to accounts.
+<p>If you want to use accounts, but don’t actually require them for your app’s primary
+functionality, you can check for account availability and disable features when not available.
+You should first check if there is an existing account available. If not, then query whether
+it’s possible to create a new account by calling {@link
+android.os.UserManager#getUserRestrictions()} and check the {@link
+android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true},
+then you should disable whatever functionality of your app requires access to accounts.
For example:</p>
<pre>
UserManager um = (UserManager) context.getSystemService(Context.USER_SERVICE);
@@ -312,15 +315,15 @@ if (restrictions.getBoolean(UserManager.DISALLOW_MODIFY_ACCOUNTS, false)) {
// cannot add accounts, disable some functionality
}
</pre>
-<p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare
+<p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare
any new attributes in your manifest file.</p>
</li>
<li><strong>Disable your app when unable to access private accounts.</strong>
-<p>If it’s instead important that your app not be available to restricted profiles because
-your app depends on sensitive personal information in an account (and because restricted profiles
+<p>If it’s instead important that your app not be available to restricted profiles because
+your app depends on sensitive personal information in an account (and because restricted profiles
currently cannot add new accounts), add
-the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
+the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
android:requiredAccountType}</a> attribute to the <a
href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application></a> tag:</p>
<pre>
@@ -338,11 +341,11 @@ because the owner's personal email should not be available to restricted profile
<h3 id="BTLE">Bluetooth Low Energy (Smart Ready)</h3>
-<p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}.
-With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy
+<p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}.
+With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy
peripherals such as heart rate monitors and pedometers.</p>
-<p>Because Bluetooth LE is a hardware feature that is not available on all
+<p>Because Bluetooth LE is a hardware feature that is not available on all
Android-powered devices, you must declare in your manifest file a <a
href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
element for {@code "android.hardware.bluetooth_le"}:</p>
@@ -350,11 +353,11 @@ element for {@code "android.hardware.bluetooth_le"}:</p>
&lt;uses-feature android:name="android.hardware.bluetooth_le" android:required="true" />
</pre>
-<p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the
-Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link
-android.bluetooth.BluetoothManager} class that you should use for some high level operations
-such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected
-devices, and checking the state of a device. For example, here's how you should now get the
+<p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the
+Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link
+android.bluetooth.BluetoothManager} class that you should use for some high level operations
+such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected
+devices, and checking the state of a device. For example, here's how you should now get the
{@link android.bluetooth.BluetoothAdapter}:</p>
<pre>
final BluetoothManager bluetoothManager =
@@ -362,32 +365,32 @@ final BluetoothManager bluetoothManager =
mBluetoothAdapter = bluetoothManager.getAdapter();
</pre>
-<p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan
-startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation
-of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth
-adapter detects a Bluetooth LE peripheral, your {@link
-android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the
-{@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This
-method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the
-detected device, the RSSI value for the device, and a byte array containing the device's
+<p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan
+startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation
+of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth
+adapter detects a Bluetooth LE peripheral, your {@link
+android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the
+{@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This
+method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the
+detected device, the RSSI value for the device, and a byte array containing the device's
advertisement record.</p>
-<p>If you want to scan for only specific types of peripherals, you can instead call {@link
-android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link
+<p>If you want to scan for only specific types of peripherals, you can instead call {@link
+android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link
java.util.UUID} objects that specify the GATT services your app supports.</p>
-<p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em>
-scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic
+<p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em>
+scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic
Bluetooth devices at once.</p>
-<p>To then connect to a Bluetooth LE peripheral, call {@link
-android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding
-{@link android.bluetooth.BluetoothDevice} object, passing it an implementation of
-{@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link
-android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity
-state with the device and other events. It's during the {@link
-android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()}
-callback that you can begin communicating with the device if the method passes {@link
+<p>To then connect to a Bluetooth LE peripheral, call {@link
+android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding
+{@link android.bluetooth.BluetoothDevice} object, passing it an implementation of
+{@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link
+android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity
+state with the device and other events. It's during the {@link
+android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()}
+callback that you can begin communicating with the device if the method passes {@link
android.bluetooth.BluetoothProfile#STATE_CONNECTED} as the new state.</p>
<p>Accessing Bluetooth features on a device also requires that your app request certain
@@ -397,161 +400,162 @@ href="{@docRoot}guide/topics/connectivity/bluetooth-le.html">Bluetooth Low Energ
<h3 id="WiFiScan">Wi-Fi scan-only mode</h3>
-<p>When attempting to identify the user's location, Android may use Wi-Fi to help determine
-the location by scanning nearby access points. However, users often keep Wi-Fi turned off to
-conserve battery, resulting in location data that's less accurate. Android now includes a
-scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location
+<p>When attempting to identify the user's location, Android may use Wi-Fi to help determine
+the location by scanning nearby access points. However, users often keep Wi-Fi turned off to
+conserve battery, resulting in location data that's less accurate. Android now includes a
+scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location
without connecting to an access point, thus greatly reducing battery usage.</p>
-<p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the
-user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity
-startActivity()} with the action {@link
+<p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the
+user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity
+startActivity()} with the action {@link
android.net.wifi.WifiManager#ACTION_REQUEST_SCAN_ALWAYS_AVAILABLE}.</p>
<h3 id="WiFiConfig">Wi-Fi configuration</h3>
-<p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to
+<p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to
automate Wi-Fi configuration for managed devices.</p>
<h3 id="QuickResponse">Quick response for incoming calls</h3>
-<p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming
-calls with an immediate text message without needing to pick up the call or unlock the device.
-Until now, these quick messages were always handled by the default Messaging app. Now any app
-can declare its capability to handle these messages by creating a {@link android.app.Service}
+<p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming
+calls with an immediate text message without needing to pick up the call or unlock the device.
+Until now, these quick messages were always handled by the default Messaging app. Now any app
+can declare its capability to handle these messages by creating a {@link android.app.Service}
with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p>
-<p>When the user responds to an incoming call with a quick response, the Phone app sends
-the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI
-describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra
-with the message the user wants to send. When your service receives the intent, it should deliver
+<p>When the user responds to an incoming call with a quick response, the Phone app sends
+the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI
+describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra
+with the message the user wants to send. When your service receives the intent, it should deliver
the message and immediately stop itself (your app should not show an activity).</p>
-<p>In order to receive this intent, you must declare the {@link
+<p>In order to receive this intent, you must declare the {@link
android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE} permission.</p>
<h2 id="Multimedia">Multimedia</h2>
-<h3 id="DASH">MPEG DASH support</h3>
+<h3 id="MediaExtractor">MediaExtractor and MediaCodec enhancements</h3>
-<p>Android now supports Dynamic Adaptive Streaming over HTTP (DASH) in accordance with the
-ISO/IEC 23009-1 standard, using existing APIs in {@link android.media.MediaCodec} and {@link
-android.media.MediaExtractor}. The framework underlying these APIs has been updated to support
-parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata
+<p>Android now makes it easier for you to write your own Dynamic Adaptive
+Streaming over HTTP (DASH) players in accordance with the ISO/IEC 23009-1 standard,
+using existing APIs in {@link android.media.MediaCodec} and {@link
+android.media.MediaExtractor}. The framework underlying these APIs has been updated to support
+parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata
and passing the individual streams to {@link android.media.MediaExtractor}.</p>
-<p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link
-android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media
-sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to
-{@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media.
-This method returns a map of {@link java.util.UUID} objects to bytes, with the
-{@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific
+<p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link
+android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media
+sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to
+{@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media.
+This method returns a map of {@link java.util.UUID} objects to bytes, with the
+{@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific
to that scheme.</p>
<h3 id="DRM">Media DRM</h3>
<p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights
-management (DRM) with your media content by separating DRM concerns from media playback. For
-instance, this API separation allows you to play back Widevine-encrypted content without having
-to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you
+management (DRM) with your media content by separating DRM concerns from media playback. For
+instance, this API separation allows you to play back Widevine-encrypted content without having
+to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you
can use a variety of DRM schemes with your streaming content.</p>
-<p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process
-key-response messages from the server for license acquisition and provisioning. Your app is
-responsible for handling the network communication with the servers; the {@link
+<p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process
+key-response messages from the server for license acquisition and provisioning. Your app is
+responsible for handling the network communication with the servers; the {@link
android.media.MediaDrm} class provides only the ability to generate and process the messages.</p>
-<p>The {@link android.media.MediaDrm} APIs are intended to be used in conjunction with the
-{@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16),
-including {@link android.media.MediaCodec} for encoding and decoding your content, {@link
-android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor}
+<p>The {@link android.media.MediaDrm} APIs are intended to be used in conjunction with the
+{@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16),
+including {@link android.media.MediaCodec} for encoding and decoding your content, {@link
+android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor}
for extracting and demuxing your content.</p>
-<p>You must first construct {@link android.media.MediaExtractor} and
-{@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying
-{@link java.util.UUID}, typically from metadata in the content, and use it to construct an
+<p>You must first construct {@link android.media.MediaExtractor} and
+{@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying
+{@link java.util.UUID}, typically from metadata in the content, and use it to construct an
instance of a {@link android.media.MediaDrm} object with its constructor.</p>
<h3 id="EncodingSurface">Video encoding from a Surface</h3>
-<p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level
-encoding and decoding of media content. When encoding video, Android 4.1 required that you provide
-the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link
-android.view.Surface} as the input to an encoder. For instance, this allows you to encode input
+<p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level
+encoding and decoding of media content. When encoding video, Android 4.1 required that you provide
+the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link
+android.view.Surface} as the input to an encoder. For instance, this allows you to encode input
from an existing video file or using frames generated from OpenGL ES.</p>
-<p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link
-android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}.
-Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link
+<p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link
+android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}.
+Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link
android.view.Surface} upon which you can stream your media.</p>
-<p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL
-context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface
-eglCreateWindowSurface()}. Then while rendering the surface, call {@link
-android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link
+<p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL
+context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface
+eglCreateWindowSurface()}. Then while rendering the surface, call {@link
+android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link
android.media.MediaCodec}.</p>
-<p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link
-android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream}
-to terminate encoding, and call {@link android.view.Surface#release()} on the
+<p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link
+android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream}
+to terminate encoding, and call {@link android.view.Surface#release()} on the
{@link android.view.Surface}.</p>
<h3 id="MediaMuxing">Media muxing</h3>
-<p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream
-and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor}
+<p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream
+and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor}
class added in Android 4.2 for de-multiplexing (demuxing) media.</p>
-<p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently,
-MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports
+<p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently,
+MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports
only one audio stream and/or one video stream at a time.</p>
-<p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec}
-so you can perform video processing through {@link android.media.MediaCodec} then save the
-output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link
-android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform
+<p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec}
+so you can perform video processing through {@link android.media.MediaCodec} then save the
+output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link
+android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform
media editing without the need to encode or decode.</p>
<h3 id="ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</h3>
-<p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to
-enable media playback controls from remote control clients such as the controls available on the
-lock screen. Android 4.3 now provides the ability for such controllers to display the playback
-position and controls for scrubbing the playback. If you've enabled remote control for your
-media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback
+<p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to
+enable media playback controls from remote control clients such as the controls available on the
+lock screen. Android 4.3 now provides the ability for such controllers to display the playback
+position and controls for scrubbing the playback. If you've enabled remote control for your
+media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback
scrubbing by implementing two new interfaces.</p>
-<p>First, you must enable the {@link
-android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to
-{@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p>
+<p>First, you must enable the {@link
+android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to
+{@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p>
<p>Then implement the following two new interfaces:</p>
<dl>
<dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt>
- <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position
+ <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position
of your media when the remote control needs to update the progress in its UI.</dd>
<dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt>
- <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which
- tells your app the new time code for your media when the user scrubs the playback with the
+ <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which
+ tells your app the new time code for your media when the user scrubs the playback with the
remote control UI.
- <p>Once you update your playback with the new position, call {@link
- android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the
+ <p>Once you update your playback with the new position, call {@link
+ android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the
new playback state, position, and speed.</p>
</dd>
</dl>
-<p>With these interfaces defined, you can set them for your {@link
-android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and
-{@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener
+<p>With these interfaces defined, you can set them for your {@link
+android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and
+{@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener
setPlaybackPositionUpdateListener()}, respectively.</p>
@@ -560,7 +564,7 @@ setPlaybackPositionUpdateListener()}, respectively.</p>
<h3 id="OpenGL">Support for OpenGL ES 3.0</h3>
-<p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality
+<p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality
provided in OpenGL ES 3.0 includes:</p>
<ul>
<li>Acceleration of advanced visual effects</li>
@@ -570,8 +574,8 @@ provided in OpenGL ES 3.0 includes:</p>
<li>Broader standardization of texture size and render-buffer formats</li>
</ul>
-<p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}.
-When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the
+<p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}.
+When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the
<a href="{@docRoot}guide/topics/manifest/uses-feature-element.html">&lt;uses-feature></a>
tag and the {@code android:glEsVersion} attribute. For example:</p>
<pre>
@@ -581,21 +585,27 @@ tag and the {@code android:glEsVersion} attribute. For example:</p>
&lt;/manifest>
</pre>
-<p>And remember to specify the OpenGL ES context by calling {@link android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()}, passing {@code 3} as the version.</p>
+<p>And remember to specify the OpenGL ES context by calling {@link
+android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()},
+passing {@code 3} as the version.</p>
+
+<p>For more information about using OpenGL ES, including how to check the device's supported
+OpenGL ES version at runtime, see the <a href="{@docRoot}guide/topics/graphics/opengl.html"
+>OpenGL ES</a> API guide.</p>
<h3 id="MipMap">Mipmapping for drawables</h3>
-<p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a
-quality image and various image scales, which can be particularly useful if you expect your
+<p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a
+quality image and various image scales, which can be particularly useful if you expect your
image to be scaled during an animation.</p>
-<p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap}
-class&mdash;Android swaps the mip images in your {@link android.graphics.Bitmap} when you've
-supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap
-setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link
-android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and
-setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link
+<p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap}
+class&mdash;Android swaps the mip images in your {@link android.graphics.Bitmap} when you've
+supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap
+setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link
+android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and
+setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link
android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}.
</p>
@@ -605,36 +615,36 @@ android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}.
<h3 id="ViewOverlay">View overlays</h3>
-<p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of
-a {@link android.view.View} on which you can add visual content and which does not affect
-the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link
-android.view.View} by calling {@link android.view.View#getOverlay}. The overlay
-always has the same size and position as its host view (the view from which it was created),
-allowing you to add content that appears in front of the host view, but which cannot extend
+<p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of
+a {@link android.view.View} on which you can add visual content and which does not affect
+the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link
+android.view.View} by calling {@link android.view.View#getOverlay}. The overlay
+always has the same size and position as its host view (the view from which it was created),
+allowing you to add content that appears in front of the host view, but which cannot extend
the bounds of that host view.
</p>
-<p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create
-animations such as sliding a view outside of its container or moving items around the screen
-without affecting the view hierarchy. However, because the usable area of an overlay is
-restricted to the same area as its host view, if you want to animate a view moving outside
-its position in the layout, you must use an overlay from a parent view that has the desired
+<p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create
+animations such as sliding a view outside of its container or moving items around the screen
+without affecting the view hierarchy. However, because the usable area of an overlay is
+restricted to the same area as its host view, if you want to animate a view moving outside
+its position in the layout, you must use an overlay from a parent view that has the desired
layout bounds.</p>
-<p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you
-can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling
-{@link android.view.ViewOverlay#add(Drawable)}. If you call {@link
+<p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you
+can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling
+{@link android.view.ViewOverlay#add(Drawable)}. If you call {@link
android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout},
the object returned is a {@link android.view.ViewGroupOverlay}. The
-{@link android.view.ViewGroupOverlay} class is a subclass
-of {@link android.view.ViewOverlay} that also allows you to add {@link android.view.View}
+{@link android.view.ViewGroupOverlay} class is a subclass
+of {@link android.view.ViewOverlay} that also allows you to add {@link android.view.View}
objects by calling {@link android.view.ViewGroupOverlay#add(View)}.
</p>
-<p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay
+<p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay
are visual only. They cannot receive focus or input events.</p>
-<p>For example, the following code animates a view sliding to the right by placing the view
+<p>For example, the following code animates a view sliding to the right by placing the view
in the parent view's overlay, then performing a translation animation on that view:</p>
<pre>
View view = findViewById(R.id.view_to_remove);
@@ -647,17 +657,17 @@ anim.start();
<h3 id="OpticalBounds">Optical bounds layout</h3>
-<p>For views that contain nine-patch background images, you can now specify that they should
-be aligned with neighboring views based on the "optical" bounds of the background image rather
+<p>For views that contain nine-patch background images, you can now specify that they should
+be aligned with neighboring views based on the "optical" bounds of the background image rather
than the "clip" bounds of the view.</p>
-<p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is
-using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the
-nine-patch images used for the button and the photo frame include padding around the edges,
+<p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is
+using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the
+nine-patch images used for the button and the photo frame include padding around the edges,
they don’t appear to align with each other or the text when using clip bounds.</p>
-<p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show
-layout bounds" developer setting enabled. For each view, red lines indicate the optical
+<p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show
+layout bounds" developer setting enabled. For each view, red lines indicate the optical
bounds, blue lines indicate the clip bounds, and pink indicates margins.</p>
<script type="text/javascript">
@@ -725,30 +735,30 @@ optical bounds.
</p>
</div>
-<p>For this to work, the nine-patch images applied to the background of your views must specify
-the optical bounds using red lines along the bottom and right-side of the nine-patch file (as
-shown in figure 3). The red lines indicate the region that should be subtracted from
+<p>For this to work, the nine-patch images applied to the background of your views must specify
+the optical bounds using red lines along the bottom and right-side of the nine-patch file (as
+shown in figure 3). The red lines indicate the region that should be subtracted from
the clip bounds, leaving the optical bounds of the image.</p>
-<p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all
-descendant views inherit the optical bounds layout mode unless you override it for a group by
-setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the
-optical bounds of their child views, adapting their own bounds based on the optical bounds of
-the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup})
+<p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all
+descendant views inherit the optical bounds layout mode unless you override it for a group by
+setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the
+optical bounds of their child views, adapting their own bounds based on the optical bounds of
+the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup})
currently do not support optical bounds for nine-patch images applied to their own background.</p>
<p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p>
<p class="note"><strong>Note:</strong> All widgets supported by the Holo theme have been updated
-with optical bounds, including {@link android.widget.Button}, {@link android.widget.Spinner},
+with optical bounds, including {@link android.widget.Button}, {@link android.widget.Spinner},
{@link android.widget.EditText}, and others. So you can immediately benefit by setting the
-{@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme
-({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light
+{@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme
+({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light
Theme.Holo.Light}, etc.).
</p>
-<p>To specify optical bounds for your own nine-patch images with the <a
-href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on
+<p>To specify optical bounds for your own nine-patch images with the <a
+href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on
the border pixels.</p>
@@ -756,59 +766,59 @@ the border pixels.</p>
<h3 id="AnimationRect">Animation for Rect values</h3>
-<p>You can now animate between two {@link android.graphics.Rect} values with the new {@link
-android.animation.RectEvaluator}. This new class is an implementation of {@link
-android.animation.TypeEvaluator} that you can pass to {@link
+<p>You can now animate between two {@link android.graphics.Rect} values with the new {@link
+android.animation.RectEvaluator}. This new class is an implementation of {@link
+android.animation.TypeEvaluator} that you can pass to {@link
android.animation.ValueAnimator#setEvaluator ValueAnimator.setEvaluator()}.
</p>
<h3 id="AttachFocus">Window attach and focus listener</h3>
-<p>Previously, if you wanted to listen for when your view attached/detached to the window or
-when its focus changed, you needed to override the {@link android.view.View} class to
-implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link
-android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or {@link
+<p>Previously, if you wanted to listen for when your view attached/detached to the window or
+when its focus changed, you needed to override the {@link android.view.View} class to
+implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link
+android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or {@link
android.view.View#onWindowFocusChanged onWindowFocusChanged()}, respectively.
</p>
-<p>Now, to receive attach and detach events you can instead implement {@link
-android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with
-{@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}.
-And to receive focus events, you can implement {@link
-android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with
-{@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener
+<p>Now, to receive attach and detach events you can instead implement {@link
+android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with
+{@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}.
+And to receive focus events, you can implement {@link
+android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with
+{@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener
addOnWindowFocusChangeListener()}.
</p>
<h3 id="Overscan">TV overscan support</h3>
-<p>To be sure your app fills the entire screen on every television, you can now enable overscan
-for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as
-{@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the
+<p>To be sure your app fills the entire screen on every television, you can now enable overscan
+for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as
+{@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the
{@link android.R.attr#windowOverscan} style in a custom theme.</p>
<h3 id="Orientation">Screen orientation</h3>
-<p>The <a
+<p>The <a
href="{@docRoot}guide/topics/manifest/activity-element.html">{@code &lt;activity>}</a>
-tag's <a
+tag's <a
href="{@docRoot}guide/topics/manifest/activity-element.html#screen">{@code screenOrientation}</a>
attribute now supports additional values to honor the user's preference for auto-rotation:</p>
<dl>
<dt>{@code "userLandscape"}</dt>
-<dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate
+<dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate
then it locks in the normal landscape orientation and will not flip.
</dd>
<dt>{@code "userPortrait"}</dt>
-<dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then
+<dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then
it locks in the normal portrait orientation and will not flip.
</dd>
<dt>{@code "fullUser"}</dt>
-<dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except
+<dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except
if the user disables auto-rotate then it locks in the user's preferred orientation.
</dd></dl>
@@ -818,8 +828,8 @@ the screen's current orientation.</p>
<h3 id="RotationAnimation">Rotation animations</h3>
-<p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in
-{@link android.view.WindowManager} allows you to select between one of three animations you
+<p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in
+{@link android.view.WindowManager} allows you to select between one of three animations you
want to use when the system switches screen orientations. The three animations are:</p>
<ul>
<li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li>
@@ -844,25 +854,17 @@ protected void onCreate(Bundle savedInstanceState) {
<h2 id="UserInput">User Input</h2>
-<h3 id="SignificantMotion">Detect significant motion</h3>
-
-<p>The {@link android.hardware.SensorManager} APIs now allow you to request a callback when the
-device sensors detect "significant motion." For instance, this event may be triggered by new
-motion such as when the user starts to walk.</p>
-
-<p>To register a listener for significant motion, extend the {@link android.hardware.TriggerEventListener} class and implement the {@link android.hardware.TriggerEventListener#onTrigger onTrigger()} callback method. Then register your event listener with the {@link android.hardware.SensorManager} by passing it to {@link android.hardware.SensorManager#requestTriggerSensor requestTriggerSensor()}, passing it your {@link android.hardware.TriggerEventListener} and {@link android.hardware.Sensor#TYPE_SIGNIFICANT_MOTION}.</p>
-
<h3 id="Sensors">New sensor types</h3>
<p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p>
-<p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link
-android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without
-consideration for bias estimations. That is, the existing {@link
-android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD}
-sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron
-in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide
-the raw sensor data and offer the estimated bias values separately. These sensors allow you to
-provide your own custom calibration for the sensor data by enhancing the estimated bias with
+<p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link
+android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without
+consideration for bias estimations. That is, the existing {@link
+android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD}
+sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron
+in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide
+the raw sensor data and offer the estimated bias values separately. These sensors allow you to
+provide your own custom calibration for the sensor data by enhancing the estimated bias with
external data.</p>
@@ -891,13 +893,13 @@ external data.</p>
<p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p>
-<p>Additionally, the Contacts Provider now broadcasts the {@link
-android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user
-clears the contacts storage through the system settings menu, effectively recreating the
-Contacts Provider database. It’s intended to signal apps that they need to drop all the contact
+<p>Additionally, the Contacts Provider now broadcasts the {@link
+android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user
+clears the contacts storage through the system settings menu, effectively recreating the
+Contacts Provider database. It’s intended to signal apps that they need to drop all the contact
information they’ve stored and reload it with a new query.</p>
-<p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos
+<p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos
sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples</a> download.</p>
@@ -905,13 +907,13 @@ sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples
<h3 id="BiDi">Improved support for bi-directional text</h3>
-<p>Previous versions of Android support right-to-left (RTL) languages and layout,
-but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link
-android.text.BidiFormatter} APIs that help you properly format text with opposite-direction
+<p>Previous versions of Android support right-to-left (RTL) languages and layout,
+but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link
+android.text.BidiFormatter} APIs that help you properly format text with opposite-direction
content without garbling any parts of it.</p>
-<p>For example, when you want to create a sentence with a string variable, such as "Did you mean
-15 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to
+<p>For example, when you want to create a sentence with a string variable, such as "Did you mean
+15 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to
{@link java.lang.String#format String.format()}:</p>
<pre>
Resources res = getResources();
@@ -922,8 +924,8 @@ String suggestion = String.format(res.getString(R.string.did_you_mean), address)
<p dir="rtl">האם התכוונת ל 15 Bay Street, Laurel, CA?</p>
-<p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link
-android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String)
+<p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link
+android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String)
unicodeWrap()} method. For example, the code above becomes:</p>
<pre>
Resources res = getResources();
@@ -933,15 +935,15 @@ String suggestion = String.format(res.getString(R.string.did_you_mean),
</pre>
<p>
-By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the
-first-strong directionality estimation heuristic, which can get things wrong if the first
-signal for text direction does not represent the appropriate direction for the content as a whole.
-If necessary, you can specify a different heuristic by passing one of the {@link
-android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics}
+By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the
+first-strong directionality estimation heuristic, which can get things wrong if the first
+signal for text direction does not represent the appropriate direction for the content as a whole.
+If necessary, you can specify a different heuristic by passing one of the {@link
+android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics}
to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p>
<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
-of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support
+of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support
Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p>
@@ -950,36 +952,36 @@ Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and re
<h3 id="A11yKeyEvents">Handle key events</h3>
-<p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for
-key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent
-onKeyEvent()} callback method. This allows your accessibility service to handle input for
-key-based input devices such as a keyboard and translate those events to special actions that
+<p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for
+key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent
+onKeyEvent()} callback method. This allows your accessibility service to handle input for
+key-based input devices such as a keyboard and translate those events to special actions that
previously may have been possible only with touch input or the device's directional pad.</p>
<h3 id="A11yText">Select text and copy/paste</h3>
-<p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow
-an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste
+<p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow
+an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste
text in a node.</p>
-<p>To specify the selection of text to cut or copy, your accessibility service can use the new
-action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing
-with it the selection start and end position with {@link
-android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link
-android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}.
-Alternatively you can select text by manipulating the cursor position using the existing
-action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY}
-(previously only for moving the cursor position), and adding the argument {@link
+<p>To specify the selection of text to cut or copy, your accessibility service can use the new
+action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing
+with it the selection start and end position with {@link
+android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link
+android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}.
+Alternatively you can select text by manipulating the cursor position using the existing
+action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY}
+(previously only for moving the cursor position), and adding the argument {@link
android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_EXTEND_SELECTION_BOOLEAN}.</p>
-<p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT},
-{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with
+<p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT},
+{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with
{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p>
<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
-of Android through the Android <a href="{@docRoot}tools/extras/support-library.html">Support
+of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support
Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat}
class.</p>
@@ -987,14 +989,14 @@ class.</p>
<h3 id="A11yFeatures">Declare accessibility features</h3>
-<p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities
-in its metadata file in order to use certain accessibility features. If the capability is not
-requested in the metadata file, then the feature will be a no-op. To declare your service's
-accessibility capabilities, you must use XML attributes that correspond to the various
-"capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo}
+<p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities
+in its metadata file in order to use certain accessibility features. If the capability is not
+requested in the metadata file, then the feature will be a no-op. To declare your service's
+accessibility capabilities, you must use XML attributes that correspond to the various
+"capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo}
class.</p>
-<p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability,
+<p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability,
then it will not receive key events.</p>
@@ -1002,36 +1004,36 @@ then it will not receive key events.</p>
<h3 id="UiAutomation">Automated UI testing</h3>
-<p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user
-actions for test automation. By using the platform's {@link
-android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation}
+<p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user
+actions for test automation. By using the platform's {@link
+android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation}
APIs allow you to inspect the screen content and inject arbitrary keyboard and touch events.</p>
-<p>To get an instance of {@link android.app.UiAutomation}, call {@link
-android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order
-for this to work, you must supply the {@code -w} option with the {@code instrument} command
-when running your {@link android.test.InstrumentationTestCase} from <a
+<p>To get an instance of {@link android.app.UiAutomation}, call {@link
+android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order
+for this to work, you must supply the {@code -w} option with the {@code instrument} command
+when running your {@link android.test.InstrumentationTestCase} from <a
href="{@docRoot}tools/help/adb.html#am">{@code adb shell}</a>.</p>
-<p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test
-your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent
-executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout
-period for the operation, and an implementation of the {@link
-android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link
-android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call
-that allows you to filter the events that you're interested in and determine the success or
+<p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test
+your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent
+executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout
+period for the operation, and an implementation of the {@link
+android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link
+android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call
+that allows you to filter the events that you're interested in and determine the success or
failure of a given test case.</p>
-<p>To observe all the events during a test, create an implementation of {@link
-android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link
-android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}.
-Your listener interface then receives a call to {@link
-android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()}
-each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object
+<p>To observe all the events during a test, create an implementation of {@link
+android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link
+android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}.
+Your listener interface then receives a call to {@link
+android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()}
+each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object
that describes the event.</p>
-<p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose
-at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance,
+<p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose
+at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance,
{@link android.app.UiAutomation} can also:</p>
<ul>
<li>Inject input events
@@ -1039,16 +1041,16 @@ at a very low level to encourage the development of UI test tools such as <a hre
<li>Take screenshots
</ul>
-<p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work
+<p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work
across application boundaries, unlike those in {@link android.app.Instrumentation}.</p>
<h3 id="Systrace">Systrace events for apps</h3>
-<p>Android 4.3 adds the {@link android.os.Trace} class with two static methods,
-{@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()},
-which allow you to define blocks of code to include with the systrace report. By creating
-sections of traceable code in your app, the systrace logs provide you a much more detailed
+<p>Android 4.3 adds the {@link android.os.Trace} class with two static methods,
+{@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()},
+which allow you to define blocks of code to include with the systrace report. By creating
+sections of traceable code in your app, the systrace logs provide you a much more detailed
analysis of where slowdown occurs within your app.</p>
<p>For information about using the Systrace tool, read <a href="{@docRoot}tools/debugging/systrace.html">Analyzing Display and Performance with Systrace</a>.</p>
@@ -1058,31 +1060,31 @@ analysis of where slowdown occurs within your app.</p>
<h3 id="KeyStore">Android key store for app-private keys</h3>
-<p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore}
-facility, called Android Key Store, which allows you to generate and save private keys that
-may be seen and used by only your app. To load the Android Key Store, pass
-{@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String)
+<p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore}
+facility, called Android Key Store, which allows you to generate and save private keys that
+may be seen and used by only your app. To load the Android Key Store, pass
+{@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String)
KeyStore.getInstance()}.</p>
-<p>To manage your app's private credentials in the Android Key Store, generate a new key with
-{@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First
-get an instance of {@link java.security.KeyPairGenerator} by calling {@link
-java.security.KeyPairGenerator#getInstance getInstance()}. Then call
-{@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of
-{@link android.security.KeyPairGeneratorSpec}, which you can get using
-{@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}.
-Finally, get your {@link java.security.KeyPair} by calling {@link
+<p>To manage your app's private credentials in the Android Key Store, generate a new key with
+{@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First
+get an instance of {@link java.security.KeyPairGenerator} by calling {@link
+java.security.KeyPairGenerator#getInstance getInstance()}. Then call
+{@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of
+{@link android.security.KeyPairGeneratorSpec}, which you can get using
+{@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}.
+Finally, get your {@link java.security.KeyPair} by calling {@link
java.security.KeyPairGenerator#generateKeyPair generateKeyPair()}.</p>
<h3 id="HardwareKeyChain">Hardware credential storage</h3>
-<p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain}
-credentials, providing more security by making the keys unavailable for extraction. That is, once
-keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for
-cryptographic operations but the private key material cannot be exported. Even the OS kernel
-cannot access this key material. While not all Android-powered devices support storage on
-hardware, you can check at runtime if hardware-backed storage is available by calling
+<p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain}
+credentials, providing more security by making the keys unavailable for extraction. That is, once
+keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for
+cryptographic operations but the private key material cannot be exported. Even the OS kernel
+cannot access this key material. While not all Android-powered devices support storage on
+hardware, you can check at runtime if hardware-backed storage is available by calling
{@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p>
@@ -1091,9 +1093,9 @@ hardware, you can check at runtime if hardware-backed storage is available by ca
<h3 id="ManifestFeatures">Declarable required features</h3>
-<p>The following values are now supported in the <a
+<p>The following values are now supported in the <a
href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
-element so you can ensure that your app is installed only on devices that provide the features
+element so you can ensure that your app is installed only on devices that provide the features
your app needs.</p>
<dl>
@@ -1137,8 +1139,8 @@ Example:
<h3 id="ManifestPermissions">User permissions</h3>
-<p>The following values are now supported in the <a
-href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code &lt;uses-permission>}</a>
+<p>The following values are now supported in the <a
+href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code &lt;uses-permission>}</a>
to declare the
permissions your app requires in order to access certain APIs.</p>
diff --git a/docs/html/about/versions/jelly-bean.jd b/docs/html/about/versions/jelly-bean.jd
index 5812f3d..503a95b 100644
--- a/docs/html/about/versions/jelly-bean.jd
+++ b/docs/html/about/versions/jelly-bean.jd
@@ -1,8 +1,10 @@
page.title=Jelly Bean
-tab1=Android 4.2
-tab1.link=#android-42
-tab2=Android 4.1
-tab2.link=#android-41
+tab1=Android 4.3
+tab1.link=#android-43
+tab2=Android 4.2
+tab2.link=#android-42
+tab3=Android 4.1
+tab3.link=#android-41
@jd:body
<div id="butterbar-wrapper" >
@@ -16,6 +18,7 @@ tab2.link=#android-41
<style>
#android-41 {display:none;}
+#android-42 {display:none;}
</style>
<script>
@@ -60,6 +63,574 @@ window.onhashchange = function () {
</script>
+<!-- BEGIN ANDROID 4.3 -->
+<div id="android-43" class="version-section">
+
+<div style="float:right;padding:0px 0px 10px 28px;width:480px;">
+<div>
+<a href="{@docRoot}images/jb-android-43@2x.png"><img src="{@docRoot}images/jb-android-43.jpg" alt="Android 4.3 on phone and tablet" width="472"></a>
+
+</div>
+</div>
+<p>Welcome to Android 4.3, a sweeter version of <span
+style="white-space:nowrap;">Jelly Bean!</span></p>
+
+<p>Android 4.3 includes performance optimizations and great
+new features for users and developers. This document provides a glimpse of what's new for
+developers.
+
+<p>See the <a href="{@docRoot}about/versions/android-4.3.html">Android 4.3 APIs</a>
+document for a detailed look at the new developer APIs.</p>
+
+<p>Find out more about the new Jelly Bean features for users at <a
+href="http://www.android.com/whatsnew">www.android.com</a>.</p>
+
+
+<h2 id="43-performance" style="line-height:1.25em;">Faster, Smoother, More
+Responsive</h2>
+
+<p>Android 4.3 builds on the performance improvements already included in Jelly
+Bean &mdash; <strong>vsync timing</strong>, <strong>triple buffering</strong>,
+<strong>reduced touch latency</strong>, <strong>CPU input boost</strong>, and
+<strong>hardware-accelerated 2D rendering</strong> &mdash; and adds new
+optimizations that make Android even faster.</p>
+
+<p>For a graphics performance boost, the hardware-accelerated 2D renderer now
+<strong>optimizes the stream of drawing commands</strong>, transforming it into
+a more efficient GPU format by rearranging and merging draw operations. For
+multithreaded processing, the renderer can also now use <strong>multithreading
+across multiple CPU cores</strong> to perform certain tasks.</p>
+
+<p>Android 4.3 also improves <strong>rendering for shapes and text</strong>.
+Shapes such as circles and rounded rectangles are now rendered at higher quality
+in a more efficient manner. Optimizations for text include increased performance
+when using multiple fonts or complex glyph sets (CJK), higher rendering quality
+when scaling text, and faster rendering of drop shadows.</p>
+
+<p><strong>Improved window buffer allocation</strong> results in a faster image
+buffer allocation for your apps, reducing the time taken to start rendering when
+you create a window.</p>
+
+<p>For highest-performance graphics, Android 4.3 introduces support for
+<strong>OpenGL ES 3.0</strong> and makes it accessible to apps through both
+framework and native APIs. On supported devices, the hardware accelerated 2D
+rendering engine takes advantage of OpenGL ES 3.0 to optimize <strong>texture
+management</strong> and increase <strong>gradient rendering
+fidelity</strong>.</p>
+
+
+<h2 id="43-graphics">OpenGL ES 3.0 for High-Performance Graphics</h2>
+
+<p>Android 4.3 introduces platform support for <a class="external-link"
+href="http://www.khronos.org/opengles/3_X/" target="_android">Khronos OpenGL ES 3.0</a>,
+providing games and other apps with highest-performance 2D and 3D graphics
+capabilities on supported devices. You can take advantage of OpenGL ES 3.0
+and related EGL extensions using either <strong>framework APIs</strong>
+or <strong>native API bindings</strong> through the Android Native Development
+Kit (NDK).</p>
+
+<p>Key new functionality provided in OpenGL ES 3.0 includes acceleration of
+advanced visual effects, high quality ETC2/EAC texture compression as a standard
+feature, a new version of the GLSL ES shading language with integer and 32-bit
+floating point support, advanced texture rendering, and standardized texture
+size and render-buffer formats.
+
+<p>You can use the OpenGL ES 3.0 APIs to create highly complex, highly efficient
+graphics that run across a range of compatible Android devices, and you can
+support a single, standard texture-compression format across those devices.</p>
+
+<p>OpenGL ES 3.0 is an optional feature that depends on underlying graphics
+hardware. Support is already available on Nexus 7 (2013), Nexus 4, and
+Nexus 10 devices.</p>
+
+
+<h2 id="43-bluetooth" style="clear:both;">Enhanced Bluetooth Connectivity</h2>
+
+<h4 id="43-bt-le">Connectivity with Bluetooth Smart devices and sensors</h4>
+
+<p>Now you can design and build apps that interact with the latest generation
+of small, low-power devices and sensors that use <a
+href="http://www.bluetooth.com/Pages/Bluetooth-Smart-Devices.aspx"
+class="external-link" target="_android">Bluetooth Smart technology</a>. </p>
+
+<div style="float:right;margin:0px 0px 32px 0px;width:460px;">
+<img src="{@docRoot}images/jb-btle.png" alt="" width="450" style="padding-left:1.5em;margin-bottom:0">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;margin-bottom:0;padding-left:1.5em;">Android 4.3 gives you a single, standard API for interacting with Bluetooth Smart devices. </p>
+</div>
+
+<p>Android 4.3 introduces built-in platform support for <strong>Bluetooth Smart
+Ready</strong> in the central role and provides a standard set of APIs that
+apps can use to discover nearby devices, query for GATT services, and read/write
+characteristics.</p>
+
+<p>With the new APIs, your apps can efficiently scan for devices and services of
+interest. For each device, you can check for supported GATT services by UUID and
+manage connections by device ID and signal strength. You can connect to a GATT
+server hosted on the device and read or write characteristics, or register a
+listener to receive notifications whenever those characteristics change.</p>
+
+<p>You can implement support for any GATT profile. You can read or write
+standard characteristics or add support for custom characteristics as needed.
+Your app can function as either client or server and can transmit and receive
+data in either mode. The APIs are generic, so you’ll be able to support
+interactions with a variety of devices such as proximity tags, watches, fitness
+meters, game controllers, remote controls, health devices, and more.
+</p>
+
+<p>Support for Bluetooth Smart Ready is already available on Nexus 7 (2013)
+and Nexus 4 devices and will be supported in a growing number of
+Android-compatible devices in the months ahead.</p>
+
+<h4 id="43-bt-avrcp">AVRCP 1.3 Profile</h4>
+
+<p>Android 4.3 adds built-in support for <strong>Bluetooth AVRCP 1.3</strong>,
+so your apps can support richer interactions with remote streaming media
+devices. Apps such as media players can take advantage of AVRCP 1.3 through the
+<strong>remote control client APIs</strong> introduced in Android 4.0. In
+addition to exposing playback controls on the remote devices connected over
+Bluetooth, apps can now transmit metadata such as track name, composer, and
+other types of media metadata. </p>
+
+<p>Platform support for AVRCP 1.3 is built on the Bluedroid Bluetooth stack
+introduced by Google and Broadcom in Android 4.2. Support is available right
+away on Nexus devices and other Android-compatible devices that offer A2DP/AVRCP
+capability. </p>
+
+
+<h2 id="43-profiles">Support for Restricted Profiles</h2>
+
+<div style="float:right;margin:22px 0px 0px 24px;width:340px;">
+<img src="{@docRoot}images/jb-profiles-create-n713.png" alt="Setting up a Restricted Profile" width="340" style="margin-bottom:0">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;margin-bottom:0;">A tablet owner can set up one or more restricted profiles in Settings and manage them independently. </p>
+<img src="{@docRoot}images/jb-profiles-restrictions-n713.png" alt="Setting Restrictions in a Profile" width="340" style="margin-bottom:0;padding-top:1em;">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">Your app can offer restrictions to let owners manage your app content when it's running in a profile. </p>
+</div>
+
+<p>Android 4.3 extends the multiuser feature for tablets with <strong>restricted
+profiles</strong>, a new way to manage users and their capabilities on a single
+device. With restricted profiles, tablet owners can quickly set up
+<strong>separate environments</strong> for each user, with the ability to
+manage <strong>finer-grained restrictions</strong> in the apps that are
+available in those environments. Restricted profiles are ideal for friends and
+family, guest users, kiosks, point-of-sale devices, and more. </p>
+
+<p>Each restricted profile offers an isolated and secure space with its own
+local storage, home screens, widgets, and settings. Unlike with
+users, profiles are created from the tablet owner’s environment, based on the
+owner’s installed apps and system accounts. The owner controls which installed
+apps are enabled in the new profile, and access to the owner’s accounts is
+disabled by default. </p>
+
+<p>Apps that need to access the owner’s accounts &mdash; for sign-in,
+preferences, or other uses &mdash; can opt-in by declaring a manifest attribute,
+and the owner can review and manage those apps from the profile configuration
+settings.</p>
+
+<p>For developers, restricted profiles offer a new way to deliver more value and
+control to your users. You can implement <strong>app restrictions</strong>
+&mdash; content or capabilities controls that are supported by your app &mdash;
+and advertise them to tablet owners in the profile configuration settings.
+</p>
+
+<p>You can add app restrictions directly to the profile configuration settings
+using predefined boolean, select, and multi-select types. If you want more
+flexibility, you can even launch your own UI from profile configuration settings
+to offer any type of restriction you want. </p>
+
+<p>When your app runs in a profile, it can check for any restrictions configured
+by the owner and enforce them appropriately. For example, a media app
+might offer a restriction to let the owner set a maturity level for the profile.
+At run time, the app could check for the maturity setting and then manage
+content according to the preferred maturity level. </p>
+
+<p>If your app is not designed for use in restricted profiles, you can opt
+out altogether, so that your app can't be enabled in any restricted profile.</p>
+
+
+<h2 id="43-optimized-location">Optimized Location and Sensor Capabilities</h2>
+
+<p><a href="{@docRoot}google/play-services/index.html">Google Play services</a>
+offers advanced location APIs that you can use in your apps. Android 4.3
+<strong>optimizes these APIs</strong> on supported devices with new hardware and
+software capabilities that minimize use of the battery. </p>
+
+
+<div style="float:left;margin:22px 24px 36px 22px;width:250px;">
+<a href=""><img src="{@docRoot}images/google/gps-location.png" alt="" height="160" style="padding-right:1.5em;margin-bottom:0"></a>
+</div>
+
+<p><strong>Hardware geofencing</strong> optimizes for power efficiency by
+performing location computation in the device hardware, rather than in
+software. On devices that support hardware geofencing, Google Play services
+geofence APIs will be able to take advantage of this optimization to save
+battery while the device is moving. </p>
+
+<p><strong>Wi-Fi scan-only mode</strong> is a new platform optimization that
+lets users keep Wi-Fi scan on without connecting to a Wi-Fi network, to improve
+location accuracy while conserving battery. Apps that depend on Wi-Fi for
+location services can now ask users to enable scan-only mode from Wi-Fi
+advanced settings. Wi-Fi scan-only mode is not dependent on device hardware and
+is available as part of the Android 4.3 platform.</p>
+
+<p>New sensor types allow apps to better manage sensor readings. A <strong>game
+rotation vector</strong> lets game developers sense the device’s rotation
+without having to worry about magnetic interference. <strong>Uncalibrated
+gyroscope</strong> and <strong>uncalibrated magnetometer</strong> sensors report
+raw measurements as well as estimated biases to apps. </p>
+
+<p>The new hardware capabilities are already available on Nexus 7 (2013) and
+Nexus 4 devices, and any device manufacturer or chipset vendor can build them
+into their devices.</p>
+
+
+<h2 id="43-media">New Media Capabilities</h2>
+
+<h4 id="43-modular-drm">Modular DRM framework</h4>
+
+<p>To meet the needs of the next generation of media services, Android 4.3
+introduces a <strong>modular DRM framework</strong> that enables media application
+developers to more easily integrate DRM into their own streaming protocols, such
+as MPEG DASH (Dynamic Adaptive Streaming over HTTP, ISO/IEC 23009-1).</p>
+
+<p>Through a combination of new APIs and enhancements to existing APIs, the
+media DRM framework provides an <strong>integrated set of services</strong> for
+managing licensing and provisioning, accessing low-level codecs, and decoding
+encrypted media data. A new MediaExtractor API lets you get the PSSH metadata
+for DASH media. Apps using the media DRM framework manage the network
+communication with a license server and handle the streaming of encrypted data
+from a content library. </p>
+
+<h4 id="43-vp8-encoder">VP8 encoder</h4>
+
+<p>Android 4.3 introduces built-in support for <strong>VP8 encoding</strong>,
+accessible from framework and native APIs. For apps using native APIs, the
+platform includes <strong>OpenMAX 1.1.2 extension headers</strong> to support
+VP8 profiles and levels. VP8 encoding support includes settings for target
+bitrate, rate control, frame rate, token partitioning, error resilience,
+reconstruction and loop filters. The platform API introduces VP8 encoder support
+in a range of formats, so you can take advantage of the best format for your
+content. </p>
+
+<p>VP8 encoding is available in software on all compatible devices running
+Android 4.3. For highest performance, the platform also supports
+hardware-accelerated VP8 encoding on capable devices, such as Nexus 7 (2013),
+Nexus 4, and Nexus 10 devices.</p>
+
+<h4 id="43-surface">Video encoding from a surface</h4>
+
+<p>Starting in Android 4.3 you can use a surface as the input to a video
+encoder. For example, you can now direct a stream from an OpenGL ES surface
+to the encoder, rather than having to copy between buffers.</p>
+
+<h4 id="43-media-muxer">Media muxer</h4>
+
+<p>Apps can use new media muxer APIs to combine elementary audio and video
+streams into a single output file. Currently apps can multiplex a single MPEG-4
+audio stream and a single MPEG-4 video stream into a <strong>single MPEG-4 ouput
+file</strong>. The new APIs are a counterpart to the media demuxing APIs
+introduced in Android 4.2. </p>
+
+<h4 id="43-progress-scrubbing">Playback progress and scrubbing in remote control
+clients</h4>
+
+<p>Since Android 4.0, media players and similar applications have been able to
+offer playback controls from remote control clients such as the device lock
+screen, notifications, and remote devices connected over Bluetooth. Starting in
+Android 4.3, those applications can now also expose playback <strong>progress
+and speed</strong> through their remote control clients, and receive commands to
+jump to a specific <strong>playback position</strong>. </p>
+
+
+<h2 id="43-beautiful-apps">New Ways to Build Beautiful Apps</h2>
+
+
+<h3 id="43-notification-access">Access to notifications</h3>
+
+<p>Notifications have long been a popular Android feature because they let users
+see information and updates from across the system, all in one place. Now in
+Android 4.3, apps can <strong>observe the stream of notifications</strong> with the
+user's permission and display the notifications in any way they want, including
+sending them to nearby devices connected over Bluetooth. </p>
+
+<p>You can access notifications through new APIs that let you <strong>register a
+notification listener</strong> service and with permission of the user, receive
+notifications as they are displayed in the status bar. Notifications are
+delivered to you in full, with all details on the originating app, the post
+time, the content view and style, and priority. You can evaluate fields of
+interest in the notifications, process or add context from your app, and route
+them for display in any way you choose.</p>
+
+<p>The new API gives you callbacks when a notification is added, updated, and
+removed (either because the user dismissed it or the originating app withdrew it).
+You'll be able to launch any intents attached to the notification or its actions,
+as well as dismiss it from the system, allowing your app to provide a complete
+user interface to notifications.</p>
+
+<p><strong>Users remain in control</strong> of which apps can receive
+notifications. At any time, they can look in Settings to see which apps have
+notification access and <strong>enable or disable access</strong> as needed.
+Notification access is disabled by default &mdash; apps can use a new Intent to
+take the user directly to the Settings to enable the listener service after
+installation.</p>
+
+<h4 id="43-view-overlays">View overlays</h4>
+
+<p>You can now create <strong>transparent overlays</strong> on top of Views and
+ViewGroups to render a temporary View hierarchy or transient animation effects
+without disturbing the underlying layout hierarchy. Overlays are particularly
+useful when you want to create animations such as sliding a view outside of its
+container or dragging items on the screen without affecting the view
+hierarchy. </p>
+
+<h4 id="43-optical-bounds">Optical bounds layout mode</h4>
+
+<p>A new layout mode lets you manage the positioning of Views inside ViewGroups
+according to their <strong>optical bounds</strong>, rather than their clip
+bounds. Clip bounds represent a widget’s actual outer boundary, while the new
+optical bounds describe the where the widget appears to be, within the clip
+bounds. You can use the optical bounds layout mode to properly align widgets
+that use outer visual effects such as shadows and glows.</p>
+
+<h4 id="43-rotation-animation">Custom rotation animation types</h4>
+
+<p>Apps can now define the exit and entry animation types used on a window when the
+device is rotated. You can set window properties to enable
+<strong>jump-cut</strong>, <strong>cross-fade</strong>, or
+<strong>standard</strong> window rotation. The system uses the custom animation
+types when the window is fullscreen and is not covered by other windows.</p>
+
+<h4 id="43-screen-orientations">Screen orientation modes</h4>
+
+<p>Apps can set new orientation modes for Activities to ensure that they are
+displayed in the proper orientation when the device is flipped. Additionally,
+apps can use a new mode to <strong>lock the screen</strong> to its current
+orientation. This is useful for apps using the camera that want to
+<strong>disable rotation</strong> while shooting video. </p>
+
+<h4 id="43-quick-responses-intent">Intent for handling Quick Responses</h4>
+
+<p>Android 4.3 introduces a new public Intent that lets any app <strong>handle
+Quick Responses</strong> &mdash; text messages sent by the user in response to
+an incoming call, without needing to pick up the call or unlock the device. Your
+app can listen for the intent and send the message to the caller over your
+messaging system. The intent includes the recipient (caller) as well as the
+message itself. </p>
+
+
+<h2 id="43-intl">Support for International Users</h2>
+
+<div style="float:right;margin:22px 0px 0px 24px;width:380px;">
+<img src="{@docRoot}images/jb-rtl-arabic-n4.png" alt="" width="180" style="margin-bottom:0;">
+<img src="{@docRoot}images/jb-rtl-hebrew-n4.png" alt="" width="180" style="margin-bottom:0;padding-left:10px;">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">More parts of Android 4.3 are optimized for RTL languages.</p>
+</div>
+
+<h4 id="43-rtl">RTL improvements</h4>
+
+<p>Android 4.3 includes RTL performance enhancements and broader RTL support
+across framework UI widgets, including ProgressBar/Spinner and
+ExpandableListView. More debugging information visible through the
+<code>uiautomatorviewer</code> tool. In addition, more system UI components are
+now RTL aware, such as notifications, navigation bar and the Action Bar.</p>
+
+<p>To provide a better systemwide experience in RTL scripts, more default system
+apps now support RTL layouts, including Launcher, Quick Settings, Phone, People,
+SetupWizard, Clock, Downloads, and more.</p>
+
+<h4 id="43-localization">Utilities for localization</h4>
+
+<div style="float:right;margin:16px 12px 0px 32px;width:260px;clear:both;">
+<img src="{@docRoot}images/jb-pseudo-locale-zz.png" alt="" width="260" style="margin-bottom:0;">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">Pseudo-locales make it easier to test your app's localization.</p>
+</div>
+
+<p>Android 4.3 also includes new utilities and APIs for creating better RTL
+strings and testing your localized UIs. A new <strong>BidiFormatter</strong>
+provides a set of simple APIs for wrapping Unicode strings so that you can
+fine-tune your text rendering in RTL scripts. To let you use this utility more
+broadly in your apps, the BidiFormatter APIs are also now available for earlier
+platform versions through the Support Package in the Android SDK. </p>
+
+<p>To assist you with managing date formatting across locales, Android 4.3
+includes a new <strong>getBestAvaialbleDate()</strong> method that automatically
+generates the best possible localized form of a Unicode UTS date for a locale
+that you specify. It’s a convenient way to provide a more localized experience
+for your users. </p>
+
+<p>To help you test your app more easily in other locales, Android 4.3
+introduces <strong>pseudo-locales</strong> as a new developer option.
+Pseudo-locales simulate the language, script, and display characteristics
+associated with a locale or language group. Currently, you can test with a
+pseudo-locale for <strong>Accented English</strong>, which lets you see how your
+UI works with script accents and characters used in a variety of European
+languages. <!--To use the pseudo-locale, enable “Developer options” in Settings
+and then select Accented English from Language and Input settings. --></p>
+
+
+<h2 id="43-accessibility">Accessibility and UI Automation</h2>
+
+<p>Starting in Android 4.3, accessibility services can <strong>observe and
+filter key events</strong>, such as to handle keyboard shortcuts or provide
+navigation parity with gesture-based input. The service receives the events and
+can process them as needed before they are passed to the system or other
+installed apps.</p>
+
+<p>Accessibility services can declare <strong>new capability attributes</strong>
+to describe what their services can do and what platform features they use. For
+example, they can declare the capability to filter key events, retrieve window
+content, enable explore-by-touch, or enable web accessibility features. In some
+cases, services must declare a capability attribute before they can access
+related platform features. The system uses the service’s capability attributes
+to generate an opt-in dialog for users, so they can see and agree to the
+capabilities before launch.</p>
+
+<p>Building on the accessibility framework in Android 4.3, a new <strong>UI
+automation framework</strong> lets tests interact with the device’s UI by
+simulating user actions and introspecting the screen content. Through the UI
+automation framework you can perform basic operations, set rotation of the
+screen, generate input events, take screenshots, and much more. It’s a powerful
+way to automate testing in realistic user scenarios, including actions or
+sequences that span multiple apps.</p>
+
+
+<h2 id="43-enterprise-security">Enterprise and Security</h2>
+
+<h4 id="43-wpa2">Wi-Fi configuration for WPA2-Enterprise networks</h4>
+
+<p>Apps can now configure the <strong>Wi-Fi credentials</strong> they need for
+connections to <strong>WPA2 enterprise access points</strong>. Developers can
+use new APIs to configure Extensible Authentication Protocol (EAP) and
+Encapsulated EAP (Phase 2) credentials for authentication methods used in the
+enterprise. Apps with permission to access and change Wi-Fi can configure
+authentication credentials for a variety of EAP and Phase 2 authentication
+methods. </p>
+
+<h4 id="43-selinux">Android sandbox reinforced with SELinux</h4>
+
+<p>Android now uses <strong>SELinux</strong>, a mandatory access control (MAC)
+system in the Linux kernel to augment the UID based application sandbox.
+This protects the operating system against potential security vulnerabilities.</p>
+
+<h4 id="43-keychain">KeyChain enhancements</h4>
+
+<p>The KeyChain API now provides a method that allows applications to confirm
+that system-wide keys are bound to a <strong>hardware root of trust</strong> for
+the device. This provides a place to create or store private keys that
+<strong>cannot be exported</strong> off the device, even in the event of a root or
+kernel compromise.</p>
+
+<h4 id="43-keystore">Android Keystore Provider</h4>
+
+<p>Android 4.3 introduces a keystore provider and APIs that allow applications
+to create exclusive-use keys. Using the APIs, apps can create or store private
+keys that <strong>cannot be seen or used by other apps</strong>, and can be
+added to the keystore without any user interaction. </p>
+
+<p>The keystore provider provides the same security benefits that the KeyChain
+API provides for system-wide credentials, such as binding credentials to a
+device. Private keys in the keystore cannot be exported off the device.</p>
+
+<h4 id="43-seuid">Restrict Setuid from Android Apps</h4>
+
+<p>The <code>/system</code> partition is now mounted <code>nosuid</code> for
+zygote-spawned processes, preventing Android applications from executing
+<code>setuid</code> programs. This reduces root attack surface and likelihood of
+potential security vulnerabilities.</p>
+
+
+<h2 id="43-tools">New Ways to Analyze Performance</h2>
+
+<div style="float:right;margin:16px 6px 0px 32px;width:390px;">
+<img src="{@docRoot}images/jb-systrace.png" alt="" width="390" style="margin-bottom:0;">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">Systrace uses a new command syntax and lets you collect more types of profiling data.</p>
+</div>
+
+<h4 id="43-systrace">Enhanced Systrace logging</h4>
+
+<p>Android 4.3 supports an enhanced version of the <strong>Systrace</strong>
+tool that’s easier to use and that gives you access to more types of information
+to profile the performance of your app. You can now collect trace data from
+<strong>hardware modules</strong>, <strong>kernel functions</strong>,
+<strong>Dalvik VM</strong> including garbage collection, <strong>resources
+loading</strong>, and more. </p>
+
+<p>Android 4.3 also includes new Trace APIs that you can use in your apps to mark
+specific sections of code to trace using Systrace <strong>begin/end
+events</strong>. When the marked sections of code execute, the system writes the
+begin/end events to the trace log. There's minimal impact on the performance of
+your app, so timings reported give you an accurate view of what your app is
+doing.</p>
+
+<p>You can visualize app-specific events in a timeline in the Systrace output
+file and analyze the events in the context of other kernel and user space trace
+data. Together with existing Systrace tags, custom app sections can give you new
+ways to understand the performance and behavior of your apps.</p>
+
+<div style="float:right;margin:6px 0px 0px 32px;width:380px;">
+<img src="{@docRoot}images/jb-gpu-profile-clk-n4.png" alt="" width="180" style="margin-bottom:0;">
+<img src="{@docRoot}images/jb-gpu-profile-cal-n4.png" alt="" width="180" style="margin-bottom:0;padding-left:10px;">
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">On-screen GPU profiling in Android 4.3.</p>
+</div>
+
+<h4 id="43-gpu-profiling" >On-screen GPU profiling</h4>
+
+<p>Android 4.3 adds new developer options to help you analyze your app’s
+performance and pinpoint rendering issues on any device or emulator.</p>
+
+<p>In the <strong>Profile GPU rendering</strong> option you can now visualize
+your app’s effective framerate on-screen, while the app is running. You can
+choose to display profiling data as on-screen <strong>bar or line
+graphs</strong>, with colors indicating time spent creating drawing commands
+(blue), issuing the commands (orange), and waiting for the commands to complete
+(yellow). The system updates the on-screen graphs continuously, displaying a
+graph for each visible Activity, including the navigation bar and notification
+bar. </p>
+
+<p>A green line highlights the <strong>60ms threshold</strong> for rendering
+operations, so you can assess the your app’s effective framerate relative
+to a 60 fps goal. If you see operations that cross the green line, you
+can analyze them further using Systrace and other tools.</p>
+
+<p class="caution" style="clear:both">On devices running Android 4.2 and higher,
+developer options are hidden by default. You can reveal them at any time by
+tapping 7 times on <strong>Settings &gt; About phone &gt; Build number</strong>
+on any compatible Android device.</p>
+
+<h4 id="43-strictmode">StrictMode warning for file URIs</h4>
+
+<p>The latest addition to the StrictMode tool is a policy constraint that warns
+when your app exposes a <code>file://</code> URI to the system or another app.
+In some cases the receiving app may not have access to the <code>file://</code>
+URI path, so when sharing files between apps, a <code>content://</code> URI should
+be used (with the appropriate permission). This new policy helps you catch and fix
+such cases. If you’re looking for a convenient way to store and expose files to other
+apps, try using the <code>FileProvider</code> content provider that’s available
+in the <a href="{@docRoot}tools/support-library/index.html">Support Library</a>.</p>
+
+</div><!-- END ANDROID 4.3 -->
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
<!-- BEGIN ANDROID 4.2 -->
<div id="android-42" class="version-section">
@@ -75,7 +646,7 @@ style="white-space:nowrap;">Jelly Bean!</span></p>
new features for users and developers. This document provides a glimpse of what's new for
developers.
-<p>See the <a href="/about/versions/android-4.2.html">Android 4.2 APIs</a>
+<p>See the <a href="{@docRoot}about/versions/android-4.2.html">Android 4.2 APIs</a>
document for a detailed look at the new developer APIs.</p>
<p>Find out more about the new Jelly Bean features for users at <a
@@ -158,7 +729,7 @@ in a single-user environment. </p>
<div>
<img src="{@docRoot}images/jb-lock-calendar.png" alt="Calendar lock screen widget" width="280" height="543" style="padding-left:1em;margin-bottom:0">
</div>
-<p class="image-caption" style="padding:1.5em">You can extend <strong>app widgets</strong> to run on the lock screen, for instant access to your content.</p>
+<p class="img-caption" style="padding-top:1.5em;line-height:1.25em;">You can extend <strong>app widgets</strong> to run on the lock screen, for instant access to your content.</p>
</div>
<h3 id="42-lockscreen-widgets">Lock screen widgets</h3>