| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
Change-Id: I278f796b3390e57f4309f215e4f37359a80f0e83
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- New public APIs to find out when a user goes to the foreground,
background, and is first initializing.
- New activity manager callback to be involved in the user switch
process, allowing other services to let it know when it is safe
to stop freezing the screen.
- Wallpaper service now implements this to handle its user switch,
telling the activity manager when it is done. (Currently this is
only handling the old wallpaper going away, we need a little more
work to correctly wait for the new wallpaper to get added.)
- Lock screen now implements the callback to do its user switch. It
also now locks itself when this happens, instead of relying on
some other entity making sure it is locked.
- Pre-boot broadcasts now go to all users.
- WallpaperManager now has an API to find out if a named wallpaper is
in use by any users.
Change-Id: I27877aef1d82126c0a1428c3d1861619ee5f8653
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This change is the initial check in of the screen magnification
feature. This feature enables magnification of the screen via
global gestures (assuming it has been enabled from settings)
to allow a low vision user to efficiently use an Android device.
Interaction model:
1. Triple tap toggles permanent screen magnification which is magnifying
the area around the location of the triple tap. One can think of the
location of the triple tap as the center of the magnified viewport.
For example, a triple tap when not magnified would magnify the screen
and leave it in a magnified state. A triple tapping when magnified would
clear magnification and leave the screen in a not magnified state.
2. Triple tap and hold would magnify the screen if not magnified and enable
viewport dragging mode until the finger goes up. One can think of this
mode as a way to move the magnified viewport since the area around the
moving finger will be magnified to fit the screen. For example, if the
screen was not magnified and the user triple taps and holds the screen
would magnify and the viewport will follow the user's finger. When the
finger goes up the screen will clear zoom out. If the same user interaction
is performed when the screen is magnified, the viewport movement will
be the same but when the finger goes up the screen will stay magnified.
In other words, the initial magnified state is sticky.
3. Pinching with any number of additional fingers when viewport dragging
is enabled, i.e. the user triple tapped and holds, would adjust the
magnification scale which will become the current default magnification
scale. The next time the user magnifies the same magnification scale
would be used.
4. When in a permanent magnified state the user can use two or more fingers
to pan the viewport. Note that in this mode the content is panned as
opposed to the viewport dragging mode in which the viewport is moved.
5. When in a permanent magnified state the user can use three or more
fingers to change the magnification scale which will become the current
default magnification scale. The next time the user magnifies the same
magnification scale would be used.
6. The magnification scale will be persisted in settings and in the cloud.
Note: Since two fingers are used to pan the content in a permanently magnified
state no other two finger gestures in touch exploration or applications
will work unless the uses zooms out to normal state where all gestures
works as expected. This is an intentional tradeoff to allow efficient
panning since in a permanently magnified state this would be the dominant
action to be performed.
Design:
1. The window manager exposes APIs for setting accessibility transformation
which is a scale and offsets for X and Y axis. The window manager queries
the window policy for which windows will not be magnified. For example,
the IME windows and the navigation bar are not magnified including windows
that are attached to them.
2. The accessibility features such a screen magnification and touch
exploration are now impemented as a sequence of transformations on the
event stream. The accessibility manager service may request each
of these features or both. The behavior of the features is not changed
based on the fact that another one is enabled.
3. The screen magnifier keeps a viewport of the content that is magnified
which is surrounded by a glow in a magnified state. Interactions outside
of the viewport are delegated directly to the application without
interpretation. For example, a triple tap on the letter 'a' of the IME
would type three letters instead of toggling magnified state. The viewport
is updated on screen rotation and on window transitions. For example,
when the IME pops up the viewport shrinks.
4. The glow around the viewport is implemented as a special type of window
that does not take input focus, cannot be touched, is laid out in the
screen coordiates with width and height matching these of the screen.
When the magnified region changes the root view of the window draws the
hightlight but the size of the window does not change - unless a rotation
happens. All changes in the viewport size or showing or hiding it are
animated.
5. The viewport is encapsulated in a class that knows how to show,
hide, and resize the viewport - potentially animating that.
This class uses the new animation framework for animations.
6. The magnification is handled by a magnification controller that
keeps track of the current trnasformation to be applied to the screen
content and the desired such. If these two are not the same it is
responsibility of the magnification controller to reconcile them by
potentially animating the transition from one to the other.
7. A dipslay content observer wathces for winodw transitions, screen
rotations, and when a rectange on the screen has been reqeusted. This
class is responsible for handling interesting state changes such
as changing the viewport bounds on IME pop up or screen rotation,
panning the content to make a requested rectangle visible on the
screen, etc.
8. To implement viewport updates the window manger was updated with APIs
to watch for window transitions and when a rectangle has been requested
on the screen. These APIs are protected by a signature level permission.
Also a parcelable and poolable window info class has been added with
APIs for getting the window info given the window token. This enables
getting some useful information about a window. There APIs are also
signature protected.
bug:6795382
Change-Id: Iec93da8bf6376beebbd4f5167ab7723dc7d9bd00
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The activity manager now keeps track of which users are running.
Initially, only user 0 is running.
When you switch to another user, that user is started so it is
running. It is only at this point that BOOT_COMPLETED is sent
for that user and it is allowed to execute anything.
You can stop any user except user 0, which brings it back to the
same state as when you first boot the device. This is also used
to be able to more cleaning delete a user, by first stopping it
before removing its data.
There is a new broadcast ACTION_USER_STOPPED sent when a user is
stopped; system services need to handle this like they currently
handle ACTION_PACKAGE_RESTARTED when individual packages are
restarted.
Change-Id: I89adbd7cbaf4a0bb72ea201385f93477f40a4119
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Split the DisplayManager into two parts. One part is bound
to a Context and takes care of Display compatibility and
caching Display objects on behalf of the Context. The other
part is global and takes care of communicating with the
DisplayManagerService, handling callbacks, and caching
DisplayInfo objects on behalf of the process.
Implemented support for enumerating Displays and getting
callbacks when displays are added, removed or changed.
Elaborated the roles of DisplayManagerService, DisplayAdapter,
and DisplayDevice. We now support having multiple display
adapters registered, each of which can register multiple display
devices and configure them dynamically.
Added an OverlayDisplayAdapter which is used to simulate
secondary displays by means of overlay windows. Different
configurations of overlays can be selected using a new
setting in the Developer Settings panel. The overlays can
be repositioned and resized by the user for convenience.
At the moment, all displays are mirrors of display 0 and
no display transformations are applied. This will be improved
in future patches.
Refactored the way that the window manager creates its threads.
The OverlayDisplayAdapter needs to be able to use hardware
acceleration so it must share the same UI thread as the Keyguard
and window manager policy. We now handle this explicitly as
part of starting up the system server. This puts us in a
better position to consider how we might want to share (or not
share) Loopers among components.
Overlay displays are disabled when in safe mode or in only-core
mode to reduce the number of dependencies started in these modes.
Change-Id: Ic2a661d5448dde01b095ab150697cb6791d69bb5
|
|
|
|
|
|
|
| |
Bug: 6987838
Now emma is only enabled for apks.
Change-Id: Id8d198467076a8dff705195a8e051f3fb00d5660
|
|
|
|
|
|
|
|
|
|
| |
Moved a bunch of methods from PackageManager to UserManager.
Fix launching of activities from recents to correct user.
Guest creation APIs
Change-Id: I0733405e6eb2829675665e225c759d6baa2b708f
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Themes: Fused Location, Geofencing, LocationRequest.
API changes
o Fused location is always returned when asking for location by Criteria.
o Fused location is never returned as a LocationProvider object, nor returned
as a provider String. This wouldn't make sense because the current API
design assumes that LocationProvider's have fixed properties (accuracy, power
etc).
o The fused location engine will tune itself based on the criteria passed
by applications.
o Deprecate LocationProvider. Apps should use fused location (via Criteria
class), instead of enumerating through LocationProvider objects. It is
also over-engineered: designed for a world with a plethora of location
providers that never materialized.
o The Criteria class is also over-engineered, with many methods that aren't
currently used, but for now we won't deprecate them since they may have
value in the future. It is now used to tune the fused location engine.
o Deprecate getBestProvider() and getProvider().
o Add getLastKnownLocation(Criteria), so we can return last known
fused locations.
o Apps with only ACCESS_COARSE_LOCATION _can_ now use the GPS, but the location
they receive will be fudged to a 1km radius. They can also use NETWORK
and fused locatoins, which are fudged in the same way if necessary.
o Totally deprecate Criteria, in favor of LocationRequest.
Criteria was designed to map QOS to a location provider. What we
really need is to map QOS to _locations_.
The death knell was the conflicting ACCURACY_ constants on
Criteria, with values 1, 2, 3, 1, 2. Yes not a typo.
o Totally deprecate LocationProvider.
o Deprecate test/mock provider support. They require a named provider,
which is a concept we are moving away from. We do not yet have a
replacement, but I think its ok to deprecate since you also
need to have 'allow mock locations' checked in developer settings.
They will continue to work.
o Deprecate event codes associated with provider status. The fused
provider is _always_ available.
o Introduce Geofence data object to provide an easier path fowards
for polygons etc.
Implementation changes
o Fused implementation: incoming (GPS and NLP) location fixes are given
a weight, that exponentially decays with respect to age and accuracy.
The half-life of age is ~60 seconds, and the half-life of accuracy is
~20 meters. The fixes are weighted and combined to output a fused
location.
o Move Fused Location impl into
frameworks/base/packages/FusedLocation
o Refactor Fused Location behind the IProvider AIDL interface. This allow us
to distribute newer versions of Fused Location in a new APK, at run-time.
o Introduce ServiceWatcher.java, to refactor code used for run-time upgrades of
Fused Location, and the NLP.
o Fused Location is by default run in the system server (but can be moved to
any process or pacakge, even at run-time).
o Plumb the Criteria requirements through to the Fused Location provider via
ILocation.sendExtraCommand(). I re-used this interface to avoid modifying the
ILocation interface, which would have broken run-time upgradability of the
NLP.
o Switch the geofence manager to using fused location.
o Clean up 'adb shell dumpsys location' output.
o Introduce config_locationProviderPackageNames and
config_overlay_locationProviderPackageNames to configure the default
and overlay package names for Geocoder, NLP and FLP.
o Lots of misc cleanup.
o Improve location fudging. Apply random vector then quantize.
o Hide internal POJO's from clients of com.android.location.provider.jar
(NLP and FLP). Introduce wrappers ProviderRequestUnbundled and
ProviderPropertiesUnbundled.
o Introduce ProviderProperties to collapse all the provider accuracy/
bearing/altitude/power plumbing (that is deprecated anyway).
o DELETE lots of code: DummyLocationProvider,
o Rename the (internal) LocationProvider to LocationProviderBase.
o Plumb pid, uid and packageName throughout
LocationManagerService#Receiver to support future features.
TODO: The FLP and Geofencer have a lot of room to be more intelligent
TODO: Documentation
TODO: test test test
Change-Id: Iacefd2f176ed40ce1e23b090a164792aa8819c55
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The purpose of this change is to remove direct reliance on
SurfaceFlinger for describing the size and characteristics of
displays.
This patch also starts to make a distinction between logical displays
and physical display devices. Currently, the window manager owns
the concept of a logical display whereas the new display
manager owns the concept of a physical display device.
Change-Id: I7e0761f83f033be6c06fd1041280c21500bcabc0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Change-Id: Ib13d5c77416e58161df0e04d7a15ec0dddbde8b5
Conflicts:
core/java/android/bluetooth/BluetoothInputDevice.java
Conflicts:
core/java/com/android/internal/app/ShutdownThread.java
services/java/com/android/server/SystemServer.java
Conflicts:
services/java/com/android/server/SystemServer.java
services/java/com/android/server/pm/ShutdownThread.java
|
|
|
|
| |
Change-Id: I22f35073ceb131d84df6b233d1b63d20fa1b4451
|
|
|
|
| |
Change-Id: Iadc79a425b4b6e12329d86dd2ac0782adcb0174d
|
|
|
|
|
|
|
| |
Add IBluetoothHeadsetPhone.aidl for a small service in Phone to get
phone state changes
Change-Id: I1015e4a69720c4e9cd18ae4236ccd0dbff2e1b2c
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
1. The window manager was not notifying a window when the latter
has been moved. This was causing incorrect coordinates of the
nodes reported to accessibility services. To workaround that
we have carried the correct window location when making a
call from the accessibility layer into a window. Now the
window manager notifies the window when it is moved and the
workaround is no longer needed. This change takes it out.
2. The left and right in the attach info were not updated properly
after a report that the window has moved.
3. The accessibility manager service was calling directly methods
on the window manager service without going through the interface
of the latter. This leads to unnecessary coupling and in the
long rung increases system complexity and reduces maintability.
bug:6623031
Change-Id: Iacb734b1bf337a47fad02c827ece45bb2f53a79d
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
These have been created to reduce the size and complexity
of frameworks/base.
mms-common was created by moving all of
frameworks/base/core/java/com/google/android/mms
to:
frameworks/opt/mms
telephony-common was created by moving some of
frameworks/base/telephony
to:
frameworks/opt/telephony
Change-Id: If6cb3c6ff952767fc10210f923dc0e4b343cd4ad
|
|
|
|
| |
Change-Id: I30d04dcde5cd505959d94c274634018b3602cb26
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Extend MediaRouter.UserRouteInfo to enable setting playback
information, which includes volume. When the user route instance
has a RemoteControlClient, forward any playback information to it.
Enable specifying a callback to be notified of volume events
on the route.
Extend MediaRouter.RouteInfo to enable retrieving playback
information.
Update RemoteControlClient javadoc to reflect which parts of the
API are not intended to be made public.
Change-Id: I59d728eb61747af6c8c89d53f0faeb07940594c3
|
|
|
|
|
|
|
|
|
|
|
| |
The AudioService now has an API to call to get the currently
connected devices, and later reports of changes in connection
state. The information includes the name of the bluetooth
device if one is connected for display to the user, and states
for all of the pluggable devices. No longer requires a Bluetooth
permission to keep the routes updated.
Change-Id: I81ca421c60592fbc1592477d59bf1c9d1b64954a
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
1. The initial design was to have some accessibility gestures
being handled by the system if the gesture handling access
service does not consume the gesture. However, we are not
sure what a good default is and once we add a default handler
we cannot remove it since people may rely on it. Thus, we
take the simples approach and let the accessibility service
handle the gestures. If no gestures are handled the system
will work in explore by touch as before.
bug:5932640
Change-Id: I865a83549fa03b0141d27ce9713e9b7bb45a57b4
|
|
|
|
|
| |
Bug: 6427830
Change-Id: I39451bb1e1d4a8d976ed1c671234f0c8c61658dd
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Introduce IRingtonePlayer, which handles playback for both Ringtone
objects and Notifications. SystemUI now hosts this player, which it
registers with AudioService. It also keeps MediaPlayer instances
warm, and cleans them up after stop() or Binder death.
Move both Ringtone and NotificationManagerService to play back audio
through this new interface.
Bug: 6376128, 6350773
Change-Id: I1dcb86d16ee3c4f07cdb2248d33dcff4ead3609a
|
|
|
|
| |
Change-Id: I6178b96896ffbb3323210f93784a65d724a3e694
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
1. An accessibility service has to explicitly opt in to be notified
for gestures by the system. There is only one accessibility service
that handles gestures and in case it does not handle a gesture
the system performs default handling. This default handling ensures
that we have gesture navigation even if no accessibility service
would like to participate/customize the interaction model.
bug:5932640
Change-Id: Id8194293bd94097b455e9388b68134a45dc3b8fa
|
|
|
|
|
|
| |
Bug 5826326
Change-Id: I67d18cacf0c4e908ec41dbed483314ece8b72ceb
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Usefulness: Keep track of the current user location in the screen when
traversing the it. Enabling structural and directional
navigation over all elements on the screen. This enables
blind users that know the application layout to efficiently
locate desired elements as opposed to try touch exploring the
region where the the element should be - very tedious.
Rationale: There are two ways to implement accessibility focus One is
to let accessibility services keep track of it since they
have access to the screen content, and another to let the view
hierarchy keep track of it. While the first approach would
require almost no work on our part it poses several challenges
which make it a sub-optimal choice. Having the accessibility focus
in the accessibility service would require that service to scrape
the window content every time it changes to sync the view tree
state and the accessibility focus location. Pretty much the service
will have to keep an off screen model of the screen content. This
could be quite challenging to get right and would incur performance
cost for the multiple IPCs to repeatedly fetch the screen content.
Further, keeping virtual accessibility focus (i.e. in the service)
would require sync of the input and accessibility focus. This could
be challenging to implement right as well. Also, having an unlimited
number of accessibility services we cannot guarantee that they will
have a proper implementation, if any, to allow users to perform structural
navigation of the screen content. Assuming two accessibility
services implement structural navigation via accessibility focus,
there is not guarantee that they will behave similarly by default,
i.e. provide some standard way to navigate the screen content.
Also feedback from experienced accessibility researchers, specifically
T.V Raman, provides evidence that having virtual accessibility focus
creates many issues and it is very hard to get right.
Therefore, keeping accessibility focus in the system will avoid
keeping an off-screen model in accessibility services, it will always
be in sync with the state of the view hierarchy and the input focus.
Also this will allow having a default behavior for traversing the
screen via this accessibility focus that is consistent in all
accessibility services. We provide accessibility services with APIs to
override this behavior but all of them will perform screen traversal
in a consistent way by default.
Behavior: If accessibility is enabled the accessibility focus is the leading one
and the input follows it. Putting accessibility focus on a view moves
the input focus there. Clearing the accessibility focus of a view, clears
the input focus of this view. If accessibility focus is on a view that
cannot take input focus, then no other view should have input focus.
In accessibility mode we initially give accessibility focus to the topmost
view and no view has input focus. This ensures consistent behavior accross
all apps. Note that accessibility focus can move hierarchically in the
view tree and having it at the root is better than putting it where the
input focus would be - at the first input focusable which could be at
an arbitrary depth in the view tree. By default not all views are reported
for accessibility, only the important ones. A view may be explicitly labeled
as important or not for accessibility, or the system determines which one
is such - default. Important views for accessibility are all views that are
not dumb layout managers used only to arrange their chidren. Since the same
content arrangement can be obtained via different combintation of layout
managers, such managers cannot be used to reliably determine the application
structure. For example, a user should see a list as a list view with several
list items and each list item as a text view and a button as opposed to seeing
all the layout managers used to arrange the list item's content.
By default only important for accessibility views are regared for accessibility
purposes. View not regarded for accessibility neither fire accessibility events,
nor are reported being on the screen. An accessibility service may request the
system to regard all views. If the target SDK of an accessibility services is
less than JellyBean, then all views are regarded for accessibility.
Note that an accessibility service that requires all view to be ragarded for
accessibility may put accessibility focus on any view. Hence, it may implement
any navigational paradigm if desired. Especially considering the fact that
the system is detecting some standard gestures and delegates their processing
to an accessibility service. The default implementation of an accessibility
services performs the defualt navigation.
bug:5932640
bug:5605641
Change-Id: Ieac461d480579d706a847b9325720cb254736ebe
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This change allows the InputManager to keep track of what input
devices are registered with the system and when they change.
It needs to do this so that it can properly clear its cache of
input device properties (especially the key map!) when changes
occur.
Added new API so that applications can register listeners for
input device changes.
Fixed a minor bug in EventHub where it didn't handle EPOLLHUP
properly so it would spam the log about unsupposed epoll events
until inotify noticed that the device was gone and removed it.
Change-Id: I937d8c601f7185d4299038bce6a2934fe4fdd2b3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Rather than normal Activities (which have a host of problems
when used for this purpose), screen savers are now a
special kind of Service that can add views to its own
special window (TYPE_DREAM, in the SCREENSAVER layer).
Dreams are now launched by the power manager; whenever it is
about to turn the screen off, it asks the window manager if
it wants to run a screen saver instead. (http://b/5677408)
Also, the new config_enableDreams bool allows the entire
feature to be switched on or off in one place. It is
currently switched off (and the APIs are all @hidden).
Change-Id: Idfe9d430568471d15f4b463cb70586a899a331f7
|
|
|
|
|
|
|
|
|
| |
Users outside system_server now explicitly communicate their
lifecycle, which keeps a strong-reference chain to any fully loaded
NetworkStatsCollection histories.
Bug: 6236498
Change-Id: I8e22739b6e89a626b676967a736d7117fd000778
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Extracted the input system from the window manager service into
a new input manager service. This will make it easier to
offer new input-related features to applications.
Cleaned up the input manager service JNI layer somewhat to get rid
of all of the unnecessary checks for whether the input manager
had been initialized. Simplified the callback layer as well.
Change-Id: I3175d01307aed1420780d3c093d2694b41edf66e
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Move all lockscreen related settings to LockSettingsService.
LockPatternUtils uses this through IPC instead of Secure settings.
Migrate old settings to new database managed by LockSettingsService.
Passwords and patterns are stored in a new per-user location, except
for the primary user, for backward compatibility.
KeyguardViewMediator and LockPatternKeyguardView listen for changes
to user and updates the lockscreen.
Settings provider will look for Lock settings in the LockSettings
service now for the entries that used to be stored in Settings.
Change-Id: I956cd5b95e2d9d45a6401af7e270e6a5aa2dcc98
|
|/
|
|
| |
Change-Id: I53c0b7ebfd75e520ebb7553612f1aa8413b6b79b
|
|\
| |
| |
| |
| |
| |
| | |
API Demo" into ics-mr1
* commit 'af71f328e6b03cc6b7111b87b625c3f9cb3987e4':
sdk doc change: Added KeyChain API Demo
|
| |
| |
| |
| | |
Change-Id: I8ea879bf30f933c745e33dafa6591fce77251eb6
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This is part of the multi-project commit to move the filter-framework
from system/media/mca to frameworks/base/media/mca.
Note that the filter-framework will soon be replaced with a refactored
version currently under API review (also to go under frameworks/base).
This move is done now to unblock the PDK efforts.
Change-Id: I9f42be5a12a9e8157512be11f04e38e4548970be
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
An "UpdateLock" works similarly to a wake lock in API: the caller is
providing a hint to the OS that now is not a good time to interrupt
the user/device in order to do intrusive work like applying OTAs.
This is particularly important for headless or kiosk-like products
where ordinarily the update process will be automatically scheduled
and proceed without user or administrator intervention.
UpdateLocks require that the caller hold the new signatureOrSystem
permission android.permission.UPDATE_LOCK. acquire() and release()
will throw security exceptions if this is not the case.
The "is now convenient?" state is expressed to interested parties
by way of a sticky broadcast sent only to registered listeners. The
broadcast is protected; only the system can send it, so listeners
can trust it to be accurate. The broadcast intent also includes a
timestamp (System.currentTimeMillis()) to help inform listeners that
wish to implement scheduling policies based on when the device became
idle.
The API change here is a tiny one: a dump(PrintWriter) method has been
added to the TokenWatcher class to facilitate getting information out
of it for dumpsys purposes. UpdateLock itself is still @hide.
Bug 5543442
Change-Id: I3709c831fc1883d7cb753cd2d3ee8e10a61e7e48
|
|\ \ |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
SerialManager: provides access to serial ports
SerialPort: for reading and writing data to and from serial ports
IO with both array based and direct ByteBuffers is supported.
Accessing serial ports requires android.permission.SERIAL_PORT permission
Each platform must configure list of supported serial ports in the
config_serialPorts resource overlay
(this is needed to prevent apps from accidentally accessing the bluetooth
or other system UARTs).
In addition, the platform uevent.rc file must set the owner to the
/dev/tty* files to "system" so the framework can access the port.
Signed-off-by: Mike Lockwood <lockwood@android.com>
|
|/ /
| |
| |
| | |
Change-Id: If4a94bfd4a033748eb13e8f3ff25e24382746778
|
| |
| |
| |
| |
| | |
Bug: 5943637
Change-Id: I12a339f285f4db58e79acb5fd8ec2fc1acda5265
|
|\ \ |
|
| |/
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Added new API to enable cancelation of SQLite and content provider
queries by means of a CancelationSignal object. The application
creates a CancelationSignal object and passes it as an argument
to the query. The cancelation signal can then be used to cancel
the query while it is executing.
If the cancelation signal is raised before the query is executed,
then it is immediately terminated.
Change-Id: If2c76e9a7e56ea5e98768b6d4f225f0a1ca61c61
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
o Add NdefRecord.toMimeType()
Maps the record to a MIME type
o Add NdefRecord.toUri()
Maps the record to a URI
o Add hidden NfcAdapter.dispatch()
Helps test the dispatch path.
o Modify createMime(), createUri() and createExternal():
Do not try and strictly follow RFC requirements for URI or MIME content
types. This just leads to heartbreak - the RFC requirements are too strict.
For example RFC1341 forbids the use of '.' in a MIME type, however this is in
common use in types such as "application/vnd.companyname". I think the best
approach is to only remove 'obvious' whitespace issues, and to convert
uppercase to lowercase as per Android guidelines.
Change-Id: Id686f5f3b05b2dceafad48e1cfcbdb2b3890b854
|
|\
| |
| |
| |
| |
| |
| | |
Experience
* commit '3d672e1e789e171e913605945efe95a477ab0505':
Android U Class: Monetization / Ads without Compromising User Experience
|
| |
| |
| |
| | |
Change-Id: I1aaddc6bbbc9fc2b53119893f2b70260f1b0d9a7
|
|\ \
| |/
| |
| |
| |
| |
| | |
Enterprise. This class uses a sample app.
* commit 'aed4ced6556383483209f454c9e4872e8ad28ebf':
Android U Class: Developing Android Applications for the Enterprise. This class uses a sample app.
|
| |
| |
| |
| |
| |
| | |
This class uses a sample app.
Change-Id: I508edbb98c8e9dea1d3ea26c8dcd9da213330d87
|
| |
| |
| |
| | |
Change-Id: I9a9b13b9c7b8ae3011772a62735c788762b45f7f
|
|\ \
| |/
| |
| |
| | |
* commit '801fda548c719a8618e7f4cd64cad8404b0970b9':
AndroidU lesson on designing for multiple screens.
|