| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
| |
This patch gets rid of all compiler warnings for the
GPU emulation libraries when building on a Linux host.
Note that GLcommon/GLutils.h now provides two new functions
to perform 'safe' type casts between unsigned integers and
pointers: SafePointerFromUInt() and SafeUIntFromPointer().
Change-Id: I01c48bbd72f925d70eb9831f57e15815e687121f
|
|
|
|
|
|
|
| |
Breakage was introduced by https://android-review.googlesource.com/#/c/79332/
which modified the location of some headers referenced here.
Change-Id: I801ba2527386af0d6d2961f2c79f5db332a6d023
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| | |
bug: 10456411
Fix for internal bug
Change-Id: I85181d358f1844b25cc85fbaf5f64842d5ed6f22
|
|\ \ |
|
| |/
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The pointer returned by glGetString is owned by the GL context, so
when the GL context is destroyed it may become invalid. This happens
on Mesa, for example. Make/manage our own copy of the extension string
to use after destroying the context.
Bug: 9627179
Change-Id: I605536151ee64f50403546d0d38c5b5f1f27dd73
|
|\ \ |
|
| |/
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This is what the pixel format attribute lists in
MacPixelFormatsAttribs.m try to achieve, but despite this, alpha
is nonzero in every returned configuration on certain (all?)
machines (at least on 10.8.5 on a nvidia gpu). This means that
EGL won't return any configs at all with alpha == 0.
The default config chooser in GLSurfaceView requires a config with
alpha == 0. This means that previously, this view failed to start up
on the emulator on OS X, unless set up with a non-default config
chooser.
Change-Id: I2bf3e92a026c525a97d6746e491d920ce127787f
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Updating or deleting data associated with a buffer object was clearing
the name->data association, but not actually deallocating the data.
Thanks to manjian2006 for finding the bug and proposing the fix.
Bug: 60468
Change-Id: Ibabfb1bace8acdeb1a4bbe5bf922845d096a8d22
|
|\ \
| |/
|/|
| |
| |
| | |
* changes:
Ignore empty ranges
Fix rangeUnion return value in the successful case
|
| |
| |
| |
| | |
Change-Id: I0cccba6795e3b9709cc646f6fa55bb60e6446ea1
|
| |
| |
| |
| |
| |
| |
| | |
Even if the ranges can be merge rangeUnion was returning false.
Most probably this was a typo.
Change-Id: I4cf8a19bd701a8501c2d49cf0bfa996f9e12c02f
|
|\ \
| | |
| | |
| | |
| | |
| | | |
* changes:
ColorBuffer: Remove the y-invert Intel GPU bug workaround
EglMacApi: Use the right pbuffer texture target and format parameters
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
The fix in 57501158 makes sure pbuffers get initialized
properly. The previously incorrect pbuffer texture parameters
had different effects on different GPUs/drivers. On a Nvidia
GT 650M, the buffers were rendered properly but glReadPixels calls
were inverted, while Intel HD 3000/4000 seemed to get the rendering
inverted as well.
By passing proper pbuffer texture parameters, the bug (which in
itself was no driver bug but inconsistent behaviour when given
invalid parameters) vanishes.
This reverts the bug workaround parts of 9322c5cb (from
development.git).
Change-Id: Ibc38147967361cba6ba85cdf3b4e9a2e2ee6d881
|
| |/
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The target and format parameters are in the EGL parameter range
(and are stored in EGLints), while nsCreatePBuffer (which calls
NSOpenGLPixelBuffer initWithTextureTarget) takes GLenums.
This is pretty much similar to the same function in EglWindowsApi.cpp,
but contrary to that function, there's nothing similar to
WGL_NO_TEXTURE_ARB in initWithTextureTarget, so something has to be
specified in all cases.
Previously, the default EGL_NO_TEXTURE (0x305C) was passed through.
While this mostly worked just fine, it had the surprising hidden
side effect of using a vertically flipped coordinate system in
glReadPixels (with the origin being the top left corner instead
of the bottom left one, which is default in OpenGL).
This makes the EncodeDecodeTest media CTS test pass with surface
output on the emulator on Mac OS X. (This test renders the decoded
video to a pbuffer and checks individual pixel values using
glReadPixels.)
Change-Id: I21a2430ce6334a5e82ea3203c4d157f5bad1558d
|
|/
|
|
|
|
|
|
|
|
|
|
| |
The EGL specs say that eglChooseConfig doesn't update num_config
if it returns a failure (which is exactly what
Translator/EGL/EglImp.cpp does). Therefore, if this function
returned a failure (e.g. due to an unsupported egl attribute),
nConfigs was left untouched, meaning that the configs array
was left uninitialized but treated as if it was full of valid
configs.
Change-Id: I3809272298ea10d65dc939849d2e3c17d1158da6
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Instead of using a TLS destructor, RenderThreadInfo is now an automatic
variable of RenderThread::Main(), so is automatically destroyed on thread exit.
RenderThread::Main() now explicitly unbinds the thread's context and surfaces
from the FrameBuffer, ensuring that the thread has released all references
before it exits.
This fixes a bug where RenderThreadInfo's destructor was releasing the
references in the TLS destructor, which caused ColorBuffer's destructor to call
FrameBuffer::bind_locked() when the FrameBuffer lock wasn't held. By clearing
the references in FrameBuffer::bindContext(), locking around destruction
happens correctly.
Change-Id: I617cea838d5f352a597ccc6d3dbd8f9c08cb91bd
|
|/
|
|
|
|
|
| |
Fix a small bug that caused the emulator to crash when used with some
graphic libraries.
Change-Id: Ifb7e0b11a8302d0538632dac467d187dfcdfda47
|
|
|
|
|
|
|
|
|
| |
In cases when RenderThread failed to start, RenderServer would free the
stream first and then the thread. However, the thread itself also attemps to
free the stream and this caused a crash of the emulator in some corner
cases.
Change-Id: I2e508c37ab0a09c9261b30e59072bf1a44982dfe
|
|
|
|
|
|
| |
- lack of tlsDestruct causes resource leakage in linux
Change-Id: I6f5308fd00da06dbecd9246393021e3d72aa40c3
|
|
|
|
|
|
|
| |
- SocketStream is passed from RenderServer::Main, but it is not deleted.
- As a result, emulator will keep opening these sockets.
Change-Id: I4f2445855fc45a8d5f20f4d598e7021e8f3e000c
|
|
|
|
|
|
|
|
| |
- setting int should be OK even without lock as there is no synchronization in reader side
- dead-lock can happen inside constructor if the same error handler is already
set and if error happens inside constructor in place like XSync
Change-Id: I2f401067e0555ae092df23f57cc9ab054a1555d7
|
|
|
|
|
|
| |
These files are already in development/tools/emulator.
Change-Id: I58988ce49804583b06e7d93380c44ba800448216
|
|
|
|
|
| |
Bug: 39835
Change-Id: Ied3f43b76d2bb1bdba478f57122ec0ef4d967ae4
|
|
|
|
|
|
|
|
|
|
|
| |
This skin only defines the parts that can be used. Its
layout section is expected to be generated at runtime
by the emulator when the option -dynamic-skin is used.
The assets used to generate this skin are placed in the
assets folder.
Change-Id: Ib252ed6a7b1ef16c21c3d45bdc0c977c1ad42466
|
|
|
|
|
|
|
|
|
|
|
| |
EglOS::getDefaultDisplay() can return an invalid display, e.g. on X11
if $DISPLAY is not set. This is called from the EglGlobalInfo
constructor, which doesn't have a good way to indicate failure. So
instead EglGlobalInfo::addDisplay() checks that the display is valid
before wrapping it in a EglDisplay.
Bug: 7020498
Change-Id: Id18fc568dae5bff4b45b706f3322ae5e4785d95d
|
|
|
|
| |
Change-Id: I5412777820c8b0e691d07b10df348a739f92f291
|
|
|
|
|
|
|
|
|
|
| |
Function __dyld_func_lookup is deprecated and invisible in Mac
SDK 10.6+. Instruct linker to resolve it at run-time.
Related CL https://android-review.googlesource.com/#/c/37355/
fix the build, but cause run-time "Bus error".
Change-Id: Icf3ea7a0b8ac29c69482e372f34e0b2e364472d8
|
|
|
|
|
|
|
|
|
|
|
| |
When the emulator window has non-1.0 scaling, the scale is applied
when blitting the Android framebuffer to the host window. For the
OnPost callback, we were not only reading the image back from the
window (so post-scaling), we were also storing it in a 1.0-scaled
buffer. Now we read back from the Android framebuffer image, which is
always 1.0-scaled.
Change-Id: Ia9c974711a9a4b0b19f4b997f65ecc64481b4c6a
|
|
|
|
| |
Change-Id: I84133fb36d8f15ed33e6bcba2be158e43c903901
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously we used a hardcoded address (tcp port, unix pipe path,
etc.) for the OpenGLRender system. Multiple emulators would all try to
listen on the same address, with the system non-deterministically (?)
choosing which one accepted each new connection. This resulted in
frames going to the wrong emulator window, one emulator shutting down
another's OpenGL system, etc.
Now the OpenGLRender server requests an unused tcp port or derives a
path from the pid, and reports the address back to the emulator client
to use for future connections from the guest.
Change-Id: I6af2eac0c7f27670a3b6595772eebc7aa2b24688
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This leak has always been there, but normally only leaks one socket
per emulator instance. Worse, though, is that the socket is listening
for connections on a hardcoded port, so it prevents other emulators
from listening on that port. Since we now start the GL renderer
briefly in every emulator instance (for GL string collection) this
means only one emulator can run at a time, even if none are using GL
acceleration.
Even with this fix, a GL-accelerated emulator will prevent any other
emulator (accelerated or not) from starting, since it is listening on
the hardcoded port, and the new emulator will try to listen on and
connect to that port at least for GL string collection. That will be
fixed in a future change.
Bug: 33383
Change-Id: I62a8a67eb6afb6c53cb41a19d00b6449cf5e1abe
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Switch the min and mag filters for the ColorBuffer's texture object to
linear instead of nearest. This gives much better results when posting
a colorbuffer to a scaled framebuffer, which happens when the android
screen + emulator skin is too big to fit on the user's display.
The ColorBuffer's own texture object is only actually used as a
texture during post(); when used as a texture by the guest system
(e.g. by surfaceflinger composition) it's mapped through an EGLImage
to a guest-owned texture object, which has its own filter settings. So
we can just change the filter settings for the ColorBuffer texture
object at initialization without affecting anything else.
SDK Bug: 6721429
Change-Id: I8ec81125d076e0ed77a44f8b0dce412fa3cecabf
|
|
|
|
|
|
|
|
| |
ColorBuffer wasn't destroying its blit texture and associated
EGLImage, leaking one pair per Android gralloc buffer.
Change-Id: I2fa42d2ecbb654edca7b224bd002d7513a08a633
http://code.google.com/p/android/issues/detail?id=33445
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
See b/5680952 "Compilation warnings in etc1.cpp" for discussion.
This is a manual merge of an update that was made to
frameworks/native/opengl/libs/ETC1/etc1.cpp.
sdk/emulator/opengl/host/libs/Translator/GLcommon/etc1.cpp is an exact
copy of frameworks/native/opengl/libs/ETC1/etc1.cpp, so we might as well
keep the two versions in synch.
Bug: 5680952
Change-Id: Icf5d5ed2e7c5c79eb9677d210b1ff5fee507271d
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Because of the way the SDK and Android system images are branched,
host code that goes into the SDK tools can't live in the same
repository as code that goes into the system image. This change keeps
the emugl host code in sdk.git/emulator/opengl while moving the emugl
system code to development.git/tools/emulator/opengl.
A few changes were made beyond simply cloning the directories:
(a) Makefiles were modified to only build the relevant components. Not
doing so would break the build due to having multiple rule
definitions.
(b) Protocol spec files were moved from the guest encoder directories
to the host decoder directories. The decoder must support older
versions of the protocol, but not newer versions, so it makes
sense to keep the latest version of the protocol spec with the
decoder.
(c) Along with that, the encoder is now built from checked in
generated encoder source rather than directly from the protocol
spec. The generated code must be updated manually. This makes it
possible to freeze the system encoder version without freezing the
host decoder version, and also makes it very obvious when a
protocol changes is happening that will require special
backwards-compatibility support in the decoder/renderer.
(d) Host-only and system-only code were removed from the repository
where they aren't used.
(e) README and DESIGN documents were updated to reflect this split.
No actual source code was changed due to the above.
Change-Id: I70b576a70ac3dc94155f931508b152178f1e8cd5
|
|
|
|
|
|
|
|
| |
MIPS cannot handle unaligned accesses, so this patch changes the
direct assignment of ints/floats to using memcpy
Signed-Off-By: Bhanu Chetlapalli <bhanu@mips.com>
Change-Id: I82600dece8f48f718f73b49cdf831094bbfdcde5
|
|\ |
|
| |
| |
| |
| |
| |
| | |
Now Mac OS X SDK 10.6 is the minimal requirement.
Change-Id: Ic4ad91210048120965f13957376c5c581567cda4
|
| |
| |
| |
| |
| |
| | |
Also fix comment
Signed-Off-By: Bhanu Chetlapalli <bhanu@mips.com>
|
|/
|
|
|
| |
Change-Id: Ib39de7ccf512a81436e68ac1363010af3cb2d28a
Signed-Off-By: Bhanu Chetlapalli <bhanu@mips.com>
|
|
|
|
|
| |
Bug: 6515813
Change-Id: I738fc2663d81876dc75ad560fd08506b423a21bf
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| | |
Since per-frame readback is slow and clients don't need it on all the
time, this change allows the callback to be registered after
initialization, and allows it to be disabled later.
Change-Id: Ic73d4515d302a0981ee0c80b9e6f9ba5c84b82ae
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
emulator_renderer implicitly uses symbols defined by libX11.so
through intermediate libraries, which can cause dependency issues
if the intermediates drop the dependency. The linkers on distros
like fedora now explicilty disable such indirect dependency
resolution - which causes compilation failure.
More information is available at this URL
http://fedoraproject.org/wiki/UnderstandingDSOLinkChange
Signed-Off-By: Bhanu Chetlapalli <bhanu@mips.com>
Change-Id: If378fa76142cb6c8c7641d76802dcbc7691871d6
|
|
|
|
|
|
|
|
| |
Added OS X 10.8 to the conditional which includes dylib during compilation
on 10.7.
Change-Id: Id078e001fd52d82b345249fcf647e0a4802c1f89
Signed-off-by: Al Sutton <al@funkyandroid.com>
|
|
|
|
|
|
|
|
|
| |
This also changes the strings reported by the default OpenGL ES
1.1/2.0 to OpenGL translators so they include the strings from the
underlying OpenGL implementation. This will give more useful bug
reports and SDK deployment statistics.
Change-Id: Id2d231a4fe3c40157c24a63ec19785826e037fd3
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The emulator opengles.c file duplicated the function declarations from
libOpenglRenderer's render_api.h instead of including it directly.
This led to multiple bugs since the declarations didn't actually
match, but there was no way for the compiler or dynamic loader to
check this.
This change makes opengles.c include render_api.h to get function
pointer prototypes, and changes the prototypes/implementation as
necessary to make both sides actually match. It should be much more
difficult to introduce interface mismatch bugs now.
Two bugs this change would have prevented:
(a) The interface mismatch caused by inconsistent branching which led
to GPU acceleration crashing on Windows. With this change, we
would have caught the problem at compile time.
(b) The emulator verbose log has always been printing "Can't start
OpenGLES renderer?" even when the renderer started fine. This is
because the renderer was returning a bool (true == success) but
the emulator's declaration said it returned int, and the emulator
assumed 0 meant success. This difference in return type should now
be caught at compile time.
Change-Id: Iab3b6960e221edd135b515a166cf991b62bb60c9
|