diff options
author | Tim Murray <timmurray@google.com> | 2013-03-27 21:32:31 +0000 |
---|---|---|
committer | Android (Google) Code Review <android-gerrit@google.com> | 2013-03-27 21:32:31 +0000 |
commit | 8ff0201ad0354b6c43aebac6075298ee847c42ef (patch) | |
tree | 364cd6fecc6be3ea5192ce9cdfaa1f2fe9b4cc97 /docs | |
parent | 8565520b85bd42bc57adc551c4a72bc3f5b0025a (diff) | |
parent | 275812c2bb09a82efd7ea8d90b57c99ff51eab0a (diff) | |
download | frameworks_base-8ff0201ad0354b6c43aebac6075298ee847c42ef.zip frameworks_base-8ff0201ad0354b6c43aebac6075298ee847c42ef.tar.gz frameworks_base-8ff0201ad0354b6c43aebac6075298ee847c42ef.tar.bz2 |
Merge "Revert "Remove all public mention of RS graphics from docs."" into jb-mr2-dev
Diffstat (limited to 'docs')
-rw-r--r-- | docs/html/about/versions/android-4.0.jd | 54 | ||||
-rw-r--r-- | docs/html/guide/topics/graphics/renderscript/graphics.jd | 994 |
2 files changed, 1041 insertions, 7 deletions
diff --git a/docs/html/about/versions/android-4.0.jd b/docs/html/about/versions/android-4.0.jd index 868227a..f2fd0c4 100644 --- a/docs/html/about/versions/android-4.0.jd +++ b/docs/html/about/versions/android-4.0.jd @@ -122,7 +122,7 @@ to invoke an action that indicates the user wants to add a contact to a social n receiving the app uses it to invite the specified contact to that social network. Most apps will be on the receiving-end of this operation. For example, the built-in People app invokes the invite intent when the user selects "Add connection" for a specific -social app that's listed in a person's contact details.</p> +social app that's listed in a person's contact details.</p> <p>To make your app visible as in the "Add connection" list, your app must provide a sync adapter to sync contact information from your social network. You must then indicate to the system that your @@ -327,7 +327,7 @@ image (usually done by calling the {@link android.opengl.GLES20#glTexImage2D glT function). You may provide multiple mipmap levels. If the output texture has not been bound to a texture image, it will be automatically bound by the effect as a {@link android.opengl.GLES20#GL_TEXTURE_2D} and with one mipmap level (0), which will have the same -size as the input.</p> +size as the input.</p> <p>All effects listed in {@link android.media.effect.EffectFactory} are guaranteed to be supported. However, some additional effects available from external libraries are not supported by all devices, @@ -452,7 +452,7 @@ android.hardware.Camera.Parameters#getMaxNumDetectedFaces()} and ensure the retu value is greater than zero. Also, some devices may not support identification of eyes and mouth, in which case, those fields in the {@link android.hardware.Camera.Face} object will be null.</p> - + <h4>Focus and metering areas</h4> <p>Camera apps can now control the areas that the camera uses for focus and for metering white @@ -495,7 +495,7 @@ added in API level 9.</p> <h4>Other camera features</h4> -<ul> +<ul> <li>While recording video, you can now call {@link android.hardware.Camera#takePicture takePicture()} to save a photo without interrupting the video session. Before doing so, you should call {@link android.hardware.Camera.Parameters#isVideoSnapshotSupported} to be sure the hardware @@ -775,7 +775,7 @@ methods that allow the view and its parents to add more contextual information t <li>When invoked, the {@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} and {@link android.view.View#sendAccessibilityEventUnchecked sendAccessibilityEventUnchecked()} methods defer -to {@link android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()}. +to {@link android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()}. <p>Custom implementations of {@link android.view.View} might want to implement {@link android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()} to attach additional accessibility information to the {@link @@ -1022,6 +1022,46 @@ roaming or connected to Wi-Fi.</p> +<h3 id="RenderScript">RenderScript</h3> + +<p>Three major features have been added to RenderScript:</p> + +<ul> + <li>Off-screen rendering to a framebuffer object</li> + <li>Rendering inside a view</li> + <li>RS for each from the framework APIs</li> +</ul> + +<p>The {@link android.renderscript.Allocation} class now supports a {@link +android.renderscript.Allocation#USAGE_GRAPHICS_RENDER_TARGET} memory space, which allows you to +render things directly into the {@link android.renderscript.Allocation} and use it as a framebuffer +object.</p> + +<p>{@link android.renderscript.RSTextureView} provides a means to display RenderScript graphics +inside of a {@link android.view.View}, unlike {@link android.renderscript.RSSurfaceView}, which +creates a separate window. This key difference allows you to do things such as move, transform, or +animate an {@link android.renderscript.RSTextureView} as well as draw RenderScript graphics inside +a view that lies within an activity layout.</p> + +<p>The {@link android.renderscript.Script#forEach Script.forEach()} method allows you to call +RenderScript compute scripts from the VM level and have them automatically delegated to available +cores on the device. You do not use this method directly, but any compute RenderScript that you +write will have a {@link android.renderscript.Script#forEach forEach()} method that you can call in +the reflected RenderScript class. You can call the reflected {@link +android.renderscript.Script#forEach forEach()} method by passing in an input {@link +android.renderscript.Allocation} to process, an output {@link android.renderscript.Allocation} to +write the result to, and a {@link android.renderscript.FieldPacker} data structure in case the +RenderScript needs more information. Only one of the {@link android.renderscript.Allocation}s is +necessary and the data structure is optional.</p> + + + + + + + + + <h3 id="Enterprise">Enterprise</h3> <p>Android 4.0 expands the capabilities for enterprise application with the following features.</p> @@ -1718,7 +1758,7 @@ href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targe notes for more information.</li> </ul> </dd> - + <dt><a href="android-3.1.html">Android 3.1</a></dt> <dd> <ul> @@ -1741,7 +1781,7 @@ android.net.rtp} documentation.</li> notes for many more new APIs.</li> </ul> </dd> - + <dt><a href="android-3.2.html">Android 3.2</a></dt> <dd> <ul> diff --git a/docs/html/guide/topics/graphics/renderscript/graphics.jd b/docs/html/guide/topics/graphics/renderscript/graphics.jd new file mode 100644 index 0000000..58676ea --- /dev/null +++ b/docs/html/guide/topics/graphics/renderscript/graphics.jd @@ -0,0 +1,994 @@ +page.title=Graphics +parent.title=Renderscript +parent.link=index.html + +@jd:body + + <div id="qv-wrapper"> + <div id="qv"> + <h2>In this document</h2> + + <ol> + <li> + <a href="#creating-graphics-rs">Creating a Graphics Renderscript</a> + <ol> + <li><a href="#creating-native">Creating the Renderscript file</a></li> + <li><a href="#creating-entry">Creating the Renderscript entry point class</a></li> + <li><a href="#creating-view">Creating the view class</a></li> + <li><a href="#creating-activity">Creating the activity class</a></li> + </ol> + </li> + <li> + <a href="#drawing">Drawing</a> + <ol> + <li><a href="#drawing-rsg">Simple drawing</a></li> + <li><a href="#drawing-mesh">Drawing with a mesh</a></li> + </ol> + </li> + <li> + <a href="#shaders">Shaders</a> + <ol> + <li><a href="#shader-bindings">Shader bindings</a></li> + <li><a href="#shader-sampler">Defining a sampler</a></li> + </ol> + </li> + <li> + <a href="#fbo">Rendering to a Framebuffer Object</a> + </li> + </ol> + + <h2>Related Samples</h2> + + <ol> + <li><a href="{@docRoot}resources/samples/RenderScript/Balls/index.html">Balls</a></li> + + <li><a href="{@docRoot}resources/samples/RenderScript/Fountain/index.html">Fountain</a></li> + + <li><a href="{@docRoot}resources/samples/RenderScript/FountainFbo/index.html">FountainFbo</a></li> + + <li><a href="{@docRoot}resources/samples/RenderScript/HelloWorld/index.html">Hello +World</a></li> + + <li><a +href="{@docRoot}resources/samples/RenderScript/MiscSamples/index.html">Misc Samples</a></li> + </ol> + </div> + </div> + + <p>Renderscript provides a number of graphics APIs for rendering, both at the Android + framework level as well as at the Renderscript runtime level. For instance, the Android framework APIs let you + create meshes and define shaders to customize the graphical rendering pipeline. The native + Renderscript graphics APIs let you draw the actual meshes to render your scene. You need to + be familiar with both APIs to appropriately render graphics on an Android-powered device.</p> + + <h2 id="creating-graphics-rs">Creating a Graphics Renderscript</h2> + + <p>Renderscript applications require various layers of code, so it is useful to create the following + files to help keep your application organized:</p> + + <dl> + <dt>The Renderscript <code>.rs</code> file</dt> + + <dd>This file contains the logic to do the graphics rendering.</dd> + + <dt>The Renderscript entry point <code>.java</code> class</dt> + + <dd>This class allows the view class to interact with the code defined in the <code>.rs</code> + file. This class contains a Renderscript object (instance of + <code>ScriptC_<em>renderscript_file</em></code>), which allows your Android framework code to + call the Renderscript code. In general, this class does much of the setup for Renderscript + such as shader and mesh building and memory allocation and binding. The SDK samples follow the + convention of naming this file ActivityRS.java, + where Activity is the name of your main activity class.</dd> + + <dt>The view <code>.java</code> class</dt> + + <dd>This class extends {@link android.renderscript.RSSurfaceView} or {@link + android.renderscript.RSTextureView} to provide a surface to render on. A {@link + android.renderscript.RSSurfaceView} consumes a whole window, but a {@link + android.renderscript.RSTextureView} allows you to draw Renderscript graphics inside of a + view and add it to a {@link android.view.ViewGroup} alongside + other views. In this class, you create a {@link android.renderscript.RenderScriptGL} context object + with a call to {@link android.renderscript.RSSurfaceView#createRenderScriptGL + RSSurfaceView.createRenderscriptGL()} or {@link android.renderscript.RSTextureView#createRenderScriptGL + RSTextureView.createRenderscriptGL()}. The {@link android.renderscript.RenderScriptGL} context object + contains information about the current rendering state of Renderscript such as the vertex and + fragment shaders. You pass this context object to the Renderscript entry point class, so that + class can modify the rendering context if needed and bind the Renderscript code to the context. Once bound, + the view class can use the Renderscript code to display graphics. + The view class should also implement callbacks for events inherited from {@link + android.view.View}, such as {@link android.view.View#onTouchEvent onTouchEvent()} and {@link + android.view.View#onKeyDown onKeyDown()} if you want to detect these types of user interactions. + The SDK samples follow the convention of naming this file ActivityView.java, + where Activity is the name of your main activity class</dd> + + <dt>The activity <code>.java</code> class</dt> + + <dd>This class is the main activity class and sets your {@link android.renderscript.RSSurfaceView} as the main content + view for this activity or uses the {@link android.renderscript.RSTextureView} alongside other views.</dd> + </dl> + <p>Figure 1 describes how these classes interact with one another in a graphics Renderscript:</p> + + <img src="{@docRoot}images/rs_graphics.png"> + <p class="img-caption"><strong>Figure 1.</strong> Graphics Renderscript overview</p> + + + <p>The following sections describe how to create an application that uses a graphics Renderscript by using + the <a href="{@docRoot}resources/samples/RenderScript/Fountain/index.html">Renderscript Fountain + sample</a> that is provided in the SDK as a guide (some code has been modified from its original + form for simplicity).</p> + + <h3 id="creating-native">Creating the Renderscript file</h3> + + <p>Your Renderscript code resides in <code>.rs</code> and <code>.rsh</code> (headers) files in the + <code><project_root>/src/</code> directory. This code contains the logic to render your + graphics and declares all other necessary items such as variables, structs, + and pointers. Every graphics <code>.rs</code> file generally contains the following items:</p> + + <ul> + <li>A pragma declaration (<code>#pragma rs java_package_name(<em>package.name</em>)</code>) that declares + the package name of the <code>.java</code> reflection of this Renderscript.</li> + + <li>A pragma declaration (<code>#pragma version(1)</code>) that declares the version of Renderscript that + you are using (1 is the only value for now).</li> + + <li>A <code>#include "rs_graphics.rsh"</code> declaration.</li> + + <li>A <code>root()</code> function. This is the main worker function for your Renderscript and + calls Renderscript graphics functions to render scenes. This function is called every time a + frame refresh occurs, which is specified as its return value. A <code>0</code> (zero) specified for + the return value says to only render the frame when a property of the scene that you are + rendering changes. A non-zero positive integer specifies the refresh rate of the frame in + milliseconds. + + <p class="note"><strong>Note:</strong> The Renderscript runtime makes its best effort to + refresh the frame at the specified rate. For example, if you are creating a live wallpaper + and set the return value to 20, the Renderscript runtime renders the wallpaper at 50fps if it has just + enough or more resources to do so. It renders as fast as it can if not enough resources + are available.</p> + + <p>For more information on using the Renderscript graphics functions, see the <a href= + "#drawing">Drawing</a> section.</p> + </li> + + <li>An <code>init()</code> function. This allows you to do initialization of your + Renderscript before the <code>root()</code> function runs, such as assigning values to variables. This + function runs once and is called automatically when the Renderscript starts, before anything + else in your Renderscript. Creating this function is optional.</li> + + <li>Any variables, pointers, and structures that you wish to use in your Renderscript code (can + be declared in <code>.rsh</code> files if desired)</li> + </ul> + + <p>The following code shows how the <code>fountain.rs</code> file is implemented:</p> + <pre> +#pragma version(1) + +// Tell which java package name the reflected files should belong to +#pragma rs java_package_name(com.example.android.rs.fountain) + +//declare shader binding +#pragma stateFragment(parent) + +// header with graphics APIs, must include explicitly +#include "rs_graphics.rsh" + +static int newPart = 0; + +// the mesh to render +rs_mesh partMesh; + +// the point representing where a particle is rendered +typedef struct __attribute__((packed, aligned(4))) Point { + float2 delta; + float2 position; + uchar4 color; +} Point_t; +Point_t *point; + +// main worker function that renders particles onto the screen +int root() { + float dt = min(rsGetDt(), 0.1f); + rsgClearColor(0.f, 0.f, 0.f, 1.f); + const float height = rsgGetHeight(); + const int size = rsAllocationGetDimX(rsGetAllocation(point)); + float dy2 = dt * (10.f); + Point_t * p = point; + for (int ct=0; ct < size; ct++) { + p->delta.y += dy2; + p->position += p->delta; + if ((p->position.y > height) && (p->delta.y > 0)) { + p->delta.y *= -0.3f; + } + p++; + } + + rsgDrawMesh(partMesh); + return 1; +} + +// adds particles to the screen to render +static float4 partColor[10]; +void addParticles(int rate, float x, float y, int index, bool newColor) +{ + if (newColor) { + partColor[index].x = rsRand(0.5f, 1.0f); + partColor[index].y = rsRand(1.0f); + partColor[index].z = rsRand(1.0f); + } + float rMax = ((float)rate) * 0.02f; + int size = rsAllocationGetDimX(rsGetAllocation(point)); + uchar4 c = rsPackColorTo8888(partColor[index]); + + Point_t * np = &point[newPart]; + float2 p = {x, y}; + while (rate--) { + float angle = rsRand(3.14f * 2.f); + float len = rsRand(rMax); + np->delta.x = len * sin(angle); + np->delta.y = len * cos(angle); + np->position = p; + np->color = c; + newPart++; + np++; + if (newPart >= size) { + newPart = 0; + np = &point[newPart]; + } + } +} +</pre> + + <h3 id="creating-entry">Creating the Renderscript entry point class</h3> + + <p>When you create a Renderscript (<code>.rs</code>) file, it is helpful to create a + corresponding Android framework class that is an entry point into the <code>.rs</code> file. + The most important thing this class does is receive a {@link android.renderscript.RenderScriptGL} rendering context + object from the <a href="#creating-view">view class</a> and binds the actual Renderscript + code to the rendering context. This notifies your view class of the code that it needs + to render graphics. + </p> + + <p>In addition, this class should contain all of the things needed to set up Renderscript. + Some important things that you need to do in this class are:</p> + + <ul> + <li>Create a Renderscript object + <code>ScriptC_<em>rs_filename</em></code>. The Renderscript object is attached to the Renderscript bytecode, which is platform-independent and + gets compiled on the device when the Renderscript application runs. The bytecode is referenced + as a raw resource and is passed into the constructor for the Renderscript object. + For example, this is how the <a href="{@docRoot}resources/samples/RenderScript/Fountain/index.html">Fountain</a> + sample creates the Renderscript object: + <pre> + RenderScriptGL rs; //obtained from the view class + Resources res; //obtained from the view class + ... + ScriptC_fountain mScript = new ScriptC_fountain(mRS, mRes, R.raw.fountain); + </pre> + </li> + <li>Allocate any necessary memory and bind it to your Renderscript code via the Renderscript object.</li> + <li>Build any necessary meshes and bind them to the Renderscript code via the Renderscript object.</li> + <li>Create any necessary programs and bind them to the Renderscript code via the Renderscript object.</li> + </ul> + + <p>The following code shows how the <a href= + "{@docRoot}resources/samples/RenderScript/Fountain/src/com/example/android/rs/fountain/FountainRS.html"> + FountainRS</a> class is implemented:</p> + <pre> +package com.example.android.rs.fountain; + +import android.content.res.Resources; +import android.renderscript.*; +import android.util.Log; + +public class FountainRS { + public static final int PART_COUNT = 50000; + + public FountainRS() { + } + + /** + * This provides us with the Renderscript context and resources + * that allow us to create the Renderscript object + */ + private Resources mRes; + private RenderScriptGL mRS; + + // Renderscript object + private ScriptC_fountain mScript; + + // Called by the view class to initialize the Renderscript context and renderer + public void init(RenderScriptGL rs, Resources res) { + mRS = rs; + mRes = res; + + /** + * Create a shader and bind to the Renderscript context + */ + ProgramFragmentFixedFunction.Builder pfb = new ProgramFragmentFixedFunction.Builder(rs); + pfb.setVaryingColor(true); + rs.bindProgramFragment(pfb.create()); + + /** + * Allocate memory for the particles to render and create the mesh to draw + */ + ScriptField_Point points = new ScriptField_Point(mRS, PART_COUNT); + Mesh.AllocationBuilder smb = new Mesh.AllocationBuilder(mRS); + smb.addVertexAllocation(points.getAllocation()); + smb.addIndexSetType(Mesh.Primitive.POINT); + Mesh sm = smb.create(); + + /** + * Create and bind the Renderscript object to the Renderscript context + */ + mScript = new ScriptC_fountain(mRS, mRes, R.raw.fountain); + mScript.set_partMesh(sm); + mScript.bind_point(points); + mRS.bindRootScript(mScript); + } + + boolean holdingColor[] = new boolean[10]; + + /** + * Calls Renderscript functions (invoke_addParticles) + * via the Renderscript object to add particles to render + * based on where a user touches the screen. + */ + public void newTouchPosition(float x, float y, float pressure, int id) { + if (id >= holdingColor.length) { + return; + } + int rate = (int)(pressure * pressure * 500.f); + if (rate > 500) { + rate = 500; + } + if (rate > 0) { + mScript.invoke_addParticles(rate, x, y, id, !holdingColor[id]); + holdingColor[id] = true; + } else { + holdingColor[id] = false; + } + + } +} +</pre> + + + <h3 id="creating-view">Creating the view class</h3> + + + <p>To display graphics, you need a view to render on. Create a class that extends {@link + android.renderscript.RSSurfaceView} or {@link android.renderscript.RSTextureView}. This class + allows you to create a {@link android.renderscript.RenderScriptGL} context object by calling and + pass it to the Rendscript entry point class to bind the two. Once bound, the content is aware + of the code that it needs to use to render graphics with. If your Renderscript code + depends on any type of information that the view is aware of, such as touches from the user, + you can also use this class to relay that information to the Renderscript entry point class. + The following code shows how the <code>FountainView</code> class is implemented:</p> + <pre> +package com.example.android.rs.fountain; + +import android.renderscript.RSTextureView; +import android.renderscript.RenderScriptGL; +import android.content.Context; +import android.view.MotionEvent; + +public class FountainView extends RSTextureView { + + public FountainView(Context context) { + super(context); + } + // Renderscript context + private RenderScriptGL mRS; + // Renderscript entry point object that calls Renderscript code + private FountainRS mRender; + + /** + * Create Renderscript context and initialize Renderscript entry point + */ + @Override + protected void onAttachedToWindow() { + super.onAttachedToWindow(); + android.util.Log.e("rs", "onAttachedToWindow"); + if (mRS == null) { + RenderScriptGL.SurfaceConfig sc = new RenderScriptGL.SurfaceConfig(); + mRS = createRenderScriptGL(sc); + mRender = new FountainRS(); + mRender.init(mRS, getResources()); + } + } + + @Override + protected void onDetachedFromWindow() { + super.onDetachedFromWindow(); + android.util.Log.e("rs", "onDetachedFromWindow"); + if (mRS != null) { + mRS = null; + destroyRenderScriptGL(); + } + } + + + /** + * Use callbacks to relay data to Renderscript entry point class + */ + @Override + public boolean onTouchEvent(MotionEvent ev) + { + int act = ev.getActionMasked(); + if (act == ev.ACTION_UP) { + mRender.newTouchPosition(0, 0, 0, ev.getPointerId(0)); + return false; + } else if (act == MotionEvent.ACTION_POINTER_UP) { + // only one pointer going up, we can get the index like this + int pointerIndex = ev.getActionIndex(); + int pointerId = ev.getPointerId(pointerIndex); + mRender.newTouchPosition(0, 0, 0, pointerId); + } + int count = ev.getHistorySize(); + int pcount = ev.getPointerCount(); + + for (int p=0; p < pcount; p++) { + int id = ev.getPointerId(p); + mRender.newTouchPosition(ev.getX(p), + ev.getY(p), + ev.getPressure(p), + id); + + for (int i=0; i < count; i++) { + mRender.newTouchPosition(ev.getHistoricalX(p, i), + ev.getHistoricalY(p, i), + ev.getHistoricalPressure(p, i), + id); + } + } + return true; + } +} +</pre> + + <h3 id="creating-activity">Creating the activity class</h3> + + <p>Applications that use Renderscript still behave like normal Android applications, so you + need an activity class that handles activity lifecycle callback events appropriately. The activity class + also sets your {@link android.renderscript.RSSurfaceView} view class to be the main content view of the + activity or uses your {@link android.renderscript.RSTextureView} + in a {@link android.view.ViewGroup} alongside other views.</p> + + <p>The following code shows how the <a href="{@docRoot}resources/samples/RenderScript/Fountain/index.html">Fountain</a> + sample declares its activity class:</p> + <pre> +package com.example.android.rs.fountain; + +import android.app.Activity; +import android.os.Bundle; +import android.util.Log; + +public class Fountain extends Activity { + + private static final String LOG_TAG = "libRS_jni"; + private static final boolean DEBUG = false; + private static final boolean LOG_ENABLED = false; + + private FountainView mView; + + @Override + public void onCreate(Bundle icicle) { + super.onCreate(icicle); + + // Create our Preview view and set it as + // the content of our activity + mView = new FountainView(this); + setContentView(mView); + } + + @Override + protected void onResume() { + Log.e("rs", "onResume"); + + // Ideally a game should implement onResume() and onPause() + // to take appropriate action when the activity looses focus + super.onResume(); + mView.resume(); + } + + @Override + protected void onPause() { + Log.e("rs", "onPause"); + + // Ideally a game should implement onResume() and onPause() + // to take appropriate action when the activity looses focus + super.onPause(); + mView.pause(); + + } + + static void log(String message) { + if (LOG_ENABLED) { + Log.v(LOG_TAG, message); + } + } +} +</pre> + +<p>Now that you have an idea of what is involved in a Renderscript graphics application, you can +start building your own. It might be easiest to begin with one of the +<a href="{@docRoot}resources/samples/RenderScript/index.html">Renderscript samples</a> as a starting +point if this is your first time using Renderscript.</p> + + <h2 id="drawing">Drawing</h2> + <p>The following sections describe how to use the graphics functions to draw with Renderscript.</p> + + <h3 id="drawing-rsg">Simple drawing</h3> + + <p>The native Renderscript APIs provide a few convenient functions to easily draw a polygon or text to + the screen. You call these in your <code>root()</code> function to have them render to the {@link + android.renderscript.RSSurfaceView} or {@link android.renderscript.RSTextureView}. These functions are + available for simple drawing and should not be used for complex graphics rendering:</p> + + <ul> + <li><code>rsgDrawRect()</code>: Sets up a mesh and draws a rectangle to the screen. It uses the + top left vertex and bottom right vertex of the rectangle to draw.</li> + + <li><code>rsgDrawQuad()</code>: Sets up a mesh and draws a quadrilateral to the screen.</li> + + <li><code>rsgDrawQuadTexCoords()</code>: Sets up a mesh and draws a quadrilateral to the screen + using the provided coordinates of a texture.</li> + + <li><code>rsgDrawText()</code>: Draws specified text to the screen. Use <code>rsgFontColor()</code> + to set the color of the text.</li> + </ul> + + <h3 id="drawing-mesh">Drawing with a mesh</h3> + + <p>When you want to render complex scenes to the screen, instantiate a {@link + android.renderscript.Mesh} and draw it with <code>rsgDrawMesh()</code>. A {@link + android.renderscript.Mesh} is a collection of allocations that represent vertex data (positions, + normals, texture coordinates) and index data that provides information on how to draw triangles + and lines with the provided vertex data. You can build a Mesh in three different ways:</p> + + <ul> + <li>Build the mesh with the {@link android.renderscript.Mesh.TriangleMeshBuilder} class, which + allows you to specify a set of vertices and indices for each triangle that you want to draw.</li> + + <li>Build the mesh using an {@link android.renderscript.Allocation} or a set of {@link + android.renderscript.Allocation}s with the {@link android.renderscript.Mesh.AllocationBuilder} + class. This approach allows you to build a mesh with vertices already stored in memory, which allows you + to specify the vertices in Renderscript or Android framework code.</li> + + <li>Build the mesh with the {@link android.renderscript.Mesh.Builder} class. You should use + this convenience method when you know the data types you want to use to build your mesh, but + don't want to make separate memory allocations like with {@link + android.renderscript.Mesh.AllocationBuilder}. You can specify the types that you want and this + mesh builder automatically creates the memory allocations for you.</li> + </ul> + + <p>To create a mesh using the {@link android.renderscript.Mesh.TriangleMeshBuilder}, you need to + supply it with a set of vertices and the indices for the vertices that comprise the triangle. For + example, the following code specifies three vertices, which are added to an internal array, + indexed in the order they were added. The call to {@link + android.renderscript.Mesh.TriangleMeshBuilder#addTriangle addTriangle()} draws the triangle with + vertex 0, 1, and 2 (the vertices are drawn counter-clockwise).</p> + <pre> +int float2VtxSize = 2; +Mesh.TriangleMeshBuilder triangles = new Mesh.TriangleMeshBuilder(renderscriptGL, +float2VtxSize, Mesh.TriangleMeshBuilder.COLOR); +triangles.addVertex(300.f, 300.f); +triangles.addVertex(150.f, 450.f); +triangles.addVertex(450.f, 450.f); +triangles.addTriangle(0 , 1, 2); +Mesh smP = triangle.create(true); +script.set_mesh(smP); +</pre> + + <p>To draw a mesh using the {@link android.renderscript.Mesh.AllocationBuilder}, you need to + supply it with one or more allocations that contain the vertex data:</p> + <pre> +Allocation vertices; + +... +Mesh.AllocationBuilder triangle = new Mesh.AllocationBuilder(mRS); +smb.addVertexAllocation(vertices.getAllocation()); +smb.addIndexSetType(Mesh.Primitive.TRIANGLE); +Mesh smP = smb.create(); +script.set_mesh(smP); +</pre> + + <p>In your Renderscript code, draw the built mesh to the screen:</p> + <pre> +rs_mesh mesh; +... + +int root(){ +... +rsgDrawMesh(mesh); +... +return 0; //specify a non zero, positive integer to specify the frame refresh. + //0 refreshes the frame only when the mesh changes. +} +</pre> + + <h2 id="shader">Programs</h2> + + <p>You can attach four program objects to the {@link android.renderscript.RenderScriptGL} context + to customize the rendering pipeline. For example, you can create vertex and fragment shaders in + GLSL or build a raster program object that controls culling. The four programs mirror a + traditional graphical rendering pipeline:</p> + + <table> + <tr> + <th>Android Object Type</th> + + <th>Renderscript Native Type</th> + + <th>Description</th> + </tr> + + <tr> + <td>{@link android.renderscript.ProgramVertex}</td> + + <td>rs_program_vertex</td> + + <td> + <p>The Renderscript vertex program, also known as a vertex shader, describes the stage in + the graphics pipeline responsible for manipulating geometric data in a user-defined way. + The object is constructed by providing Renderscript with the following data:</p> + + <ul> + <li>An {@link android.renderscript.Element} describing its varying inputs or attributes</li> + + <li>GLSL shader string that defines the body of the program</li> + + <li>a {@link android.renderscript.Type} that describes the layout of an + Allocation containing constant or uniform inputs</li> + </ul> + + <p>Once the program is created, bind it to the {@link android.renderscript.RenderScriptGL} + graphics context by calling {@link android.renderscript.RenderScriptGL#bindProgramVertex + bindProgramVertex()}. It is then used for all subsequent draw calls until you bind a new + program. If the program has constant inputs, the user needs to bind an allocation + containing those inputs. The allocation's type must match the one provided during creation. + </p> + + <p>The Renderscript runtime then does all the necessary plumbing to send those constants to + the graphics hardware. Varying inputs to the shader, such as position, normal, and texture + coordinates are matched by name between the input {@link android.renderscript.Element} + and the mesh object that is being drawn. The signatures don't have to be exact or in any + strict order. As long as the input name in the shader matches a channel name and size + available on the mesh, the Renderscript runtime handles connecting the two. Unlike OpenGL + there is no need to link the vertex and fragment programs.</p> + + <p>To bind shader constants to the program, declare a <code>struct</code> that contains the necessary + shader constants in your Renderscript code. This <code>struct</code> is generated into a + reflected class that you can use as a constant input element during the program's creation. + It is an easy way to create an instance of this <code>struct</code> as an allocation. You would then + bind this {@link android.renderscript.Allocation} to the program and the + Renderscript runtime sends the data that is contained in the <code>struct</code> to the hardware + when necessary. To update shader constants, you change the values in the + {@link android.renderscript.Allocation} and notify the Renderscript + code of the change.</p> + + <p>The {@link android.renderscript.ProgramVertexFixedFunction.Builder} class also + lets you build a simple vertex shader without writing GLSL code. + </p> + </td> + </tr> + + <tr> + <td>{@link android.renderscript.ProgramFragment}</td> + + <td>rs_program_fragment</td> + + <td> + <p>The Renderscript fragment program, also known as a fragment shader, is responsible for + manipulating pixel data in a user-defined way. It's constructed from a GLSL shader string + containing the program body, texture inputs, and a {@link android.renderscript.Type} + object that describes the constants + used by the program. Like the vertex programs, when an {@link android.renderscript.Allocation} + with constant input + values is bound to the shader, its values are sent to the graphics program automatically. + Note that the values inside the {@link android.renderscript.Allocation} are not explicitly tracked. + If they change between two draw calls using the same program object, notify the runtime of that change by + calling <code>rsgAllocationSyncAll()</code>, so it can send the new values to hardware. Communication + between the vertex and fragment programs is handled internally in the GLSL code. For + example, if the fragment program is expecting a varying input called <code>varTex0</code>, the GLSL code + inside the program vertex must provide it.</p> + + <p>To bind shader constructs to the program, declare a <code>struct</code> that contains the necessary + shader constants in your Renderscript code. This <code>struct</code> is generated into a + reflected class that you can use as a constant input element during the program's creation. + It is an easy way to create an instance of this <code>struct</code> as an allocation. You would then + bind this {@link android.renderscript.Allocation} to the program and the + Renderscript runtime sends the data that is contained in the <code>struct</code> to the hardware + when necessary. To update shader constants, you change the values in the + {@link android.renderscript.Allocation} and notify the Renderscript + code of the change.</p> + + <p>The {@link android.renderscript.ProgramFragmentFixedFunction.Builder} class also + lets you build a simple fragment shader without writing GLSL code. + </p> + </td> + </tr> + + <tr> + <td>{@link android.renderscript.ProgramStore}</td> + + <td>rs_program_store</td> + + <td>The Renderscript store program contains a set of parameters that control how the graphics + hardware writes to the framebuffer. It could be used to enable and disable depth writes and + testing, setup various blending modes for effects like transparency and define write masks + for color components.</td> + </tr> + + <tr> + <td>{@link android.renderscript.ProgramRaster}</td> + + <td>rs_program_raster</td> + + <td>The Renderscript raster program is primarily used to specify whether point sprites are enabled and to + control the culling mode. By default back faces are culled.</td> + </tr> + </table> + + <p>The following example defines a vertex shader in GLSL and binds it to a Renderscript context object:</p> + <pre> + private RenderScriptGL glRenderer; //rendering context + private ScriptField_Point mPoints; //vertices + private ScriptField_VpConsts mVpConsts; //shader constants + + ... + + ProgramVertex.Builder sb = new ProgramVertex.Builder(glRenderer); + String t = "varying vec4 varColor;\n" + + "void main() {\n" + + " vec4 pos = vec4(0.0, 0.0, 0.0, 1.0);\n" + + " pos.xy = ATTRIB_position;\n" + + " gl_Position = UNI_MVP * pos;\n" + + " varColor = vec4(1.0, 1.0, 1.0, 1.0);\n" + + " gl_PointSize = ATTRIB_size;\n" + + "}\n"; + sb.setShader(t); + sb.addConstant(mVpConsts.getType()); + sb.addInput(mPoints.getElement()); + ProgramVertex pvs = sb.create(); + pvs.bindConstants(mVpConsts.getAllocation(), 0); + glRenderer.bindProgramVertex(pvs); +</pre> + + + <p>The <a href= + "{@docRoot}resources/samples/RenderScript/MiscSamples/src/com/example/android/rs/miscsamples/RsRenderStatesRS.html"> + RsRenderStatesRS</a> sample has many examples on how to create a shader without writing GLSL.</p> + + <h3 id="shader-bindings">Program bindings</h3> + + <p>You can also declare four pragmas that control default program bindings to the {@link + android.renderscript.RenderScriptGL} context when the script is executing:</p> + + <ul> + <li><code>stateVertex</code></li> + + <li><code>stateFragment</code></li> + + <li><code>stateRaster</code></li> + + <li><code>stateStore</code></li> + </ul> + + <p>The possible values for each pragma are <code>parent</code> or <code>default</code>. Using + <code>default</code> binds the shaders to the graphical context with the system defaults.</p> + + <p>Using <code>parent</code> binds the shaders in the same manner as it is bound in the calling + script. If this is the root script, the parent state is taken from the bind points that are set + by the {@link android.renderscript.RenderScriptGL} bind methods.</p> + + <p>For example, you can define this at the top of your graphics Renderscript code to have + the vertex and store programs inherent the bind properties from their parent scripts:</p> + <pre> +#pragma stateVertex(parent) +#pragma stateStore(parent) +</pre> + + <h3 id="shader-sampler">Defining a sampler</h3> + + <p>A {@link android.renderscript.Sampler} object defines how data is extracted from textures. + Samplers are bound to a {@link android.renderscript.ProgramFragment} alongside the texture + whose sampling they control. These + objects are used to specify such things as edge clamping behavior, whether mip-maps are used, and + the amount of anisotropy required. There might be situations where hardware does not support the + desired behavior of the sampler. In these cases, the Renderscript runtime attempts to provide the + closest possible approximation. For example, the user requested 16x anisotropy, but only 8x was + set because it's the best available on the hardware.</p> + + <p>The <a href= + "{@docRoot}resources/samples/RenderScript/MiscSamples/src/com/example/android/rs/miscsamples/RsRenderStatesRS.html"> + RsRenderStatesRS</a> sample has many examples on how to create a sampler and bind it to a + Fragment program.</p> + + + +<h2 id="fbo">Rendering to a Framebuffer Object</h2> + +<p>Framebuffer objects allow you to render offscreen instead of in the default onscreen +framebuffer. This approach might be useful for situations where you need to post-process a texture before +rendering it to the screen, or when you want to composite two scenes in one such as rendering a rear-view +mirror of a car. There are two buffers associated with a framebuffer object: a color buffer +and a depth buffer. The color buffer (required) contains the actual pixel data of the scene +that you are rendering, and the depth buffer (optional) contains the values necessary to figure +out what vertices are drawn depending on their z-values.</p> + +<p>In general, you need to do the following to render to a framebuffer object:</p> + +<ul> + <li>Create {@link android.renderscript.Allocation} objects for the color buffer and + depth buffer (if needed). Specify the {@link + android.renderscript.Allocation#USAGE_GRAPHICS_RENDER_TARGET} usage attribute for these + allocations to notify the Renderscript runtime to use these allocations for the framebuffer + object. For the color buffer allocation, you most likely need to declare the {@link + android.renderscript.Allocation#USAGE_GRAPHICS_TEXTURE} usage attribute + to use the color buffer as a texture, which is the most common use of the framebuffer object.</li> + + <li>Tell the Renderscript runtime to render to the framebuffer object instead of the default + framebuffer by calling <code>rsgBindColorTarget()</code> and passing it the color buffer + allocation. If applicable, call <code>rsgBindDepthTarget()</code> passing in the depth buffer + allocation as well.</li> + + <li>Render your scene normally with the <code>rsgDraw</code> functions. The scene will be + rendered into the color buffer instead of the default onscreen framebuffer.</li> + + <li>When done, tell the Renderscript runtime stop rendering to the color buffer and back + to the default framebuffer by calling <code>rsgClearAllRenderTargets()</code>.</li> + + <li>Create a fragment shader and bind a the color buffer to it as a texture.</li> + + <li>Render your scene to the default framebuffer. The texture will be used according + to the way you setup your fragment shader.</li> +</ul> + +<p>The following example shows you how to render to a framebuffer object by modifying the +<a href="{@docRoot}guide/resources/renderscript/Fountain/">Fountain</a> Renderscript sample. The end +result is the <a href="{@docRoot}guide/resources/renderscript/FountainFBO/">FountainFBO</a> sample. +The modifications render the exact same scene into a framebuffer object as it does the default +framebuffer. The framebuffer object is then rendered into the default framebuffer in a small +area at the top left corner of the screen.</p> + +<ol> + <li>Modify <code>fountain.rs</code> and add the following global variables. This creates setter + methods when this file is reflected into a <code>.java</code> file, allowing you to allocate + memory in your Android framework code and binding it to the Renderscript runtime. +<pre> +//allocation for color buffer +rs_allocation gColorBuffer; +//fragment shader for rendering without a texture (used for rendering to framebuffer object) +rs_program_fragment gProgramFragment; +//fragment shader for rendering with a texture (used for rendering to default framebuffer) +rs_program_fragment gTextureProgramFragment; +</pre> + </li> + + <li>Modify the root function of <code>fountain.rs</code> to look like the following code. The + modifications are commented: +<pre> +int root() { + float dt = min(rsGetDt(), 0.1f); + rsgClearColor(0.f, 0.f, 0.f, 1.f); + const float height = rsgGetHeight(); + const int size = rsAllocationGetDimX(rsGetAllocation(point)); + float dy2 = dt * (10.f); + Point_t * p = point; + for (int ct=0; ct < size; ct++) { + p->delta.y += dy2; + p->position += p->delta; + if ((p->position.y > height) && (p->delta.y > 0)) { + p->delta.y *= -0.3f; + } + p++; + } + //Tell Renderscript runtime to render to the frame buffer object + rsgBindColorTarget(gColorBuffer, 0); + //Begin rendering on a white background + rsgClearColor(1.f, 1.f, 1.f, 1.f); + rsgDrawMesh(partMesh); + + //When done, tell Renderscript runtime to stop rendering to framebuffer object + rsgClearAllRenderTargets(); + + //Bind a new fragment shader that declares the framebuffer object to be used as a texture + rsgBindProgramFragment(gTextureProgramFragment); + + //Bind the framebuffer object to the fragment shader at slot 0 as a texture + rsgBindTexture(gTextureProgramFragment, 0, gColorBuffer); + //Draw a quad using the framebuffer object as the texture + float startX = 10, startY = 10; + float s = 256; + rsgDrawQuadTexCoords(startX, startY, 0, 0, 1, + startX, startY + s, 0, 0, 0, + startX + s, startY + s, 0, 1, 0, + startX + s, startY, 0, 1, 1); + + //Rebind the original fragment shader to render as normal + rsgBindProgramFragment(gProgramFragment); + + //Render the main scene + rsgDrawMesh(partMesh); + + return 1; +} +</pre> + </li> + + <li>In the <code>FountainRS.java</code> file, modify the <code>init()</code> method to look + like the following code. The modifications are commented: + +<pre> +/* Add necessary members */ +private ScriptC_fountainfbo mScript; +private Allocation mColorBuffer; +private ProgramFragment mProgramFragment; +private ProgramFragment mTextureProgramFragment; + +public void init(RenderScriptGL rs, Resources res) { + mRS = rs; + mRes = res; + + ScriptField_Point points = new ScriptField_Point(mRS, PART_COUNT); + + Mesh.AllocationBuilder smb = new Mesh.AllocationBuilder(mRS); + smb.addVertexAllocation(points.getAllocation()); + smb.addIndexSetType(Mesh.Primitive.POINT); + Mesh sm = smb.create(); + + mScript = new ScriptC_fountainfbo(mRS, mRes, R.raw.fountainfbo); + mScript.set_partMesh(sm); + mScript.bind_point(points); + + ProgramFragmentFixedFunction.Builder pfb = new ProgramFragmentFixedFunction.Builder(rs); + pfb.setVaryingColor(true); + mProgramFragment = pfb.create(); + mScript.set_gProgramFragment(mProgramFragment); + + /* Second fragment shader to use a texture (framebuffer object) to draw with */ + pfb.setTexture(ProgramFragmentFixedFunction.Builder.EnvMode.REPLACE, + ProgramFragmentFixedFunction.Builder.Format.RGBA, 0); + + /* Set the fragment shader in the Renderscript runtime */ + mTextureProgramFragment = pfb.create(); + mScript.set_gTextureProgramFragment(mTextureProgramFragment); + + /* Create the allocation for the color buffer */ + Type.Builder colorBuilder = new Type.Builder(mRS, Element.RGBA_8888(mRS)); + colorBuilder.setX(256).setY(256); + mColorBuffer = Allocation.createTyped(mRS, colorBuilder.create(), + Allocation.USAGE_GRAPHICS_TEXTURE | + Allocation.USAGE_GRAPHICS_RENDER_TARGET); + + /* Set the allocation in the Renderscript runtime */ + mScript.set_gColorBuffer(mColorBuffer); + + mRS.bindRootScript(mScript); +} +</pre> + +<p class="note"><strong>Note:</strong> This sample doesn't use a depth buffer, but the following code +shows you how to declare an example depth buffer if you need to use +one for your application. The depth buffer must have the same dimensions as the color buffer: + +<pre> +Allocation mDepthBuffer; + +... + +Type.Builder b = new Type.Builder(mRS, Element.createPixel(mRS, DataType.UNSIGNED_16, + DataKind.PIXEL_DEPTH)); +b.setX(256).setY(256); +mDepthBuffer = Allocation.createTyped(mRS, b.create(), +Allocation.USAGE_GRAPHICS_RENDER_TARGET); + +</pre> +</p> +</li> + + <li>Run and use the sample. The smaller, white quad on the top-left corner is using the + framebuffer object as a texture, which renders the same scene as the main rendering.</li> +</ol> |