The previous discussion about rendering audio and video on Android used MediaCodec
, which has its drawbacks, such as the inability to edit videos. Instead, videos can be rendered using OpenGL ES, which allows for better processing, such as adding filters. Here, we introduce OpenGL in Android, specifically OpenGL ES, which is a free, cross-platform, fully functional 2D/3D graphics library API designed specifically for various embedded systems. It is a carefully extracted subset of OpenGL, with the main content as follows:
- Introduction
- GLSurfaceView
- Renderer
- Coordinate Mapping
- Drawing Triangles
- Drawing Effects
Introduction#
Android supports high-performance 2D and 3D graphics through the Open Graphics Library OpenGL ES. OpenGL is a cross-platform graphics API that specifies a standard software interface for 3D graphics processing hardware. OpenGL ES is a form of the OpenGL specification suitable for embedded devices. Android supports multiple versions of the OpenGL ES API, as follows:
- OpenGL ES 1.0 and 1.1 - This API specification is supported from Android 1.0 and above.
- OpenGL ES 2.0 - This API specification is supported from Android 2.2 (API level 8) and above.
- OpenGL ES 3.0 - This API specification is supported from Android 4.3 (API level 18) and above.
- OpenGL ES 3.1 - This API specification is supported from Android 5.0 (API level 21) and above.
Declare the version of OpenGL ES in AndroidManifest.xml
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
GLSurfaceView#
GLSurfaceView
is the OpenGL
implementation of SurfaceView
, added starting from Android 1.5. It adds EGL management and a built-in rendering thread GLThread
on top of SurfaceView
. Its main functions are as follows:
- Manages a
Surface
, which is a special memory area that can be combined with Android'sView
system, meaning it can be used alongsideView
. - Manages an
EGL
, which allowsOpenGL
to render onto thisSurface
.EGL
is the bridge between Android andOpenGL
. - Supports user-defined renderer
Renderer
objects. - Renders on a dedicated thread.
- Supports on-demand and continuous rendering.
- Optionally wraps, traces, and/or error-checks the renderer's OpenGL calls.
EGL window, OpenGL surface, and GL surface all mean the same thing.
Common settings for GLSurfaceView
are as follows:
EGL Configuration#
The default implementation of EGLConfigChooser
is SimpleEGLConfigChooser
. By default, GLSurfaceView
will choose a surface
with a depth buffer of at least 16 bits in PixelFormat.RGB_888
format. The default EGLConfigChooser
implementation is SimpleEGLConfigChooser
, as follows:
private class SimpleEGLConfigChooser extends ComponentSizeChooser {
public SimpleEGLConfigChooser(boolean withDepthBuffer) {
super(8, 8, 8, 0, withDepthBuffer ? 16 : 0, 0);
}
}
You can modify the default behavior of EGLConfig
as follows:
// Set the default EGLConfig depth buffer, true for 16-bit depth buffer
setEGLConfigChooser(boolean needDepth)
// Specify a custom EGLConfigChooser
setEGLConfigChooser(android.opengl.GLSurfaceView.EGLConfigChooser configChooser)
// Specify the values for each component
public void setEGLConfigChooser(int redSize, int greenSize, int blueSize,
int alphaSize, int depthSize, int stencilSize)
Rendering#
Set the renderer and start the rendering thread GLThread
using setRenderer
. There are two rendering modes as follows:
RENDERMODE_CONTINUOUSLY
: Suitable for scenes that require repeated rendering, the default rendering mode.RENDERMODE_WHEN_DIRTY
: Renders only once after theSurface
is created, and will continue rendering only ifrequestRender
is called.
The rendering mode can be set using setRenderMode
, as follows:
// Set the renderer
public void setRenderer(Renderer renderer)
// Set the rendering mode, effective only after setRenderer is called
public void setRenderMode(int renderMode)
setDebugFlags and setGLWrapper#
setDebugFlags
is used to set debug flags for easier debugging and code tracing. Optional values are DEBUG_CHECK_GL_ERROR
and DEBUG_LOG_GL_CALLS
. setGLWrapper
can delegate the GL interface through a custom GLWrapper
to add some custom behavior, as follows:
// DEBUG_CHECK_GL_ERROR: Checks every GL call, throwing an exception if glError occurs
// DEBUG_LOG_GL_CALLS: Logs GL calls at verbose level with TAG as GLSurfaceView
setDebugFlags(int debugFlags)
// For debugging and tracing code, can customize GLWrapper to wrap GL interface and return GL interface
setGLWrapper(android.opengl.GLSurfaceView.GLWrapper glWrapper)
Renderer#
This part has been mentioned earlier, but here we discuss it separately. To perform rendering operations on the GL surface, you need to implement the Renderer
object to complete the actual rendering operations. Set the renderer object Renderer
for GLSurfaceView
and specify the rendering mode as follows:
// Set the renderer object Renderer for GLSurfaceView
public void setRenderer(Renderer renderer)
// Set the rendering mode, effective only after setRenderer is called
public void setRenderMode(int renderMode)
When setting the renderer Renderer
, an independent thread GLThread
is created and started. This thread is the rendering thread, independent of the UI thread.
This involves two threads: the UI thread and the rendering thread, which naturally involves communication between threads. You can use volatile
and synchronized
to implement communication between threads.
If operations in the rendering thread are called from the UI thread, you can use the queueEvent
method of GLSurfaceView
to execute that operation in the rendering thread. This is generally needed when customizing GLSurfaceView
. Similarly, if in the rendering thread, you can use runOnUiThread
to execute UI-related operations in the UI thread.
Now let's look at the basic implementation of the renderer Renderer
:
public class GLES20Renderer implements Renderer {
private static final String TAG = GLES20Renderer.class.getSimpleName();
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Log.i(TAG, "onSurfaceCreated");
GLES20.glClearColor(0.0f, 0.0f, 1.0f, 1);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
Log.i(TAG, "onSurfaceChanged");
GLES20.glViewport(0, 0, width, height);
}
public void onDrawFrame(GL10 gl) {
Log.i(TAG, "onDrawFrame");
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
}
}
Coordinate Mapping#
First, let's understand the world coordinate system of OpenGL and the corresponding texture coordinate system on Android, as shown in the following image:
When using OpenGL in Android, corresponding coordinate transformations must be performed. Next, let's look at the mapping relationship of the OpenGL coordinate system on the Android screen, as shown in the following image:
As shown in the image, the left side is the default OpenGL coordinate system, and the right side is the mapping of the OpenGL coordinate system on the Android screen. It can be clearly seen that the triangle in the image is distorted. To maintain the image ratio, OpenGL projection mode and camera view need to be applied to convert coordinates, which involves projection matrices and view matrices. This part will be introduced in subsequent articles.
Drawing Triangles#
With the above content, we have a preliminary understanding of Android OpenGL. Following the tradition, let's do a small case where we use OpenGL to draw a triangle. The Triangle
class encapsulates the triangle data and the use of shaders, and subsequent rendering directly calls the draw
method for rendering, as follows:
// Triangle
class Triangle(context: Context) {
companion object {
// Number of coordinates for each vertex in the coordinate array
private const val COORDINATE_PER_VERTEX = 3
}
private var programHandle: Int = 0
private var positionHandle: Int = 0
private var colorHandler: Int = 0
private var vPMatrixHandle: Int = 0
private var vertexStride = COORDINATE_PER_VERTEX * 4
// The three edges of the triangle
private var triangleCoordinate = floatArrayOf( // Three edges in counter-clockwise order
0.0f, 0.5f, 0.0f, // top
-0.5f, -0.5f, 0.0f, // bottom left
0.5f, -0.5f, 0.0f // bottom right
)
// Color array
private val color = floatArrayOf(0.63671875f, 0.76953125f, 0.22265625f, 1.0f)
private var vertexBuffer: FloatBuffer =
// (number of coordinate values * 4 bytes per float)
ByteBuffer.allocateDirect(triangleCoordinate.size * 4).run {
// ByteBuffer uses native byte order
this.order(ByteOrder.nativeOrder())
// ByteBuffer to FloatBuffer
this.asFloatBuffer().apply {
put(triangleCoordinate)
position(0)
}
}
init {
// read shader sourceCode
val vertexShaderCode = GLUtil.readShaderSourceCodeFromRaw(context, R.raw.vertex_shader_triangle_default)
val fragmentShaderCode =
GLUtil.readShaderSourceCodeFromRaw(context, R.raw.fragment_shader_triangle)
if (vertexShaderCode.isNullOrEmpty() || fragmentShaderCode.isNullOrEmpty()) {
throw RuntimeException("vertexShaderCode or fragmentShaderCode is null or empty")
}
// compile shader
val vertexShaderHandler = GLUtil.compileShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode)
val fragmentShaderHandler =
GLUtil.compileShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode)
// create and link program
programHandle = GLUtil.createAndLinkProgram(vertexShaderHandler, fragmentShaderHandler)
}
/**
* Drawing method
*/
fun draw(mvpMatrix: FloatArray) {
GLES20.glUseProgram(programHandle)
// Get the address index of the attribute variable
// get handle to vertex shader's vPosition member
positionHandle = GLES20.glGetAttribLocation(programHandle, "vPosition").also {
// enable vertex attribute, default is disable
GLES20.glEnableVertexAttribArray(it)
GLES20.glVertexAttribPointer(
it, // Position of the first vertex attribute in the shader
COORDINATE_PER_VERTEX,
GLES20.GL_FLOAT,
false,
vertexStride, // Interval between consecutive vertex attribute groups
vertexBuffer
)
}
// get handle to fragment shader's vColor member
colorHandler = GLES20.glGetUniformLocation(programHandle, "vColor").also {
GLES20.glUniform4fv(it, 1, color, 0)
}
// draw triangle
GLES20.glDrawArrays(GLES20.GL_TRIANGLES, 0, triangleCoordinate.size / COORDINATE_PER_VERTEX)
GLES20.glDisableVertexAttribArray(positionHandle)
}
}
The renderer implementation is as follows:
// Renderer implementation
class MRenderer(private var context: Context) : GLSurfaceView.Renderer {
private val tag = MRenderer::class.java.simpleName
private lateinit var triangle: Triangle
private val vPMatrix = FloatArray(16) // Model view projection matrix
private val projectionMatrix = FloatArray(16)
private val viewMatrix = FloatArray(16)
override fun onSurfaceCreated(gl: GL10?, config: EGLConfig?) {
// Called when the Surface is created, used to create resources needed for rendering
Log.d(tag, "onSurfaceCreated")
triangle = Triangle(context)
}
override fun onSurfaceChanged(gl: GL10?, width: Int, height: Int) {
// Called when the Surface changes size, set the viewport
Log.d(tag, "onSurfaceChanged")
GLES20.glViewport(0, 0, width, height)
}
override fun onDrawFrame(gl: GL10?) {
// Draw the current frame, used for rendering specific content
Log.d(tag, "onDrawFrame")
triangle.draw(vPMatrix)
}
}
The above are basic drawing operations, nothing much to say. The shader usage process will be introduced in subsequent articles, so I won't post additional code here. If you're interested, you can check the source code at the end of the article.
Drawing Effects#
The above drawing did not use projection matrices and camera views for coordinate transformation, which may cause distortion when switching between landscape and portrait modes. This will be corrected in the next article. Let's look at the effect of the drawing from the above code, as shown in the following image:
If you need to leave a message, use the keyword 【OpenGL】 to obtain the source code.