There are two magic classes that allow you to do this: Surface and SurfaceTexture. I would consider them both to be very poorly documented, and the documentation would have you believe they are quite limited in their usage but actually they are quite powerful. For example, SurfaceTexture's documentation states: "The image stream may come from either camera preview or video decode." This made me think no other source was possible but, as I state below, you can actually connect Surface to SurfaceTexture and that opens up your possibilities a whole lot more.
The SurfaceTexture is basically your entry point into the OpenGL layer. It is initialised with an OpenGL texture id, and performs all of it's rendering onto that texture.
The Surface class provides the abstraction required to perform View drawing onto hardware. It provides a Canvas object (through the
lockCanvas method), which is basically what an Android View uses for all it's drawing (the
onDraw() method of View actually takes in a Canvas). As it turns out, the Surface class takes in a SurfaceTexture in its constructor.
And that's basically it. The steps to render your view to opengl:
- Initialise an OpenGL texture
- Within an OpenGL context construct a SurfaceTexture with the texture id. Use
SurfaceTexture.setDefaultBufferSize(int width, int height) to make sure you have enough space on the texture for the view to render.
- Create a Surface constructed with the above SurfaceTexture.
- Within the View's onDraw, use the Canvas returned by Surface.lockCanvas to do the view drawing. You can obviously do this with any View, and not just WebView. Plus Canvas has a whole bunch of drawing methods, allowing you to do funky, funky things.