react-native-webgl
implements WebGL 2 in React Native.
npm i --save rn-webgl2
# OR
yarn add rn-webgl2
react-native link rn-webgl2
IMPORTANT: you must also manually do the following:
on iOS:
in your XCode project,
- if it not there, add
libRNWebGL.a
in the Linked Libraries of your project target (and remove the potentiallibGPUImage.a
if it not needed). There is a "bug" withreact-native link
.
on Android:
react-native-webgl is implemented with some C++ bricks and react-native link react-native-webgl
is not enough to install and configure your project for Android:
android/local.properties
: Make sure you have an up-to-date Android NDK (needed to compile the Native C++ code) and that it's properly configured in ANDROID_NDK env or inlocal.properties
file (e.g.ndk.dir=/usr/local/share/android-ndk
).android/build.gradle
: If it's not already there, addgradle-download-task
buildscript dependency:classpath 'de.undercouch:gradle-download-task:3.1.2'
. If you don't do this, you will likely have:downloadJSCHeaders
not working.android/app/build.gradle
: Make sure you haveminSdkVersion 17
or higher
Ths library exposes a WebGLView
that implements WebGL in React Native.
Basic gist:
import React, { Component } from "react";
import { WebGLView } from "rn-webgl2";
class RedSquareWebGL extends Component {
onContextCreate = (gl: WebGLRenderingContext) => {
const rngl = gl.getExtension("RN");
gl.clearColor(1, 0, 0, 1);
gl.clear(gl.COLOR_BUFFER_BIT);
rngl.endFrame();
};
render() {
return (
<WebGLView
style={{ width: 100, height: 100 }}
onContextCreate={this.onContextCreate}
/>
);
}
}
For a better example, see Image drawn through a Shader (vanilla WebGL). Then, feel free to use your preferred library, like https://github.com/regl-project/regl , https://github.com/mrdoob/three.js or https://github.com/gre/gl-react (
gl-react-native
is backed by this implementation).
The first noticeable difference is the addition of an extension, called "RN"
that you can get with gl.getExtension("RN")
. It returns an object with a few functions:
endFrame()
: the mandatory call to get anything drawn on screen. It's the way to tell the current implementation everything is finished for the current frame. (we might later introduce better way)loadTexture(config)
: It is a way to load aTexture
with a configuration object. For the config object format see Section Texture Config Formats.This function returns a Promise of{ texture, width, height }
where texture is the actualWebGLTexture
instance you can use in agl.bindTexture
and width and height is the texture dimension.unloadTexture(texture)
: It is a way to unload aTexture
with the texture object that was returned from a previousloadTexture
call. This must be invoked when a texture is no longer required and when it can be removed, in order to avoid memory leaks. This is especially important when using your preferred library (such as three.js), as not only the objects created by your preferred library will need to be disposed, but the texture object itself needs to be unloaded. For an example of how to safely remove all references to textures, see this memory leak issue discussion.
The texture formats are provided in an extensible and loosely-coupled way via adding more "Loaders" to the project. (as soon as they are linked, they will get discovered by RNWebGL via the RN bridge).
This library comes with one built-in loader: the Image Loader. More loaders can come via libraries like react-native-webgl-camera
and react-native-webgl-video
. Feel free to implement your own.
Format is { image }
where image have the same format as React Native <Image source
prop.
There are also config options shared (by convention) across the loaders:
yflip
(boolean): allows to vertically flip the texture when you load it. You likely always want to set this to true. (default is false because it's an extra cost)
Once the component is mounted and the OpenGL ES context has been created, the gl
object received through the onContextCreate
prop becomes the interface to the OpenGL ES context, providing a WebGL API. It resembles a WebGL2RenderingContext in the WebGL 2 spec. However, some older Android devices may not support WebGL2 features. To check whether the device supports WebGL2 it's recommended to use gl instanceof WebGL2RenderingContext
.
An additional method gl.endFrame()
is present which notifies the context that the current frame is ready to be presented. This is similar to a 'swap buffers' API call in other OpenGL platforms.
The following WebGL2RenderingContext methods are currently unimplemented:
getFramebufferAttachmentParameter()
getRenderbufferParameter()
compressedTexImage2D()
compressedTexSubImage2D()
getTexParameter()
getUniform()
getVertexAttrib()
getVertexAttribOffset()
getBufferSubData()
getInternalformatParameter()
renderbufferStorageMultisample()
compressedTexImage3D()
compressedTexSubImage3D()
fenceSync()
isSync()
deleteSync()
clientWaitSync()
waitSync()
getSyncParameter()
getActiveUniformBlockParameter()
The pixels
argument of texImage2D()
must be null
, an ArrayBuffer
with pixel data, or an object of the form { localUri }
where localUri
is the file://
URI of an image in the device's file system. Thus an Expo.Asset
object could be used once .downloadAsync()
has been called on it (and completed) to fetch the resource.
This implementation is a standalone fork of Expo GLView (MIT License) available on https://github.com/expo/expo and https://github.com/expo/expo-sdk. Huge kudos to Expo team and especially @nikki93 for implementing it.