Skip to main content

You are viewing Agora Docs forBetaproducts and features. Switch to Docs

You are looking at Interactive Live Streaming v3.x Docs. The newest version is  Interactive Live Streaming 4.x

Android
iOS
macOS
Windows C++
Windows C#
Unity
Flutter
React Native
Electron
Cocos Creator
Cocos2d-x

Agora Component for Customization (Android)

Customize the Video Source with the Agora Component

1. AgoraBufferedCamera2 Class

The AgoraBufferedCamera2.java class shows how to customize the video source in ByteBuffer and ByteArray.

The construct of AgoraBufferedCamera2 needs a Context and an optional CaptureParameter that defines the parameters of the camera and buffer type. By default, the resolution is 640 x 480, video format is YUV420P; and data type is ByteBuffer.


_3
AgoraBufferedCamera2 source = new AgoraBufferedCamera2(this);
_3
source.useFrontCamera(true);
_3
rtcEngine.setVideoSource(source);

When using the ByteArray or ByteBuffer video source, the SDK receives the pixel format in YUV420p, NV21, or RGBA.

2. AgoraTextureCamera Class

The AgoraTextureCamera.java class shows how to customize a textured video source. AgoraTextureCamera can be used directly as the video source.


_2
IVideoSource source = new AgoraTextureCamera(this, 640, 480);
_2
rtcEngine.setVideoSource(source);

3. Helper Class and Component

Compared with YUV or RGB, the textured video source is more complex considering the GL environment and thread requirements when being transmitted.

SurfaceTextureHelper Interface

The SurfaceTextureHelper class is an assisting class provided by the Agora SDK to help users use SurfaceTexture without the need to build a GL environment, textures, and interact between threads.

Major functions of SurfaceTextureHelper:

  1. Create a textured object, and build a SurfaceTexture with the textured object
  2. Notify developers on texture updates when SurfaceTexture has captured the video frame

Create a SurfaceTextureHelper


_1
public static SurfaceTextureHelper create(final String threadName, final EglBase.Context sharedContext);

Call the create method to create SurfaceTextureHelper. In this method, a GL thread is built together with textures and SurfaceTexture.

Get the SurfaceTexture


_3
public EglBase.Context getEglContext();
_3
public Handler getHandler();
_3
public SurfaceTexture getSurfaceTexture();

This method gets the created texture. If in the GL environment or a thread is required, call the getEglContext and getHandler methods.

Monitor the SurfaceTexture


_5
public interface OnTextureFrameAvailableListener {
_5
abstract void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timestampNs);
_5
}
_5
public void startListening(final OnTextureFrameAvailableListener listener);
_5
public void stopListening();

This method creates a listener to monitor the new video frame of SurfaceTexture, and start or end monitoring by calling the startListening and stopListening methods.

Release SurfaceTexture


_1
void dispose();

Call this method to release relevant resources when SurfaceTexture is no longer needed.

TextureSource Interface

The TextureSource interface includes SurfaceTextureHelper and IVideoFrameConsumer methods to implement customized textured video source operations. SurfaceTexture can be created by SurfaceTextureHelper, and used to capture a video frame and convert it into a texture to be sent to RtcEngine. With the TextureSource interface, developers only need to care about the following:

  • Video source functionality and compatibility.
  • Capturing the SurfaceTexture video frame.
  • Sending the updated texture to RtcEngine.
  1. Implement four TextureSource callbacks for the video source functionality and compatibility.


    _4
    abstract protected boolean onCapturerOpened();
    _4
    abstract protected boolean onCapturerStarted();
    _4
    abstract protected void onCapturerStopped();
    _4
    abstract protected void onCapturerClosed();

  2. Use SurfaceTexture to capture the video frame.


    _1
    public SurfaceTexture getSurfaceTexture();

    See SurfaceTexture for methods to capture video frame.

  3. Once the video frame is captured and updated as a texture, call onTextureFrameAvailable to send the video frame to RtcEngine.


    _1
    public void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timestampNs);

  4. Release the resources when the video frame is no longer needed.


    _1
    public void release();

4. Example of Using External Screen Recording as the Video Source with TextureSource

The following sequence shows an example of how to use external screen recording as the video source.

Step 1. Implement the following callbacks:


_30
public class ScreenRecordSource extends TextureSource {
_30
private Context mContext;
_30
private boolean mIsStart;
_30
private VirtualDisplay mVirtualDisplay;
_30
private MediaProjection mMediaProjection;
_30
_30
public ScreenRecordSource(Context context, int width, int height, int dpi, MediaProjection mediaProjection) {
_30
super(null, width, height);
_30
mContext = context;
_30
mMediaProjection = mediaProjection;
_30
}
_30
_30
@Override
_30
protected boolean onCapturerOpened() {
_30
createVirtualDisplay();
_30
return true;
_30
}
_30
@Override
_30
protected boolean onCapturerStarted() {
_30
return mIsStart = true;
_30
}
_30
@Override
_30
protected void onCapturerStopped() {
_30
mIsStart = false;
_30
}
_30
@Override
_30
protected void onCapturerClosed() {
_30
releaseVirtualDisplay();
_30
}
_30
}

Step 2. Use SurfaceTexture to create a virtual display for capturing the screen data.


_14
private void createVirtualDisplay() {
_14
Surface inputSurface = new Surface(getSurfaceTexture);
_14
if (mVirtualDisplay == null) {
_14
mVirtualDisplay = mediaProjection.createVirtualDisplay("MainScreen", mWidth, mHeight, mDpi,
_14
DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR, inputSurface, null, null);
_14
}
_14
}
_14
_14
private void virtualDisplay() {
_14
if (virtualDisplay != null) {
_14
virtualDisplay.release();
_14
}
_14
virtualDisplay = null;
_14
}

Step 3. Reimplement the callbacks for getting the video data.


_8
@Override
_8
public void onTextureFrameAvailable(int oesTextureId, float[] transformMatrix, long timeStampNs) {
_8
super.onTextureFrameAvailable(oesTextureId, transformMatrix, timeStampNs);
_8
if (mIsStart && mConsumer != null && mConsumer.get() != null) {
_8
mConsumer.get().consumeTextureFrame(oesTextureId, TEXTURE_OES.intValue(), mWidth, mHeight,
_8
0, System.currentTimeMillis(), transformMatrix);
_8
}
_8
}

Make sure to call the super.onTextureFrameAvailable(oesTextureId, transformMatrix, timeStampNs) parent class method.

Step 4. Release the resources when the external video source is no longer needed.


_4
public void sourceRelease() {
_4
releaseProjection();
_4
release();
_4
}

Customize the Video Sink with the Agora Component

The Agora SDK uses its default renderer to render the local and remote video. The IVideoSink interface can be used for more advanced functions, such as:

  • To render the local or remote video frame instead of directly on the view component.
  • To use the general SurfaceView object or customized view component.
  • To render images in specific areas, such as in gaming.

AgoraSurfaceView Class

AgoraSurfaceView inherits SurfaceView and implements the IVideoSink interface to render video frames in YUV420P, RGB, and Texture (2D/OES).


_4
AgoraSurfaceView render = new AgoraSurfaceView(this);
_4
render.init(MediaIO.BufferType.BYTE_ARRAY, I420, null);
_4
render.setZOrderOnTop(true);
_4
rtcEngine.setLocalVideoRenderer(render);

AgoraTextureView Class

AgoraTextureView inherits TextureView and implements the IVideoSink interface to render video frames in YUV420P, RGB, and Texture (2D/OES).

The following code shows the usage of AgoraTextureView with external video sources, and creates the GL environment with TextureSource:


_7
AgoraTextureCamera source = new AgoraTextureCamera(this, 640, 480);
_7
AgoraTextureView render = (AgoraTextureView) findViewById(R.id.agora_texture_view);
_7
render.init(source.getEglContext());
_7
render.setBufferType(MediaIO.BufferType.TEXTURE);
_7
render.setPixelFormat(MediaIO.PixelFormat.TEXTURE_OES);
_7
rtcEngine().setVideoSource(source);
_7
rtcEngine().setLocalVideoRenderer(render);

Helper Class and Component

BaseVideoRenderer Class

Major functions of the BaseVideoRenderer class:

  • Supports rendering various formats: I420, RGBA, and TEXTURE_2D/OES.
  • Supports various rendering targets: SurfaceView, TextureView, Surface, and SurfaceTexture.

Follow these steps to use the BaseVideoRenderer class:

  1. Create a customized renderer class to implement the IVideoSink interface and embed the BaseVideoRenderer object.

  2. Specify the type and format of the video frame, or call the setBufferType and setPixelFormat methods of the embedded BaseVideoRenderer object.

  3. Share EGLContextHandle with RtcEngine, now that BaseVideoRenderer uses OpenGL as the renderer and has created the EGLContext.

  4. Set the object to render by calling the setRenderView and setRenderSurface methods of the embedded BaseVideoRenderer object.

  5. Implement methods to control the renderer by calling the onInitialize, onStart, onStop, and onDispose methods.

  6. Implement IVideoFrameConsumer, and call the BaseVideoRenderer object in the corresponding format to render the received video frame on the rendered target.

Interactive Live Streaming