Skip to main content

You are viewing Agora Docs forBetaproducts and features. Switch to Docs

You are looking at Interactive Live Streaming v3.x Docs. The newest version is  Interactive Live Streaming 4.x

Android
iOS
macOS
Windows C++
Windows C#
Unity
Flutter
React Native
Electron
Cocos Creator
Cocos2d-x

Share the Screen

Introduction

Screen sharing enables the host of an interactive live streaming broadcast or video call to display what is on their screen to other users in the channel. This technology has many obvious advantages for communicating information, particularly in the following scenarios:

  • During video conferencing, the speaker can share a local image, web page, or full presentation with other participants.
  • For online instruction, the teacher can share slides or notes with students.

Share a screen

Implementations for v3.7.0 and later

This section describes how to implement screen sharing using the Android SDK v3.7.0 and later.

Before you begin, ensure that you understand how to start a video call or start interactive video streaming. For details, see Start a Video Call or Start Interactive Video Streaming.

1. Integrate the AgoraScreenShareExtension

  1. Copy the AgoraScreenShareExtension.aar file of the SDK to the /app/libs/ directory.

  2. Add the following line in the dependencies node in the /app/build.gradle file to support importing files in the AAR format.


    _1
    implementation fileTree(dir: "libs", include: ["*.jar","*.aar"])

2. Call the API

Call startScreenCapture to start screen sharing.

Sample project

Agora provides an open-source sample project on GitHub for your reference.

API reference

Some restrictions and cautions exist for using the screen sharing feature, and charges can apply. Agora recommends that you read the following API references before calling the API:

Implementations earlier than v3.7.0

This section describes how to implement screen sharing using the Android SDK earlier than v3.7.0.

Before you begin, ensure that you understand how to start a video call or start interactive video streaming. For details, see Start a Video Call or Start Interactive Video Streaming.

The Agora SDK does not provide any method for screen share on Android. Therefore, You need to implement this function using the native screen-capture APIs provided by Android, and the custom video-source APIs provided by Agora.

  1. Use android.media.projection and android.hardware.display.VirtualDisplay to get and pass the screen-capture data.
  2. Create an OpenGL ES environment. Create a SurfaceView object and pass the object to VirtualDisplay, which works as the recipient of the screen-capture data.
  3. You can get the screen-capture data from the callbacks of SurfaceView. Use either the Push mode or mediaIO mode to push the screen-capture data to the SDK. For details, see Custom Video Source and Renderer.

Data transfer

The following diagram shows how data is transferred during screen sharing on Android:

1607071986468

Sample code

The code samples provided in this section use MediaProjection and VirtualDisplay APIs provided by Android and have the following Android/API version requirements:
  • The Android version must be Lollipop or higher.
  • To use MediaProjection APIs, the Android API level must be 21 or higher.
  • To use VirtualDisplay APIs, the Android API level must be 19 or higher.
For detailed usage and considerations, see the Android documentation MediaProjection and VirtualDisplay.
  1. Implement IVideoSource and IVideoFrameConsumer, and rewrite the callbacks in IVideoSource.


    _42
    // Implements the IVideoSource interface
    _42
    public class ExternalVideoInputManager implements IVideoSource {
    _42
    ...
    _42
    // Gets the IVideoFrameConsumer object when initializing the video source
    _42
    @Override
    _42
    public boolean onInitialize(IVideoFrameConsumer consumer) {
    _42
    mConsumer = consumer;
    _42
    return true;
    _42
    }
    _42
    _42
    @Override
    _42
    public boolean onStart() {
    _42
    return true;
    _42
    }
    _42
    _42
    @Override
    _42
    public void onStop() {
    _42
    _42
    }
    _42
    // Sets IVideoFrameConsumer as null when IVideoFrameConsumer is released by the media engine
    _42
    @Override
    _42
    public void onDispose() {
    _42
    Log.e(TAG, "SwitchExternalVideo-onDispose");
    _42
    mConsumer = null;
    _42
    }
    _42
    _42
    @Override
    _42
    public int getBufferType() {
    _42
    return TEXTURE.intValue();
    _42
    }
    _42
    _42
    @Override
    _42
    public int getCaptureType() {
    _42
    return CAMERA;
    _42
    }
    _42
    _42
    @Override
    _42
    public int getContentHint() {
    _42
    return MediaIO.ContentHint.NONE.intValue();
    _42
    }
    _42
    ...
    _42
    }


    _2
    // Implements IVideoFrameConsumer
    _2
    private volatile IVideoFrameConsumer mConsumer;

  2. Set the custom video source before joining a channel.


    _24
    // Sets the input thread of the custom video source
    _24
    // In the sample project, we use the class in the open-source grafika project, which encapsulates the graphics architecture of Android. For details, see https://source.android.com/devices/graphics/architecture
    _24
    // For detailed implementation of EglCore, GlUtil, EGLContext, and ProgramTextureOES, see https://github.com/google/grafika
    _24
    // The GLThreadContext class contains EglCore, EGLContext, and ProgramTextureOES
    _24
    private void prepare() {
    _24
    // Creates an OpenEL ES environment based on EglCore
    _24
    mEglCore = new EglCore();
    _24
    mEglSurface = mEglCore.createOffscreenSurface(1, 1);
    _24
    mEglCore.makeCurrent(mEglSurface);
    _24
    // Creates an EGL texture object based on GlUtil
    _24
    mTextureId = GlUtil.createTextureObject(GLES11Ext.GL_TEXTURE_EXTERNAL_OES);
    _24
    // Creates a SurfaceTexture object based on EGL texture
    _24
    mSurfaceTexture = new SurfaceTexture(mTextureId);
    _24
    // Surface a Surface object based on SurfaceTexture
    _24
    mSurface = new Surface(mSurfaceTexture);
    _24
    // Pass EGLCore, EGL context, and ProgramTextureOES to GLThreadContext as its members
    _24
    mThreadContext = new GLThreadContext();
    _24
    mThreadContext.eglCore = mEglCore;
    _24
    mThreadContext.context = mEglCore.getEGLContext();
    _24
    mThreadContext.program = new ProgramTextureOES();
    _24
    _24
    // Sets the custom video source
    _24
    ENGINE.setVideoSource(ExternalVideoInputManager.this);
    _24
    }

  3. Create an intent based on MediaProjection, and pass the intent to the startActivityForResult() method to start capturing screen data.


    _14
    private class VideoInputServiceConnection implements ServiceConnection {
    _14
    @Override
    _14
    public void onServiceConnected(ComponentName componentName, IBinder iBinder) {
    _14
    mService = (IExternalVideoInputService) iBinder;
    _14
    // Starts capturing screen data. Ensure that your Android version must be Lollipop or higher.
    _14
    if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
    _14
    // Instantiates a MediaProjectionManager object
    _14
    MediaProjectionManager mpm = (MediaProjectionManager)
    _14
    getContext().getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    _14
    // Creates an intent
    _14
    Intent intent = mpm.createScreenCaptureIntent();
    _14
    // Starts screen capturing
    _14
    startActivityForResult(intent, PROJECTION_REQ_CODE);
    _14
    }

  4. Get the screen-capture data information from the activity result.


    _14
    // Gets the intent of the data information from activity result
    _14
    @Override
    _14
    public void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
    _14
    super.onActivityResult(requestCode, resultCode, data);
    _14
    if (requestCode == PROJECTION_REQ_CODE && resultCode == RESULT_OK) {
    _14
    ...
    _14
    // Sets the custom video source as the screen-capture data
    _14
    mService.setExternalVideoInput(ExternalVideoInputManager.TYPE_SCREEN_SHARE, data);
    _14
    }
    _14
    catch (RemoteException e) {
    _14
    e.printStackTrace();
    _14
    }
    _14
    }
    _14
    }

    The implementation of setExternalVideoInput(int type, Intent intent) is as follows:


    _29
    // Gets the parameters of the screen-capture data from intent
    _29
    boolean setExternalVideoInput(int type, Intent intent) {
    _29
    _29
    if (mCurInputType == type && mCurVideoInput != null
    _29
    && mCurVideoInput.isRunning()) {
    _29
    return false;
    _29
    }
    _29
    _29
    IExternalVideoInput input;
    _29
    switch (type) {
    _29
    ...
    _29
    case TYPE_SCREEN_SHARE:
    _29
    // Gets the screen-capture data from the intent of MediaProjection
    _29
    int width = intent.getIntExtra(FLAG_SCREEN_WIDTH, DEFAULT_SCREEN_WIDTH);
    _29
    int height = intent.getIntExtra(FLAG_SCREEN_HEIGHT, DEFAULT_SCREEN_HEIGHT);
    _29
    int dpi = intent.getIntExtra(FLAG_SCREEN_DPI, DEFAULT_SCREEN_DPI);
    _29
    int fps = intent.getIntExtra(FLAG_FRAME_RATE, DEFAULT_FRAME_RATE);
    _29
    Log.i(TAG, "ScreenShare:" + width + "|" + height + "|" + dpi + "|" + fps);
    _29
    // Instantiates a ScreenShareInput class using the screen-capture data
    _29
    input = new ScreenShareInput(context, width, height, dpi, fps, intent);
    _29
    break;
    _29
    default:
    _29
    input = null;
    _29
    }
    _29
    // Sets the captured video data as the ScreenShareInput object, and creates an input thread for external video data
    _29
    setExternalVideoInput(input);
    _29
    mCurInputType = type;
    _29
    return true;
    _29
    }

  5. During the initialization of the input thread of the external video data, create a VirtualDisplay object with MediaProjection, and render VirtualDisplay on SurfaceView.


    _15
    public void onVideoInitialized(Surface target) {
    _15
    MediaProjectionManager pm = (MediaProjectionManager)
    _15
    mContext.getSystemService(Context.MEDIA_PROJECTION_SERVICE);
    _15
    mMediaProjection = pm.getMediaProjection(Activity.RESULT_OK, mIntent);
    _15
    _15
    if (mMediaProjection == null) {
    _15
    Log.e(TAG, "media projection start failed");
    _15
    return;
    _15
    }
    _15
    // Creates VirtualDisplay with MediaProjection, and render VirtualDisplay on SurfaceView
    _15
    mVirtualDisplay = mMediaProjection.createVirtualDisplay(
    _15
    VIRTUAL_DISPLAY_NAME, mSurfaceWidth, mSurfaceHeight, mScreenDpi,
    _15
    DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC, target,
    _15
    null, null);
    _15
    }

  6. Use SurfaceView as the custom video source. After the user joins the channel, the custom video module gets the screen-capture data using consumeTextureFrame in ExternalVideoInputThread and passes the data to the SDK.


    _34
    public void run() {
    _34
    ...
    _34
    // Calls updateTexImage() to update the data to the texture object of OpenGL ES
    _34
    // Calls getTransformMatrix() to transform the texture matrix
    _34
    try {
    _34
    mSurfaceTexture.updateTexImage();
    _34
    mSurfaceTexture.getTransformMatrix(mTransform);
    _34
    }
    _34
    catch (Exception e) {
    _34
    e.printStackTrace();
    _34
    }
    _34
    _34
    // Gets the screen-capture data from onFrameAvailable. onFrameAvailable is a rewrite of ScreenShareInput, which gets information such as the texture ID and transform information
    _34
    // No need to render the screen-capture data on the local view
    _34
    if (mCurVideoInput != null) {
    _34
    mCurVideoInput.onFrameAvailable(mThreadContext, mTextureId, mTransform);
    _34
    }
    _34
    _34
    mEglCore.makeCurrent(mEglSurface);
    _34
    GLES20.glViewport(0, 0, mVideoWidth, mVideoHeight);
    _34
    _34
    if (mConsumer != null) {
    _34
    Log.e(TAG, "SDK encoding->width:" + mVideoWidth + ",height:" + mVideoHeight);
    _34
    // Calls consumeTextureFrame to pass the video data to the SDK
    _34
    mConsumer.consumeTextureFrame(mTextureId,
    _34
    TEXTURE_OES.intValue(),
    _34
    mVideoWidth, mVideoHeight, 0,
    _34
    System.currentTimeMillis(), mTransform);
    _34
    }
    _34
    _34
    // Waits for the next frame
    _34
    waitForNextFrame();
    _34
    ...
    _34
    }

Sample project

Agora provides an open-source sample project on GitHub to show how to switch the video between screen sharing and the camera.

Send video streams from screen sharing and local camera

Currently, the Agora Video SDK for Android supports creating only one RtcEngine instance per app. You need multi-processing to send video from screen sharing and the local camera at the same time.

Implementation

1609729759595

You need to create separate processes for screen sharing and video captured by the local camera. Both processes use the following methods to send video data to the SDK:

  • Local-camera process: Implemented with joinChannel. This process is often the main process and communicates with the screen sharing process via AIDL (Android Interface Definition Language). See Android documentation to learn more about AIDL.
  • Screen sharing process: Implemented with MediaProjection, VirtualDisplay, and custom video capture. The screen sharing process creates an RtcEngine object and uses the object to create and join a channel for screen sharing.
The users of both processes must join the same channel. After joining a channel, a user subscribes to the audio and video streams of all other users in the channel by default, thus incurring all associated usage costs. Because the screen sharing process is only used to publish the screen sharing stream, you can call muteAllRemoteAudioStreams(true) and muteAllRemoteVideoStreams(true) to mute remote audio and video streams.

Sample code

The following sample code uses multi-processing to send the video from the screen sharing video and locally captured video. AIDL is used to communicate between the processes.

  1. Configure android:process in AndroidManifest.xml for the components.


    _14
    <application>
    _14
    <activity
    _14
    android:name=".impl.ScreenCapture$ScreenCaptureAssistantActivity"
    _14
    android:process=":screensharingsvc"
    _14
    android:screenOrientation="fullUser"
    _14
    android:theme="@android:style/Theme.Translucent" />
    _14
    <service
    _14
    android:name=".impl.ScreenSharingService"
    _14
    android:process=":screensharingsvc">
    _14
    <intent-filter>
    _14
    <action android:name="android.intent.action.screenshare" />
    _14
    </intent-filter>
    _14
    </service>
    _14
    </application>

  2. Create an AIDL interface, which includes methods to communicate between processes.


    _13
    // Includes methods to manage the screen sharing process
    _13
    // IScreenSharing.aidl
    _13
    package io.agora.rtc.ss.aidl;
    _13
    _13
    import io.agora.rtc.ss.aidl.INotification;
    _13
    _13
    interface IScreenSharing {
    _13
    void registerCallback(INotification callback);
    _13
    void unregisterCallback(INotification callback);
    _13
    void startShare();
    _13
    void stopShare();
    _13
    void renewToken(String token);
    _13
    }


    _8
    // Includes callbacks to receive notifications from the screen sharing process
    _8
    // INotification.aidl
    _8
    package io.agora.rtc.ss.aidl;
    _8
    _8
    interface INotification {
    _8
    void onError(int error);
    _8
    void onTokenWillExpire();
    _8
    }

  3. Implement the screen sharing process. Screen sharing is implemented with MediaProjection, VirtualDisplay, and custom video source. The screen sharing process creates an RtcEngine object and uses the object to create and join a channel for screen sharing.


    _61
    // Define the ScreenSharingClient object
    _61
    public class ScreenSharingClient {
    _61
    private static final String TAG = ScreenSharingClient.class.getSimpleName();
    _61
    private static IScreenSharing mScreenShareSvc;
    _61
    private IStateListener mStateListener;
    _61
    private static volatile ScreenSharingClient mInstance;
    _61
    _61
    public static ScreenSharingClient getInstance() {
    _61
    if (mInstance == null) {
    _61
    synchronized (ScreenSharingClient.class) {
    _61
    if (mInstance == null) {
    _61
    mInstance = new ScreenSharingClient();
    _61
    }
    _61
    }
    _61
    }
    _61
    _61
    return mInstance;
    _61
    }
    _61
    _61
    // Start screen sharing
    _61
    public void start(Context context, String appId, String token, String channelName, int uid, VideoEncoderConfiguration vec) {
    _61
    if (mScreenShareSvc == null) {
    _61
    Intent intent = new Intent(context, ScreenSharingService.class);
    _61
    intent.putExtra(Constant.APP_ID, appId);
    _61
    intent.putExtra(Constant.ACCESS_TOKEN, token);
    _61
    intent.putExtra(Constant.CHANNEL_NAME, channelName);
    _61
    intent.putExtra(Constant.UID, uid);
    _61
    intent.putExtra(Constant.WIDTH, vec.dimensions.width);
    _61
    intent.putExtra(Constant.HEIGHT, vec.dimensions.height);
    _61
    intent.putExtra(Constant.FRAME_RATE, vec.frameRate);
    _61
    intent.putExtra(Constant.BITRATE, vec.bitrate);
    _61
    intent.putExtra(Constant.ORIENTATION_MODE, vec.orientationMode.getValue());
    _61
    context.bindService(intent, mScreenShareConn, Context.BIND_AUTO_CREATE);
    _61
    } else {
    _61
    try {
    _61
    mScreenShareSvc.startShare();
    _61
    } catch (RemoteException e) {
    _61
    e.printStackTrace();
    _61
    Log.e(TAG, Log.getStackTraceString(e));
    _61
    }
    _61
    }
    _61
    _61
    }
    _61
    _61
    // Stop screen sharing
    _61
    public void stop(Context context) {
    _61
    if (mScreenShareSvc != null) {
    _61
    try {
    _61
    mScreenShareSvc.stopShare();
    _61
    mScreenShareSvc.unregisterCallback(mNotification);
    _61
    } catch (RemoteException e) {
    _61
    e.printStackTrace();
    _61
    Log.e(TAG, Log.getStackTraceString(e));
    _61
    } finally {
    _61
    mScreenShareSvc = null;
    _61
    }
    _61
    }
    _61
    context.unbindService(mScreenShareConn);
    _61
    }
    _61
    ...
    _61
    }

    After binding the screen sharing service, the screen sharing process creates an RtcEngine object and joins the channel for screen sharing.


    _10
    @Override
    _10
    public IBinder onBind(Intent intent) {
    _10
    // Creates a RtcEngine object
    _10
    setUpEngine(intent);
    _10
    // Set video encoding configurations
    _10
    setUpVideoConfig(intent);
    _10
    // Join the channel
    _10
    joinChannel(intent);
    _10
    return mBinder;
    _10
    }

    When creating the RtcEngine object in the screen sharing process, perform the following configurations:


    _6
    // Mute all audio streams
    _6
    mRtcEngine.muteAllRemoteAudioStreams(true);
    _6
    // Mute all video streams
    _6
    mRtcEngine.muteAllRemoteVideoStreams(true);
    _6
    // Disable the audio module
    _6
    mRtcEngine.disableAudio();

  4. Implement the local-camera process and the code logic to start the screen sharing process.


    _66
    public class MultiProcess extends BaseFragment implements View.OnClickListener
    _66
    {
    _66
    private static final String TAG = MultiProcess.class.getSimpleName();
    _66
    // Define the uid for the screen sharing process
    _66
    private static final Integer SCREEN_SHARE_UID = 10000;
    _66
    ...
    _66
    @Override
    _66
    public void onActivityCreated(@Nullable Bundle savedInstanceState)
    _66
    {
    _66
    super.onActivityCreated(savedInstanceState);
    _66
    Context context = getContext();
    _66
    if (context == null)
    _66
    {
    _66
    return;
    _66
    }
    _66
    try
    _66
    {
    _66
    // Create an RtcEngine instance
    _66
    engine = RtcEngine.create(context.getApplicationContext(), getString(R.string.agora_app_id), iRtcEngineEventHandler);
    _66
    // Initialize the screen sharing process
    _66
    mSSClient = ScreenSharingClient.getInstance();
    _66
    mSSClient.setListener(mListener);
    _66
    }
    _66
    catch (Exception e)
    _66
    {
    _66
    e.printStackTrace();
    _66
    getActivity().onBackPressed();
    _66
    }
    _66
    }
    _66
    ...
    _66
    // Run the screen sharing process and send information such as App ID and channel ID to the screen sharing process
    _66
    else if (v.getId() == R.id.screenShare){
    _66
    String channelId = et_channel.getText().toString();
    _66
    if (!isSharing) {
    _66
    mSSClient.start(getContext(), getResources().getString(R.string.agora_app_id), null,
    _66
    channelId, SCREEN_SHARE_UID, new VideoEncoderConfiguration(
    _66
    VD_640x360,
    _66
    FRAME_RATE_FPS_15,
    _66
    STANDARD_BITRATE,
    _66
    ORIENTATION_MODE_ADAPTIVE
    _66
    ));
    _66
    screenShare.setText(getResources().getString(R.string.stop));
    _66
    isSharing = true;
    _66
    } else {
    _66
    mSSClient.stop(getContext());
    _66
    screenShare.setText(getResources().getString(R.string.screenshare));
    _66
    isSharing = false;
    _66
    }
    _66
    }
    _66
    ...
    _66
    // Create a view for local preview
    _66
    SurfaceView surfaceView = RtcEngine.CreateRendererView(context);
    _66
    if(fl_local.getChildCount() > 0)
    _66
    {
    _66
    fl_local.removeAllViews();
    _66
    }
    _66
    ...
    _66
    // Join channel
    _66
    int res = engine.joinChannel(accessToken, channelId, "Extra Optional Data", 0);
    _66
    if (res != 0)
    _66
    {
    _66
    showAlert(RtcEngine.getErrorDescription(Math.abs(res)));
    _66
    return;
    _66
    }
    _66
    join.setEnabled(false);
    _66
    }

Sample project

Agora provides an open-source sample project on GitHub to show how to send video from both screen sharing and the camera by using dual processes.

Interactive Live Streaming