Skip to main content

You are viewing Agora Docs forBeta products and features. Switch to Docs

Call quality best practice

Customer satisfaction for your Video Calling integrated app depends on the quality of video and audio it provides. Quality of audiovisual communication through your app is affected by the following factors:

  • Bandwidth of network connection: Bandwidth is the volume of information that an Internet connection can handle per unit of time. When the available bandwidth is not sufficient to transmit the amount of data necessary to provide the desired video quality, your users see jerky or frozen video along with audio that cuts in and out.

  • Stability of network connection: Network connections are often unstable with the network quality going up and down. Users get temporarily disconnected and come back online after an interruption. These issues lead to a poor audiovisual experience for your users unless your app is configured to respond to these situations and take remedial actions.

  • Hardware quality: The camera and microphone used to capture video and audio must be of sufficiently good quality. If the user's hardware does not capture the audiovisual information in suitably high definition, it limits the quality of audio and video that is available to the remote user.

  • Video and audio settings: The sharpness, smoothness, and overall quality of the video is directly linked to the frame rate, bitrate and other video settings. Similarly, the audio quality depends on the sample rate, bitrate, number of channels and other audio parameters. If you do not choose proper settings, the audio and video transmitted are of poor quality. On the other hand, if the settings are too demanding, the available bandwidth quickly gets choked, leading to suboptimal experience for your users.

  • Echo: Echo is produced when your audio signal is played by a remote user through a speakerphone or an external device. This audio is captured by the remote user's microphone and sent back to you. Echo negatively affects audio quality, making speech difficult to understand.

  • Multiple users in a channel: When multiple users engage in real-time audio and video communication in a channel, the available bandwidth is quickly used up due to several incoming audio and video streams. The device performance also deteriorates due to the excessive workload required to decode and render multiple video streams.

This page shows you how to use Video SDK features to account for these factors and ensure optimal audio and video quality in your app.

Understand the tech

Video SDK provides the following features to deal with channel quality issues:

  • Network probe test: The network probe test checks the last-mile network quality before you join a channel. The method returns network quality statistics including round-trip latency, packet loss rate, and network bandwidth.

  • Echo test: The echo test captures audio through the microphone on the user’s device, and sends it to Agora SD-RTN™. After a short delay, Agora SD-RTN™ sends the audio back to the sender to be played. The quality of the returned audio enables a user to judge if their hardware and network connection are adequate. Agora recommends that an echo test be performed before a network probe test.

  • Audio profiles: Delivering the best quality audio to your users requires choosing audio settings customized for your particular application. In Video SDK you can choose from pre-configured audio profiles and audio scenarios to optimize audio settings for a wide range of applications.

    • An audio profile sets the audio sample rate, bitrate, encoding scheme, and the number of channels for your audio. Video SDK offers several preset audio profiles to choose from. To pick the most suitable audio profile for your application, refer to the List of audio profiles.
    • An audio scenario specifies the audio performance in terms of volume, audio quality, and echo cancellation. Based on the nature of your application, you can pick the most suitable option from the List of audio scenarios.
  • Video profiles: In real-time engagement scenarios, user experience is closely tied to the sharpness, smoothness, and overall quality of the video. In Video SDK you can set the video dimensions, framerate, bitrate, orientation mode, and mirror mode by specifying a video profile. You can also set the degradation preference to specify how video quality is degraded under suboptimal network conditions. To find the suitable bitrate for a given combination of dimensions and framerate, refer to the Video profile table.

  • In-call quality statistics: Video SDK provides several callbacks and methods to monitor channel quality in real-time. These methods and callbacks provide vital statistics to evaluate communication quality and provide the information necessary to take remedial actions. Video SDK provides you the following statistics :

    • Network quality: The uplink and downlink network quality in terms of the transmission bitrate, packet loss rate, average Round-Trip Time, and jitter in your network.

    • Call quality: Information on the current user session and the resources being used by the channel in terms of the number of users in a channel, packet loss rate, CPU usage and call duration. Use these statistics to troubleshoot call quality issues.

    • Local audio quality: Local audio measurements such as audio channels, sample rate, sending bitrate, and packet loss rate in the audio stream.

    • Remote audio quality: These statistics provide information such as the number of channels, received bitrate, jitter in the audio stream, audio loss rate, and packet loss rate.

    • Local video quality: Local video quality statistics such as packet loss rate, frame rate, encoded frame width, and sent bitrate.

    • Remote video quality: These statistics include information about the width and height of video frames, packet loss rate, receiving stream type, and bitrate in the reported interval.

    • Video and Audio states: Agora SD-RTN™ reports the new state, and the reason for state change, whenever the state of an audio or video stream changes.

  • Dual stream mode: In dual-stream mode, Video SDK transmits a high-quality and a low-quality video stream from the sender. The high-quality stream has higher resolution and bitrate than the the low-quality video stream. Remote users subscribe to the low-quality stream to improve communication continuity as it reduces bandwidth consumption. Subscribers should also choose the low-quality video streams when network condition are unreliable, or when multiple users publish streams in a channel.

  • Video stream fallback: When network conditions deteriorate, Video SDK automatically switches the video stream from high-quality to low-quality, or disables video to ensure audio delivery. Agora SD-RTN™ continues to monitor the network quality after fallback, and restores the video stream when network conditions allow it. To improve communication quality under extremely poor network conditions, implement a fallback option in your app.

  • Video for multiple users: When multiple users join a channel, several incoming high-quality video streams negatively impact network and device performance. In such cases, you can manage the excess load by playing high-quality video from the user who has focus, and low quality streams from all other users. To implement this feature, it is necessary for all users in the channel to enable the dual stream mode.

  • Echo cancellation when playing audio files: Video SDK offers audio mixing functionality to play media in a channel. You can mix a local or online audio file with the audio captured through the microphone, or completely replace the microphone audio. Audio mixing takes advantage of the echo cancellation features of Video SDK to reduce echo in a channel. Refer to Audio and voice effects to learn more about audio mixing in Video SDK.

  • Connection state monitoring: The connection state between an app and Agora SD-RTN™ changes when the app joins or leaves a channel, or goes offline due to network or authentication issues. Video SDK provides connection state monitoring to detect when and why a network connection is interrupted. When the connection state changes, Agora SD-RTN™ sends a callback to notify the app. Video SDK then automatically tries to reconnect to the server to restore the connection.

  • Log files: Video SDK provides configuration options that you use to customize the location, content and size of log files containing key data of Video SDK operation. When you set up logging, Video SDK writes information messages, warnings, and errors regarding activities such as initialization, configuration, connection and disconnection to log files. Log files are useful in detecting and resolving channel quality issues.

The following figure shows the workflow you need to implement to ensure channel quality in your app:

Ensure Channel Quality

Prerequisites

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • An Agora account and project.

  • A computer with Internet access.

    Ensure that no firewall is blocking your network communication.

Implement best practice to optimize call quality

This section shows you how to integrate call quality optimization features of Video SDK into your app, step by step.

  1. Import the required Agora libraries


    _5
    import io.agora.rtc2.*
    _5
    import io.agora.rtc2.video.VideoCanvas
    _5
    import io.agora.rtc2.internal.LastmileProbeConfig
    _5
    import io.agora.rtc2.video.VideoEncoderConfiguration
    _5
    import io.agora.rtc2.IRtcEngineEventHandler.RemoteVideoStats

  2. Use a probe test to check network health


    _16
    fun startProbeTest() {
    _16
    if (agoraEngine == null) setupAgoraEngine()
    _16
    // Configure a LastmileProbeConfig instance.
    _16
    val config = LastmileProbeConfig()
    _16
    // Probe the uplink network quality.
    _16
    config.probeUplink = true
    _16
    // Probe the down link network quality.
    _16
    config.probeDownlink = true
    _16
    // The expected uplink bitrate (bps). The value range is [100000,5000000].
    _16
    config.expectedUplinkBitrate = 100000
    _16
    // The expected down link bitrate (bps). The value range is [100000,5000000].
    _16
    config.expectedDownlinkBitrate = 100000
    _16
    agoraEngine!!.startLastmileProbeTest(config)
    _16
    sendMessage("Running the last mile probe test ...")
    _16
    // Test results are reported through the onLastmileProbeResult callback
    _16
    }

  3. Implement best practice for app initiation

    Use the following Video SDK features when you set up an instance of the Agora Engine:

    • Enable and configure logging: For optimization and debugging.
    • Enable dual stream mode: Required for multi-user scenarios.
    • Set an audio profile and audio scenario: Setting an audio profile is optional and only required if you have special requirements such as streaming music.
    • Set the video profile: Setting a video profile is also optional. It is useful when you want to change one or more of mirrorMode, frameRate, bitrate, dimensions, orientationMode, degradationPrefer or compressionPrefer from the default setting to custom values. For more information, see video profile table.
    • Start a network probe test: A quick test at startup to gauge network quality.

    _55
    override fun setupAgoraEngine(): Boolean {
    _55
    try {
    _55
    val config = RtcEngineConfig()
    _55
    config.mContext = mContext
    _55
    config.mAppId = appId
    _55
    config.mEventHandler = iRtcEngineEventHandler
    _55
    _55
    // Configure the log file
    _55
    val logConfig = RtcEngineConfig.LogConfig()
    _55
    logConfig.fileSizeInKB = 256 // Range 128-1024 Kb
    _55
    logConfig.level = Constants.LogLevel.getValue(Constants.LogLevel.LOG_LEVEL_WARN)
    _55
    config.mLogConfig = logConfig
    _55
    agoraEngine = RtcEngine.create(config)
    _55
    // Enable video mode
    _55
    agoraEngine!!.enableVideo()
    _55
    } catch (e: Exception) {
    _55
    sendMessage(e.toString())
    _55
    return false
    _55
    }
    _55
    _55
    // Enable the dual stream mode
    _55
    agoraEngine!!.setDualStreamMode(Constants.SimulcastStreamMode.ENABLE_SIMULCAST_STREAM)
    _55
    // If you set the dual stream mode to AUTO_SIMULCAST_STREAM, the low-quality video
    _55
    // steam is not sent by default; the SDK automatically switches to low-quality after
    _55
    // it receives a request to subscribe to a low-quality video stream.
    _55
    _55
    // Set an audio profile and an audio scenario.
    _55
    agoraEngine!!.setAudioProfile(
    _55
    Constants.AUDIO_PROFILE_DEFAULT,
    _55
    Constants.AUDIO_SCENARIO_GAME_STREAMING
    _55
    )
    _55
    _55
    // Set the video profile
    _55
    val videoConfig = VideoEncoderConfiguration()
    _55
    // Set mirror mode
    _55
    videoConfig.mirrorMode = VideoEncoderConfiguration.MIRROR_MODE_TYPE.MIRROR_MODE_AUTO
    _55
    // Set frameRate
    _55
    videoConfig.frameRate = VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_10.value
    _55
    // Set bitrate
    _55
    videoConfig.bitrate = VideoEncoderConfiguration.STANDARD_BITRATE
    _55
    // Set dimensions
    _55
    videoConfig.dimensions = VideoEncoderConfiguration.VD_640x360
    _55
    // Set orientation mode
    _55
    videoConfig.orientationMode =
    _55
    VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE
    _55
    // Set degradation preference
    _55
    videoConfig.degradationPrefer =
    _55
    VideoEncoderConfiguration.DEGRADATION_PREFERENCE.MAINTAIN_BALANCED
    _55
    // Set compression preference: low latency or quality
    _55
    videoConfig.advanceOptions.compressionPreference =
    _55
    VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY
    _55
    // Apply the configuration
    _55
    agoraEngine!!.setVideoEncoderConfiguration(videoConfig)
    _55
    return true
    _55
    }

  4. Test the user's hardware

    The echo test checks that the user's hardware is working properly.


    _49
    fun startEchoTest(): SurfaceView {
    _49
    if (agoraEngine == null) setupAgoraEngine()
    _49
    // Set test configuration parameters
    _49
    val echoConfig = EchoTestConfiguration()
    _49
    echoConfig.enableAudio = true
    _49
    echoConfig.enableVideo = true
    _49
    echoConfig.channelId = channelName
    _49
    echoConfig.intervalInSeconds = 2 // Interval between recording and playback
    _49
    // Set up a SurfaceView
    _49
    val localSurfaceView = SurfaceView(mContext)
    _49
    localSurfaceView.visibility = View.VISIBLE
    _49
    // Call setupLocalVideo with a VideoCanvas having uid set to 0.
    _49
    agoraEngine!!.setupLocalVideo(
    _49
    VideoCanvas(
    _49
    localSurfaceView,
    _49
    VideoCanvas.RENDER_MODE_HIDDEN,
    _49
    0
    _49
    )
    _49
    )
    _49
    echoConfig.view = localSurfaceView
    _49
    _49
    // Get a token from the server or from the config file
    _49
    if (serverUrl.contains("http")) { // A valid server url is available
    _49
    // Fetch a token from the server for channelName
    _49
    fetchToken(channelName, 0, object : TokenCallback {
    _49
    override fun onTokenReceived(rtcToken: String?) {
    _49
    // Set the token in the config
    _49
    echoConfig.token = rtcToken
    _49
    // Start the echo test
    _49
    agoraEngine!!.startEchoTest(echoConfig)
    _49
    }
    _49
    _49
    override fun onError(errorMessage: String) {
    _49
    // Handle the error
    _49
    sendMessage("Error: $errorMessage")
    _49
    }
    _49
    })
    _49
    } else { // use the token from the config.json file
    _49
    echoConfig.token = config!!.optString("rtcToken")
    _49
    // Start the echo test
    _49
    agoraEngine!!.startEchoTest(echoConfig)
    _49
    }
    _49
    return localSurfaceView
    _49
    }
    _49
    _49
    fun stopEchoTest() {
    _49
    agoraEngine!!.stopEchoTest()
    _49
    destroyAgoraEngine()
    _49
    }

  5. Listen to Agora Engine events to receive state change notifications and quality statistics

    Use the following IRtcEngineEventHandler callbacks to monitor and ensure channel quality:


    _57
    override fun onLastmileQuality(quality: Int) {
    _57
    // Reports the last-mile network quality of the local user
    _57
    (mListener as CallQualityManagerListener).onLastMileQuality(quality)
    _57
    }
    _57
    _57
    override fun onLastmileProbeResult(result: LastmileProbeResult) {
    _57
    // Reports the last mile network probe result
    _57
    agoraEngine!!.stopLastmileProbeTest()
    _57
    // The result object contains the detailed test results that help you
    _57
    // manage call quality, for example, the down link bandwidth.
    _57
    sendMessage("Available down link bandwidth: " + result.downlinkReport.availableBandwidth)
    _57
    }
    _57
    _57
    override fun onNetworkQuality(uid: Int, txQuality: Int, rxQuality: Int) {
    _57
    // Reports the last mile network quality of each user in the channel
    _57
    (mListener as CallQualityManagerListener).onNetworkQuality(
    _57
    uid, txQuality, rxQuality
    _57
    )
    _57
    }
    _57
    _57
    override fun onRtcStats(rtcStats: RtcStats) {
    _57
    // Reports the statistics of the current session
    _57
    counter += 1
    _57
    var msg = ""
    _57
    if (counter == 5) msg =
    _57
    rtcStats.users.toString() + " user(s)" else if (counter == 10) {
    _57
    msg = "Packet loss rate: " + rtcStats.rxPacketLossRate
    _57
    counter = 0
    _57
    }
    _57
    if (msg.isNotEmpty()) sendMessage(msg)
    _57
    }
    _57
    _57
    override fun onConnectionStateChanged(state: Int, reason: Int) {
    _57
    // Occurs when the network connection state changes
    _57
    sendMessage(
    _57
    "Connection state changed\n" +
    _57
    "New state: $state\n" +
    _57
    "Reason: $reason"
    _57
    )
    _57
    }
    _57
    _57
    override fun onRemoteVideoStateChanged(uid: Int, state: Int, reason: Int, elapsed: Int) {
    _57
    // Occurs when the remote video stream state changes
    _57
    val msg = "Remote video state changed:\n" +
    _57
    "Uid = $uid\n" +
    _57
    "NewState = $state\n" +
    _57
    "Reason = $reason\n" +
    _57
    "Elapsed = $elapsed"
    _57
    sendMessage(msg)
    _57
    }
    _57
    _57
    override fun onRemoteVideoStats(stats: RemoteVideoStats) {
    _57
    // Reports the statistics of the video stream sent by each remote user
    _57
    (mListener as CallQualityManagerListener).onRemoteVideoStats(
    _57
    stats
    _57
    )
    _57
    }

  6. Switch stream quality

    Take advantage of dual-stream mode and switch remote video quality to high or low.


    _8
    fun setStreamQuality(remoteUid: Int, highQuality: Boolean) {
    _8
    // Set the stream type of the remote video
    _8
    if (highQuality) {
    _8
    agoraEngine!!.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_HIGH)
    _8
    } else {
    _8
    agoraEngine!!.setRemoteVideoStreamType(remoteUid, Constants.VIDEO_STREAM_LOW)
    _8
    }
    _8
    }

Test your implementation

To ensure that you have implemented call quality features into your app:

  1. Generate a temporary token in Agora Console.

  2. Configure the web demo you use to connect to your app:

    In your browser, navigate to the Agora web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. Set the APP ID

    In agora-manager/res/raw/config.json, set appId to the AppID of your project.

  2. Set the authentication method

    Choose one of the following authentication methods:

    • Temporary token:
      1. Set rtcToken with the value of your temporary token.
      2. Set channelName - with the name of a channel you used to create the token.
    • Authentication server:
      1. Setup an Authentication server
      2. In config.json, set:
        • channelName with the name of a channel you want to join.
        • rtcToken to an empty string.
        • serverUrl to the base URL of your authentication server. For example, https://agora-token-service-production-1234.up.railway.app.
  3. Start the Android reference app

    1. In Android Studio, connect a physical Android device to your development machine.

    2. Click Run to start the app.

      A moment later you see the project installed on your device. If this is the first time you run the project, you need to grant microphone and camera access to your app.

  1. Choose this sample in the reference app

    From the main screen of the app, choose Interactive Live Streaming from the dropdown and then select Call quality best practice.

  2. Initialization best practice

    When the app initializes the Agora Engine, it does the following:

    • Sets the log file location, size, and logging level according to your preference.
    • Enables the dual-stream mode.
    • Sets the audio profile.
    • Sets the video profile.
    • Starts a network probe test.

    You see the result of the network probe test displayed in the network status icon.

  3. Run the echo test

    1. Press Start Echo Test. You see the local camera feed.

    2. Speak into the device microphone. You hear the recorded audio after a short delay.

      This test confirms that the user's hardware is working properly.

    3. Press Stop Echo Test to end the test.

  4. Join a channel

    Press Join to connect to the same channel as your web demo.

  5. Monitor network health

    You see the network status indicator updated periodically based on the result of the onNetworkQuality callback.

  6. View connection statistics

    After joining a channel, you receive toast messages informing you of some selected call statistics, including:

    • The number of users in the channel
    • Packet loss rate
    • Remote video state changes
  7. Test the dual steam functionality

    Tap on a remote video in one of the smaller frames. The app moves the video to the larger frame and switches video quality to high. Tap on another video in a small frame. The app moves the video in the larger frame back to a smaller frame and switches back to the low quality stream.

  8. View remote video statistics

    When a remote video is displayed in the larger frame, the app overlays the video with some selected stats for that video.

Reference

This section contains information that completes the information in this page, or points you to documentation that explains other aspects to this product.

The recommended video settings vary by scenario. For example, in a one-to-one online class, the video windows of the teacher and student are both large, which calls for higher resolutions, frame rate, and bitrate. However, in a one-to-many online class, the video windows are smaller. You can set lower resolution, frame rate, and bitrate to accommodate bandwidth limitations. The recommended settings for these different scenarios are:

  • One-to-one video call:

    • Resolution: 320 x 240; Frame rate: 15 fps; Bitrate: 200 Kbps.
    • Resolution: 640 x 360; Frame rate: 15 fps; Bitrate: 400 Kbps.
  • One-to-many video call:

    • Resolution: 160 x 120; Frame rate: 15 fps; Bitrate: 65 Kbps.
    • Resolution: 320 x 180; Frame rate: 15 fps; Bitrate: 140 Kbps.
    • Resolution: 320 x 240; Frame rate: 15 fps; Bitrate: 200 Kbps.

Video profile table

Video SDK provides a selection of video dimensions, framerate, and bitrate to choose from. You can also customize the values according to the table below.

Video ProfileResolution (Width×Height)Frame rate (fps)Bitrate(Kbps)
120p160 × 1201565
120p_1160 × 1201565
120p_3120 × 1201550
180p320 × 18015140
180p_1320 × 18015140
180p_3180 × 18015100
180p_4240 × 18015120
240p320 × 24015200
240p_1320 × 24015200
240p_3240 × 24015140
240p_4424 × 24015220
360p640 × 36015400
360p_1640 × 36015400
360p_3360 × 36015260
360p_4640 × 36030600
360p_6360 × 36030400
360p_7480 × 36015320
360p_8480 × 36030490
360p_9640 × 36015800
360p_10640 × 36024800
360p_11640 × 360241000
480p640 × 48015500
480p_1640 × 48015500
480p_2640 × 480301000
480p_3480 × 48015400
480p_4640 × 48030750
480p_6480 × 48030600
480p_8848 × 48015610
480p_9848 × 48030930
480p_10640 × 48010400
540p960 × 540151100
720p1280 × 720151130
720p_11280 × 720151130
720p_21280 × 720302000
720p_31280 × 720301710
720p_5960 × 72015910
720p_6960 × 720301380
720p_auto1280 × 720303000
1080p1920 × 1080152080
1080p_11920 × 1080152080
1080p_21920 × 1080303000
1080p_31920 × 1080303150
1080p_51920 × 1080604780

The default video profile is 540p. For more details, see VideoEncoderConfiguration.

Mainstream video profiles

You can also refer to the following tables to learn the default resolution, frame rate, and bitrate of the low-quality video stream for different mainstream video profiles of the high-quality video stream.

High-quality stream video profile: CommunicationDefault low-quality stream video profile: Communication
320 × 240, 15, 200144 × 108, 5, 20
640 × 360, 15, 400288 × 162, 5, 40
640 × 480, 15, 500288 × 216, 5, 50
1280 × 720, 15, 1130288 × 162, 5, 113
240 × 320, 15, 200108 × 144, 5, 20
240 × 320, 15, 200108 × 144, 5, 20
360 × 640, 15, 400164 × 288, 5, 40
480 × 640, 15, 500216 × 288, 5, 50
720 × 1280, 15, 1130164 × 288, 5, 113
High-quality stream video profile: Live-broadcastDefault low-quality stream video profile: Live-broadcast
320 × 240, 15, 350160 × 120, 5, 45
640 × 360, 15, 650192 × 108, 5, 50
640 × 480, 15, 800160 × 120, 5, 45
1280 × 720, 15, 1600192 × 108, 5, 50
240 × 320, 15, 350120 × 160, 5, 45
360 × 640, 15, 650108 × 192, 5, 50
480 × 640, 15, 800120 × 160, 5, 45
720 × 1280, 15, 1600108 × 192, 5, 50

This section provides the recommended video resolution, frame rate, and bitrate for high-quality and low-quality streams.

Channel profile Video stream type Device system Recommended video profile
Communication high-quality stream macOS, Windows 640 × 480, 15, 500
Android, iOS 640 × 360, 15, 400
low-quality stream macOS, Windows 320 × 180, 7, 75
Android, iOS 160 × 90, 7, 45
Live-broadcast high-quality stream macOS, Windows 640 × 480, 15, 800
Android, iOS 640 × 360, 15, 650
low-quality stream macOS, Windows 320 × 180, 7, 126
Android, iOS 160 × 90, 7, 64

In practice, different user devices, user network conditions, application service locations, and user requirements affect which kinds of video profiles you use. Therefore, if the recommended video profiles are not suitable for you, contact technical support for assistance.

Mirror mode

By default, Video SDK does not mirror the video during encoding. You can use the mirrorMode parameter to decide whether to mirror the video that remote users see.

Connection states

When the connection state changes, Agora sends the onConnectionStateChanged callback. The following diagram illustrates the various states and how the states change as a client app joins and leaves a channel:

When the network connection is interrupted, the SDK automatically tries to reconnect to the server. The following diagram shows the callbacks received by the local user (UID1) and the remote user (UID2) when the local user joins the channel, gets a network exception, lises connection, and rejoins the channal.

As shown in the above diagram:

  • T0: The SDK receives the joinChannel request from UID1.
  • T1: 200 ms after calling joinChannel, UID1 joins the channel. In the process, UID1 also receives the onConnectionStateChanged(CONNECTING, CONNECTING) callback. When successfully joining the channel, UID 1 receives the onConnectionStateChanged(CONNECTED, JOIN_SUCCESS) and onJoinChannelSuccess callbacks.
  • T2: 100 ms after UID1 joins the channel, UID2 receives the onUserJoined callback.
  • T3: The uplink network condition of UID1 deteriorates. The SDK automatically tries rejoining the channel.
  • T4: If UID1 fails to receive any data from the server in four seconds, UID1 receives onConnectionStateChanged(RECONNCTING, INTERRUPTED); meanwhile the SDK continues to try rejoining the channel.
  • T5: If UID1 fails to receive any data from the server in ten seconds, UID1 receives onConnectionLost; meanwhile the SDK continues to try rejoining the channel.
  • T6: If UID2 fails to receive any data from UID1 in 20 seconds, the SDK decides that UID1 is offline. UID2 receives onUserOffline.
  • T7: If UID1 fails to rejoin the channel in 20 minutes, the SDK stops trying to rejoin the channel. UID1 receives onConnectionStateChanged(FAILED, JOIN_FAILED).

For more detailed information, about the connection state and reasons, see IRtcEngineEventHandler.onConnectionStateChanged.

List of audio profiles

Video SDK provides the following audio profile options:

List of audio scenarios

Video SDK provides the following audio scenarios to choose from:

Audio ScenarioPurpose
DefaultBasic communication.
Chatroom EntertainmentEntertainment scenario where users need to frequently switch the user role.
EducationEducation scenario where users want smoothness and stability.
Game StreamingHigh-quality audio chatroom scenario where hosts mainly play music.
ShowroomShowroom scenario where a single host wants high-quality audio.
Chatroom GamingGaming scenario for group chat that only contains human voice.
IoTInternet of Things scenario for devices that require low power consumption.
MeetingMeeting scenario that mainly contains human voice.

Profile and scenario parameter settings for some typical applications

ApplicationProfileScenarioFeatures
One-to-one classroomDefaultDefaultPrioritizes the call quality with smooth transmission and high-fidelity audio.
Battle Royale GameSpeech StandardChatroom GamingNoise reduction. Transmits voice only. Reduces the transmission rate. Suitable for multiplayer games.
Murder Mystery GameMusic StandardChatroom EntertainmentHigh-fidelity audio encoding and decoding. No volume or audio quality change when you mute/unmute the microphone.
KTVMusic High-qualityGame StreamingHigh-fidelity audio and effects. Adapts to the high-fidelity audio application.
PodcastMusic High-quality StereoShowRoomHigh-fidelity audio and stereo panning. Support for professional audio hardware.
Music educationMusic Standard StereoGame StreamingPrioritizes audio quality. Suitable for transmitting live external audio effects.
Collaborative teachingMusic Standard StereoChatroom EntertainmentHigh-fidelity audio and effects. No volume or audio quality change when you mute/unmute the microphone.

How video is oriented on the playing device

The way video is displayed on the playing device depends on orientationMode used on the encoding device, orientation of the capturing device, orientation of the playing device, and whether screen rotation is enabled on the playing device. The following images show how the video is finally oriented based on these factors.

Orientation mode: Adaptive

  • Screen rotation: Disabled

  • Capturing device orientation: Landscape orientation_adaptive_locked_landscape

  • Screen rotation: Disabled

  • Capturing device orientation: Portrait orientation_adaptive_locked_portrait

  • Screen rotation: Enabled

  • Capturing device orientation: Landscape orientation_adaptive_unlocked_landscape

  • Screen rotation: Enabled

  • Capturing device orientation: Portrait orientation_adaptive_unlocked_portrait

Orientation mode: Landscape

  • Capturing device orientation: Landscape orientation_landscape_landscape

  • Capturing device orientation: Portrait orientation_landscape_portrait

Orientation mode: Portrait

  • Capturing device orientation: Portrait orientation_portrait_portrait

  • Capturing device orientation: Landscape orientation_portrait_landscape

API reference

Interactive Live Streaming