Skip to main content

You are viewing Agora Docs forBetaproducts and features. Switch to Docs

Android
iOS
Web
macOS
Windows
Flutter
React Native

LiveData Conversation Intelligence

The LiveData Conversation Intelligence extension enables you to quickly add real-time speech transcription and translation features to your app.

This page shows you how to integrate and use the extension in your app.

Understand the tech

The LiveData Conversation Intelligence extension is an encapsulation of the real-time speech recognition and translation core API on the cloud. You can quickly integrate the LiveData Conversation Intelligence capabilities by passing in the specified key and value parameters to the setExtensionProperty method provided in Agora SDK v4.x.

The key parameter of the setExtensionProperty method corresponds to the name of the API, and the value parameter wraps some or all of the parameters of the API in JSON format. Therefore, by passing in the specified key and value parameters, you can call the corresponding cloud API to realize the functions of LiveData Conversation Intelligence.

Prerequisites

Ensure that your development environment meets the following requirements:

  • Android Studio 4.1 or later.
  • A physical device (not an emulator) running Android 5.0 or later.
  • LiveData Conversation Intelligence is used with Agora Video SDK v4.x.

    Refer to the SDK quickstart to integrate Video SDK v4.x and implement basic video calling.

Project Setup

The LiveData Conversation Intelligence extension provides a sample project on GitHub to help you get started quickly.

  1. Setup the sample project

    To set up and run the LiveData Conversation Intelligence sample project, do the following:

    1. Clone the Github repository.

      Execute the following command in the terminal:


      _1
      git clone https://github.com/highras/rtvt-agora-marketplace.git

    2. Open the sample project in Android Studio.

    3. Sync project with Gradle files.

    4. Connect a real Android device (not a simulator) and run the project.

  1. Test translation and transcription features

    Once you have installed the sample project on your device, follow these steps to test the translation and transcription features:

    1. Start the app. Fill in the channel name in the input box, and click Join.
    2. Click Start Translation to begin the translation. Speak into the device. You see the transcription and translation on the screen in real-time.
    3. Click End Translation to end transcription and translation.
    4. Click End Plug-in to stop using the extension.

Integrate the extension

This section shows you how to integrate the LiveData Conversation Intelligence extension, and call the core API to perform real-time speech recognition and translation.

To integrate the extension in your project:

  1. Purchase and activate the extension

    Visit the Extensions Marketplace and follow the prompts to purchase the LiveData Conversation Intelligence extension. Save the appKey and appSecret you obtain. You use these values to initialize the extension.

  1. Integrate the extension

    Refer to the following steps:

    1. From the Extensions Marketplace, download the Android extension package of the LiveData Conversation Intelligence extension. Unzip the package and save all .aar files to the /app/libs path in your project folder.

    2. Open app/build.gradle and add the following line under dependencies:


      _1
      implementation fileTree(dir: "libs", include: ["*.jar", "*.aar"])

  2. Enable the extension

    When initializing the Agora Engine, call addExtension to load the extension and then call enableExtension to enable it:


    _4
    RtcEngineConfig config = new RtcEngineConfig();
    _4
    config.addExtension("agora-iLiveData-filter");
    _4
    engine = RtcEngine.create(config);
    _4
    engine.enableExtension("iLiveData", "RTVT", true);

  3. Start transcription and translation

    Prepare a JSON object to pass in values for the appKey and appSecret parameters:


    _8
    JSONObject jsonObject = new JSONObject();
    _8
    // Pass in the `appKey` and `appSecret` obtained when purchasing and activating the extension in the Agora console.
    _8
    jsonObject.put("appKey", "80001000");
    _8
    jsonObject.put("appSecret", "qwerty");
    _8
    // Set source language
    _8
    jsonObject.put("srclang", "zh");
    _8
    // Set target language
    _8
    jsonObject.put("dstLang", "en");

    Call setExtensionProperty with a startAudioTranslation key


    _2
    engine.setExtensionProperty(EXTENSION_VENDOR_NAME,
    _2
    EXTENSION_AUDIO_FILTER_VOLUME, "startAudioTranslation", jsonObject.toString());

  4. Get transcription and translation results

    After successful initialization, the LiveData Conversation Intelligence extension returns the transcription and translation results using the onEvent callback.

  5. Stop using the extension

    Call the setExtensionProperty method and specify the key as closeAudioTranslation to end the use of the LiveData Conversation Intelligence extension:


    _2
    engine.setExtensionProperty(EXTENSION_VENDOR_NAME,
    _2
    EXTENSION_AUDIO_FILTER_VOLUME, "closeAudioTranslation", "end");

Reference

This section contains content that completes the information on this page, or points you to documentation that explains other aspects to this product.

API reference

vundefined