Skip to content

Download Symbl.ai's demo app for AWS Chime. The continuous conversation artificial intelligence you receive extends beyond mere transcription to contextual insights such as questions, action-items, follow-ups or topics. Track personalized conversation intentions among speakers.

License

Notifications You must be signed in to change notification settings

symblai/symbl-adapter-for-chime-demo

Repository files navigation

Demo App: Symbl Conversation AI Adapter for Chime SDK

Symbl's APIs empower developers to enable:

  • Real-time analysis of free-flowing discussions to automatically surface highly relevant summary discussion topics, contextual insights, suggestive action items, follow-ups, decisions, and questions.
  • Voice APIs that makes it easy to add AI-powered conversation intelligence to either telephony or WebSocket interfaces.
  • Conversation APIs that provide a REST interface for managing and processing your conversation data.
  • Summary UI with a fully customizable and editable reference experience that indexes a searchable transcript and shows generated actionable insights, topics, timecodes, and speaker information.

Features

Application that demonstrates Symbl's Conversational AI Adapter with Chime SDK capabilities and easy integration.

Initialization

The Symbl adapter should be initialized after connecting to your Chime video conference and your access token is generated by your backend service. For instructions on how to do this please see here.

Once your Symbl Access Token is retrieved, you can set it by setting the static property ACCESS_TOKEN on the Symbl class.

Symbl.ACCESS_TOKEN = joinInfo.Symbl.accessToken;

Once your access token is set you can call the constructor as shown below.

/**
    @param {object} chime - chime instance
        @property {object} configuration : {
            @property {object} credentials: {
                ...
                @property {string} attendeeId -- Client attendee id
                @property {string} externalUserId
                @property {string} joinToken
                ...
            },
            @property {string} meetingId -- UUID of the meeting
        },
        @property {string} meeting meeting name
    }
    @param {object} config - Symbl Configuration
        @property {number} confidenceThreshold  optional | default: 0.5 | 0.0 - 1.0 minimum confidence value produce valid insight
        @property {string} languageCode         optional - default: 'en-US' | The language code as per the BCP 47 specification
        @property {boolean} insightsEnabled     optional - default: true -- false if language code is not english.
        @property {boolean} speechRecognition   optional - default: false -- Speaker identity to use for audio in this WebSocket connection. If omitted, no speaker identification will be used for processing.
*/
const symbl = new Symbl({
    configuration: {
        credentials: {
            attendeeId: chime.credentials.attendeeId,
            externalUserId: chime.credentials.externalUserId,
        },
        meetingId: "acbd0689-9b84-42f7-b8b8-9bc3aa7b057a".
    },
    {
        confidenceThreshold: 0.5,
        languageCode: 'en-US',
        insightsEnabled: true,
        speechRecognition: true,
    }
});

Realtime Closed Captioning

Realtime closed captioning can easily be added to your Chime SDK video chat application by creating a handler with 3 callback functions:

  • onClosedCaptioningToggled - Will be called whenever closed captioning is toggled on or off.
  • subtitleCreated - Called whenever speech is first detected and a new captioning object is created.
  • subtitleUpdated - Called when speech is subsequently detected

The handler can be added by calling the subscribeToCaptioningEvents function of your Symbl instance.

const captioningHandler = {
    onClosedCaptioningToggled: (ccEnabled: boolean) => {
        // Implement
    },
    subtitleCreated: (subtitle: Caption) => {
        console.warn('Subtitle created', subtitle);
        // Retrieve the video element that you wish to add the subtitle tracks to.
        const activeVideoElement = getActiveVideoElement() as HTMLVideoElement;
        if (activeVideoElement) {
            subtitle.setVideoElement(activeVideoElement);
        }
    },
    subtitleUpdated: (subtitle: Caption) => {
        const activeVideoElement = getActiveVideoElement() as HTMLVideoElement;
        // Check if the video element is set correctly
        if (!subtitle.videoElement && activeVideoElement) {
            subtitle.setVideoElement(activeVideoElement);
        }
        if (activeVideoElement && subtitle.videoElement !== activeVideoElement) {
            console.log('Active video element changed', activeVideoElement);
            subtitle.setVideoElement(activeVideoElement);
        }
    },
};
symbl.subscribeToCaptioningEvents(captioningHandler);

Setting the video element that subtitles will be superimposed over should be done by calling the setVideoElement function on the Caption class.

If your video chat application has alternating primary video tiles, this can be used to change which element is active.

Realtime Insights

Realtime insights are generated as Symbl processes the conversation in your video chat platform.

The Symbl adapter exposes a function, subscribeToInsightEvents, that takes a handler with a function called onInsightsCreated. Insights are enabled by default, or by passing a property insightsEnabled in the config parameter of the Symbl constructor.

new Symbl(chimeConfiguration, {insightsEnabled: true});

By creating a handler and passing it into the subscribeToInsightEvents function, we can create a default element or use the data included in the Insight object returned in the handlers callback.

const insightHandler = {
    onInsightCreated: (insight: Insight) => {
        // Creates a predesigned insight widget;
        const element = insight.createElement();
        // Customize any styling
        element.classList.add('mx-auto');
        element.style.width = '98%';
        // Get container you wish to add insights to.
        const insightContainer = document.getElementById('receive-insight');
        // Call add on the insight object to add it to DIV
        insight.add(insightContainer);
    }
};
// Subscribe to realtime insight events using the handler created above
this.symbl.subscribeToInsightEvents(insightHandler);

Realtime Topics

Realtime topics are generated as Symbl processes the conversation in your video chat platform. When a topic is detected by Symbl and the onTopicCreated event is emitted, a Topic object is passed to the callback function provided in the Topic handler. The Symbl adapter exposes a handler function, subscribeToTopicEvents, that has a callback function onTopicCreated. Topics are enabled by default.

  • Example:

When a topic event is emitted from the onTopicCreated handler, you can use the Topic object returned and either use the createElement function to create a default element or you can use the data included in the Topic object returned in the handlers callback to create your own element, capture the data and store as a metric, etc…

        const topicHandler = {
            onTopicCreated: (topic: Topic) => {
                //Random font color for new topic
                const content = topic.phrases;
                const score = topic.score;
                fontSize = score * 40 + 8;
                let element = topic.createElement();
                element.innerText = content;
                element.style.fontSize=String(fontSize)+'px'
                //In case you have a Topics document you can add this element with differnt font size of topic based on the score
                document.getElementById('Topics').appendChild(element);

            }
        };
        // Subscribe to realtime tracker events using the handler created above
        this.symbl.subscribeToTopicEvents(topicHandler);

Realtime Trackers

Realtime trackers are generated as Symbl processes the conversation in your video chat platform. When an Tracker is detected by Symbl and the onTrackerCreated event is emitted, a Tracker object is passed to the callback function provided in the Tracker handler. The Tracker class holds data about the tracker generated. The Symbl adapter exposes a handler function, subscribeToTrackerEvents, that has a callback function onTrackerCreated. Trackers are enabled by adding a list of name and vocabulary pharses in the form of disctionaries to be found in a conversation.

new Symbl(chimeConfiguration, {trackers:[
        {
            name: "COVID-19",
            vocabulary: [
                "social distancing",
                "cover your face with mask",
                "vaccination"
            ]
        }
    ],});

Subscribing to the Tracker publisher is achieved by passing a handler to the subscribeToTrackerEvents function of your Symbl instance.

  • Example

When an tracker event is emitted from the onTrackerCreated handler, you can use the Tracker object returned and either use the createElement function to create a default element or you can use the data included in the Tracker object returned in the handlers callback to create your own element, capture the data and store as a metric, etc…

        const TrackerHandler = {
            onTrackerCreated: (topic: Tracker) => {
                const name = tracker.name;
                const matches = tracker.matches;
                let currentCategoryHit=0;
                //Check the number of non-empty messageRefs in current tracker
                for (let i = 0; i < matches.length; i++) {
                    if (matches[i]["messageRefs"].length > 0) {
                        currentCategoryHit+=1
                    }
                }
                let element = tracker.createElement();
                element.innerText = name + ':' + String(currentCategoryHit);
                element.style.fontSize = String(12 + currentCategoryHit)+ 'px';
                 //In case you have a Trackers document you can add this element with differnt
                 //font size of tracker based on the number of messageRefs to know how many times the tracker was foud in the converation  
                document.getElementById('Trackers').appendChild(element);

            }
        };
        // Subscribe to realtime tracker events using the handler created above
        this.symbl.subscribeToTrackerEvents(trackerHandler);

Prerequisites

You must have the following installed:

Installation

Make sure you have Node.js version 10 and higher. To add the Amazon Chime SDK for JavaScript into an existing application, install the package directly from npm:

git clone https://github.com/symblai/symbl-adapter-for-chime-demo
cd symbl-adapter-for-chime-demo
npm install
cd demos/browser
npm install
cd ../serverless
npm install

Set Up for Symbl

Symbl Credentials

  • Create an account in the Symbl Console if you don't have one already.
  • After you login, you will find your appId and appSecret on the home page.
  • Create a .env file in demos/browser and demos/serverless/src that includes your appId and appSecret as shown below.
SYMBL_APP_ID=<xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx>
SYMBL_APP_SECRET=<xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx>

The App ID and App Secret are used to authenticate your session with Symbl by generating an access token. Your App ID and Secret should not be shared or posted publicly.

Local Demo

This demo shows how one might implement Symbl Conversational AI Adapter for Amazon Chime SDK to build meeting applications with realtime transcription and insights using Symbl's Realtime Websockets API.

Prerequisites - Local

To build, test, and run demos from source you will need:

  • Node 10 or higher
  • npm 6.11 or higher

Serverless Deployment

Run deployment script

Make sure you have built the browser application by running

cd demos/browser
npm run build

The following will create a CloudFormation stack containing a Lambda and API Gateway deployment that runs the meetingV2 demo. Make sure the bucket and stack names are unique to AWS.

cd demos/serverless/
npm run deploy -- -r us-east-1 -b <my-bucket> -s <my-stack-name> -a meetingV2

This script will create an S3 bucket and CloudFormation stack with Lambda and API Gateway resources required to run the demo. After the script finishes, it will output a URL that can be opened in a browser.

Running the browser demos with a local server

  1. Navigate to the demos/browser folder: cd demos/browser

  2. Start the demo application: npm run start

  3. Open http://localhost:8080 in your browser.

Running

Browser demo applications are located in the app folder. Current demos are:

To run the Symbl Conversation AI demo application use:

npm run start

After running start the first time, you can speed things up on subsequent iterations by using start:fast, e.g.

npm run start:fast

Community

If you have any questions, feel free to reach out to us at [email protected] or through our Community Slack or our forum.

This guide is actively developed, and we love to hear from you! Please feel free to create an issue or open a pull request with your questions, comments, suggestions and feedback. If you liked our integration guide, please star our repo!

This library is released under the Apache License

About

Download Symbl.ai's demo app for AWS Chime. The continuous conversation artificial intelligence you receive extends beyond mere transcription to contextual insights such as questions, action-items, follow-ups or topics. Track personalized conversation intentions among speakers.

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published