WebRTC plugin for Flutter Mobile/Desktop/Web
Feature | Android | iOS | Web | macOS | Windows | Linux | Fuchsia |
---|---|---|---|---|---|---|---|
Audio/Video | ✔️ | ✔️ | ✔️ | ✔️ | [WIP] | [WIP] | |
Data Channel | ✔️ | ✔️ | ✔️ | ✔️ | [WIP] | [WIP] | |
Screen Capture | ✔️ | ✔️ | ✔️ | ||||
Unified-Plan | |||||||
MediaRecorder | ✔️ |
Add flutter_webrtc
as a dependency in your pubspec.yaml file.
Add the following entry to your Info.plist file, located in <project root>/ios/Runner/Info.plist
:
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Camera Usage!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Usage!</string>
This entry allows your app to access camera and microphone.
Ensure the following permission is present in your Android Manifest file, located in <project root>/android/app/src/main/AndroidManifest.xml
:
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
If you need to use a Bluetooth device, please add:
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
The Flutter project template adds it, so it may already be there.
Also you will need to set your build settings to Java 8, because official WebRTC jar now uses static methods in EglBase
interface. Just add this to your app level build.gradle
:
android {
//...
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
The modification requires specific initialization order:
- You need to call
WebRTCInit.initialize
before using the library. It takes three parameters: notification title, text and channel name. It starts the foreground service and sets up the sticky notification. - You then must call either
getUserMedia
orgetDisplayMedia
. This will initialize rest of the library and select correct audio source (if available) on Android. - Only then you can call other methods (e.g.
createPeerConnection
).
App requires one additional permission in main Android Manifest:
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
Additionally, you need to define service for the main application:
<service android:name="com.cloudwebrtc.webrtc.GetUserMediaImpl" android:foregroundServiceType="mediaProjection" />
In the main build.gradle
you will need to increase minSdkVersion
of defaultConfig
up to 21
(or 24 because of a bug in Android Studio) and targetSdkVersion
to 29
. The same goes for compileSdkVersion
(it also needs to be set to 29
).
The app needs to create a Broadcast Upload Extension in order to cast a full screenshare (beyond its own screens). The example
directory contains all the working code needed to handle a sample buffers (both video and audio) properly using modified GoogleWebRTC
framework.
The app needs to share data with the extension in some way and one of them is by creating an App Group and sharing same NSUserDefaults
. This is also used in SampleHandler
file of the example project. In order for this to work, both the app and the extension should be in the same app group.
Some parts of GoogleWebRTC
that are being used support only ARM64 architecture, so the project won't compile if armv7
or armv7s
archs are included.
The project is inseparable from the contributors of the community.
- CloudWebRTC - Original Author
- RainwayApp - Sponsor
- 亢少军 - Sponsor
For more examples, please refer to flutter-webrtc-demo.
This project exists thanks to all the people who contribute. [Contribute].
Become a financial contributor and help us sustain our community. [Contribute]
Support this project with your organization. Your logo will show up here with a link to your website. [Contribute]