# fluttertpc_livekit_client **Repository Path**: openharmony-sig/fluttertpc_livekit_client ## Basic Information - **Project Name**: fluttertpc_livekit_client - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 1 - **Created**: 2024-06-25 - **Last Updated**: 2025-05-07 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # 🚨 **重要提示 | IMPORTANT** > > **⚠️ 此代码仓已归档。新地址请访问 [fluttertpc_livekit_client](https://gitcode.com/openharmony-sig/fluttertpc_livekit_client)。| ⚠️ This repository has been archived. For the new address, please visit [fluttertpc_livekit_client](https://gitcode.com/openharmony-sig/fluttertpc_livekit_client).** > --- > The LiveKit icon, the name of the repository and some sample code in the background. [![pub package](https://img.shields.io/pub/v/livekit_client?label=livekit_client&color=blue)](https://pub.dev/packages/livekit_client) # LiveKit Flutter SDK Use this SDK to add real-time video, audio and data features to your Flutter app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code. ## NOTE Version 2 of the SDK contains a small set of breaking changes. Read the [migration guide](https://docs.livekit.io/guides/migrate-from-v1/) for a detailed overview of what has changed. This package is published to pub.dev as [livekit_client](https://pub.dev/packages/livekit_client). ## Docs More Docs and guides are available at [https://docs.livekit.io](https://docs.livekit.io) ## Current supported features | Feature | Subscribe/Publish | Simulcast | Background audio | Screen sharing | End to End Encryption | Multi Codec Simulcast | | :-----: | :---------------: | :-------: | :--------------: | :------------: | :-------------------: | :-------------------: | | iOS | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | | Android | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | | Mac | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | | Windows | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | | Linux | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 | 🟢 = Available 🟡 = Coming soon (Work in progress) 🔴 = Not currently available (Possibly in the future) ## Example app We built a multi-user conferencing app as an example in the [example/](example/) folder. You can join the same room from any supported LiveKit clients. Online demo: https://livekit.github.io/client-sdk-flutter/ ## Installation Include this package to your `pubspec.yaml` ```yaml --- dependencies: livekit_client: ``` ### iOS Camera and microphone usage need to be declared in your `Info.plist` file. ```xml ... NSCameraUsageDescription $(PRODUCT_NAME) uses your camera NSMicrophoneUsageDescription $(PRODUCT_NAME) uses your microphone ``` Your application can still run the voice call when it is switched to the background if the background mode is enabled. Select the app target in Xcode, click the Capabilities tab, enable Background Modes, and check **Audio, AirPlay, and Picture in Picture**. Your `Info.plist` should have the following entries. ```xml ... UIBackgroundModes audio ``` #### Notes Since [xcode 14](https://developer.apple.com/news/upcoming-requirements/?id=06062022a) no longer supports 32bit builds, and our latest version is based on libwebrtc m104+ the iOS framework no longer supports 32bit builds, we strongly recommend upgrading to flutter 3.3.0+. if you are using flutter 3.0.0 or below, there is a high chance that your flutter app cannot be compiled correctly due to the missing i386 and arm 32bit framework [#132](https://github.com/livekit/client-sdk-flutter/issues/132) [#172](https://github.com/livekit/client-sdk-flutter/issues/172). You can try to modify your `{projects_dir}/ios/Podfile` to fix this issue. ```ruby post_install do |installer| installer.pods_project.targets.each do |target| flutter_additional_ios_build_settings(target) target.build_configurations.each do |config| # Workaround for https://github.com/flutter/flutter/issues/64502 config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line end end end ``` For iOS, the minimum supported deployment target is `12.1`. You will need to add the following to your Podfile. ```ruby platform :ios, '12.1' ``` You may need to delete `Podfile.lock` and re-run `pod install` after updating deployment target. ### Android We require a set of permissions that need to be declared in your `AppManifest.xml`. These are required by Flutter WebRTC, which we depend on. ```xml ... ``` For using the bluetooth headset correctly on the android device, you need to add `permission_handler` to your project. And call the following code after launching your app for the first time. ```dart import 'package:permission_handler/permission_handler.dart'; Future _checkPermissions() async { var status = await Permission.bluetooth.request(); if (status.isPermanentlyDenied) { print('Bluetooth Permission disabled'); } status = await Permission.bluetoothConnect.request(); if (status.isPermanentlyDenied) { print('Bluetooth Connect Permission disabled'); } } void main() async { WidgetsFlutterBinding.ensureInitialized(); await _checkPermissions(); runApp(MyApp()); } ``` #### Audio Modes By default, we use the `communication` audio mode on Android which works best for two-way voice communication. If your app is media playback oriented and does not need the use of the device's microphone, you can use the `media` audio mode which will provide better audio quality. ```dart import 'package:flutter_webrtc/flutter_webrtc.dart' as webrtc; Future _initializeAndroidAudioSettings() async { await webrtc.WebRTC.initialize(options: { 'androidAudioConfiguration': webrtc.AndroidAudioConfiguration.media.toMap() }); webrtc.Helper.setAndroidAudioConfiguration( webrtc.AndroidAudioConfiguration.media); } void main() async { await _initializeAudioSettings(); runApp(const MyApp()); } ``` Note: the audio routing will become controlled by the system and cannot be manually changed with functions like `Hardware.selectAudioOutput`. ### Desktop support In order to enable Flutter desktop development, please follow [instructions here](https://docs.flutter.dev/desktop#set-up). On Windows [VS 2019](https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=community&rel=16) is needed (link in flutter docs will download VS 2022). ## Usage ### Connecting to a room, publish video & audio ```dart final roomOptions = RoomOptions( adaptiveStream: true, dynacast: true, // ... your room options ) final room = Room(); await room.connect(url, token, roomOptions: roomOptions); try { // video will fail when running in ios simulator await room.localParticipant.setCameraEnabled(true); } catch (error) { print('Could not publish video, error: $error'); } await room.localParticipant.setMicrophoneEnabled(true); ``` ### Screen sharing Screen sharing is supported across all platforms. You can enable it with: ```dart room.localParticipant.setScreenShareEnabled(true); ``` #### Android On Android, you would have to define a foreground service in your AndroidManifest.xml. ```xml title="AndroidManifest.xml" ... ``` #### iOS On iOS, a broadcast extension is needed in order to capture screen content from other apps. See [setup guide](https://github.com/flutter-webrtc/flutter-webrtc/wiki/iOS-Screen-Sharing#broadcast-extension-quick-setup) for instructions. #### Desktop(Windows/macOS) On dekstop you can use `ScreenSelectDialog` to select the window or screen you want to share. ```dart try { final source = await showDialog( context: context, builder: (context) => ScreenSelectDialog(), ); if (source == null) { print('cancelled screenshare'); return; } print('DesktopCapturerSource: ${source.id}'); var track = await LocalVideoTrack.createScreenShareTrack( ScreenShareCaptureOptions( sourceId: source.id, maxFrameRate: 15.0, ), ); await room.localParticipant.publishVideoTrack(track); } catch (e) { print('could not publish screen sharing: $e'); } ``` ### End to End Encryption LiveKit supports end-to-end encryption for audio/video data sent over the network. By default, the native platform can support E2EE without any settings, but for flutter web, you need to use the following steps to create `e2ee.worker.dart.js` file. ```bash # for example app dart compile js web/e2ee.worker.dart -o example/web/e2ee.worker.dart.js -m # for your project export YOU_PROJECT_DIR=your_project_dir git clone https://github.com/livekit/client-sdk-flutter.git cd client-sdk-flutter && flutter pub get dart compile js web/e2ee.worker.dart -o ${YOU_PROJECT_DIR}/web/e2ee.worker.dart.js -m ``` ### Advanced track manipulation The setCameraEnabled/setMicrophoneEnabled helpers are wrappers around the Track API. You can also manually create and publish tracks: ```dart var localVideo = await LocalVideoTrack.createCameraTrack(); await room.localParticipant.publishVideoTrack(localVideo); ``` ### Rendering video Each track can be rendered separately with the provided `VideoTrackRenderer` widget. ```dart VideoTrack? track; @override Widget build(BuildContext context) { if (track != null) { return VideoTrackRenderer(track); } else { return Container( color: Colors.grey, ); } } ``` ### Audio handling Audio tracks are played automatically as long as you are subscribed to them. ### Handling changes LiveKit client makes it simple to build declarative UI that reacts to state changes. It notifies changes in two ways - `ChangeNotifier` - generic notification of changes. This is useful when you are building reactive UI and only care about changes that may impact rendering. - `EventsListener` - listener pattern to listen to specific events (see [events.dart](https://github.com/livekit/client-sdk-flutter/blob/main/lib/src/events.dart)). This example will show you how to use both to react to room events. ```dart class RoomWidget extends StatefulWidget { final Room room; RoomWidget(this.room); @override State createState() { return _RoomState(); } } class _RoomState extends State { late final EventsListener _listener = widget.room.createListener(); @override void initState() { super.initState(); // used for generic change updates widget.room.addListener(_onChange); // used for specific events _listener ..on((_) { // handle disconnect }) ..on((e) { print("participant joined: ${e.participant.identity}"); }) } @override void dispose() { // be sure to dispose listener to stop listening to further updates _listener.dispose(); widget.room.removeListener(_onChange); super.dispose(); } void _onChange() { // perform computations and then call setState // setState will trigger a build setState(() { // your updates here }); } @override Widget build(BuildContext context) { // your build function } } ``` Similarly, you could do the same when rendering participants. Reacting to changes makes it possible to handle tracks published/unpublished or re-ordering participants in your UI. ```dart class VideoView extends StatefulWidget { final Participant participant; VideoView(this.participant); @override State createState() { return _VideoViewState(); } } class _VideoViewState extends State { TrackPublication? videoPub; @override void initState() { super.initState(); widget.participant.addListener(this._onParticipantChanged); // trigger initial change _onParticipantChanged(); } @override void dispose() { widget.participant.removeListener(this._onParticipantChanged); super.dispose(); } @override void didUpdateWidget(covariant VideoView oldWidget) { oldWidget.participant.removeListener(_onParticipantChanged); widget.participant.addListener(_onParticipantChanged); _onParticipantChanged(); super.didUpdateWidget(oldWidget); } void _onParticipantChanged() { var subscribedVideos = widget.participant.videoTracks.values.where((pub) { return pub.kind == TrackType.VIDEO && !pub.isScreenShare && pub.subscribed; }); setState(() { if (subscribedVideos.length > 0) { var videoPub = subscribedVideos.first; // when muted, show placeholder if (!videoPub.muted) { this.videoPub = videoPub; return; } } this.videoPub = null; }); } @override Widget build(BuildContext context) { var videoPub = this.videoPub; if (videoPub != null) { return VideoTrackRenderer(videoPub.track as VideoTrack); } else { return Container( color: Colors.grey, ); } } } ``` ### Mute, unmute local tracks On `LocalTrackPublication`s, you could control if the track is muted by setting its `muted` property. Changing the mute status will generate an `onTrackMuted` or `onTrack Unmuted` delegate call for the local participant. Other participant will receive the status change as well. ```dart // mute track trackPub.muted = true; // unmute track trackPub.muted = false; ``` ### Subscriber controls When subscribing to remote tracks, the client has precise control over status of its subscriptions. You could subscribe or unsubscribe to a track, change its quality, or disabling the track temporarily. These controls are accessible on the `RemoteTrackPublication` object. For more info, see [Subscriber controls](https://docs.livekit.io/guides/room/receive#subscriber-controls). ## Getting help / Contributing Please join us on [Slack](https://livekit.io/join-slack) to get help from our devs / community members. We welcome your contributions(PRs) and details can be discussed there. ## License Apache License 2.0 ## Thanks A huge thank you to [flutter-webrtc](https://github.com/flutter-webrtc/flutter-webrtc) for making it possible to use WebRTC in Flutter.
LiveKit Ecosystem
Real-time SDKsReact Components · JavaScript · iOS/macOS · Android · Flutter · React Native · Rust · Python · Unity (web) · Unity (beta)
Server APIsNode.js · Golang · Ruby · Java/Kotlin · Python · Rust · PHP (community)
Agents FrameworksPython · Playground
ServicesLivekit server · Egress · Ingress · SIP
ResourcesDocs · Example apps · Cloud · Self-hosting · CLI