# client-sdk-unity
**Repository Path**: feiniudaxia_admin/client-sdk-unity
## Basic Information
- **Project Name**: client-sdk-unity
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: Not specified
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2024-03-11
- **Last Updated**: 2024-03-11
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# LiveKit Unity SDK
⚠️ Warning
> This SDK is currently in Developer Preview mode and not ready for production use. There will be bugs and APIs may change during this period.
>
> We welcome and appreciate any feedback or contributions. You can create issues here or chat live with us in the #rust-developer-preview channel within the [LiveKit Community Slack](https://livekit.io/join-slack).
Use this SDK to add real-time video, audio and data features to your Unity app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.
## Platform Support
- [x] Windows
- [x] MacOS
- [x] Linux
- [ ] iOS
- [ ] Android
- [ ] WebGL
We plan to support all Unity platforms with this SDK. WebGL is currently supported with [client-sdk-unity-web](https://github.com/livekit/client-sdk-unity-web).
## Installation
Follow this [unity tutorial](https://docs.unity3d.com/Manual/upm-ui-giturl.html) using the `https://github.com/livekit/client-sdk-unity.git` link.
You can then directly import the samples into the package manager.
This repo uses [Git LFS](https://git-lfs.com/), please ensure it's installed when cloning the repo.
## Examples
### Connect to a room:
```cs
using LiveKit;
IEnumerator Start()
{
var room = new Room();
room.TrackSubscribed += TrackSubscribed;
var connect = room.Connect("ws://localhost:7880", "");
yield return connect;
if (!connect.IsError)
{
Debug.Log("Connected to " + room.Name);
}
}
```
### Publishing microphone
```cs
// Publish Microphone
var source = GetComponent();
source.clip = Microphone.Start("MacBook Pro Microphone", true, 2, 48000);
source.loop = true;
Ssurce.Play();
var rtcSource = new RtcAudioSource(Source);
var track = LocalAudioTrack.CreateAudioTrack("my-track", rtcSource);
var options = new TrackPublishOptions();
options.Source = TrackSource.SourceMicrophone;
var publish = room.LocalParticipant.PublishTrack(track, options);
yield return publish;
if (!publish.IsError)
{
Debug.Log("Track published!");
}
```
### Publishing a texture (e.g Unity Camera)
```cs
var rt = new UnityEngine.RenderTexture(1920, 1080, 24, RenderTextureFormat.ARGB32);
rt.Create();
Camera.main.targetTexture = rt;
var source = new TextureVideoSource(rt);
var track = LocalVideoTrack.CreateVideoTrack("my-track", source);
var options = new TrackPublishOptions();
options.VideoCodec = VideoCodec.H264;
options.Source = TrackSource.SourceCamera;
var publish = _room.LocalParticipant.PublishTrack(track, options);
yield return publish;
if (!publish.IsError)
{
Debug.Log("Track published!");
}
```
### Receiving tracks
```cs
IEnumerator Start()
{
var room = new Room();
room.TrackSubscribed += TrackSubscribed;
var connect = room.Connect("ws://localhost:7880", "");
yield return connect;
// ....
}
void TrackSubscribed(IRemoteTrack track, RemoteTrackPublication publication, RemoteParticipant participant)
{
if (track is RemoteVideoTrack videoTrack)
{
var rawImage = GetComponent();
var stream = new VideoStream(videoTrack);
stream.TextureReceived += (tex) =>
{
rawImage.texture = tex;
};
StartCoroutine(stream.Update());
// The video data is displayed on the rawImage
}
else if (track is RemoteAudioTrack audioTrack)
{
var source = GetComponent();
var stream = new AudioStream(audioTrack, source);
// Audio is being played on the source ..
}
}
```