原文链接
UnityRenderStreaming官方文档
Unity:2021.3.8f1c1
RenderStreaming:3.1.0-exp.4(Pre-release)
RenderStreaming WebServer:3.1.0-exp.3
本次RenderStreamingPackage使用3.1.0-exp.4,并没有正式版,但exp.4比exp.3版本开放了视频流编码、帧率、比特率、缩放比等设置,使用中也很稳定,方便根据自己项目需求调整。
WebServer使用3.1.0-exp.3,因为exp.4版本Server,网页访问路径检查有问题。
Git地址: UnityRenderStreaming
根据自己的平台选择Server,需要安装Node.js下载
使用node -v
npm -v
检查是否安装成功
Windows启动方式如下(其他平台启动方式参考官方文档)
找到webserver.exe,地址栏中输入cmd,按下Enter
输入.\webserver.exe -w
,以WebSocket方式启动
打开浏览器,输入本机IP,下图就是以WebSocket启动后的内容
源码解压后,复制com.unity.renderstreaming
文件,我这边放在了示例项目的Packages下,新建一个文件夹内。(根据自己的喜好存放)
打开PackageManager
点击+ ,选择Add package from disk
找到刚才的文件中package.json
,打开
打开后如下
选择Add package by name
输入com.unity.webrtc
,点击Add
安装后如下:
创建空物体,起名RenderStreaming
添加RenderStreaming
、Broadcast
、 VideoStreamSender
组件
修改各项参数如下:
添加AudioStreamSender
组件在AudioListener
上,这个组件必须要在一个物体
Broadcast
组件记得添加AudioStreamSender
组件
选择ReceiverSample
效果如下,可根据自己的需求去调整VideoStreamSender
中的各项参数
安装InputSystem
下面一些设置针对Unity2021.2以上版本,低版本参考链接
(该HDRP示例,已默认使用了InputSystem)
出现该弹窗后,选择Yes,同意使用New Input Sytem
或者在ProjectSettings/Player/OtherSettings/ActiveInputHanding中选择New或者Both
Run In Background开启(2021.2)
ProjectSettings/Player/Resolution and Presentation/Run in background
Input System Package 设置修改(2021.2)
官方文档参考
添加InputReceiver
组件,并添加到Broadcast
中
创建InputActions后打开,操作:Create/InputActions
根据自己的项目需求配置Actions
将配置好的额InputActions文件拖入InputReceiver/Actions
中,展开Events,可以看到配置的Action对应的事件,根据自己的需求去绑定
EventSystem使用InputSystemUIInputModule
显示分辨率要和VideoStreamSender
的StreamingSize
保持一致,否则识别点位会发生偏移
对于大多数 WebRTC 应用,服务器都需要在对等设备之间中继流量,因为在客户端之间通常无法实现直接套接字(除非这些应用在同一本地网络中)。解决此问题的常见方法是使用TURN 服务器。术语表示使用中继 NAT 的遍历,是一种中继网络流量的协议。
参考Unity文档中TURN 服务器设置
TURN服务器使用的端口需要公开,可设置最大最小值
协议 | 端口 |
---|---|
TCP | 32355-65535、3478-3479 |
UDP | 32355-65535、3478-3479 |
Web端中更改config.js
文件中的config.iceServers
config.iceServers = [{
urls: ['stun:stun.l.google.com:19302']
}, {
urls: ['turn:xx.xx.xx.xx:3478?transport=tcp'],
username: 'username',
credential: 'password'
}
];
Unity中修改RenderStreaming
组件中Ice Server
下面是如何制作自己的WebServer,更多详情查看官方文档
1.下载exp.3的源码
2.解压后找到WebApp
3.地址栏中输入cmd
4.输入npm install
或者npm i
安装依赖
如果安装过慢或者超时导致失败,可以先输入
npm config set registry https://registry.npm.taobao.org
更换npm的安装镜像源为国内
如果出现如下错误,可以输入npm config set legacy-peer-deps true
,之后再次输入npm install
或者npm i
安装依赖**
5.构建Server,输入npm run build
6.启动Server,输入npm run start -- -w
如果启动异常,如下图,使用npm run dev -- -w
启动
启动后如下,暂不清楚为何出现上述情况,有了解该原因的小伙伴,可以告诉我一下,谢谢。
7.制作启动快捷方式
为了方便以后快速启动,可以制作如下文件
8.打包,npm run pack
等待进度完成,将打包成exe
启动方式如上.\webserver.exe -w
发送消息使用WebRTC
中的RTCDataChannel
的Send
函数,接收消息使用OnMessage
Unity项目中创建脚本如下,使用inputReceiver
中的Channel
对象进行Send
using Unity.RenderStreaming;
using UnityEngine;
public class RenderStreamingManager : MonoBehaviour
{
private InputReceiver inputReceiver;
private void Awake()
{
inputReceiver = transform.GetComponent<InputReceiver>();
}
public void SendMsg(string msg)
{
inputReceiver.Channel.Send(msg);
}
private void Update()
{
if (Input.GetKeyDown(KeyCode.P))
{
SendMsg("Send Msg to web");
}
}
}
记得添加到项目中
Web中接收如下
找到receiver.js
文件中的inputSenderChannel
对象,使用onmessage
接收信息
收到的消息内容如下,字符串会在data中
web中使用send函数发送消息
this.inputSenderChannel.send("msg to unity");
Unity中Receiver.cs
中OnMessage中通过RTCDataChannel接收了web发送的消息
在服务器带宽允许的前提下,尽可能的提高传输质量,如下,是我在项目中的设置
可以根据自己带宽,调整DepathBuffer,StreamingSize,Framerate,Bitrate,ScaleResolution
。
也可以获取VideoStreamSender组件,运行时修改对应参数,方便发布后测试。
VideoStreamSender videoStreamSender = transform.GetComponent<VideoStreamSender>();
videoStreamSender.SetTextureSize(new Vector2Int(gameConfig.RSResolutionWidth,gameConfig.RSResolutionHeight));
videoStreamSender.SetFrameRate(gameConfig.RSFrameRate);
videoStreamSender.SetScaleResolutionDown(gameConfig.RSScale);
videoStreamSender.SetBitrate((uint)gameConfig.RSMinBitRate,(uint)gameConfig.RSMaxBitRate);
因为目前受限于分辨率的变化导致Unity的按钮经常不能正确的进行点击,所以由Web同学制作对应的按钮,通过Web向Unity发消息的形式,响应对应按钮操作。
我这边是为所有按钮配置id,web发送按钮id,Unity接收后遍历配置表,进行Invoke
private void OnClickButton(int elementId)
{
// 在配置表中根据id,获取UIName,FunctionName,Para
List<UIMatchFuncModel> uiMatchFuncModels = UniversalConfig.Instance.GetUIMatchModels();
foreach (var item in uiMatchFuncModels.Where(item => item.id == elementId))
{
ExecuteFunction(item);
break;
}
}
private void ExecuteFunction(UIMatchFuncModel matchFuncModel)
{
UIPanel uiPanel = UIKit.GetPanel(matchFuncModel.uiName);
if (uiPanel == null)
{
uiPanel = UIKit.OpenPanel(matchFuncModel.uiName);
}
uiPanel.Invoke(matchFuncModel.functionName,0);
}
匹配不同设备分辨率,通过web中获取当前设备分辨率,连接成功后,发送设备宽高消息到Unity,Unity中重新设置输出分辨率来保持web设备中铺满显示。
public void ChangeScreenSize(int width,int height)
{
videoStreamSender.SetTextureSize(new Vector2Int(width,height));
}
手机端由于竖屏展示画面过小,通常都是竖屏状态下展示横屏内容,因此:
1.上一点提到的分辨率设置时需要注意web端的设备是横屏还是竖屏;
2.项目中如果有滑动操作,需要注意根据横竖屏来反转操作。
项目中有过每隔半小时到一小时服务就会断开的情况,因此添加了心跳检测,进行重连操作
如下,在ISignaling
中添加OnHeartBeatHandler
using Unity.WebRTC;
namespace Unity.RenderStreaming.Signaling
{
public delegate void OnStartHandler(ISignaling signaling);
public delegate void OnConnectHandler(ISignaling signaling, string connectionId, bool polite);
public delegate void OnDisconnectHandler(ISignaling signaling, string connectionId);
public delegate void OnOfferHandler(ISignaling signaling, DescData e);
public delegate void OnAnswerHandler(ISignaling signaling, DescData e);
public delegate void OnIceCandidateHandler(ISignaling signaling, CandidateData e);
// add
public delegate void OnHeartBeatHandler(ISignaling signaling);
public interface ISignaling
{
void Start();
void Stop();
event OnStartHandler OnStart;
event OnConnectHandler OnCreateConnection;
event OnDisconnectHandler OnDestroyConnection;
event OnOfferHandler OnOffer;
event OnAnswerHandler OnAnswer;
event OnIceCandidateHandler OnIceCandidate;
// add
event OnHeartBeatHandler OnHeartBeat;
string Url { get; }
float Interval { get; }
void OpenConnection(string connectionId);
void CloseConnection(string connectionId);
void SendOffer(string connectionId, RTCSessionDescription offer);
void SendAnswer(string connectionId, RTCSessionDescription answer);
void SendCandidate(string connectionId, RTCIceCandidate candidate);
// add
void SendHeartBeat();
}
}
修改报错
WebSocketSignaling中的SendHeartBeat函数中添加如下内容
public void SendHeartBeat()
{
this.WSSend($"{{\"type\":\"heart\"}}");
}
创建RenderStreamingManager
脚本,代码如下
using System.Collections;
using System.Threading;
using Unity.RenderStreaming;
using Unity.RenderStreaming.Signaling;
using UnityEngine;
public class RenderStreamingManager : MonoBehaviour
{
private RenderStreaming _renderStreaming;
private VideoStreamSender _videoStreamSender;
private ISignaling _signaling;
private InputReceiver _inputReceiver;
// heartbeat check
private Coroutine _heartBeatCoroutine;
private bool _isReceiveHeart;
private float _heartBeatInterval;
private void Awake()
{
// websocket
_signaling = new WebSocketSignaling($"{"ws"}://{"192.0.0.0"}", 5, SynchronizationContext.Current);
if (_signaling != null)
{
_renderStreaming = transform.GetComponent<RenderStreaming>();
_videoStreamSender = transform.GetComponent<VideoStreamSender>();
_inputReceiver = transform.GetComponent<InputReceiver>();
_renderStreaming.Run(_signaling);
// heart beat
_signaling.OnStart += OnWebSocketStart;
_signaling.OnHeartBeat += OnHeatBeat;
}
}
#region HeartBeat
private void OnWebSocketStart(ISignaling signaling)
{
StartHeartCheck();
}
private void OnHeatBeat(ISignaling signaling)
{
_isReceiveHeart = true;
}
private void StartHeartCheck()
{
_isReceiveHeart = false;
_signaling.SendHeartBeat();
_heartBeatCoroutine = StartCoroutine(HeartBeatCheck());
}
private IEnumerator HeartBeatCheck()
{
yield return new WaitForSeconds(5);
if (!_isReceiveHeart)
{
// not receive msg
Reconnect();
}
else
{
StartHeartCheck();
}
}
private void Reconnect()
{
StartCoroutine(ReconnectDelay());
}
private IEnumerator ReconnectDelay()
{
_signaling.Stop();
yield return new WaitForSeconds(1);
_signaling.Start();
}
#endregion
}
记得关闭RenderStreaming
中的RunOnAwake
WebServer
中找到websocket.ts
和websockethandler.ts
,添加如下内容
重新Build服务后启动Server,这样就可以Unity就可以和Server通过WebSocket的形式收发心跳消息
在RenderStreamingManager
中绑定事件
private void Awake()
{
// connect & disconnect
_inputReceiver.OnStartedChannel += OnStartChannel;
_inputReceiver.OnStoppedChannel += OnStopChannel;
}
private void OnStartChannel(string connectionid)
{
}
private void OnStopChannel(string connectionid)
{
}
如图创建物体,添加RenderStreaming
SingleConnection
VideoStreamReceiver
InputSender
组件
AudioStreamReceiver
是声音相关的,根据自己需要添加
新建Canvas,创建RawImage,作为接收Texture的载体
Audio相关的内容,根据Sample中配置,不需要声音相关的可以忽略
Awake里获取组件和绑定事件
StartConnect调用开始连接websocket,连接成功后开始webrtc的连接,之后VideoStreamReceiver绑定的OnUpdateReceiveTexture事件更新RawImage里面的画面。
OnStartedChannel里添加了发送当前设备分辨率信息,消息结构可自定义,我这边创建了ScreenMessage,添加了宽高信息,发送到客户端。
脚本如下:
using System;
using System.Threading;
using Unity.RenderStreaming;
using Unity.RenderStreaming.Signaling;
using UnityEngine;
using UnityEngine.UI;
public class RenderStreamingReceiverManager : MonoSingleton<RenderStreamingReceiverManager>
{
private RenderStreaming _renderStreaming;
private VideoStreamReceiver _videoStreamReceiver;
private AudioStreamReceiver _audioStreamReceiver;
private InputSender _inputSender;
private SingleConnection _connection;
[SerializeField] private RawImage remoteVideoImage;
[SerializeField] private AudioSource remoteAudioSource;
private ISignaling _signaling;
private string _connectionId;
private void Awake()
{
// Get Component
_renderStreaming = transform.GetComponent<RenderStreaming>();
_videoStreamReceiver = transform.GetComponent<VideoStreamReceiver>();
_inputSender = transform.GetComponent<InputSender>();
_connection = transform.GetComponent<SingleConnection>();
// Connect DisConnect
_inputSender.OnStartedChannel += OnStartedChannel;
_inputSender.OnStoppedChannel += OnStopChannel;
_videoStreamReceiver.OnUpdateReceiveTexture += OnUpdateReceiveTexture;
_audioStreamReceiver = transform.GetComponent<AudioStreamReceiver>();
if (_audioStreamReceiver)
{
_audioStreamReceiver.OnUpdateReceiveAudioSource += source =>
{
source.loop = true;
source.Play();
};
}
}
// 外部调用Start
public void StartConnect()
{
_signaling = new WebSocketSignaling(
$"{"ws"}://172.0.0.1:80",
5, SynchronizationContext.Current);
_signaling.OnStart += OnWebSocketStart;
_renderStreaming.Run(_signaling);
}
// 外部调用Stop
public void StopConnect()
{
StopRenderStreaming();
_signaling.OnStart -= OnWebSocketStart;
_renderStreaming.Stop();
}
private void OnWebSocketStart(ISignaling signaling)
{
StartRenderStreaming();
}
private void StartRenderStreaming()
{
if (string.IsNullOrEmpty(_connectionId))
{
_connectionId = Guid.NewGuid().ToString("N");
}
if (_audioStreamReceiver)
{
_audioStreamReceiver.targetAudioSource = remoteAudioSource;
}
_connection.CreateConnection(_connectionId);
}
private void StopRenderStreaming()
{
_connection.DeleteConnection(_connectionId);
_connectionId = String.Empty;
}
void OnUpdateReceiveTexture(Texture texture)
{
remoteVideoImage.texture = texture;
SetInputChange();
}
void OnStartedChannel(string connectionId)
{
Debug.Log("连接成功:" + connectionId);
// 发送当前设备的分辨率到Client
SendScreenToClient();
SetInputChange();
}
private void OnStopChannel(string connectionId)
{
Debug.Log("断开连接:" + connectionId);
}
void SetInputChange()
{
if (!_inputSender.IsConnected || remoteVideoImage.texture == null)
return;
// correct pointer position
Vector3[] corners = new Vector3[4];
remoteVideoImage.rectTransform.GetWorldCorners(corners);
Camera camera = remoteVideoImage.canvas.worldCamera;
var corner0 = RectTransformUtility.WorldToScreenPoint(camera, corners[0]);
var corner2 = RectTransformUtility.WorldToScreenPoint(camera, corners[2]);
var region = new Rect(
corner0.x,
corner0.y,
corner2.x - corner0.x,
corner2.y - corner0.y
);
var size = new Vector2Int(remoteVideoImage.texture.width, remoteVideoImage.texture.height);
_inputSender.SetInputRange(region, size);
_inputSender.EnableInputPositionCorrection(true);
}
private void SendScreenToClient()
{
ScreenMessage screenMessage = new ScreenMessage(MessageType.Screen, Screen.width, Screen.height);
string msg = JsonUtility.ToJson(screenMessage);
SendMsg(msg);
}
private void SendMsg(string msg)
{
_inputSender.Channel.Send(msg);
}
}
Message定义结构,根据自己喜好编写
public enum MessageType
{
Screen
}
public class RenderStreamingMessage
{
public MessageType MessageType;
}
public class ScreenMessage : RenderStreamingMessage
{
public int Width;
public int Height;
public ScreenMessage(MessageType messageType, int width, int height)
{
MessageType = messageType;
Width = width;
Height = height;
}
}
收到消息后,解析拿到宽高,按照移动设备的分辨率计算得到合适的分辨率
使用_videoStreamSender.SetTextureSize修改输出分辨率
使用_inputReceiver.SetInputRange和_inputReceiver.SetEnableInputPositionCorrection(true)来修正分辨率,web端分辨率修改和这个一致。
private void OnMessage(byte[] bytes)
{
string msg = System.Text.Encoding.Default.GetString(bytes);
if (string.IsNullOrEmpty(msg) || !msg.Contains("MessageType"))
{
return;
}
RenderStreamingMessage messageObject = JsonUtility.FromJson<RenderStreamingMessage>(msg);
if (messageObject == null)
{
return;
}
switch (messageObject.MessageType)
{
case MessageType.Screen:
ScreenMessage screenMessage = JsonUtility.FromJson<ScreenMessage>(msg);
ResetOutputScreen(screenMessage);
break;
default:
break;
}
}
private void ResetOutputScreen(ScreenMessage screenMessage)
{
int width = screenMessage.Width;
int height = screenMessage.Height;
float targetWidth, targetHeight;
int configWidth = 1920;
int configHeight = 1080;
float configRate = (float) configWidth / configHeight;
float curRate = (float) width / height;
if (curRate > configRate)
{
targetWidth = configWidth;
targetHeight = targetWidth / curRate;
}
else
{
targetHeight = configHeight;
targetWidth = targetHeight * curRate;
}
_videoStreamSender.SetTextureSize(new Vector2Int((int) targetWidth, (int) targetHeight));
// 这一步是分辨率偏移修正的关键代码
_inputReceiver.SetInputRange(
new Vector2Int((int) _videoStreamSender.width, (int) _videoStreamSender.height),
new Rect(0, 0, Screen.width, Screen.height));
_inputReceiver.SetEnableInputPositionCorrection(true);
}
1.mac和iOS的发布bug分享