Flutter 中使用 WebRTC 实现实时音视频通话
随着实时通信技术的快速发展,WebRTC 已逐渐成为实现视频通话和直播的一种主流技术。在 Flutter 中,你同样可以轻松调用 WebRTC 来实现跨平台的音视频实时通信。
### 一、添加依赖
首先打开 `pubspec.yaml` 文件,添加 flutter_webrtc 插件:
```yaml
dependencies:
flutter_webrtc: ^0.9.48
```
执行以下命令安装:
```shell
flutter pub get
```
⚙️ 二、配置平台权限
- **Android** (`android/app/src/main/AndroidManifest.xml`):
```xml
```
- **iOS** (`ios/Runner/Info.plist`):
```xml
```
三、WebRTC 使用示例
下面以点对点视频通话为例,展示 Flutter 如何调用 WebRTC:
#### 1. 引入库文件
```dart
import 'package:flutter_webrtc/flutter_webrtc.dart';
```
#### 2. 创建媒体流(MediaStream)
```dart
MediaStream? _localStream;
Future
final Map
'audio': true,
'video': {
'facingMode': 'user', // 前置摄像头
}
};
_localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
return _localStream!;
}
```
#### 3. 初始化 PeerConnection 并进行 SDP 交换(简化版)
建立点对点连接需创建两个终端的 PeerConnection 并交换SDP(Session Description Protocol)信息。实际情况可能需要服务器辅助,简化版示例如下:
```dart
RTCPeerConnection? _peerConnection;
createPeerConnection() async {
final Map
"iceServers": [
{"urls": "stun:stun.l.google.com:19302"},
]
};
_peerConnection = await createPeerConnection(config);
_localStream!.getTracks().forEach((track) {
_peerConnection!.addTrack(track, _localStream!);
});
_peerConnection!.onIceCandidate = (candidate) {
// 将 candidate 发送给远程客户端
};
_peerConnection!.onTrack = (RTCTrackEvent event) {
// 远程流可以从这里获取
MediaStream remoteStream = event.streams[0];
};
RTCSessionDescription offer = await _peerConnection!.createOffer();
await _peerConnection!.setLocalDescription(offer);
// 将 offer.sdp 和 offer.type 发送给对方
}
```
接收方收到 offer 后,进行回应:
```dart
Future
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(remoteSdp, 'offer'));
RTCSessionDescription answer = await _peerConnection!.createAnswer();
await _peerConnection!.setLocalDescription(answer);
// 将 answer.sdp 和 answer.type 发送回对方
}
```
4. 使用 RTCVideoView 显示视频流
在你的 Flutter Widget 中放置两个视频容器:
```dart
RTCVideoView(
RTCVideoRenderer()..initialize()..srcObject = _localStream,
mirror: true,
);
// 远端视频流类似处理,srcObject设置为远程流即可
```
✅ 四、注意事项
- 实际生产环境需搭配 Signal Server(如 WebSocket 或 Firebase)进行信令交换。
- ICE servers 中 STUN 服务确保 NAT 穿透能力,如需完全穿透防火墙,可能需要 TURN 服务(如 coturn)。
- 要实现多人视频通话,建议采用 SFU 或 MCU 服务器架构。
总结
通过使用 flutter_webrtc 包,我们可以快速构建实时音视频通信应用。本文简要讲解了在 Flutter 中调用 WebRTC 所需的基本概念及关键代码,使你可以轻松上手实现一个具有音视频功能的 App!
要让 Flutter 应用与浏览器中的 WebRTC Demo 进行通信,关键在于正确处理信令交换和确保跨平台兼容性。下面我将详细介绍实现流程:
一、信令服务器的搭建
两个平台要互相通信,首先需要一个共同的信令服务器。以下是基于 Node.js 和 Socket.IO 的简易信令服务器:
```javascript
const express = require('express');
const http = require('http');
const socketIo = require('socket.io');
const app = express();
const server = http.createServer(app);
const io = socketIo(server);
// 存储连接的用户
let users = {};
io.on('connection', (socket) => {
console.log('用户已连接:', socket.id);
// 用户加入房间
socket.on('join', (roomId) => {
socket.join(roomId);
users[socket.id] = roomId;
// 通知房间内其他用户
socket.to(roomId).emit('user-joined', socket.id);
});
// 转发 offer
socket.on('offer', (data) => {
socket.to(data.target).emit('offer', {
sdp: data.sdp,
type: data.type,
from: socket.id
});
});
// 转发 answer
socket.on('answer', (data) => {
socket.to(data.target).emit('answer', {
sdp: data.sdp,
type: data.type,
from: socket.id
});
});
// 转发 ICE candidate
socket.on('ice-candidate', (data) => {
socket.to(data.target).emit('ice-candidate', {
candidate: data.candidate,
from: socket.id
});
});
// 断开连接
socket.on('disconnect', () => {
const roomId = users[socket.id];
if (roomId) {
socket.to(roomId).emit('user-left', socket.id);
delete users[socket.id];
}
});
});
server.listen(3000, () => {
console.log('信令服务器运行在 http://localhost:3000');
});
```
二、Flutter 端实现
在前文的基础上,调整 Flutter WebRTC 代码以支持与浏览器通信:
```dart
import 'package:flutter/material.dart';
import 'package:flutter_webrtc/flutter_webrtc.dart';
import 'package:socket_io_client/socket_io_client.dart' as IO;
class WebRTCPage extends StatefulWidget {
@override
_WebRTCPageState createState() => _WebRTCPageState();
}
class _WebRTCPageState extends State
final RTCVideoRenderer _localRenderer = RTCVideoRenderer();
final RTCVideoRenderer _remoteRenderer = RTCVideoRenderer();
MediaStream? _localStream;
RTCPeerConnection? _peerConnection;
IO.Socket? _socket;
String roomId = "test_room";
@override
void initState() {
super.initState();
initRenderers();
_connectSocket();
}
// 初始化视频渲染器
Future
await _localRenderer.initialize();
await _remoteRenderer.initialize();
}
// 连接到信令服务器
void _connectSocket() {
_socket = IO.io('http://your-signaling-server:3000',
'transports': ['websocket'],
'autoConnect': true,
});
_socket!.on('connect', (_) {
print('已连接到信令服务器');
_socket!.emit('join', roomId);
_initWebRTC();
});
_socket!.on('user-joined', (id) {
print('新用户加入: $id');
_createOffer(id);
});
_socket!.on('offer', (data) async {
print('收到 offer');
await _handleOffer(data);
});
_socket!.on('answer', (data) async {
print('收到 answer');
await _handleAnswer(data);
});
_socket!.on('ice-candidate', (data) async {
print('收到 ICE candidate');
await _addIceCandidate(data);
});
}
// 初始化WebRTC
Future
// 获取本地媒体流
final Map
'audio': true,
'video': {
'facingMode': 'user',
}
};
_localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
setState(() {
_localRenderer.srcObject = _localStream;
});
// 创建PeerConnection配置
final Map
"iceServers": [
{"urls": "stun:stun.l.google.com:19302"},
// 添加TURN服务器以提高连接成功率
]
};
// 创建RTC约束条件,明确使用统一计划
final Map
"mandatory": {
"OfferToReceiveAudio": true,
"OfferToReceiveVideo": true,
},
"optional": [],
};
// 创建PeerConnection
_peerConnection = await createPeerConnection(config, offerSdpConstraints);
// 添加本地媒体流
_localStream!.getTracks().forEach((track) {
_peerConnection!.addTrack(track, _localStream!);
});
// 监听远程流
_peerConnection!.onTrack = (RTCTrackEvent event) {
print("收到远程媒体流");
if (event.streams.isNotEmpty) {
setState(() {
_remoteRenderer.srcObject = event.streams[0];
});
}
};
// 监听ICE候选项
_peerConnection!.onIceCandidate = (RTCIceCandidate candidate) {
if(_socket != null && candidate.candidate != null) {
_socket!.emit('ice-candidate', {
'target': 'browser', // 发送给浏览器端
'candidate': candidate.toMap(),
});
}
};
}
// 创建Offer
Future
RTCSessionDescription description = await _peerConnection!.createOffer();
await _peerConnection!.setLocalDescription(description);
// 发送Offer给对方
_socket!.emit('offer', {
'target': targetId,
'type': description.type,
'sdp': description.sdp,
});
}
// 处理收到的Offer
Future
// 确保已创建PeerConnection
if (_peerConnection == null) {
await _initWebRTC();
}
// 设置远程描述
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(data['sdp'], data['type']),
);
// 创建并发送Answer
RTCSessionDescription answer = await _peerConnection!.createAnswer();
await _peerConnection!.setLocalDescription(answer);
_socket!.emit('answer', {
'target': data['from'],
'type': answer.type,
'sdp': answer.sdp,
});
}
// 处理收到的Answer
Future
await _peerConnection!.setRemoteDescription(
RTCSessionDescription(data['sdp'], data['type']),
);
}
// 添加ICE候选项
Future
if (_peerConnection != null) {
await _peerConnection!.addCandidate(
RTCIceCandidate(
data['candidate']['candidate'],
data['candidate']['sdpMid'],
data['candidate']['sdpMLineIndex'],
),
);
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: Text('Flutter WebRTC')),
body: Column(
children: [
Expanded(
child: Container(
margin: EdgeInsets.all(8.0),
decoration: BoxDecoration(color: Colors.black),
child: RTCVideoView(_localRenderer, mirror: true),
),
),
Expanded(
child: Container(
margin: EdgeInsets.all(8.0),
decoration: BoxDecoration(color: Colors.black),
child: RTCVideoView(_remoteRenderer),
),
),
],
),
);
}
@override
void dispose() {
_localRenderer.dispose();
_remoteRenderer.dispose();
_localStream?.getTracks().forEach((track) => track.stop());
_peerConnection?.close();
_socket?.disconnect();
super.dispose();
}
}
```
三、浏览器端 WebRTC Demo 实现
浏览器端可以通过以下 JavaScript 代码实现与 Flutter 端的通信:
```html