iOS录屏功能与系统版本关联,每个版本使用的库或者API不尽相同,实现的功能也不太一样,具体如下:
1、iOS 9以前
iOS 9以前系统未开放相关的API,需要使用私有库开发相关功能,目前框兼容于iOS 9+系统版本,且苹果已将该私有库彻底禁用,无法使用,故忽略。
2、iOS 9
iOS9开始,苹果新增了 ReplayKit 框架,使用该框架中的API进行录屏,该功能只能录制应用内屏幕,且无法操作视频/音频流,最终只能在预览页面进行“保存”、“拷贝”、“分享”等操作。具体使用如下:
// 判断是否正在录屏
if ([RPScreenRecorder sharedRecorder].isRecording) {
NSLog(@"正在录制");
return;
}
// 判断录屏功能是否可用
if (![[RPScreenRecorder sharedRecorder] isAvailable]) {
NSLog(@"录屏功能不可用");
}
// 开始录制
[[RPScreenRecorder sharedRecorder] startRecordingWithMicrophoneEnabled:YES handler:^(NSError * _Nullable error) {
if (error) {
NSLog(@"开始录制出错:%@", error);
} else {
NSLog(@"开始录制");
}
}];
[[RPScreenRecorder sharedRecorder] stopRecordingWithHandler:^(RPPreviewViewController * _Nullable previewViewController, NSError * _Nullable error) {
if (error) {
NSLog(@"结束录屏失败: %@", error);
return ;
}
previewViewController.previewControllerDelegate = self;
dispatch_async(dispatch_get_main_queue(), ^{
[self presentViewController:previewViewController animated:YES completion:nil];
});
}];
#pragma mark - RPPreviewViewControllerDelegate
- (void)previewController:(RPPreviewViewController *)previewController didFinishWithActivityTypes:(NSSet *)activityTypes {
if ([activityTypes containsObject:UIActivityTypeSaveToCameraRoll]) {
// 保存到相册
}
[previewController dismissViewControllerAnimated:YES completion:nil];
}
3、iOS 10
从iOS 10开始,苹果新增了录制系统屏幕的API,即应用即使退出前台也能持续录制,以下称为“系统屏幕录制”,区分于“应用屏幕录制”。
- 应用屏幕录制
与iOS 9的API类似,只不过将iOS 9中开始录屏的方法 startRecordingWithMicrophoneEnabled 参数改为使用属性,
[RPScreenRecorder sharedRecorder].microphoneEnabled = YES;
[[RPScreenRecorder sharedRecorder] startRecordingWithHandler:^(NSError *error){
if (error) {
NSLog(@"开始录制error %@", error);
} else {
NSLog(@"开始录制");
}
}];
其他与iOS 9一样
- 系统屏幕录制
该功能同样基于 ReplayKit 实现,只不过具体实现并不是在主应用中,而是在主应用工程的Extension(扩展)中实现,具体为 Broadcast Setup UI Extension 和Broadcast Upload Extension,其中Broadcast Setup UI Extension负责UI绘制,Broadcast Upload Extension负责数据处理。
大体流程如下:
主应用 唤起 已实现 BBroadcast Setup UI Extension 应用的对应扩展,会在主应用中显示该扩展的UI页面,在该页面调用 “userDidFinishSetup” 启动 “Broadcast Upload Extension” 开始进行屏幕录制,并在该扩展中处理视频流、音频流数据。
Broadcast Setup UI Extension :
// Call this method when the user has finished interacting with the view controller and a broadcast stream can start
- (void)userDidFinishSetup {
// URL of the resource where broadcast can be viewed that will be returned to the application
NSURL *broadcastURL = [NSURL URLWithString:@"http://apple.com/broadcast/streamID"];
// Dictionary with setup information that will be provided to broadcast extension when broadcast is started
NSDictionary *setupInfo = @{ @"broadcastName" : @"example" };
// Tell ReplayKit that the extension is finished setting up and can begin broadcasting
[self.extensionContext completeRequestWithBroadcastURL:broadcastURL setupInfo:setupInfo];
}
- (void)userDidCancelSetup {
// Tell ReplayKit that the extension was cancelled by the user
[self.extensionContext cancelRequestWithError:[NSError errorWithDomain:@"YourAppDomain" code:-1 userInfo:@{NSLocalizedDescriptionKey : @"StopBroadcast"}]];
}
Broadcast Upload Extension
- (void)broadcastStartedWithSetupInfo:(NSDictionary *)setupInfo {
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
// 用户已启动广播。
// 可以提供来自UI扩展的设置信息,但这是可选的
NSLog(@"启动广播");
}
- (void)broadcastPaused {
// User has requested to pause the broadcast. Samples will stop being delivered.
// 用户暂停广播。
NSLog(@"暂停广播");
}
- (void)broadcastResumed {
// User has requested to resume the broadcast. Samples delivery will resume.
// 用户恢复广播
NSLog(@"恢复广播");
}
- (void)broadcastFinished {
// User has requested to finish the broadcast.
// 用户完成广播
NSLog(@"完成广播");
}
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType {
switch (sampleBufferType) {
case RPSampleBufferTypeVideo:
// Handle video sample buffer
// 视频流
// 得到YUV数据
NSLog(@"视频流");
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
// 处理app音频
NSLog(@"App音频流");
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
// 处理麦克风音频
NSLog(@"麦克风音频流");
break;
default:
break;
}
}
唤起录屏功能(在另外的应用,区分主应用):
[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
if (error) {
NSLog(@"RPBroadcast err %@", [error localizedDescription]);
} else {
dispatch_async(dispatch_get_main_queue(), ^{
broadcastActivityViewController.delegate = self;
broadcastActivityViewController.modalPresentationStyle = UIModalPresentationPopover;
[self presentViewController:broadcastActivityViewController animated:YES completion:nil];
});
}
}];
// 用户选了扩展之后回调的方法, 在这方法里正式调起UI扩展
- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController
didFinishWithBroadcastController:(RPBroadcastController *)broadcastController
error:(NSError *)error {
dispatch_async(dispatch_get_main_queue(), ^{
[broadcastActivityViewController dismissViewControllerAnimated:YES completion:nil];
});
NSLog(@"BundleID %@", broadcastController.broadcastExtensionBundleID);
self.broadcastController = broadcastController;
if (error) {
NSLog(@"BAC: %@ didFinishWBC: %@, err: %@",
broadcastActivityViewController,
broadcastController,
error);
return;
}
[broadcastController startBroadcastWithHandler:^(NSError * _Nullable error) {
if (!error) {
NSLog(@"--fshfka----success");
}
else {
NSLog(@"startBroadcast %@",error.localizedDescription);
}
}];
}
// 用来接收直播传回来的NSDictionary
- (void)broadcastController:(RPBroadcastController *)broadcastController
didUpdateServiceInfo:(NSDictionary *> *)serviceInfo {
NSLog(@"didUpdateServiceInfo: %@", serviceInfo);
}
// 直播关闭
- (void)broadcastController:(RPBroadcastController *)broadcastController
didFinishWithError:(NSError *)error {
NSLog(@"didFinishWithError: %@", error);
}
4、iOS 11
iOS 11官方开放了应用内录屏的流数据处理API,即可直接操作视频流、音频流,而不是只能预览、保存、分享。
- 应用屏幕录制
- (NSString *)videoPath {
if (!_videoPath) {
NSArray *pathDocuments = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = pathDocuments[0];
uint32_t random = arc4random() % 1000;
_videoPath = [[outputURL stringByAppendingPathComponent:[NSString stringWithFormat:@"%u", random]] stringByAppendingPathExtension:@"mp4"];
self.compressVideoPath = [[outputURL stringByAppendingPathComponent:[NSString stringWithFormat:@"%u_compress", random]] stringByAppendingPathExtension:@"mp4"];
NSLog(@"%@", _videoPath);
NSLog(@"%@", self.compressVideoPath);
}
return _videoPath;
}
- (AVAssetWriter *)assetWriter {
if (!_assetWriter) {
NSError *error = nil;
_assetWriter = [AVAssetWriter assetWriterWithURL:[NSURL fileURLWithPath:self.videoPath]
fileType:AVFileTypeMPEG4
error:&error];
if (error) {
NSLog(@"初始化 AVAssetWriter 失败:%@", error);
}
if ([_assetWriter canAddInput:self.assetWriterInput]) {
[_assetWriter addInput:self.assetWriterInput];
}
}
return _assetWriter;
}
- (AVAssetWriterInput *)assetWriterInput {
if (!_assetWriterInput) {
NSDictionary *compressionProperties = @{
AVVideoAverageBitRateKey : [NSNumber numberWithDouble:2000 * 1000]
};
NSDictionary *videoSettings = @{
AVVideoCompressionPropertiesKey : compressionProperties,
AVVideoCodecKey : AVVideoCodecH264,
AVVideoWidthKey : [NSNumber numberWithFloat:self.view.frame.size.width],
AVVideoHeightKey : [NSNumber numberWithFloat:self.view.frame.size.height]
};
_assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_assetWriterInput.expectsMediaDataInRealTime = YES;
}
return _assetWriterInput;
}
// 开始录屏
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
if (self.assetWriter.status == AVAssetWriterStatusUnknown && bufferType == RPSampleBufferTypeVideo) {
[self.assetWriter startWriting];
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
int64_t videopts = CMTimeGetSeconds(pts) * 1000;
// 丢掉无用帧
if (videopts < 0) {
NSLog(@"无用帧");
return ;
}
[self.assetWriter startSessionAtSourceTime:pts];
}
if (self.assetWriter.status == AVAssetWriterStatusFailed) {
NSLog(@"An error occured: %@", self.assetWriter.error);
[self stopRecord:nil];
return;
}
if (bufferType == RPSampleBufferTypeVideo) {
if (self.assetWriterInput.isReadyForMoreMediaData) {
CFRetain(sampleBuffer);
// 将sampleBuffer添加进视频输入源
[self.assetWriterInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else {
NSLog(@"Not ready for video");
}
}
} completionHandler:^(NSError * _Nullable error) {
if (error) {
NSLog(@"开始录制error %@",error);
} else {
NSLog(@"开始录制");
}
}];
// 结束录屏
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
if (error) {
NSLog(@"stopCaptureWithHandler: %@", error);
}
// 结束写入
__weak typeof(self) weakSelf = self;
[self.assetWriter finishWritingWithCompletionHandler:^{
// 结束录屏
{
self.assetWriter = nil;
self.assetWriterInput = nil;
}
__strong typeof(self) strongSelf = weakSelf;
NSLog(@"屏幕录制结束,视频地址: %@", strongSelf.videoPath);
}];
}];
- 系统屏幕录制
与iOS 10类似,仅在唤起录屏扩展的API中新增了 PreferredExtension 参数(扩展的BundleID),设置了该参数时,唤起的录屏扩展列表中默认选中该扩展
[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithPreferredExtension:@"com.asiainfo.LivePlay.UIExtension" handler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
if (error) {
NSLog(@"RPBroadcast err %@", [error localizedDescription]);
} else {
dispatch_async(dispatch_get_main_queue(), ^{
broadcastActivityViewController.delegate = self;
broadcastActivityViewController.modalPresentationStyle = UIModalPresentationPopover;
[self presentViewController:broadcastActivityViewController animated:YES completion:nil];
});
}
}];
5、iOS 12
应用屏幕录制
同iOS 11系统屏幕录制
可直接在已实现Broadcast Setup UI Extension 和Broadcast Upload Extension扩展的应用中直接使用录屏功能,而不需要在其他应用中唤起。
if (@available(iOS 12, *)) {
if (!self.broadPickerView) {
self.broadPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(-100, -100, 100, 100)];
self.broadPickerView.preferredExtension = @"com.asiainfo.LivePlay.UploadExtension";
self.broadPickerView.showsMicrophoneButton = YES;
[self.view addSubview:self.broadPickerView];
}
}
for (UIView *subView in self.broadPickerView.subviews) {
if ([subView isKindOfClass:[UIButton class]]) {
[(UIButton *)subView sendActionsForControlEvents:UIControlEventTouchDown];
break;
}
}