视音频播放可以说是视频开发中比较简单的了,只要创建几个对象,监听一些属性值即可完成一个视音频播放器。
我们需要了解下面几个类:
- AVPlayer
- AVPlayerLayer
- AVPlayerItem
AVPlayer是播放器对象,用来控制视音频的播放和暂停,还可以用来跳转进度;
AVPlayerLayers用于展示视频,AVPlayer需要在AVPlayerLayer上才能展示;
AVPlayerItem是资源管理器,用于加载资源;
播放器的创建
- (void)initializedPlayerWith:(NSURL *)url {
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:url]; //资源管理器
if (_player.currentItem != nil) {
[self removeObserverFromPlayerItem:_player.currentItem]; //移除之前的监听
[_player replaceCurrentItemWithPlayerItem:playerItem]; //切换资源
[self pause];
}else {
_player = [AVPlayer playerWithPlayerItem:playerItem];
}
[_playerLayer removeFromSuperlayer];
_playerLayer = nil;
if (_containerView != nil) {
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.videoGravity = _videoGravity; //视频填充模式
_playerLayer.frame = _containerView.bounds;
[_containerView.layer addSublayer:_playerLayer];
}
//监听播放器
[self addObserverFromPlayerItem:playerItem];
[self addProgressNotification];
[self addPlayerFinishNotification];
}
当AVPlayerLayer父视图的frame发生了变化,记得及时修改AVPlayerLayer的frame
其他的代码很简单,关于监听的函数下面会讲
缓冲、播放、暂停和跳转进度
缓冲
在播放器创建之后,会自动进行缓冲
//缓存
- (void)buffer {
if (_playerLayer == nil && self.fileUrl) {
[self initializedPlayerWith:self.fileUrl];
}
}
播放
- (void)play {
[self buffer];
if (_player && _player.rate == 0) {
[_player play];
}
}
暂停
- (void)pause {
if (_player && _player.rate != 0) {
[_player pause];
}
}
跳转进度
这里value取值范围是0-1,
说明一点,被注释掉的跳转函数seekToTime只能跳转大于1秒的进度,要实现精确跳转,使用后面那个
- (void)jumpProgressWith:(CGFloat)value {
if (_playerLayer == nil) return;
CMTime time = CMTimeMakeWithSeconds(value * CMTimeGetSeconds(_player.currentItem.duration), _player.currentItem.currentTime.timescale);
[self jumpProgressWithTime:time];
}
- (void)jumpProgressWithTime:(CMTime)time {
if (_playerLayer == nil) return;
// [_player seekToTime:time]; //该方法无法进行精确的跳转,只能跳转大于1秒后的进度
[self.player seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero]; //这样才可以进行1秒内的精确跳转
}
视频填充模式
- AVLayerVideoGravityResizeAspect 默认,按视频比例显示,直到宽或高占满,未达到的地方显示父视图
- AVLayerVideoGravityResizeAspectFill 按原比例显示视频,直到两边屏幕占满,但视频部分内容可能无法显示
- AVLayerVideoGravityResize 按父视图尺寸显示,可能与原视频比例不同
AVPlayerLayer.videoGravity
播放器状态监听
在创建播放器对象后,通过监听资源管理器AVPlayerItem的属性,可以得知视音频的总长度和缓冲进度
//开始监听
- (void)addObserverFromPlayerItem:(AVPlayerItem *)playerItem {
[playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil]; //开始或暂停
[playerItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil]; //缓存进度
}
//监听回调
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
AVPlayerItem *playerItem = object;
if ([keyPath isEqualToString:@"status"]) {
AVPlayerStatus status = [[change valueForKey:@"new"] integerValue];
if (status ==AVPlayerStatusReadyToPlay) {
CGFloat totalTime = CMTimeGetSeconds(playerItem.duration);
NSLog(@"正在播放,视频总长度为 %.2f",totalTime);
}
}else if ([keyPath isEqualToString:@"loadedTimeRanges"]) {
NSArray *array = playerItem.loadedTimeRanges;
//本次缓冲时间范围
CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];
CGFloat startSecond = CMTimeGetSeconds(timeRange.start);
CGFloat durationSecond = CMTimeGetSeconds(timeRange.duration);
CGFloat totalTime = CMTimeGetSeconds(playerItem.duration);
//缓冲总长度
NSTimeInterval totalBuffer = startSecond + durationSecond;
CGFloat bufferProgress = totalBuffer / totalTime;
if (_bufferBlock) {
_bufferBlock(bufferProgress);
}
}
}
//别忘了移除监听
- (void)removeObserverFromPlayerItem:(AVPlayerItem *)playerItem {
[playerItem removeObserver:self forKeyPath:@"status"];
[playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"];
}
播放进度
//监听播放进度
- (void)addProgressNotification {
AVPlayerItem *playerItem = _player.currentItem;
if (playerItem == nil) return;
id playProgressObserver; //播放进度监听对象
//先移除上一个视频的监听
if (playProgressObserver) {
[_player removeTimeObserver:playProgressObserver];
}
//每秒监听一次播放进度
__weak typeof(self) weekSelf = self;
/*
CMTimeMake(value,timeScale):
value表示第几帧,timeScale表示帧率,即每秒多少帧
CMTimeMake(1,10):第一帧,帧率为每秒10帧,转换为时间公式:value/timeScale,即1/10=0.1,表示在视频的0.1秒时刻
CMTimeMakeWithSeconds的第一个参数可以使float,其他都一样,不过因为这个比较好用,所以我一般用这个
*/
playProgressObserver = [_player addPeriodicTimeObserverForInterval:CMTimeMake(playerProgressTime, playerProgressTime) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
CGFloat currentTime = CMTimeGetSeconds(time);
CGFloat totalTime = CMTimeGetSeconds(playerItem.duration);
if (currentTime) {
CGFloat playProgress = currentTime / totalTime;
if (weekSelf.progressBlock) {
weekSelf.progressBlock(playProgress,currentTime,totalTime);
}
}
}];
}
关于CMTime的使用建议先去百度了解一下,对视频开发的理解会很有好处
播放完成回调
- (void)addPlayerFinishNotification {
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playFinishNotification) name:AVPlayerItemDidPlayToEndTimeNotification object:_player.currentItem];
}
现在我们完成了视音频的播放,使用本地/网络资源去试试吧。
GitHub链接