iOS 解决人脸识别卡顿

前段时间,公司项目需要运用人脸识别,我用的AVFoundation、CoreImage的CIFaceFeature来实现,然后一顿操作完成过后,发现人脸识别时在6s上CPU使用率达到了70%,在6以下的真机上运行CPU直接到150%+。。。卡到爆炸
然后代码定位在人脸识别的代理上

#pragma mark - 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress,
                                                    width, height, 8, bytesPerRow, colorSpace,
                                                    kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    UIImage *image= [UIImage imageWithCGImage:newImage scale:1 orientation:UIImageOrientationLeftMirrored];
    CGImageRelease(newImage);
    [self performSelectorOnMainThread:@selector(detectForFacesInUIImage:)
                           withObject: (id) image waitUntilDone:NO];
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
}

一般视频录制是30帧,机子每秒都需要处理那么多张图片根本吃不消,特别是CPU比较老的机型。
我的解决方法是:
1.限制摄像头的帧数

[videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(1, 25)];

限制最大帧数 每秒25帧
2.每5帧取一张图片来进行处理

fpsCount_ ++;
    if (fpsCount_%5 != 1) {
        return;
    }

先定义一个全局变量fpsCount_,然后在didOutputSampleBuffer:方法开头进行控制。
最后,终于在老机型上成功流畅运行,�运行时CPU使用率只有百分之二三十。

你可能感兴趣的:(iOS 解决人脸识别卡顿)