iOS端各方案对UIImage图片进行人脸检测耗时统计

前言

  • 本文主要使用iOS平台下各方案对静态图片的人脸识别进行性能及耗时统计,先附上测试结果,最后有各实现方案的代码。

CoreImage方案:

测试数据
test_1080x1920.JPG

检测参数设置

image.png

性能消耗情况
CIDetectorAccuracyLow (低精度):

CIDetectorAccuracyHigh(高精度):

image.png

测试数据
test_3024x3024.JPG

检测参数设置

image.png

性能消耗情况

CIDetectorAccuracyLow (低精度):


image.png

CIDetectorAccuracyHigh(高精度):


image.png

OpenCV方案:

测试数据
test_1080x1920.JPG

检测参数设置
注释:

    1. scaleFactor为每一个图像尺度中的尺度参数,默认值为1.1。scale_factor参数可以决定两个不同大小的窗口扫描之间有多大的跳跃,这个参数设置的大,则意味着计算会变快,但如果窗口错过了某个大小的人脸,则可能丢失物体。
    1. minNeighbors参数为每一个级联矩形应该保留的邻近个数,默认为3。minNeighbors控制着误检测,默认值为3表明至少有3次重叠检测,才认为人脸确实存在。
    1. cvSize()指示寻找人脸的最小区域。设置这个参数过大,会以丢失小物体为代价减少计算量。

scaleFactor:1.1
minNeighbors:3
cvSize():Size(30, 30)

image.png

性能消耗情况
人脸检测器(Haar_1):
[图片上传中...(image.png-52e482-1546677776256-0)]

人脸检测器(快速Haar):

image.png

检测参数设置
scaleFactor:1.1
minNeighbors:1
cvSize():Size(100, 100)

image.png

性能消耗情况
人脸检测器(Haar_1):

[图片上传中...(image.png-813064-1546677870655-0)]

人脸检测器(快速Haar):

image.png

Vision方案:

系统要求:iOS11及以上

测试数据
test_1080x1920.JPG

检测参数设置

image.png

性能消耗情况

image.png

测试数据
test_3024x3024.JPG

检测参数设置

image.png

性能消耗情况

image.png

AVFoundation方案:

支持视频拍摄实时预览时进行人脸识别
通过一个特定的AVCaptureOutput类型的AVCaptureMetadataOutput可以实现人脸检测功能.支持硬件加速以及同时对10个人脸进行实时检测.

  • PS:
    1.所有方案都已提前初始化好检测器,单纯统计从处理UIImage到输出检测结果的耗时;

综述:

CoreImage框架:

支持静态图像人脸识别检测;
处理过程占CPU较低;高精度:50+%,低精度:约100%
低精度时识别时间只占几十毫秒级别,但是横向人脸会检测失败(横向照片可以通过照片的方向信息,先将照片方向矫正)
高精度识别时间100多ms级别,横向人脸检测成功

后续扩展:可做滤镜,人脸特征检测(检测处理图片的特性,如使用来检测图片中人脸的眼睛、嘴巴、等等)。

OpenCV框架 :

支持静态图像人脸识别检测;
跨平台;
处理过程占CPU很高; 400+%~500+%
耗时跟参数的设置,识别精度有很大关系
但是识别参数可定制化程度高;
横向检测失败(可能姿势不对;

Vision框架:

iOS11起支持;
支持多种图片类型:
CIImage,NSURL,NSData,CGImageRef,CVPixelBufferRef(视频帧取出的图像格式);
处理过程占CPU较低;50+%
耗时和Coreimage的高精度识别差不多,100多毫秒级别
横向检测成功

后续扩展:
人脸特征点,人脸检测,二维码识别,文字检测、识别,目标跟踪(脸部,矩形和通用模板),矩形检测。。

  • 出于识别过程CPU占用率以及识别速度等方面考虑,对相册的人脸识别筛选需求建议iOS11以上系统采用Vision方案,iOS11以下的系统采用CoreImage的高精度识别方案筛选。由于没有足够的人脸数据,暂时无法进行识别准确率测试。

附录

  • KRFaceDetectTool.h //人脸识别工具类
#import 
#import 
#import 

NS_ASSUME_NONNULL_BEGIN

typedef NS_ENUM(NSInteger, KRFaceDetectResolutionType) {
    KRFaceDetectResolutionTypeCoreImage,//静态图像人脸识别
    KRFaceDetectResolutionTypeOpenCV,
    KRFaceDetectResolutionTypeVision,
    KRFaceDetectResolutionTypeAVFoundation//动态人脸识别
};

@interface KRFaceDetectTool : NSObject

- (instancetype)init NS_UNAVAILABLE;
+ (instancetype)new NS_UNAVAILABLE;

- (instancetype)initWithType:(KRFaceDetectResolutionType)type;

- (void)testForDetectFace;

- (BOOL)detectFaceWithImage:(UIImage*)image;

@end
NS_ASSUME_NONNULL_END
  • KRFaceDetectTool.mm

#import "KRFaceDetectTool.h"

#import 
#import "KRCVFaceDetectTool.h"
#import 
#import "KRGCDExtension.h"

@interface KRFaceDetectTool ()
@property (nonatomic, assign) KRFaceDetectResolutionType type;

//CoreImage
@property (nonatomic, strong) CIDetector *ciDetector;
@property (nonatomic, strong) KRCVFaceDetectTool *cvDetector;
@property (nonatomic, strong) VNImageBasedRequest *visionFaceDetectRequest;

@end

@implementation KRFaceDetectTool

- (instancetype)init {
    NSAssert(NO, @"Please use the given initial method !");
    return nil;
}

+ (instancetype)new {
    NSAssert(NO, @"Please use the given initial method !");
    return nil;
}

- (instancetype)initWithType:(KRFaceDetectResolutionType)type
{
    self = [super init];
    if (self) {
        _type = type;
        [self prepareToDetectWithType:type];
    }
    return self;
}

- (void)dealloc {
    if (self.ciDetector) {
        self.ciDetector = nil;
    }
    if (self.cvDetector) {
        self.cvDetector = nil;
    }
}

- (void)prepareToDetectWithType:(KRFaceDetectResolutionType)type {
    switch (type) {
        case KRFaceDetectResolutionTypeCoreImage:
        {
            if (!self.ciDetector) {
                NSDictionary *options = @{CIDetectorAccuracy:CIDetectorAccuracyLow};
                self.ciDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:options];
            }
        }
            break;
        case KRFaceDetectResolutionTypeVision:
        {
            void (^completionHandler)(VNRequest *, NSError * _Nullable) = ^(VNRequest *request, NSError * _Nullable error) {
                NSArray *observations = request.results;
                if (request.results.count > 0) {
                    NSLog(@"KRFaceDetectTool: has face!");
                } else {
                    NSLog(@"KRFaceDetectTool: no face!");
                }
            };
            
            self.visionFaceDetectRequest = [[VNDetectFaceRectanglesRequest alloc] initWithCompletionHandler:completionHandler];
        }
            break;
        case KRFaceDetectResolutionTypeOpenCV:
        {
            if (!self.cvDetector) {
                self.cvDetector = [[KRCVFaceDetectTool alloc] initWithType:KRCVFaceXMLTypeHaarcascadeFrontalfaceAlt];
            }
        }
            break;
        case KRFaceDetectResolutionTypeAVFoundation:
        {
            
        }
            break;
        default:
            break;
    }
}

- (BOOL)detectFaceWithImage:(UIImage*)image{
    switch (self.type) {
        case KRFaceDetectResolutionTypeCoreImage:
        {
            CIImage *ciImage = [[CIImage alloc] initWithCGImage:image.CGImage];
            NSArray *features = [self.ciDetector featuresInImage:ciImage];
            if (features.count) {
                return YES;
            }
            return NO;
        }
            break;
        case KRFaceDetectResolutionTypeVision:
        {
            VNImageRequestHandler *visionRequestHandler = [[VNImageRequestHandler alloc] initWithCGImage:image.CGImage options:@{}];
            
            [visionRequestHandler performRequests:@[self.visionFaceDetectRequest] error:nil];
            
        }
            break;
        case KRFaceDetectResolutionTypeOpenCV:
        {
            [self.cvDetector detectFaceWithImage:image];
        }
            break;
        case KRFaceDetectResolutionTypeAVFoundation:
        {
            
        }
            break;
        default:
            break;
    }
    return NO;
}

- (void)testForDetectFace {
    [self prepareToDetectWithType:KRFaceDetectResolutionTypeCoreImage];
    [self prepareToDetectWithType:KRFaceDetectResolutionTypeOpenCV];
    [self prepareToDetectWithType:KRFaceDetectResolutionTypeVision];
    [self prepareToDetectWithType:KRFaceDetectResolutionTypeAVFoundation];
    
    UIImage *testImage = [UIImage imageNamed:@"test_1080x1920.JPG"] ;
    UIImage *testImage2 = [UIImage imageNamed:@"test_3024x3024.JPG"];
    //    testImage2 = [testImage2 imageByRotateLeft90];
    
    void (^coreImageHighAccuracyBlock)(void) = ^{
        @autoreleasepool {
            CIImage *ciImage = [[CIImage alloc] initWithCGImage:testImage.CGImage];
            NSArray *features = [self.ciDetector featuresInImage:ciImage];
            if (features.count) {
                NSLog(@"KRFaceDetectTool: has face!");
            } else {
                NSLog(@"KRFaceDetectTool: no face!");
            }
        }
    };
    
    void (^cvDetectBlock)(void) = ^{
        @autoreleasepool {
            BOOL result = [self.cvDetector detectFaceWithImage:testImage];
            NSLog(@"KRFaceDetectTool: %@",result ? @"has face!": @"no face!");
        }
    };
    
    
    void (^visionDetectBlock)(void) = ^{
        VNImageRequestHandler *visionRequestHandler = [[VNImageRequestHandler alloc] initWithCGImage:testImage.CGImage options:@{}];
        [visionRequestHandler performRequests:@[self.visionFaceDetectRequest] error:nil];
    };
    
    int64_t result = kr_dispatch_benchmark(100, coreImageHighAccuracyBlock);
    NSLog(@"KRFaceDetectTool:xxx coreImageHighAccuracyBlock cost time:%lld",result);

    int64_t result2 = kr_dispatch_benchmark(100, cvDetectBlock);
    NSLog(@"KRFaceDetectTool:xxx cvDetectBlock cost time:%lld",result2);

    int64_t result3 = kr_dispatch_benchmark(100, visionDetectBlock);
    NSLog(@"KRFaceDetectTool:xxx visionDetectBlock cost time:%lld",result3);

}

@end
  • KRCVFaceDetectTool.h //基于opencv的人脸识别工具

#import 
#import 
//OpenCV
#import 
#import 
#import 

NS_ASSUME_NONNULL_BEGIN

typedef NS_ENUM(NSInteger, KRCVFaceXMLType) {
    KRCVFaceXMLTypeHaarcascadeFrontalfaceAlt,//对应opencv人脸识别xml的文件类型
    KRCVFaceXMLTypeHaarcascadeFrontalfaceAlt2
};

@interface KRCVFaceDetectTool : NSObject
{
    cv::CascadeClassifier faceDetector;
}

- (instancetype)init NS_UNAVAILABLE;
+ (instancetype)new NS_UNAVAILABLE;

- (instancetype)initWithType:(KRCVFaceXMLType)type;

- (BOOL)detectFaceWithImage:(UIImage*)image;

@end
NS_ASSUME_NONNULL_END

  • KRCVFaceDetectTool.mm
#import "KRCVFaceDetectTool.h"

@implementation KRCVFaceDetectTool

- (instancetype)initWithType:(KRCVFaceXMLType)type
{
    self = [super init];
    if (self) {
        [self prepareForDetectInOpenCV:type];
    }
    return self;
}

- (void)dealloc {
    
}

- (void)prepareForDetectInOpenCV:(KRCVFaceXMLType)type {
    switch (type) {
        case KRCVFaceXMLTypeHaarcascadeFrontalfaceAlt:
        {
            NSString* cascadePath = [[NSBundle mainBundle]
                                     pathForResource:@"haarcascade_frontalface_alt"
                                     ofType:@"xml"];
            faceDetector.load([cascadePath UTF8String]);
        }
            break;
        case KRCVFaceXMLTypeHaarcascadeFrontalfaceAlt2:
        {
            NSString* cascadePath = [[NSBundle mainBundle]
                                     pathForResource:@"haarcascade_frontalface_alt2"
                                     ofType:@"xml"];
            faceDetector.load([cascadePath UTF8String]);
        }
            break;
        default:
            break;
    }
}

- (BOOL)detectFaceWithImage:(UIImage*)image {
    cv::Mat cvImage;
    UIImageToMat(image, cvImage);
    if (!cvImage.empty()) {
        //转换为灰度图
        cv::Mat gray;
        cvtColor(cvImage, gray, CV_BGR2GRAY);
        
        //人脸检测
        std::vector faces;
        faceDetector.detectMultiScale(gray,
                                      faces,
                                      1.1,
                                      1,
                                      0|CV_HAAR_SCALE_IMAGE,
                                      cv::Size(100,100)
                                      );
        
        if (faces.size() > 0) {
            return YES;
        }
        return NO;
    }
    return NO;
}

@end

  • benchmark方法

#import 

NS_ASSUME_NONNULL_BEGIN

#ifdef __cplusplus
extern "C"
#endif
extern int64_t kr_dispatch_benchmark(size_t count, void (^block)(void));

NS_ASSUME_NONNULL_END

#import "KRGCDExtension.h"

extern uint64_t dispatch_benchmark(size_t count, void (^block)(void));

int64_t kr_dispatch_benchmark(size_t count, void (^block)(void)) {
    return dispatch_benchmark(count, block);
}

你可能感兴趣的:(iOS端各方案对UIImage图片进行人脸检测耗时统计)