iPhone摄像头设备获取

又写了个新版的帖子,看不懂怎么用的看本人这个新贴,在不懂,那我也没办法了。

iPhone摄像头设备获取(分离简化版)







目的:打开、关闭前置摄像头,绘制图像,并获取摄像头的二进制数据。

需要的库

AVFoundation.framework 、CoreVideo.framework 、CoreMedia.framework 、QuartzCore.framework

该摄像头捕抓必须编译真机的版本,模拟器下编译不了。

函数说明



- (void)createControl

{

// UI界面控件的创建

}



- (AVCaptureDevice *)getFrontCamera;

获取前置摄像头设备

- (void)startVideoCapture;

打开摄像头并开始捕捉图像

其中代码:

AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];

previewLayer.frame = localView.bounds;

previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

[self->localView.layer addSublayer: previewLayer]; 

为把图片画到UIView里面



- (void)stopVideoCapture:(id)arg;



关闭摄像头,停止捕抓图像

其中代码:



for(UIView*viewinself->localView.subviews) {

[viewremoveFromSuperview];

}





为移除摄像头图像的View

详情见代码,代码拷过去可以直接使用      Over!!!!





代码:

头文件:



//

//  AVCallController.h

//  Pxlinstall

//

//  Created by Lin Charlie C. on 11-3-24.

//  Copyright 2011  xxxx. All rights reserved.

//





#import <UIKit/UIKit.h>

#import <AVFoundation/AVFoundation.h>









@interface AVCallController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>

{

//UI

UILabel*labelState;

UIButton*btnStartVideo;

UIView*localView;



AVCaptureSession* avCaptureSession;

AVCaptureDevice *avCaptureDevice;

BOOLfirstFrame; //是否为第一帧

intproducerFps;





}

@property (nonatomic, retain) AVCaptureSession *avCaptureSession;

@property (nonatomic, retain) UILabel *labelState;





- (void)createControl;

- (AVCaptureDevice *)getFrontCamera;

- (void)startVideoCapture;

- (void)stopVideoCapture:(id)arg;

@end

/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

实现文件:

    //

//  AVCallController.m

//  Pxlinstall

//

//  Created by Lin Charlie C. on 11-3-24.

//  Copyright 2011  高鸿移通. All rights reserved.

//





#import "AVCallController.h"









@implementation AVCallController





@synthesize avCaptureSession;

@synthesize labelState;





// The designated initializer.  Override if you create the controller programmatically and want to perform customization that is not appropriate for viewDidLoad.

/*

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {

    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];

    if (self) {

        // Custom initialization.

    }

    return self;

}

*/

-(id)init

{

if(self= [superinit])

{

firstFrame= YES;

producerFps= 50;

}

returnself;

}





// Implement loadView to create a view hierarchy programmatically, without using a nib.

- (void)loadView {

[superloadView];

[selfcreateControl];

}









/*

// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.

- (void)viewDidLoad {

    [super viewDidLoad];

}

*/





/*

// Override to allow orientations other than the default portrait orientation.

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {

    // Return YES for supported orientations.

    return (interfaceOrientation == UIInterfaceOrientationPortrait);

}

*/





- (void)didReceiveMemoryWarning {

// Releases the view if it doesn't have a superview.

[superdidReceiveMemoryWarning];



// Release any cached data, images, etc. that aren't in use.

}





- (void)viewDidUnload {

[superviewDidUnload];

// Release any retained subviews of the main view.

// e.g. self.myOutlet = nil;

}









- (void)dealloc {

    [super dealloc];

}





#pragma mark -

#pragma mark createControl

- (void)createControl

{

//UI展示

self.view.backgroundColor= [UIColorgrayColor];

labelState= [[UILabelalloc] initWithFrame:CGRectMake(10, 20, 220, 30)];

labelState.backgroundColor= [UIColorclearColor];

[self.viewaddSubview:labelState];

[labelStaterelease];



btnStartVideo= [[UIButtonalloc] initWithFrame:CGRectMake(20, 350, 80, 50)];

[btnStartVideosetTitle:@"Star"forState:UIControlStateNormal];





[btnStartVideosetBackgroundImage:[UIImageimageNamed:@"Images/button.png"] forState:UIControlStateNormal];

[btnStartVideoaddTarget:selfaction:@selector(startVideoCapture) forControlEvents:UIControlEventTouchUpInside];

[self.viewaddSubview:btnStartVideo];

[btnStartVideorelease];



UIButton* stop = [[UIButtonalloc] initWithFrame:CGRectMake(120, 350, 80, 50)];

[stop setTitle:@"Stop"forState:UIControlStateNormal];



[stop setBackgroundImage:[UIImageimageNamed:@"Images/button.png"] forState:UIControlStateNormal];

[stop addTarget:selfaction:@selector(stopVideoCapture:) forControlEvents:UIControlEventTouchUpInside];

[self.view addSubview:stop];

[stop release];



localView= [[UIViewalloc] initWithFrame:CGRectMake(40, 50, 200, 300)];

[self.viewaddSubview:localView];

[localViewrelease];





}

#pragma mark -

#pragma mark VideoCapture

- (AVCaptureDevice *)getFrontCamera

{

//获取前置摄像头设备

NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in cameras)

{

        if (device.position == AVCaptureDevicePositionFront)

            return device;

    }

    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];



}

- (void)startVideoCapture

{

//打开摄像设备,并开始捕抓图像

[labelStatesetText:@"Starting Video stream"];

if(self->avCaptureDevice|| self->avCaptureSession)

{

[labelStatesetText:@"Already capturing"];

return;

}



if((self->avCaptureDevice = [self getFrontCamera]) == nil)

{

[labelStatesetText:@"Failed to get valide capture device"];

return;

}



NSError *error = nil;

    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:self->avCaptureDevice error:&error];

    if (!videoInput)

{

[labelStatesetText:@"Failed to get video input"];

self->avCaptureDevice= nil;

        return;

    }



    self->avCaptureSession = [[AVCaptureSession alloc] init];

    self->avCaptureSession.sessionPreset = AVCaptureSessionPresetLow;

    [self->avCaptureSession addInput:videoInput];



// Currently, the only supported key is kCVPixelBufferPixelFormatTypeKey. Recommended pixel format choices are 

// kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange or kCVPixelFormatType_32BGRA. 

// On iPhone 3G, the recommended pixel format choices are kCVPixelFormatType_422YpCbCr8 or kCVPixelFormatType_32BGRA.

//

    AVCaptureVideoDataOutput *avCaptureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];

NSDictionary*settings = [[NSDictionaryalloc] initWithObjectsAndKeys:

//[NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,

[NSNumbernumberWithInt:240], (id)kCVPixelBufferWidthKey,

                              [NSNumber numberWithInt:320], (id)kCVPixelBufferHeightKey,

  nil];

    avCaptureVideoDataOutput.videoSettings = settings;

    [settings release];

    avCaptureVideoDataOutput.minFrameDuration = CMTimeMake(1, self->producerFps);

/*We create a serial queue to handle the processing of our frames*/

dispatch_queue_tqueue = dispatch_queue_create("org.doubango.idoubs", NULL);

    [avCaptureVideoDataOutput setSampleBufferDelegate:self queue:queue];

    [self->avCaptureSession addOutput:avCaptureVideoDataOutput];

    [avCaptureVideoDataOutput release];

dispatch_release(queue);



AVCaptureVideoPreviewLayer* previewLayer = [AVCaptureVideoPreviewLayer layerWithSession: self->avCaptureSession];

previewLayer.frame = localView.bounds;

previewLayer.videoGravity= AVLayerVideoGravityResizeAspectFill;



[self->localView.layer addSublayer: previewLayer];



self->firstFrame= YES;

    [self->avCaptureSession startRunning];



[labelStatesetText:@"Video capture started"];



}

- (void)stopVideoCapture:(id)arg

{

//停止摄像头捕抓

if(self->avCaptureSession){

[self->avCaptureSession stopRunning];

self->avCaptureSession= nil;

[labelStatesetText:@"Video capture stopped"];

}

self->avCaptureDevice= nil;

//移除localView里面的内容

for(UIView*viewinself->localView.subviews) {

[viewremoveFromSuperview];

}

}

#pragma mark -

#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 

{

//捕捉数据输出 要怎么处理虽你便

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

/*Lock the buffer*/

if(CVPixelBufferLockBaseAddress(pixelBuffer, 0) == kCVReturnSuccess)

{

        UInt8 *bufferPtr = (UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer);

        size_t buffeSize = CVPixelBufferGetDataSize(pixelBuffer);



if(self->firstFrame)

{ 

if(1)

{

//第一次数据要求:宽高,类型

int width = CVPixelBufferGetWidth(pixelBuffer);

int height = CVPixelBufferGetHeight(pixelBuffer);



int pixelFormat = CVPixelBufferGetPixelFormatType(pixelBuffer);

switch (pixelFormat) {

casekCVPixelFormatType_420YpCbCr8BiPlanarVideoRange:

//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_nv12; // iPhone 3GS or 4

NSLog(@"Capture pixel format=NV12");

break;

casekCVPixelFormatType_422YpCbCr8:

//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_uyvy422; // iPhone 3

NSLog(@"Capture pixel format=UYUY422");

break;

default:

//TMEDIA_PRODUCER(producer)->video.chroma = tmedia_rgb32;

NSLog(@"Capture pixel format=RGB32");

break;

}



self->firstFrame = NO;

}

}

/*We unlock the buffer*/

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 

    }

/*We create an autorelease pool because as we are not in the main_queue our code is

 not executed in the main thread. So we have to create an autorelease pool for the thread we are in*/

// NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

// 

//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

//    /*Lock the image buffer*/

//    CVPixelBufferLockBaseAddress(imageBuffer,0); 

//    /*Get information about the image*/

//    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 

//    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 

//    size_t width = CVPixelBufferGetWidth(imageBuffer); 

//    size_t height = CVPixelBufferGetHeight(imageBuffer);  

//    

//    /*Create a CGImageRef from the CVImageBufferRef*/

//    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

//    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

//    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

// 

//    /*We release some components*/

//    CGContextRelease(newContext); 

//    CGColorSpaceRelease(colorSpace);

//    

//    /*We display the result on the custom layer. All the display stuff must be done in the main thread because

//  UIKit is no thread safe, and as we are not in the main thread (remember we didn't use the main_queue)

//  we use performSelectorOnMainThread to call our CALayer and tell it to display the CGImage.*/

// [self.customLayer performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage waitUntilDone:YES];

// 

// /*We display the result on the image view (We need to change the orientation of the image so that the video is displayed correctly).

//  Same thing as for the CALayer we are not in the main thread so ...*/

// UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];

// 

// /*We relase the CGImageRef*/

// CGImageRelease(newImage);

// 

// [self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

// 

// /*We unlock the  image buffer*/

// CVPixelBufferUnlockBaseAddress(imageBuffer,0);

// 

// [pool drain];

}

@end

简化版:

这次把视频捕获的彻底的分出来了。应该每个都看得懂怎么用吧。

贴上代码略说明下

需要的库

AVFoundation.framework 、CoreVideo.framework 、CoreMedia.framework

该摄像头捕抓必须编译真机的版本且要sdk4.0以上,模拟器下编译不了。

[font=]文件说明:

[font=] CameraHelp.h/.m 为主要文件即摄像头捕获

[font=] VideoController.h/.m为调用示例



//

//  CameraHelp.h

//  

//

//  Created by Zhuang Chuan Xian. on 11-6-28.

//  Copyright 2011  . All rights reserved.

//

#import <UIKit/UIKit.h>

#import <Foundation/Foundation.h>

#import <AVFoundation/AVFoundation.h>





#undef PRODUCER_HAS_VIDEO_CAPTURE

#define PRODUCER_HAS_VIDEO_CAPTURE (__IPHONE_OS_VERSION_MIN_REQUIRED >= 40000 && TARGET_OS_EMBEDDED)

@protocol CameraHelpDelegate

-(void)VideoDataOutputBuffer:(char*)videoBuffer dataSize:(int)size;

@end





@interface CameraHelp : NSObject

#if PRODUCER_HAS_VIDEO_CAPTURE

<AVCaptureVideoDataOutputSampleBufferDelegate>

#endif 

{

@private

int mWidth;

int mHeight;

int mFps;

BOOL mFrontCamera;

BOOL mFirstFrame;

BOOL mStarted;

UIView* mPreview;

id<CameraHelpDelegate> outDelegate;

#if PRODUCER_HAS_VIDEO_CAPTURE

AVCaptureSession* mCaptureSession;

AVCaptureDevice *mCaptureDevice;

#endif

}

//单例模式

+ (CameraHelp*)shareCameraHelp;

+ (void)closeCamera;

//设置前置摄像头

- (BOOL)setFrontCamera;

//设置后置摄像头

- (BOOL)setBackCamera;

//开始前设置捕获参数

- (void)prepareVideoCapture:(int) width andHeight: (int)height andFps: (int) fps andFrontCamera:(BOOL) bfront andPreview:(UIView*) view;

//开始捕获

- (void)startVideoCapture;

//停止捕获

- (void)stopVideoCapture;

//设置要显示到得View

- (void)setPreview: (UIView*)preview;

//设置数据输出

- (void)setVideoDataOutputBuffer:(id<CameraHelpDelegate>)delegate;

@end





实现的自己下载例子,该例可以在编译运行,函数不懂的看这  iPhone摄像头设备获取





下面是调用说明很简单就句话

//

//  VideoController.m

//  

//

//  Created by zcx. on 11-6-28.

//  Copyright 2011  . All rights reserved.

//





#import "VideoController.h"

#import "CameraHelp.h"





@implementation VideoController

@synthesize videoView;

// The designated initializer.  Override if you create the controller programmatically and want to perform customization that is not appropriate for viewDidLoad.





- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {

    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];

    if (self) {

// Custom initialization.

self.modalTransitionStyle = UIModalTransitionStyleFlipHorizontal;

self.modalPresentationStyle = UIModalPresentationFullScreen;

    }

return self;

}













// Implement viewDidLoad to do additional setup after loading the view, typically from a nib.

- (void)viewDidLoad {

    [super viewDidLoad];

[self setTitle:@"Video Capture"];

//捕获很简单就下面这几句话

//设置输出的View

[[CameraHelp shareCameraHelp] setPreview:videoView];

//捕获数据输出到本地

[[CameraHelp shareCameraHelp] setVideoDataOutputBuffer:self];

//开始捕获

[[CameraHelp shareCameraHelp] startVideoCapture];

}

//如果要获取捕获的数据记得重写这个接口哦 不然蹦了,不要怪人哦。

-(void)VideoDataOutputBuffer:(char*)videoBuffer dataSize:(int)size

{

NSLog(@"Recv Data size=%d",size);

}





/*

// Override to allow orientations other than the default portrait orientation.

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {

    // Return YES for supported orientations.

    return (interfaceOrientation == UIInterfaceOrientationPortrait);

}

*/





- (void)didReceiveMemoryWarning {

// Releases the view if it doesn't have a superview.

    [super didReceiveMemoryWarning];



// Release any cached data, images, etc. that aren't in use.

}





- (void)viewDidUnload {

    [super viewDidUnload];

// Release any retained subviews of the main view.

// e.g. self.myOutlet = nil;

}









- (void)dealloc {

//停止捕获

[[CameraHelp shareCameraHelp] stopVideoCapture];

//关闭输出

[[CameraHelp shareCameraHelp] setVideoDataOutputBuffer:nil];

    [super dealloc];

}





- (IBAction) onButtonEndClick: (id)sender

{

[self dismissModalViewControllerAnimated:YES];

}

@end





最后程序关闭时记得调用 

[CameraHelp closeCamera];

不然会内存泄露的

最后 觉得好的话顶下哈。。。。
  

iPhone摄像头设备获取

iPhone摄像头设备获取

 

你可能感兴趣的:(iPhone)