GPUImage

As a note: if you run into the error "Unknown class GPUImageView in Interface Builder" or the like when trying to build an interface with Interface Builder, you may need to add -ObjC to your Other Linker Flags in your project's build settings.

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];

GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:customFilter];

[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

Objective-C interface. This interface lets you define input sources for

images and video, attach filters in a chain, and send the resulting

processed image or video to the screen, to a UIImage, or to a movie on

disk.

Images or frames of video are uploaded from source objects, which are

subclasses of GPUImageOutput. These include GPUImageVideoCamera (for

live video from an iOS camera), GPUImageStillCamera (for taking photos

with the camera), GPUImagePicture (for still images), and GPUImageMovie

(for movies). Source objects upload still image frames to OpenGL ES as

textures, then hand those textures off to the next objects in the

processing chain.

Filters and other subsequent elements in the chain conform to the

GPUImageInput protocol, which lets them take in the supplied or

processed texture from the previous link in the chain and do something

with it. Objects one step further down the chain are considered targets,

and processing can be branched by adding multiple targets to a single

output or filter.

For example, an application that takes in live video from the camera,

converts that video to a sepia tone, then displays the video onscreen

would set up a chain looking something like the following:

GPUImageVideoCamera -> GPUImageSepiaFilter -> GPUImageView

GPUImage needs a few other frameworks to be linked into your

application, so you'll need to add the following as linked libraries in

your application target:

CoreMedia

CoreVideo

OpenGLES

AVFoundation

QuartzCore

Filtering live video

To filter live video from an iOS device's camera, you can use code like the following:

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];

GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:customFilter];

[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

Capturing and filtering a still photo

To capture and filter still photos, you can use a process similar to

the one for filtering video. Instead of a GPUImageVideoCamera, you use a

GPUImageStillCamera:

stillCamera = [[GPUImageStillCamera alloc] init];

stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

filter = [[GPUImageGammaFilter alloc] init];

[stillCamera addTarget:filter];

GPUImageView *filterView = (GPUImageView *)self.view;

[filter addTarget:filterView];

[stillCamera startCameraCapture];

This will give you a live, filtered feed of the still camera's

preview video. Note that this preview video is only provided on iOS 4.3

and higher, so you may need to set that as your deployment target if you

wish to have this functionality.

Once you want to capture a photo, you use a callback block like the following:

[stillCamera capturePhotoProcessedUpToFilter:filter withCompletionHandler:^(UIImage *processedImage, NSError *error){

NSData *dataForJPEGFile = UIImageJPEGRepresentation(processedImage, 0.8);

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

NSString *documentsDirectory = [paths objectAtIndex:0];

NSError *error2 = nil;

if (![dataForJPEGFile writeToFile:[documentsDirectory stringByAppendingPathComponent:@"FilteredPhoto.jpg"] options:NSAtomicWrite error:&error2])

{

return;

}

}];

Processing a still image

There are a couple of ways to process a still image and create a

result. The first way you can do this is by creating a still image

source object and manually creating a filter chain:

UIImage *inputImage = [UIImage imageNamed:@"Lambeau.jpg"];

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];

GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];

[stillImageSource addTarget:stillImageFilter];

[stillImageFilter useNextFrameForImageCapture];

[stillImageSource processImage];

UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];

The following is an example of how you would load a sample movie,

pass it through a pixellation filter, then record the result to disk as a

480 x 640 h.264 movie:

movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];

pixellateFilter = [[GPUImagePixellateFilter alloc] init];

[movieFile addTarget:pixellateFilter];

NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

unlink([pathToMovie UTF8String]);

NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];

movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];

[pixellateFilter addTarget:movieWriter];

movieWriter.shouldPassthroughAudio = YES;

movieFile.audioEncodingTarget = movieWriter;

[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];

[movieWriter startRecording];

[movieFile startProcessing];

GPUImage混合使用

首先在.h文件声明:

GPUImagePicture *staticPicture;

GPUImageOutput * brightnessFilter;//亮度

GPUImageOutput * contrastFilter;//对比度

NSMutableArray*arrayTemp;

UISlider*brightnessSlider;

UISlider* contrastSlider;

在.m文件中viewdidload中

UIImage * image = [UIImage imageNamed:@"sample1.jpg"];

staticPicture=[[GPUImagePicture alloc] initWithImage:image smoothlyScaleOutput:YES];

//亮度

brightnessFilter =[[GPUImageBrightnessFilter alloc] init];

CGRect mainScreenFrame=[[UIScreen mainScreen] applicationFrame];

GPUImageView* GPUView =[[GPUImageView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]];

[brightnessFilter forceProcessingAtSize:GPUView.sizeInPixels];

self.view=GPUView;

[brightnessFilter addTarget:GPUView];

brightnessSlider= [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height -250, mainScreenFrame.size.width -50.0,40.0)];

[brightnessSlider addTarget:self action:@selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];

brightnessSlider.autoresizingMask= UIViewAutoresizingFlexibleWidth |UIViewAutoresizingFlexibleTopMargin;

brightnessSlider.minimumValue=0.0;

brightnessSlider.maximumValue=1.0;

brightnessSlider.tag=10;

brightnessSlider.value=0.0;

[GPUView addSubview:brightnessSlider];

[staticPicture processImage];

//对比度

contrastFilter =[[GPUImageContrastFilter alloc]init];

[contrastFilter forceProcessingAtSize:GPUView.sizeInPixels];

[contrastFilter addTarget:GPUView];

contrastSlider= [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height -190, mainScreenFrame.size.width -50.0,40.0)];

[contrastSlider addTarget:self action:@selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];

contrastSlider.autoresizingMask= UIViewAutoresizingFlexibleWidth |UIViewAutoresizingFlexibleTopMargin;

contrastSlider.minimumValue=0.0;

contrastSlider.maximumValue=1.0;

contrastSlider.tag=11;

contrastSlider.value=0.0;

[GPUView addSubview:contrastSlider];

[staticPicture processImage];

//组合,这就是把你要添加的所有滤镜效果放进数组

[staticPicture addTarget:brightnessFilter];

staticPicture addTarget:contrastFilter];

arrayTemp=[[NSMutableArray alloc]initWithObjects:brightnessFilter,contrastFilter,nil];

pipeline= [[GPUImageFilterPipeline alloc]initWithOrderedFilters:arrayTemp input:staticPicture output:(GPUImageView*)self.view];

添加方法,用UISlider将调色做成可视化

- (void)updateSliderValue:(UISlider *)sender

{

NSInteger index= sender.tag -10;switch(index)

{case0:

{

GPUImageBrightnessFilter*GPU = (GPUImageBrightnessFilter *)brightnessFilter;

[GPU setBrightness:brightnessSlider.value];

[staticPicture processImage];

NSLog(@"亮度 =  %f",brightnessSlider.value);

}break;case1:{

GPUImageContrastFilter*GPU = (GPUImageContrastFilter *)contrastFilter;

[GPU setContrast:contrastSlider.value];

[staticPicture processImage];

NSLog(@"对比度 =  %f",contrastSlider.value);

}default:break;

}

}

https://github.com/BradLarson/GPUImage

你可能感兴趣的:(GPUImage)