http://codethink.no-ip.org/wordpress/archives/673
Awhile back I posted a handful of simple iOS utilities. Among them was a basic ScreenCaptureViewimplementation that would periodically render the contents of its subview(s) into a UIImage that was exposed as a publicly accessible property. This provides the ability to quickly and easily take a snapshot of your running application, or any arbitrary component within it. And while not superbly impressive (the iPhone has a built-in screenshot feature, after all), I noted that the control theoretically allowed for captured frames to be sent off to an AVCaptureSession in order to record live video of a running application.
Recently I returned to this bit of code, and the ability to record live video of an application is theoretical no longer. To get straight to the point, here is the revised code:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
|
//
//ScreenCaptureView.h
//
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
/**
* Delegate protocol. Implement this if you want to receive a notification when the
* view completes a recording.
*
* When a recording is completed, the ScreenCaptureView will notify the delegate, passing
* it the path to the created recording file if the recording was successful, or a value
* of nil if the recording failed/could not be saved.
*/
@protocol ScreenCaptureViewDelegate <NSObject>
- (
void
) recordingFinished:(NSString*)outputPathOrNil;
@end
/**
* ScreenCaptureView, a UIView subclass that periodically samples its current display
* and stores it as a UIImage available through the 'currentScreen' property. The
* sample/update rate can be configured (within reason) by setting the 'frameRate'
* property.
*
* This class can also be used to record real-time video of its subviews, using the
* 'startRecording' and 'stopRecording' methods. A new recording will overwrite any
* previously made recording file, so if you want to create multiple recordings per
* session (or across multiple sessions) then it is your responsibility to copy/back-up
* the recording output file after each session.
*
* To use this class, you must link against the following frameworks:
*
* - AssetsLibrary
* - AVFoundation
* - CoreGraphics
* - CoreMedia
* - CoreVideo
* - QuartzCore
*
*/
@interface ScreenCaptureView : UIView {
//video writing
AVAssetWriter *videoWriter;
AVAssetWriterInput *videoWriterInput;
AVAssetWriterInputPixelBufferAdaptor *avAdaptor;
//recording state
BOOL
_recording;
NSDate* startedAt;
void
* bitmapData;
}
//for recording video
- (
bool
) startRecording;
- (
void
) stopRecording;
//for accessing the current screen and adjusting the capture rate, etc.
@property(retain) UIImage* currentScreen;
@property(assign)
float
frameRate;
@property(nonatomic, assign) id<ScreenCaptureViewDelegate> delegate;
@end
//
//ScreenCaptureView.m
//
#import "ScreenCaptureView.h"
#import <QuartzCore/QuartzCore.h>
#import <MobileCoreServices/UTCoreTypes.h>
#import <AssetsLibrary/AssetsLibrary.h>
@interface ScreenCaptureView(Private)
- (
void
) writeVideoFrameAtTime:(CMTime)
time
;
@end
@implementation ScreenCaptureView
@synthesize currentScreen, frameRate, delegate;
- (
void
) initialize {
// Initialization code
self.clearsContextBeforeDrawing = YES;
self.currentScreen = nil;
self.frameRate = 10.0f;
//10 frames per seconds
_recording =
false
;
videoWriter = nil;
videoWriterInput = nil;
avAdaptor = nil;
startedAt = nil;
bitmapData = NULL;
}
- (id) initWithCoder:(NSCoder *)aDecoder {
self = [super initWithCoder:aDecoder];
if
(self) {
[self initialize];
}
return
self;
}
- (id) init {
self = [super init];
if
(self) {
[self initialize];
}
return
self;
}
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if
(self) {
[self initialize];
}
return
self;
}
- (CGContextRef) createBitmapContextOfSize:(CGSize) size {
CGContextRef context = NULL;
CGColorSpaceRef colorSpace;
int
bitmapByteCount;
int
bitmapBytesPerRow;
bitmapBytesPerRow = (size.width * 4);
bitmapByteCount = (bitmapBytesPerRow * size.height);
colorSpace = CGColorSpaceCreateDeviceRGB();
if
(bitmapData != NULL) {
free
(bitmapData);
}
bitmapData =
malloc
( bitmapByteCount );
if
(bitmapData == NULL) {
fprintf
(stderr,
"Memory not allocated!"
);
return
NULL;
}
context = CGBitmapContextCreate (bitmapData,
size.width,
size.height,
8,
// bits per component
bitmapBytesPerRow,
colorSpace,
kCGImageAlphaNoneSkipFirst);
CGContextSetAllowsAntialiasing(context,NO);
if
(context== NULL) {
free
(bitmapData);
fprintf
(stderr,
"Context not created!"
);
return
NULL;
}
CGColorSpaceRelease( colorSpace );
return
context;
}
//static int frameCount = 0; //debugging
- (
void
) drawRect:(CGRect)rect {
NSDate* start = [NSDate date];
CGContextRef context = [self createBitmapContextOfSize:self.frame.size];
//not sure why this is necessary...image renders upside-down and mirrored
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
CGContextConcatCTM(context, flipVertical);
[self.layer renderInContext:context];
CGImageRef cgImage = CGBitmapContextCreateImage(context);
UIImage* background = [UIImage imageWithCGImage: cgImage];
CGImageRelease(cgImage);
self.currentScreen = background;
//debugging
//if (frameCount < 40) {
// NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
// NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
// [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES];
// frameCount++;
//}
//NOTE: to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls
// 'setNeedsDisplay' on the ScreenCaptureView.
if
(_recording) {
float
millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0;
[self writeVideoFrameAtTime:CMTimeMake((
int
)millisElapsed, 1000)];
}
float
processingSeconds = [[NSDate date] timeIntervalSinceDate:start];
float
delayRemaining = (1.0 / self.frameRate) - processingSeconds;
CGContextRelease(context);
//redraw at the specified framerate
[self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01];
}
- (
void
) cleanupWriter {
[avAdaptor release];
avAdaptor = nil;
[videoWriterInput release];
videoWriterInput = nil;
[videoWriter release];
videoWriter = nil;
[startedAt release];
startedAt = nil;
if
(bitmapData != NULL) {
free
(bitmapData);
bitmapData = NULL;
}
}
- (
void
)dealloc {
[self cleanupWriter];
[super dealloc];
}
- (NSURL*) tempFileURL {
NSString* outputPath = [[NSString alloc] initWithFormat:@
"%@/%@"
, [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @
"output.mp4"
];
NSURL* outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager* fileManager = [NSFileManager defaultManager];
if
([fileManager fileExistsAtPath:outputPath]) {
NSError* error;
if
([fileManager removeItemAtPath:outputPath error:&error] == NO) {
NSLog(@
"Could not delete old recording file at path: %@"
, outputPath);
}
}
[outputPath release];
return
[outputURL autorelease];
}
-(
BOOL
) setUpWriter {
NSError* error = nil;
videoWriter = [[AVAssetWriter alloc] initWithURL:[self tempFileURL] fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(videoWriter);
//Configure video
NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:1024.0*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:self.frame.size.width], AVVideoWidthKey,
[NSNumber numberWithInt:self.frame.size.height], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];
NSParameterAssert(videoWriterInput);
videoWriterInput.expectsMediaDataInRealTime = YES;
NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
avAdaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:bufferAttributes] retain];
//add input
[videoWriter addInput:videoWriterInput];
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];
return
YES;
}
- (
void
) completeRecordingSession {
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
[videoWriterInput markAsFinished];
// Wait for the video
int
status = videoWriter.status;
while
(status == AVAssetWriterStatusUnknown) {
NSLog(@
"Waiting..."
);
[NSThread sleepForTimeInterval:0.5f];
status = videoWriter.status;
}
@synchronized(self) {
BOOL
success = [videoWriter finishWriting];
if
(!success) {
NSLog(@
"finishWriting returned NO"
);
}
[self cleanupWriter];
id delegateObj = self.delegate;
NSString *outputPath = [[NSString alloc] initWithFormat:@
"%@/%@"
, [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @
"output.mp4"
];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSLog(@
"Completed recording, file is stored at: %@"
, outputURL);
if
([delegateObj respondsToSelector:@selector(recordingFinished:)]) {
[delegateObj performSelectorOnMainThread:@selector(recordingFinished:) withObject:(success ? outputURL : nil) waitUntilDone:YES];
}
[outputPath release];
[outputURL release];
}
[pool drain];
}
- (
bool
) startRecording {
bool
result = NO;
@synchronized(self) {
if
(! _recording) {
result = [self setUpWriter];
startedAt = [[NSDate date] retain];
_recording =
true
;
}
}
return
result;
}
- (
void
) stopRecording {
@synchronized(self) {
if
(_recording) {
_recording =
false
;
[self completeRecordingSession];
}
}
}
-(
void
) writeVideoFrameAtTime:(CMTime)
time
{
if
(![videoWriterInput isReadyForMoreMediaData]) {
NSLog(@
"Not ready for video data"
);
}
else
{
@synchronized (self) {
UIImage* newFrame = [self.currentScreen retain];
CVPixelBufferRef pixelBuffer = NULL;
CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));
int
status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
if
(status != 0){
//could not get a buffer from the pool
NSLog(@
"Error creating pixel buffer: status=%d"
, status);
}
// set image data into pixel buffer
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);
//XXX: will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data
if
(status == 0){
BOOL
success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:
time
];
if
(!success)
NSLog(@
"Warning: Unable to write buffer to video"
);
}
//clean up
[newFrame release];
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
CVPixelBufferRelease( pixelBuffer );
CFRelease(image);
CGImageRelease(cgImage);
}
}
}
@end
|
This class will let you record high-quality video of any other view in your application. To use it, simply set it up as the superview of the UIView(s) that you want to record, add a reference to it in your correspondingUIViewController (using Interface Builder or whatever your preferred method happens to be), and then call ‘startRecording‘ when you are ready to start recording video. When you’ve recorded enough, call ‘stopRecording‘ to complete the process. You will get a nice .mp4 file stored under your application’s ‘Documents’ directory that you can copy off or do whatever else you want with.
Note that if you want to record a UIScrollView while it is scrolling, you will need to implement yourUIScrollViewDelegate such that it calls ‘setNeedsDisplay‘ on the ScreenCaptureView while the scroll-view is scrolling. For instance:
1
2
3
|
- (
void
) scrollViewDidScroll: (UIScrollView*)scrollView {
[captureView setNeedsDisplay];
}
|
I haven’t tested this code on a physical device yet, but there’s no reason why it should not work on any device that includes H.264 video codec support (iPhone 3GS and later). However, given the amount of drawing that it does, it’s safe to say that the more horsepower behind it, the better.
Here is a rather unimpressive 30-second recording of a UITableView that I created using this class (if your browser doesn’t support HTML5, use the link below):
Example iPhone Recording
Lastly, I haven’t tested this class with any OpenGL-based subviews, so I can’t say if it will work in that case. If you try it in this configuration, please feel free to reply with your results.
Update
For anyone looking for a working example, you can download this sample project. This project simply creates a 30-second recording of a ‘UITableView‘.