Unity3D研究院之IOS截屏 话筒录音 录制截屏视频保存沙盒

两周没有更新博客了,MOMO最近超忙各种加班进行中。。IOS + Android同时开发,激情的日子继续着,蛤蛤。昨天有个朋友打电话告诉我说它们的U3D项目遇到点棘手的难题,他们想在Unity3D中增加截屏录像录音的功能,并且还要能导出保存成.mp4的格式。据我所知Unity3D是没有截屏录像的功能,只有截屏图片的功能。有一段时间没有研究Unity3D的东西了,一时心里痒痒我决定那就好好研究研究,功夫不负有心人终于让我研究出来如何在Unity3D结合IOS前端录制截屏视频的功能了。

.

首先我说说我研究实现的原理。1.截取屏幕每帧的图片,将截取的N张图片组成一个没有声音的视频.mp4文件。2.同时还需要录制手机听筒中的声音保存为.caf格式。3.最终将没有声音的视频和音频组合成一个全新的视频文件,保存在沙盒中即可。

.

因为他们的Unity3D项目比较特殊,可以认为是在Unity3D游戏引擎之上搭建的IOS软件项目。Unity3D只负责显示一个3D的模型,至于UI部分全部都是由IOS的前台的OC代码实现的。这样就造成一个问题,OC的代码截图只有UI部分,U3D截图只有3D部分。为了解决这个问题截屏时我们需要把这两张图片合成为一张全新的图片。这里再说一下用苹果私有API截图是可以同时将UI部分U3D部分保存为一张图片,不过有可能APPStore不能审核通过所以大家还是老老实实用合并的方法来做。

.

OK下面MOMO来说代码的实现过程


首先在Unity中创建一个全新的工程,在创建一个立方体对象,为了方便看效果我们写一条脚本让这个立方体的对象一直自转,这样播放出来的视频看的会比较清楚喔。

Test.cs直接挂在立方体对象之上。代码比较简单我就不解释了。


  1. using UnityEngine;
  2. using System.Collections;

  3. public class Test : MonoBehaviour {

  4.         int count = 0;
  5.         void Start () {

  6.         }

  7.         // Update is called once per frame
  8.         void Update ()
  9.         {

  10.                 this.transform.Rotate(new Vector3(0,1,0));
  11.         }

  12.         //在这里OC的代码通知U3D截屏
  13.         void StartScreenshot(string str)
  14.         {
  15.                 Application.CaptureScreenshot(count +"u3d.JPG");
  16.                 count++;
  17.         }
  18. }
复制代码
然后我们将这个Unity3D工程导出成IOS的项目 。Unity会生成对应的XCODE工程。我们写一个全新的ViewController覆盖在U3D生成的OPGL viewController之上,用于写UI高级控件,接着打开APPControll.mm文件。在如下方法的末尾处添加代码
  1. int OpenEAGL_UnityCallback(UIWindow** window, int* screenWidth, int* screenHeight,  int* openglesVersion)
  2. {
  3.         CGRect rect = [[UIScreen mainScreen] bounds];

  4.         // Create a full-screen window
  5.         _window = [[UIWindow alloc] initWithFrame:rect];
  6.         EAGLView* view = [[EAGLView alloc] initWithFrame:rect];
  7.         UnityViewController *controller = [[UnityViewController alloc] init];

  8.         sGLViewController = controller;
  9.         sGLView        = view;

  10. #if defined(__IPHONE_3_0)
  11.         if( _ios30orNewer )
  12.                 controller.wantsFullScreenLayout = TRUE;
  13. #endif

  14.         controller.view = view;

  15.         CreateSplashView( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone ? (UIView*)_window : (UIView*)view );
  16.         CreateActivityIndicator(_splashView);

  17.         // add only now so controller have chance to reorient *all* added views
  18.         [_window addSubview:view];
  19.         if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone)
  20.                 [_window bringSubviewToFront:_splashView];

  21.         _autorotEnableHandling = true;
  22.         [[NSNotificationCenter defaultCenter] postNotificationName: UIDeviceOrientationDidChangeNotification object: [UIDevice currentDevice]];

  23.         // reposition activity indicator after we rotated views
  24.         if (_activityIndicator)
  25.                 _activityIndicator.center = CGPointMake([_splashView bounds].size.width/2, [_splashView bounds].size.height/2);

  26.         int openglesApi =
  27. #if defined(__IPHONE_3_0) && USE_OPENGLES20_IF_AVAILABLE
  28.         kEAGLRenderingAPIOpenGLES2;
  29. #else
  30.         kEAGLRenderingAPIOpenGLES1;
  31. #endif

  32.         for (; openglesApi >= kEAGLRenderingAPIOpenGLES1 && !_context; --openglesApi)
  33.         {
  34.                 if (!UnityIsRenderingAPISupported(openglesApi))
  35.                         continue;

  36.                 _context = [[EAGLContext alloc] initWithAPI:openglesApi];
  37.         }

  38.         if (!_context)
  39.                 return false;

  40.         if (![EAGLContext setCurrentContext:_context]) {
  41.                 _context = 0;
  42.                 return false;
  43.         }

  44.         const GLuint colorFormat = UnityUse32bitDisplayBuffer() ? GL_RGBA8_OES : GL_RGB565_OES;

  45.         if (!CreateWindowSurface(view, colorFormat, GL_DEPTH_COMPONENT16_OES, UnityGetDesiredMSAASampleCount(MSAA_DEFAULT_SAMPLE_COUNT), NO, &_surface)) {
  46.                 return false;
  47.         }

  48.         glViewport(0, 0, _surface.w, _surface.h);
  49.         [_window makeKeyAndVisible];
  50.         [view release];

  51.         *window = _window;
  52.         *screenWidth = _surface.w;
  53.         *screenHeight = _surface.h;
  54.         *openglesVersion = _context.API;

  55.         _glesContextCreated = true;

  56. //--------------------下面的MyViewController就是我们新写的Contoller----------------
  57.     MyViewController * myView =  [[MyViewController alloc] init];
  58.     [sGLViewController.view addSubview:myView.view];
  59. //--------------------上面的MyViewController就是我们新写的Contoller----------------
  60.         return true;
  61. }
复制代码
  1. int OpenEAGL_UnityCallback(UIWindow** window, int* screenWidth, int* screenHeight,  int* openglesVersion)
  2. {
  3.         CGRect rect = [[UIScreen mainScreen] bounds];

  4.         // Create a full-screen window
  5.         _window = [[UIWindow alloc] initWithFrame:rect];
  6.         EAGLView* view = [[EAGLView alloc] initWithFrame:rect];
  7.         UnityViewController *controller = [[UnityViewController alloc] init];

  8.         sGLViewController = controller;
  9.         sGLView        = view;

  10. #if defined(__IPHONE_3_0)
  11.         if( _ios30orNewer )
  12.                 controller.wantsFullScreenLayout = TRUE;
  13. #endif

  14.         controller.view = view;

  15.         CreateSplashView( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone ? (UIView*)_window : (UIView*)view );
  16.         CreateActivityIndicator(_splashView);

  17.         // add only now so controller have chance to reorient *all* added views
  18.         [_window addSubview:view];
  19.         if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone)
  20.                 [_window bringSubviewToFront:_splashView];

  21.         _autorotEnableHandling = true;
  22.         [[NSNotificationCenter defaultCenter] postNotificationName: UIDeviceOrientationDidChangeNotification object: [UIDevice currentDevice]];

  23.         // reposition activity indicator after we rotated views
  24.         if (_activityIndicator)
  25.                 _activityIndicator.center = CGPointMake([_splashView bounds].size.width/2, [_splashView bounds].size.height/2);

  26.         int openglesApi =
  27. #if defined(__IPHONE_3_0) && USE_OPENGLES20_IF_AVAILABLE
  28.         kEAGLRenderingAPIOpenGLES2;
  29. #else
  30.         kEAGLRenderingAPIOpenGLES1;
  31. #endif

  32.         for (; openglesApi >= kEAGLRenderingAPIOpenGLES1 && !_context; --openglesApi)
  33.         {
  34.                 if (!UnityIsRenderingAPISupported(openglesApi))
  35.                         continue;

  36.                 _context = [[EAGLContext alloc] initWithAPI:openglesApi];
  37.         }

  38.         if (!_context)
  39.                 return false;

  40.         if (![EAGLContext setCurrentContext:_context]) {
  41.                 _context = 0;
  42.                 return false;
  43.         }

  44.         const GLuint colorFormat = UnityUse32bitDisplayBuffer() ? GL_RGBA8_OES : GL_RGB565_OES;

  45.         if (!CreateWindowSurface(view, colorFormat, GL_DEPTH_COMPONENT16_OES, UnityGetDesiredMSAASampleCount(MSAA_DEFAULT_SAMPLE_COUNT), NO, &_surface)) {
  46.                 return false;
  47.         }

  48.         glViewport(0, 0, _surface.w, _surface.h);
  49.         [_window makeKeyAndVisible];
  50.         [view release];

  51.         *window = _window;
  52.         *screenWidth = _surface.w;
  53.         *screenHeight = _surface.h;
  54.         *openglesVersion = _context.API;

  55.         _glesContextCreated = true;

  56. //--------------------下面的MyViewController就是我们新写的Contoller----------------
  57.     MyViewController * myView =  [[MyViewController alloc] init];
  58.     [sGLViewController.view addSubview:myView.view];
  59. //--------------------上面的MyViewController就是我们新写的Contoller----------------
  60.         return true;
  61. }
复制代码
如果你不是U3D项目请大家记得引入AVFoundation.formwork 和 MediaPlayer.framework ,因为U3D会自动将这两个fromWork生成出来
  1. //
  2. //  MyViewController.h
  3. //  avcount
  4. //
  5. //  Created by 雨松MOMO on 12-9-14.
  6. //  Copyright (c) 2012年 雨松MOMO. All rights reserved.
  7. //

  8. #import 
  9. #import 
  10. #import 
  11. #import   

  12. @interface MyViewController : UIViewController
  13. {
  14.   //时间计时器
  15.   NSTimer *_timer;
  16.   int _count;
  17.   UILabel * _labe;
  18.   //录音
  19.   AVAudioRecorder * _recorder;
  20.   //读取动画
  21.   UITextView *_sharedLoadingTextView;
  22.   UIActivityIndicatorView* _sharedActivityView;

  23. }

  24. @end
复制代码
下面是具体的实现 ,核心的代码MOMO也是在网上学习老外的文章,最终将它们组合在了一起就完成了。研究了好几个小时,真实内牛满面啊~~~~  由于代码比较多,请大家一定要仔细阅读哦。。
  1. //
  2. //  MyViewController.m
  3. //  avcount
  4. //
  5. //  Created by 雨松MOMO on 12-9-14.
  6. //  Copyright (c) 2012年 雨松MOMO. All rights reserved.
  7. //

  8. #import "MyViewController.h"

  9. @interface MyViewController ()

  10. @end

  11. @implementation MyViewController

  12. - (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
  13. {
  14.     self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
  15.     if (self) {
  16.         // Custom initialization
  17.     }
  18.     return self;
  19. }

  20. - (void)viewDidLoad
  21. {
  22.     [super viewDidLoad];
  23.     self.view.backgroundColor = [UIColor redColor];
  24.     UIWindow *screenWindow = [[UIApplication sharedApplication] keyWindow];
  25.     UIGraphicsBeginImageContext(screenWindow.frame.size);
  26.     [screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
  27.     UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
  28.     UIGraphicsEndImageContext();
  29.     UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);

  30. #if !TARGET_IPHONE_SIMULATOR
  31.     self.view.backgroundColor = [UIColor greenColor];
  32. #else
  33.     self.view.backgroundColor = [UIColor clearColor];
  34. #endif

  35.     UIButton * start = [UIButton buttonWithType:UIButtonTypeRoundedRect];
  36.     [start setFrame:CGRectMake(0, 0, 200, 30)];
  37.     [start setTitle:@"开始截屏" forState:UIControlStateNormal];
  38.     [start addTarget:self action:@selector(startPress) forControlEvents:UIControlEventTouchDown];

  39.     UIButton * end = [UIButton buttonWithType:UIButtonTypeRoundedRect];
  40.     [end setFrame:CGRectMake(0, 50, 200, 30)];
  41.     [end setTitle:@"结束截屏(开始录制视频)" forState:UIControlStateNormal];
  42.     [end addTarget:self action:@selector(endPress) forControlEvents:UIControlEventTouchDown];

  43.     [self.view addSubview:start];
  44.     [self.view addSubview:end];

  45.     _labe = [[[UILabel alloc]initWithFrame:CGRectMake(30, 200, 300, 30)]autorelease];
  46.     _labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===  ",_count];
  47.     [self.view addSubview:_labe];

  48.     //初始化录音
  49.     [self prepareToRecord];
  50. }

  51. -(void)addLoading:(NSString*) info
  52. {
  53.     //顶部文本视图
  54.     _sharedLoadingTextView = [[[UITextView alloc] initWithFrame:CGRectMake(0, 0, 130, 130)] autorelease];
  55.     [_sharedLoadingTextView setBackgroundColor:[UIColor blackColor]];
  56.     [_sharedLoadingTextView setText:info];
  57.     [_sharedLoadingTextView setTextColor:[UIColor whiteColor]];
  58.     [_sharedLoadingTextView setTextAlignment:UITextAlignmentCenter];
  59.     [_sharedLoadingTextView setFont:[UIFont systemFontOfSize:15]];
  60.     _sharedLoadingTextView.textAlignment = UITextAlignmentCenter;
  61.     _sharedLoadingTextView.alpha = 0.8f;
  62.     _sharedLoadingTextView.center = self.view.center;
  63.     _sharedLoadingTextView.layer.cornerRadius = 10;
  64.     _sharedLoadingTextView.layer.masksToBounds = YES;

  65.     //创建Loading动画视图
  66.     _sharedActivityView = [[[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray] autorelease];
  67.     //设置动画视图的风格,这里设定它为白色
  68.     _sharedActivityView.activityIndicatorViewStyle=UIActivityIndicatorViewStyleWhiteLarge;
  69.     //设置它显示的区域
  70.     _sharedActivityView.frame = CGRectMake(0,0, 320, 480);

  71.     _sharedActivityView.center = self.view.center;
  72.     //开始播放动画
  73.     [_sharedActivityView startAnimating];

  74.     [self.view addSubview:_sharedLoadingTextView];
  75.     [self.view addSubview:_sharedActivityView];

  76. }

  77. -(void)removeLoading
  78. {
  79.     [_sharedLoadingTextView removeFromSuperview];
  80.     [_sharedActivityView removeFromSuperview];
  81. }

  82. -(void)startPress
  83. {

  84.     _count = 0;
  85.     _timer = [NSTimer scheduledTimerWithTimeInterval: 0.1
  86.                                               target: self
  87.                                             selector: @selector(heartBeat:)
  88.                                             userInfo: nil
  89.                                              repeats: YES];

  90.     //开始录音
  91.     [_recorder record];
  92. }

  93. -(void)endPress
  94. {

  95.     if(_timer != nil)
  96.     {
  97.         [_timer invalidate];
  98.         _timer = nil;

  99.         [self addLoading:@"开始制作视频"];
  100.         [NSThread detachNewThreadSelector:@selector(startThreadMainMethod) toTarget:self withObject:nil];
  101.    }
  102. }

  103. -(void)startThreadMainMethod
  104. {
  105.     //在这里制作视频
  106.     NSMutableArray *_array = [[[NSMutableArray alloc]init]autorelease];
  107.     NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
  108.     for(int i =0; i< _count; i++)
  109.     {

  110.         //读取存在沙盒里面的文件图片
  111.         NSString *  _pathSecond = [NSString stringWithFormat:@"%@/%d%@",Path,i,@".JPG"];
  112.         NSString *  _pathFirst = [NSString stringWithFormat:@"%@/%d%@",Path,i,@"u3d.JPG"];

  113.         //因为拿到的是个路径 把它加载成一个data对象
  114.         NSData *data0=[NSData dataWithContentsOfFile:_pathFirst];
  115.         NSData *data1=[NSData dataWithContentsOfFile:_pathSecond];
  116.         //直接把该 图片读出来
  117.         UIImage *img0=[UIImage imageWithData:data0];
  118.         UIImage *img1=[UIImage imageWithData:data1];

  119.         [_array addObject:[self MergerImage : img0 : img1]];
  120.     }

  121.     Path = [NSString stringWithFormat:@"%@/%@%@",Path,@"veido",@".MP4"];

  122.     [_recorder stop];
  123.     [self writeImages:_array ToMovieAtPath:Path withSize: CGSizeMake(320, 480) inDuration:_count*0.1 byFPS:10];
  124.     [self removeLoading];

  125.     NSLog(@"recorder successfully");
  126.     UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"视频录制成功"
  127.                                                                 delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
  128.     [recorderSuccessful show];
  129.     [recorderSuccessful release];

  130. }

  131. - (void) heartBeat: (NSTimer*) timer
  132. {
  133.     _labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===  ",_count];

  134.     //这个是私有API运气不好会被拒接
  135.     //这个方法比较给力 可以直接把ios前端和 U3D中的所有图像都截取出来
  136.     //extern CGImageRef UIGetScreenImage();
  137.     //UIImage *image = [UIImage imageWithCGImage:UIGetScreenImage()];
  138.     //UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil);

  139.     //保险起见还是用如下方法截图
  140.     //这个方法不能截取U3D的图像
  141.     UIWindow *screenWindow = [[UIApplication sharedApplication]keyWindow];
  142.     UIGraphicsBeginImageContext(screenWindow.frame.size);
  143.     [screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
  144.     UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
  145.     UIGraphicsEndImageContext();

  146.     NSData *data;

  147.     if (UIImagePNGRepresentation(image) == nil)
  148.     {
  149.         data = UIImageJPEGRepresentation(image, 1);
  150.     }
  151.     else
  152.     {
  153.         data = UIImagePNGRepresentation(image);
  154.     }

  155.     NSFileManager *fileManager = [NSFileManager defaultManager];

  156.     NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];

  157.     [fileManager createDirectoryAtPath:Path withIntermediateDirectories:YES attributes:nil error:nil];
  158.     Path = [NSString stringWithFormat:@"%@/%d%@",Path,_count,@".JPG"];
  159.     [fileManager createFileAtPath:Path contents:data attributes:nil];

  160.     //通知U3D开始截屏
  161.     UnitySendMessage("Cube","StartScreenshot","");

  162.     _count++;
  163. }

  164. //合并图片,把ios前景图片和U3D图片合并在一起
  165. -(UIImage*) MergerImage:(UIImage*) firstImg:(UIImage*) secondImg
  166. {

  167.     UIGraphicsBeginImageContext(CGSizeMake(320, 480));

  168.     [firstImg drawInRect:CGRectMake(0, 0, firstImg.size.width, firstImg.size.height)];

  169.     [secondImg drawInRect:CGRectMake(0, 0, secondImg.size.width, secondImg.size.height)];

  170.     UIImage *resultImage=UIGraphicsGetImageFromCurrentImageContext();

  171.     UIGraphicsEndImageContext();

  172.     return resultImage;

  173. }

  174. - (void)viewDidUnload
  175. {
  176.     [super viewDidUnload];
  177.     // Release any retained subviews of the main view.
  178. }

  179. - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
  180. {
  181.     return (interfaceOrientation == UIInterfaceOrientationPortrait);
  182. }

  183. - (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
  184. {
  185.     NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
  186.                              [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
  187.                              [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
  188.                              nil];
  189.     CVPixelBufferRef pxbuffer = NULL;

  190.     CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
  191.                                           size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
  192.                                           &pxbuffer);
  193.     NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

  194.     CVPixelBufferLockBaseAddress(pxbuffer, 0);
  195.     void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
  196.     NSParameterAssert(pxdata != NULL);

  197.     CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
  198.     CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
  199.                                                  size.height, 8, 4*size.width, rgbColorSpace,
  200.                                                  kCGImageAlphaNoneSkipFirst);
  201.     NSParameterAssert(context);
  202.     CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
  203.     CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
  204.                                            CGImageGetHeight(image)), image);
  205.     CGColorSpaceRelease(rgbColorSpace);
  206.     CGContextRelease(context);

  207.     CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

  208.     return pxbuffer;
  209. }

  210. - (void) writeImages:(NSArray *)imagesArray ToMovieAtPath:(NSString *) path withSize:(CGSize) size
  211.           inDuration:(float)duration byFPS:(int32_t)fps
  212. {

  213.     //在这里将之前截取的图片合并成一个视频
  214.     //Wire the writer:
  215.     NSError *error = nil;
  216.     AVAssetWriter *videoWriter = [[[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
  217.                                                             fileType:AVFileTypeQuickTimeMovie
  218.                                                                error:&error] autorelease];
  219.     NSParameterAssert(videoWriter);

  220.     NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
  221.                                    AVVideoCodecH264, AVVideoCodecKey,
  222.                                    [NSNumber numberWithInt:size.width], AVVideoWidthKey,
  223.                                    [NSNumber numberWithInt:size.height], AVVideoHeightKey,
  224.                                    nil];

  225.     AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
  226.                                              assetWriterInputWithMediaType:AVMediaTypeVideo
  227.                                              outputSettings:videoSettings] retain];

  228.     AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
  229.                                                      assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
  230.                                                      sourcePixelBufferAttributes:nil];
  231.     NSParameterAssert(videoWriterInput);
  232.     NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
  233.     [videoWriter addInput:videoWriterInput];

  234.     //Start a session:
  235.     [videoWriter startWriting];
  236.     [videoWriter startSessionAtSourceTime:kCMTimeZero];

  237.     //Write some samples:
  238.     CVPixelBufferRef buffer = NULL;

  239.     int frameCount = 0;

  240.     int imagesCount = [imagesArray count];
  241.     float averageTime = duration/imagesCount;
  242.     int averageFrame = (int)(averageTime * fps);

  243.     for(UIImage * img in imagesArray)
  244.     {
  245.         buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:size];

  246.         BOOL append_ok = NO;
  247.         int j = 0;
  248.         while (!append_ok && j < 30)
  249.         {
  250.             if (adaptor.assetWriterInput.readyForMoreMediaData)
  251.             {
  252.                 printf("appending %d attemp %d", frameCount, j);

  253.                 CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
  254.                 float frameSeconds = CMTimeGetSeconds(frameTime);
  255.                 NSLog(@"frameCount:%d,kRecordingFPS:%d,frameSeconds:%f",frameCount,fps,frameSeconds);
  256.                 append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];

  257.                 if(buffer)
  258.                     [NSThread sleepForTimeInterval:0.05];
  259.             }
  260.             else
  261.             {
  262.                 printf("adaptor not ready %d, %d", frameCount, j);
  263.                 [NSThread sleepForTimeInterval:0.1];
  264.             }
  265.             j++;
  266.         }
  267.         if (!append_ok) {
  268.             printf("error appending image %d times %d", frameCount, j);
  269.         }

  270.         frameCount = frameCount + averageFrame;
  271.     }

  272.     //Finish the session:
  273.     [videoWriterInput markAsFinished];
  274.     [videoWriter finishWriting];
  275.     NSLog(@"finishWriting");

  276.     //将静态视频 和声音合并成一个新视频
  277.     [self CompileFilesToMakeMovie];

  278. }

  279. - (void) prepareToRecord

  280. {

  281.     AVAudioSession *audioSession = [AVAudioSession sharedInstance];

  282.     NSError *err = nil;

  283.     [audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];

  284.     if(err){

  285.         NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);

  286.         return;

  287.     }

  288.     [audioSession setActive:YES error:&err];

  289.     err = nil;

  290.     if(err){

  291.         NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);

  292.         return;

  293.     }

  294.     NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];

  295.     [recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];

  296.     [recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];

  297.     [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

  298.     [recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];

  299.     [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];

  300.     [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

  301.     // Create a new dated file
  302.     NSString * recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], @"sound"] retain];
  303.     NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
  304.     err = nil;
  305.      _recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
  306.     if(!_recorder){
  307.         NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
  308.         UIAlertView *alert =
  309.         [[UIAlertView alloc] initWithTitle: @"Warning"
  310.                                    message: [err localizedDescription]
  311.                                   delegate: nil
  312.                          cancelButtonTitle:@"OK"
  313.                          otherButtonTitles:nil];
  314.         [alert show];
  315.         [alert release];
  316.         return;
  317.     }
  318.     //prepare to record
  319.     [_recorder setDelegate:self];
  320.     [_recorder prepareToRecord];
  321.     _recorder.meteringEnabled = YES;
  322.     BOOL audioHWAvailable = audioSession.inputIsAvailable;
  323.     if (! audioHWAvailable) {
  324.         UIAlertView *cantRecordAlert =
  325.         [[UIAlertView alloc] initWithTitle: @"Warning"
  326.                                    message: @"Audio input hardware not available"
  327.                                   delegate: nil
  328.                          cancelButtonTitle:@"OK"
  329.                          otherButtonTitles:nil];
  330.         [cantRecordAlert show];
  331.         [cantRecordAlert release];
  332.         return;
  333.     }
  334. }

  335. //代理 这里可以监听录音成功
  336. - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
  337. {
  338. //    NSLog(@"recorder successfully");
  339. //    UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"录音成功"
  340. //                                                                delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
  341. //    [recorderSuccessful show];
  342. //    [recorderSuccessful release];
  343. }

  344. //代理 这里可以监听录音失败
  345. - (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)arecorder error:(NSError *)error
  346. {

  347. //    UIAlertView *recorderFailed = [[UIAlertView alloc] initWithTitle:@"" message:@"发生错误"
  348. //                                                            delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
  349. //    [recorderFailed show];
  350. //    [recorderFailed release];
  351. }

  352. -(void)CompileFilesToMakeMovie
  353. {
  354.     //这个方法在沙盒中把静态图片组成的视频 与录制的声音合并成一个新视频
  355.     AVMutableComposition* mixComposition = [AVMutableComposition composition];

  356.     NSString* audio_inputFileName = @"sound.caf";
  357.     NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], audio_inputFileName] ;
  358.     NSURL*    audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];

  359.     NSString* video_inputFileName = @"veido.mp4";
  360.     NSString* video_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], video_inputFileName] ;
  361.     NSURL*    video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];

  362.     NSString* outputFileName = @"outputVeido.mov";
  363.     NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], outputFileName] ;
  364.     NSURL*    outputFileUrl = [NSURL fileURLWithPath:outputFilePath];

  365.     if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
  366.         [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];

  367.     CMTime nextClipStartTime = kCMTimeZero;

  368.     AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
  369.     CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
  370.     AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  371.     [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

  372.     AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
  373.     CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
  374.     AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
  375.     [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

  376.     AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
  377.     _assetExport.outputFileType = @"com.apple.quicktime-movie";
  378.     _assetExport.outputURL = outputFileUrl;

  379.     [_assetExport exportAsynchronouslyWithCompletionHandler:
  380.      ^(void ) {
  381.      }
  382.      ];

  383. }

  384. @end
复制代码
如下图所示,高级控件的按钮属于UI部分,后面的立方体对象是U3D生成,并且立方体对象在一直的旋转。点击开始截屏按钮时OC 部分 和U3D会每一帧同时截屏,并且此时开始录音。点击结束截屏按钮时,程序先将OC和U3D截屏的图片每一帧两两的组合成一个新图片,然后生成没有声音的视频。 最后将没有声音的视频 和刚刚录制的音频组合成一个全新的视频存在沙盒中即可。

 

此时我们看一下模拟器中的沙盒文件,”数字”.JPG就是OC截取的图片, “数字+U3D”.JPG就是U3D中截取的图片。Sound.caf就是录制的音频文件,veido.mp4 就是将连续的图片组合成的无声音视频文件,最后的outputVeido.mov就是将无声音的视频文件与音频文件组合成的新视频文件。 


 

双击打开outputVeido.mov视频文件,我们可以直接在QuickTimePlayer中播放它,怎么样功能很给力吧,哈哈哈。U3D也能轻松的实现截屏功能,哈哈哈~~


 

最后我在说一下,这个代码同时也适用于IOS普通的软件项目中,U3D只是多做了一步合成图片的功能,所有代码都写在MyViewController中请大家仔细看喔。 这两天MOMO还会抽时间研究一下在Android 下如何截屏录制视频,请大家拭目以待噢,哇咔咔。

因为U3D生成的工程比较大,所以我就不上传U3D生成的XCODE代码了,我给出一个纯OC代码的下载地址,最后雨松MOMO祝大家学习愉快。下载地址:http://vdisk.weibo.com/s/cvQOT

你可能感兴趣的:(iphone)