本篇文章格式上比较杂乱,留以后慢慢整理。
大致是从三个关于camera使用点例子中得到的一些结论。
1. AVCam
demonstrates usage of AV Foundation capture API for recording movies, taking still images, and switching cameras. It runs only on an actual device, either an iPad or iPhone, and cannot be run in the simulator.
GLKViewController 可以借助openGL实现一帧一帧的动画显示。
2. GLCameraRipple
This sample demonstrates how to use the AVFoundation framework to capture YUV
frames from the camera and process them using shaders in OpenGL ES 2.0.
3. SquareCam
demonstrates improvements to the AVCaptureStillImageOutput class in iOS 5, highlighting the following features:
- KVO observation of the @"capturingStillImage" property to know when to perform an animation
- Use of setVideoScaleAndCropFactor: to achieve a "digital zoom" effect on captured images
- Switching between front and back cameras while showing a real-time preview
- Integrating with CoreImage's new CIFaceDetector to find faces in a real-time VideoDataOutput, as well as in a captured still image.
Found faces are indicated with a red square.
- Overlaid square is rotated appropriately for the 4 supported device rotations.