一直在做安卓的项目,想着找个时间总结一下,可能太懒了,一直没总结。
代码(尤其是对焦框显示)参考了许多Blog和Github,修修补补改改挺多地方,记录一下,侵删私信或注明出处。
(1)初始化Camera2Helper
(2)调用Camera2的API,读取摄像头参数,拿到支持的分辨率(分为预览(Preview)
分辨率和录制(MediaCoder)
分辨率,下面采用的是一致的,也可以不一致的)
(3)打开摄像头,开始预览(输出到预览Surface和录制Surface)
(4)开始录制(封装成startRecord方法,内部调用MediaCoder.start()
方法)
(5)结束录制
TextureView
进行显示,可以用View的旋转方法进行显示。录制和预览界面。
public class AutoFitTextureView extends TextureView {
private int mRatioWidth = 0;
private int mRatioHeight = 0;
private MyTextureViewTouchEvent mMyTextureViewTouchEvent;
private FocusPositionTouchEvent mFocusPositionTouchEvent;
public AutoFitTextureView(Context context) {
this(context, null);
}
public AutoFitTextureView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public AutoFitTextureView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs, defStyle);
}
/**
* Sets the aspect ratio for this view. The size of the view will be measured based on the ratio
* calculated from the parameters. Note that the actual sizes of parameters don't matter, that
* is, calling setAspectRatio(2, 3) and setAspectRatio(4, 6) make the same result.
*
* @param width Relative horizontal size
* @param height Relative vertical size
*/
public void setAspectRatio(int width, int height) {
if (width < 0 || height < 0) {
throw new IllegalArgumentException("Size cannot be negative.");
}
mRatioWidth = width;
mRatioHeight = height;
requestLayout();
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = MeasureSpec.getSize(heightMeasureSpec);
if (0 == mRatioWidth || 0 == mRatioHeight) {
setMeasuredDimension(width, height);
} else {
if (width < height * mRatioWidth / mRatioHeight) {
setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
} else {
setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
}
}
}
@Override
public boolean onTouchEvent(MotionEvent event) {
mFocusPositionTouchEvent.getPosition(event);
if (mMyTextureViewTouchEvent != null) {
return mMyTextureViewTouchEvent.onAreaTouchEvent(event);
}
return true;
}
public void setMyTextureViewTouchEvent(MyTextureViewTouchEvent myTextureViewTouchEvent) {
this.mMyTextureViewTouchEvent = myTextureViewTouchEvent;
}
public void setFocusPositionTouchEvent(FocusPositionTouchEvent mFocusPositionTouchEvent) {
this.mFocusPositionTouchEvent = mFocusPositionTouchEvent;
}
public interface MyTextureViewTouchEvent {
boolean onAreaTouchEvent(MotionEvent event);
}
public interface FocusPositionTouchEvent {
void getPosition(MotionEvent event);
}
}
注:
(1)定义两个接口,getPosition
用于界面层的对焦框位置获取;onAreaTouchEvent
用于预览层的对焦点定位,设置参数,重新发送预览请求。
(2)一个公开的方法,设置纵横比,保证预览不变形。
(3)覆写onMeasure,通过(2)的方法,保证预览不变形。
(4)覆写Touch事件,调用两个监听器方法(1中的接口)。
public class AnimationImageView extends android.support.v7.widget.AppCompatImageView {
private Handler mMainHandler;
private Animation mAnimation;
private Context mContext;
/**
* 防止又换了个框,但是上次哪个还没有消失即将消失,就把新的框的给消失了
*/
public int mTimes = 0;
public AnimationImageView(Context context) {
super(context);
mContext = context;
}
public AnimationImageView(Context context, AttributeSet attrs) {
super(context, attrs);
mContext = context;
}
public AnimationImageView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
mContext = context;
}
public void setmMainHandler(Handler mMainHandler) {
this.mMainHandler = mMainHandler;
}
public void setmAnimation(Animation mAnimation) {
this.mAnimation = mAnimation;
}
public void initFocus() {
this.setVisibility(VISIBLE);
new Thread(new SleepThread(mMainHandler, Camera2Helper.FOCUS_DISAPPEAR, 1, null)).start();
}
public void startFocusing() {
mTimes++;
this.setVisibility(View.VISIBLE);
this.startAnimation(mAnimation);
this.setBackground(mContext.getDrawable(R.mipmap.ic_focus_start));
new Thread(new SleepThread(mMainHandler, Camera2Helper.FOCUS_DISAPPEAR, 1000, mTimes)).start();
}
public void focusFailed() {
mTimes++;
this.setBackground(mContext.getDrawable(R.mipmap.ic_focus_failed));
new Thread(new SleepThread(mMainHandler, Camera2Helper.FOCUS_DISAPPEAR, 800, mTimes)).start();
}
public void focusSuccess() {
mTimes++;
this.setVisibility(View.VISIBLE);
this.setBackground(mContext.getDrawable(R.mipmap.ic_focus_succeed));
new Thread(new SleepThread(mMainHandler, Camera2Helper.FOCUS_DISAPPEAR, 800, mTimes)).start();
}
public void stopFocus() {
this.setVisibility(INVISIBLE);
}
}
休眠线程,发送time,判断是否是同一次的对焦状态传递。
public class SleepThread implements Runnable {
private Handler mMainHandler;
private int what;
private long mTime;
private Object mObject;
public SleepThread(Handler mainHandler, int what, long mTime, Object mObject) {
this.mMainHandler = mainHandler;
this.what = what;
this.mTime = mTime;
this.mObject = mObject;
}
@Override
public void run() {
try {
Thread.sleep(mTime);
} catch (InterruptedException e) {
e.printStackTrace();
}
Message message = mMainHandler.obtainMessage();
message.what = what;
message.obj = mObject;
mMainHandler.sendMessage(message);
}
}
public class PreviewSessionCallback extends CameraCaptureSession.CaptureCallback implements AutoFitTextureView.FocusPositionTouchEvent {
private final static String TAG = "PreviewSessionCallback";
private int mAfState = CameraMetadata.CONTROL_AF_STATE_INACTIVE;
private int mRawX;
private int mRawY;
private boolean mFlagShowFocusImage = false;
private AnimationImageView mFocusImage;
/**
* UI线程的Handler,更新UI用
*/
private Handler mMainHandler;
public PreviewSessionCallback(AnimationImageView mFocusImage, Handler mMainHandler, AutoFitTextureView mAutoFitTextureView) {
this.mFocusImage = mFocusImage;
this.mMainHandler = mMainHandler;
mAutoFitTextureView.setFocusPositionTouchEvent(this);
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session, @NonNull CaptureRequest request, @NonNull final TotalCaptureResult result) {
Integer nowAfState = result.get(CaptureResult.CONTROL_AF_STATE);
// LogUtil.getInstance().d("status", "nowAfState:" + nowAfState + "mAfState:" + mAfState);
//获取失败
if (nowAfState == null) {
return;
}
//这次的值与之前的一样,忽略掉
if (nowAfState == mAfState) {
return;
}
mAfState = nowAfState;
mMainHandler.post(new Runnable() {
@Override
public void run() {
judgeFocus();
}
});
}
private void judgeFocus() {
switch (mAfState) {
case CameraMetadata.CONTROL_AF_STATE_ACTIVE_SCAN:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_ACTIVE_SCAN");
case CameraMetadata.CONTROL_AF_STATE_PASSIVE_SCAN:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_PASSIVE_SCAN");
focusFocusing();
break;
case CameraMetadata.CONTROL_AF_STATE_FOCUSED_LOCKED:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_FOCUSED_LOCKED");
case CameraMetadata.CONTROL_AF_STATE_PASSIVE_FOCUSED:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_PASSIVE_FOCUSED");
focusSucceed();
break;
case CameraMetadata.CONTROL_AF_STATE_INACTIVE:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_INACTIVE");
focusInactive();
break;
case CameraMetadata.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_NOT_FOCUSED_LOCKED");
case CameraMetadata.CONTROL_AF_STATE_PASSIVE_UNFOCUSED:
LogUtil.getInstance().d("status", "CONTROL_AF_STATE_PASSIVE_UNFOCUSED");
focusFailed();
break;
}
}
private void focusFocusing() {
//得到宽高
int width = mFocusImage.getWidth();
int height = mFocusImage.getHeight();
//居中
ViewGroup.MarginLayoutParams margin = new ViewGroup.MarginLayoutParams(mFocusImage.getLayoutParams());
margin.setMargins(mRawX - width / 2, mRawY - height / 2, margin.rightMargin, margin.bottomMargin);
RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(margin);
mFocusImage.setLayoutParams(layoutParams);
//显示
if (!mFlagShowFocusImage) {
mFocusImage.startFocusing();
mFlagShowFocusImage = true;
}
}
private void focusSucceed() {
if (mFlagShowFocusImage) {
mFocusImage.focusSuccess();
mFlagShowFocusImage = false;
}
}
private void focusInactive() {
mFocusImage.stopFocus();
mFlagShowFocusImage = false;
}
private void focusFailed() {
if (mFlagShowFocusImage) {
mFocusImage.focusFailed();
mFlagShowFocusImage = false;
}
}
@Override
public void getPosition(MotionEvent event) {
mRawX = (int) event.getRawX();
mRawY = (int) event.getRawY();
}
}
注:
(1)onCaptureCompleted方法当图像捕获完全完成且所有结果元数据都可用时都会调用此方法。由于请求时Repeating的,所以会不断的调用该方法,从而连续对焦,锁定等。
public class TextureViewTouchEvent implements AutoFitTextureView.MyTextureViewTouchEvent {
private CameraCharacteristics mCameraCharacteristics;
private TextureView mTextureView;
private CaptureRequest.Builder mPreviewBuilder;
private CameraCaptureSession mPreviewSession;
private CaptureRequest mPreviewRequest;
private Handler mBackgroundHandler;
private PreviewSessionCallback mPreviewSessionCallback;
private Size mPreviewSize;
private Integer mSensorOrientation;
public TextureViewTouchEvent(CameraCharacteristics mCameraCharacteristics, TextureView mTextureView, CaptureRequest.Builder mPreviewBuilder,
CameraCaptureSession mPreviewSession, CaptureRequest mPreviewRequest, Handler mBackgroundHandler,
PreviewSessionCallback mPreviewSessionCallback, Size mPreviewSize, Integer mSensorOrientation) {
this.mCameraCharacteristics = mCameraCharacteristics;
this.mTextureView = mTextureView;
this.mPreviewBuilder = mPreviewBuilder;
this.mPreviewSession = mPreviewSession;
this.mPreviewRequest = mPreviewRequest;
this.mBackgroundHandler = mBackgroundHandler;
this.mPreviewSessionCallback = mPreviewSessionCallback;
this.mPreviewSize = mPreviewSize;
this.mSensorOrientation = mSensorOrientation;
}
@Override
public boolean onAreaTouchEvent(MotionEvent event) {
switch (event.getAction()) {
case MotionEvent.ACTION_DOWN:
// 先取相对于view上面的坐标
double x = event.getX(), y = event.getY(), tmp;
LogUtil.getInstance().d("shb", "原始: x--->" + x + ",,,y--->" + y);
// 取出来的图像如果有旋转角度的话,则需要将宽高交换下
int realPreviewWidth;
int realPreviewHeight;
if (Camera2Helper.SENSOR_ORIENTATION_DEFAULT_DEGREES == mSensorOrientation || Camera2Helper.SENSOR_ORIENTATION_INVERSE_DEGREES == mSensorOrientation) {
realPreviewWidth = mPreviewSize.getHeight();
realPreviewHeight = mPreviewSize.getWidth();
} else {
realPreviewWidth = mPreviewSize.getWidth();
realPreviewHeight = mPreviewSize.getHeight();
}
// 计算摄像头取出的图像相对于view放大了多少,以及有多少偏移
double imgScale = 1.0, verticalOffset = 0, horizontalOffset = 0;
// mTextureView预览View的控件
if (realPreviewHeight * mTextureView.getWidth() > realPreviewWidth * mTextureView.getHeight()) {
imgScale = mTextureView.getWidth() * 1.0 / realPreviewWidth;
verticalOffset = (realPreviewHeight - mTextureView.getHeight() / imgScale) / 2;
} else {
imgScale = mTextureView.getHeight() * 1.0 / realPreviewHeight;
horizontalOffset = (realPreviewWidth - mTextureView.getWidth() / imgScale) / 2;
}
// 将点击的坐标转换为图像上的坐标
x = x / imgScale + horizontalOffset;
y = y / imgScale + verticalOffset;
if (Camera2Helper.SENSOR_ORIENTATION_DEFAULT_DEGREES == mSensorOrientation) {
tmp = x;
x = y;
y = mPreviewSize.getHeight() - tmp;
} else if (Camera2Helper.SENSOR_ORIENTATION_INVERSE_DEGREES == mSensorOrientation) {
tmp = x;
x = mPreviewSize.getWidth() - y;
y = tmp;
}
// 计算取到的图像相对于裁剪区域的缩放系数,以及位移
Rect cropRegion = mPreviewRequest.get(CaptureRequest.SCALER_CROP_REGION);
if (null == cropRegion) {
LogUtil.getInstance().e("TextureViewTouchEvent", "can't get crop region");
Size s = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_PIXEL_ARRAY_SIZE);
if (s != null) {
cropRegion = new Rect(0, 0, s.getWidth(), s.getHeight());
}
}
int cropWidth = cropRegion.width(), cropHeight = cropRegion.height();
if (mPreviewSize.getHeight() * cropWidth > mPreviewSize.getWidth() * cropHeight) {
imgScale = cropHeight * 1.0 / mPreviewSize.getHeight();
verticalOffset = 0;
horizontalOffset = (cropWidth - imgScale * mPreviewSize.getWidth()) / 2;
} else {
imgScale = cropWidth * 1.0 / mPreviewSize.getWidth();
horizontalOffset = 0;
verticalOffset = (cropHeight - imgScale * mPreviewSize.getHeight()) / 2;
}
// 将点击区域相对于图像的坐标,转化为相对于成像区域的坐标
x = x * imgScale + horizontalOffset + cropRegion.left;
y = y * imgScale + verticalOffset + cropRegion.top;
double tapAreaRatio = 0.03;
Rect rect = new Rect();
rect.left = MathUtils.clamp((int) (x - tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width() - 1);
rect.right = MathUtils.clamp((int) (x + tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width() - 1);
rect.top = MathUtils.clamp((int) (y - tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.height() - 1);
rect.bottom = MathUtils.clamp((int) (y + tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.height() - 1);
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_CANCEL);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_LOCK, Boolean.FALSE);
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[]{new MeteringRectangle(rect, 999)});
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, new MeteringRectangle[]{new MeteringRectangle(rect, 999)});
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER_START);
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CaptureRequest.CONTROL_AF_TRIGGER_START);
try {
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), mPreviewSessionCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
LogUtil.getInstance().e("TextureViewTouchEvent", "setRepeatingRequest failed, " + e.getMessage());
}
break;
case MotionEvent.ACTION_UP:
break;
}
return true;
}
}
private Semaphore mCameraOpenCloseLock = new Semaphore(1);
/**
* UI线程的handler
*/
private Handler mMainHandler = new Handler(Looper.getMainLooper()) {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
switch (msg.what) {
case FOCUS_DISAPPEAR:
if (msg.obj == null) {
mFocusImage.stopFocus();
break;
}
Integer valueTimes = (Integer) msg.obj;
if (mFocusImage.mTimes == valueTimes) {
mFocusImage.stopFocus();
}
break;
}
}
};
/**
* 开启一个后台线程,防止阻塞主UI
* An additional thread for running tasks that shouldn't block the UI.
*/
private HandlerThread mBackgroundThread;
/**
* 后台线程的Handler
* A {@link Handler} for running tasks in the background.
*/
private Handler mBackgroundHandler;
/**
* 开启一个后台线程,不会阻塞UI
* Starts a background thread and its {@link Handler}.
*/
private void startBackgroundThread() {
mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
/**
* 停止后台线程
* Stops the background thread and its {@link Handler}.
*/
private void stopBackgroundThread() {
if (mBackgroundThread != null) {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
/**
* @param activity 当前调用的Activity
* @param mAutoFitTextureView 录制的预览View
* @param videoAbsolutePath 视频保存路径
*/
public Camera2Helper(Activity activity, AutoFitTextureView mAutoFitTextureView, String videoAbsolutePath) {
this.mActivity = activity;
this.mAutoFitTextureView = mAutoFitTextureView;
this.videoAbsolutePath = videoAbsolutePath;
// 线程池
mExecutorService = Executors.newFixedThreadPool(3);
// [可选]初始化动画[对焦]
mScaleFocusAnimation = new ScaleAnimation(2.0f, 1.0f, 2.0f, 1.0f, Animation.RELATIVE_TO_SELF, 0.5f, Animation.RELATIVE_TO_SELF, 0.5f);
mScaleFocusAnimation.setDuration(200);
// 初始化对焦组件 step1
mFocusImage = activity.findViewById(R.id.videoRecord_focus);
mFocusImage.setVisibility(View.INVISIBLE);
mFocusImage.setmMainHandler(mMainHandler);
mFocusImage.setmAnimation(mScaleFocusAnimation);
// 初始化控件 step2
RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(
(int) TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP, mActivity.getResources().getDisplayMetrics().widthPixels * 0.03f, activity.getResources().getDisplayMetrics()),
(int) TypedValue.applyDimension(TypedValue.COMPLEX_UNIT_DIP, mActivity.getResources().getDisplayMetrics().widthPixels * 0.03f, activity.getResources().getDisplayMetrics()));
layoutParams.addRule(RelativeLayout.CENTER_IN_PARENT);
mFocusImage.setLayoutParams(layoutParams);
mFocusImage.initFocus();
// 初始化摄像头参数等
initParameter();
}
/**
* 初始化参数,相机锁加锁
*/
private void initParameter() {
if (!hasPermissionsGranted(VIDEO_PERMISSIONS)) {
LogUtil.getInstance().d(TAG, "无权限");
return;
}
final Activity activity = mActivity;
if (null == activity || activity.isFinishing()) {
return;
}
mCameraManager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
try {
LogUtil.getInstance().d(TAG, "tryAcquire");
// 拿到相机锁
if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
throw new RuntimeException("Time out waiting to lock camera opening.");
}
// 后置摄像头是0,多个的话,可测试对应的摄像头序号
cameraId = mCameraManager.getCameraIdList()[0];
// Choose the sizes for camera preview and video recording
mCameraCharacteristics = mCameraManager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = mCameraCharacteristics
.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
// 顺时针角度,输出图像需要通过从原始方向旋转该角度到设备屏幕上。如0、90、180、270
mSensorOrientation = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
if (map == null) {
throw new RuntimeException("Cannot get available preview/video sizes");
}
// 从配置文件加载分辨率设置
mVideoSize = new Size(Integer.valueOf(Config.getInstance().getProperties().getProperty("VideoWidth")), Integer.valueOf(Config.getInstance().getProperties().getProperty("VideoHeight")));
// 预览分辨率和录制分辨率保持一致
mPreviewSize = mVideoSize;
// 自动选择适合的视频分辨率,对于视频录制,并不是所有都可用,故采用定制
// mVideoSize = chooseVideoSize(map.getOutputSizes(SurfaceTexture.class));
// mPreviewSize = chooseOptimalSize(map.getOutputSizes(MediaRecorder.class),screenWidth,screenHeight,mVideoSize);
mOrientation = mActivity.getResources().getConfiguration().orientation;
// 给AutoFitTextureView设置一个纵横比,使其预览图像不变形
if (mOrientation == Configuration.ORIENTATION_LANDSCAPE) {
mAutoFitTextureView.setAspectRatio(mPreviewSize.getWidth(), mPreviewSize.getHeight());
} else {
mAutoFitTextureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
}
mPreviewSessionCallback = new PreviewSessionCallback(mFocusImage, mMainHandler, mAutoFitTextureView);
} catch (CameraAccessException e) {
LogUtil.getInstance().e(activity, "Cannot access the camera.");
activity.finish();
} catch (NullPointerException e) {
new ErrorDialog(mActivity, mActivity.getString(R.string.camera_error)).onCreateDialog().show();
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera opening.");
}
}
注:
(1)mCameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
,返回了此摄像机设备支持的可用流配置; 还包括每个格式/大小组合的最小帧持续时间和停顿持续时间。
(2)AutoFitTextureView设置了一个宽度高度,主要用到的是纵横比,和实际的宽高没关系(TextureView代码下面给出)。
(3)PreviewSessionCallback是预览Session的回调和对焦状态的事件捕捉。
相机设备的状态回调。
private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
mCameraDevice = cameraDevice;
if (mMediaRecorder == null) {
mMediaRecorder = new MediaRecorder();
}
// 开始预览的方法,同时输出到预览Surface和MediaCoder
readyToPreview();
// 释放相机锁
mCameraOpenCloseLock.release();
if (null != mAutoFitTextureView) {
configureTransform(mAutoFitTextureView.getWidth(), mAutoFitTextureView.getHeight());
}
}
@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice cameraDevice, int error) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = mActivity;
if (null != activity) {
activity.finish();
}
}
};
注:
(1)onOpened回调时,摄像头已经打开准备就绪,可以创建CaptureSession去进行捕捉画面等操作。
(2)关闭或错误时,要释放相机锁,保证相机被释放。
/**
* 设置预览请求和MediaRecorder的参数
*/
private void readyToPreview() {
if (null == mCameraDevice || !mAutoFitTextureView.isAvailable() || null == mPreviewSize) {
return;
}
try {
// 关闭先前的预览Session
closePreviewSession();
// 配置MediaCoder,并初始化
setupMediaRecorder();
SurfaceTexture surfaceTexture = mAutoFitTextureView.getSurfaceTexture();
assert surfaceTexture != null;
// 设置预览大小,全靠此方法确定预览的大小
surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// 模板化一个Builder,下面是覆盖指定参数
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
mPreviewBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CameraMetadata.CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT);
// 初始化参数
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_OFF);
// 边缘增强,高质量
mPreviewBuilder.set(CaptureRequest.EDGE_MODE, CameraMetadata.EDGE_MODE_HIGH_QUALITY);
// 3A--->auto
mPreviewBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// 3A
mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CameraMetadata.CONTROL_AF_MODE_AUTO);
mPreviewBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_ON);
mPreviewBuilder.set(CaptureRequest.CONTROL_AWB_MODE, CameraMetadata.CONTROL_AWB_MODE_AUTO);
// 输出到哪些Surface
List<Surface> surfaces = new ArrayList<>();
// Set up Surface for the camera preview
Surface previewSurface = new Surface(surfaceTexture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);
// Set up Surface for the MediaRecorder
Surface recorderSurface = mMediaRecorder.getSurface();
surfaces.add(recorderSurface);
mPreviewBuilder.addTarget(recorderSurface);
// Start a capture session
// Once the session starts, we can update the UI and start recording
mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
Camera2Helper.this.mCameraCaptureSession = cameraCaptureSession;
// 发送预览请求,开始预览,输出到上面的两个Surface
startPreview();
// 设置对焦的事件响应(监听),主要是设置对焦位置,然后重新发送一个预览请求
mAutoFitTextureView.setmMyTextureViewTouchEvent(new TextureViewTouchEvent(mCameraCharacteristics, mAutoFitTextureView,
mPreviewBuilder, Camera2Helper.this.mCameraCaptureSession, mPreviewRequest, mBackgroundHandler, mPreviewSessionCallback, mPreviewSize, mSensorOrientation));
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
Activity activity = mActivity;
if (null != activity) {
LogUtil.getInstance().e(activity, "Failed");
}
}
}, mBackgroundHandler);
} catch (CameraAccessException | IOException e) {
e.printStackTrace();
}
}
/**
* 关闭预览的Session
*/
private void closePreviewSession() {
if (mCameraCaptureSession != null) {
mCameraCaptureSession.close();
mCameraCaptureSession = null;
}
}
/**
* 开始预览
*/
private void startPreview() {
if (null == mCameraDevice) {
return;
}
try {
// 设置另外的预览参数,可以改地方,这里有点乱
setUpCaptureRequestBuilder(mPreviewBuilder);
mPreviewRequest = mPreviewBuilder.build();
// 发送重复的请求,形成视频预览
mCameraCaptureSession.setRepeatingRequest(mPreviewRequest, mPreviewSessionCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
很关键的配置MediaRecoder的方法。
private void setupMediaRecorder() throws IOException {
final Activity activity = mActivity;
if (null == activity) {
return;
}
// 无声模式,不设置源就行
// mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
// 源
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
// 封装格式
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
// 输出路径
mMediaRecorder.setOutputFile(videoAbsolutePathTemp);
mMediaRecorder.setVideoSize(mVideoSize.getWidth(), mVideoSize.getHeight());
mMediaRecorder.setVideoEncodingBitRate(10 * mVideoSize.getWidth() * mVideoSize.getHeight());
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
// I帧间隔,GOP的N值,而不是FPS!!!
mMediaRecorder.setVideoFrameRate(20);
// mMediaRecorder.setMaxDuration(1000 * 10);
mMediaRecorder.setOnErrorListener(new MediaRecorder.OnErrorListener() {
@Override
public void onError(MediaRecorder mr, int what, int extra) {
LogUtil.getInstance().e(TAG, "what:" + what + ", extra:" + extra);
}
});
mMediaRecorder.setOnInfoListener(new MediaRecorder.OnInfoListener() {
@Override
public void onInfo(MediaRecorder mr, int what, int extra) {
LogUtil.getInstance().d(TAG, "what:" + what + ", extra:" + extra);
if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
activity.findViewById(R.id.videoRecord_btn_start).performClick();
}
}
});
// 设置旋转角度
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
switch (mSensorOrientation) {
case SENSOR_ORIENTATION_DEFAULT_DEGREES:
mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
break;
case SENSOR_ORIENTATION_INVERSE_DEGREES:
mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
break;
}
// 必须prepare
mMediaRecorder.prepare();
}
预览画面的SurfaceTexture状态回调。
/**
* 预览View的Surface
*/
private TextureView.SurfaceTextureListener mSurfaceTextureListener
= new TextureView.SurfaceTextureListener() {
/**
* 可用时才能打开相机预览画面
* @param width surfaceTextured的宽
* @param height surfaceTexture的高
*/
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture,
int width, int height) {
openCamera(width, height);
}
/**
* 大小改变时需要重新设置TextureView的Matrix参数
* @param width surfaceTextured的宽
* @param height surfaceTexture的高
*/
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture,
int width, int height) {
configureTransform(width, height);
}
/**
* surfaceTexture销毁时调用
*/
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
LogUtil.getInstance().d(TAG, "SurfaceTexture被销毁");
return true;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
}
};
private void openCamera(int width, int height) {
final Activity activity = mActivity;
if (null == activity || activity.isFinishing()) {
return;
}
try {
// 设置形变和旋转Matrix参数
configureTransform(width, height);
// 打开相机
mCameraManager.openCamera(cameraId, mStateCallback, null);
} catch (CameraAccessException e) {
LogUtil.getInstance().e(activity, "Cannot access the camera.");
activity.finish();
} catch (NullPointerException e) {
new ErrorDialog(mActivity, mActivity.getString(R.string.camera_error)).onCreateDialog().show();
}
}
// 该方法不能在AutoFitTextureView大小固定之后或者相机打开之后调用,不生效!
private void configureTransform(int viewWidth, int viewHeight) {
Activity activity = mActivity;
if (null == mAutoFitTextureView || null == mPreviewSize || null == activity) {
return;
}
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
Matrix matrix = new Matrix();
RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
float centerX = viewRect.centerX();
float centerY = viewRect.centerY();
if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
float scale = Math.max(
(float) viewHeight / mPreviewSize.getHeight(),
(float) viewWidth / mPreviewSize.getWidth());
matrix.postScale(scale, scale, centerX, centerY);
matrix.postRotate(90 * (rotation - 2), centerX, centerY);
}
mAutoFitTextureView.setTransform(matrix);
}
public void startRecordingVideo() {
mIsRecordingVideo = true;
mExecutorService.submit(new Runnable() {
@Override
public void run() {
// Start recording
mMediaRecorder.start();
}
});
}
注:分配一个线程去记录视频,不会阻塞UI线程。
/**
* 关闭相机设备
*/
private void closeCamera() {
try {
mCameraOpenCloseLock.acquire();
closePreviewSession();
if (null != mCameraDevice) {
mCameraDevice.close();
mCameraDevice = null;
}
if (null != mMediaRecorder) {
mExecutorService.submit(new Runnable() {
@Override
public void run() {
mMediaRecorder.release();
mMediaRecorder = null;
}
});
}
} catch (InterruptedException e) {
throw new RuntimeException("Interrupted while trying to lock camera closing.");
} finally {
mCameraOpenCloseLock.release();
}
}
/**
* 用于Activity的onResume状态
*/
public void onResume() {
startBackgroundThread();
if (mAutoFitTextureView.isAvailable()) {
openCamera(mAutoFitTextureView.getWidth(), mAutoFitTextureView.getHeight());
} else {
mAutoFitTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
}
}
/**
* 用于Activity的onPause状态
*/
public void onPause() {
closeCamera();
stopBackgroundThread();
}
/**
* 用于Activity的onDestroy状态
*/
public void onDestroy() {
closeCamera();
stopBackgroundThread();
}
(1)视频录制,可以使用Camera 2或者Camera 1,Camera 1支持预览界面分辨率低,已经被抛弃。Camera 2支持的特性更多,分辨率更高,只是不能直接获取到原数据(可通过ImageReader获取到原数据)。
(2)此处使用MediaCoder进行录制,也可以使用ImageReader+MediaCodec(硬编码)进行录制(嫌麻烦,好像效率不太够,没实现)。
(3)也可以使用Surface缓存数据+(硬编码)进行进行录制,自己封装更灵活更细致的参数。
(4)需要将录制的状态和Activity的状态绑定。
(5)对焦要理解几个状态转换,以及构建预览请求的模板(CameraDevice.TEMPLATE_RECORD
可以触发连续对焦,而CameraDevice.TEMPLATE_PREVIEW
就不行),以及理解mPreviewBuilder.set(CaptureRequest.CONTROL_CAPTURE_INTENT, CameraMetadata.CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT);
,后面的CONTROL_CAPTURE_INTENT_VIDEO_SNAPSHOT
的含义,看源码上面的注释,这个和模板化参数也有关系,主要就是这个INTENT
意图不同,不知道内部做了什么设置。
安卓MediaCodec硬编码实现
安卓MediaCodec硬编码实现
源码地址:https://github.com/shen511460468/MediaRecorderOnCamera2