之前已经介绍过过时的旧 Camera 的使用了,毕竟在从 Android 5.0 后推荐使用 Camera2 了,所以现在开始介绍 Camera2 相关使用。老规矩还是从 SurfaceView 说起。
如果你对 Camera2 的相关类和接口还不熟悉,可以先看看下面这些介绍:
为什么选择 SurfaceView
?
SurfaceView
在自己独立的线程中绘制,不会影响到主线程,内部使用双缓冲机制,画面更流畅。相比于 TextureView
,它内存占用低,绘制更及时,耗时也更低,但不支持动画和截图。
我们选择将 Camera 和 View 分开,Camera 的相关操作由 Camera2Proxy
类完成,而 View 持有一个 Camera2Proxy 对象。这样 Camera2Proxy 也是可以重复利用的。
注意: 避免篇幅过长,下面每个小模块的示例代码在最后统一给出。
通过 CameraManager 的 openCamera()
方法打开相机,并在 CameraDevice.StateCallback
回调中获取 CameraDevice 对象。需要指定打开的相机 cameraId。
注意:
CameraCharacteristics.LENS_FACING_FRONT
通常表示后置摄像头,CameraCharacteristics.LENS_FACING_BACK
通常表示前置摄像头。
在 Camera2 API 中,相机的一些通用配置是通过 CameraCharacteristics
类完成,针对不同的请求(预览&拍照等),我们还可以通过 CaptureRequest
类单独配置。
我们可以设置 闪光模式、聚焦模式、曝光强度、预览图片格式和大小、拍照图片格式和大小 等等信息。
设置好了预览的显示方向和大小,预览的画面才不会产生拉伸等现象。
可以通过 CameraCaptureSession
的 setRepeatingRequest()
重复发送预览的请求来实现预览,通过 stopRepeating()
方法来停止发送。
相机是很耗费系统资源的东西,用完一定要释放。
简单的说,就是根据用户在 view 上的触摸点,映射到相机坐标系中对应的点,然后通过 CaptureRequest.Builder
的 CaptureRequest.CONTROL_AF_REGIONS
字段设置聚焦的区域。
通过 View 的点击事件,获取到双指之间的间距,并通过 CaptureRequest.Builder
的 CaptureRequest.SCALER_CROP_REGION
字段设置缩放。
新建一个 ImageReader
对象作为拍照的输出目标,通过创建一个拍照的 CaptureRequest,并通过 CameraCaptureSession
的 capture()
方法来发送单次请求。
注意,预览的时候是通过 CameraCaptureSession
的 setRepeatingRequest()
来发送重复请求,注意区分。
下面代码还用到了 OrientationEventListener
,这里之前没介绍,是通过传感器来获取当前手机的方向的,用于 拍照 的时候设置图片的选择使用,后面会介绍。
package com.afei.camerademo.camera;
import android.annotation.SuppressLint;
import android.annotation.TargetApi;
import android.app.Activity;
import android.content.Context;
import android.graphics.ImageFormat;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.MeteringRectangle;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.ImageReader;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.util.Log;
import android.util.Size;
import android.view.OrientationEventListener;
import android.view.Surface;
import android.view.SurfaceHolder;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
public class Camera2Proxy {
private static final String TAG = "Camera2Proxy";
private Activity mActivity;
private int mCameraId = CameraCharacteristics.LENS_FACING_FRONT; // 要打开的摄像头ID
private Size mPreviewSize; // 预览大小
private CameraManager mCameraManager; // 相机管理者
private CameraCharacteristics mCameraCharacteristics; // 相机属性
private CameraDevice mCameraDevice; // 相机对象
private CameraCaptureSession mCaptureSession;
private CaptureRequest.Builder mPreviewRequestBuilder; // 相机预览请求的构造器
private CaptureRequest mPreviewRequest;
private Handler mBackgroundHandler;
private HandlerThread mBackgroundThread;
private ImageReader mImageReader;
private Surface mPreviewSurface;
private OrientationEventListener mOrientationEventListener;
private int mDisplayRotate = 0;
private int mDeviceOrientation = 0; // 设备方向,由相机传感器获取
private int mZoom = 1; // 缩放
/**
* 打开摄像头的回调
*/
private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice camera) {
Log.d(TAG, "onOpened");
mCameraDevice = camera;
initPreviewRequest();
}
@Override
public void onDisconnected(@NonNull CameraDevice camera) {
Log.d(TAG, "onDisconnected");
releaseCamera();
}
@Override
public void onError(@NonNull CameraDevice camera, int error) {
Log.e(TAG, "Camera Open failed, error: " + error);
releaseCamera();
}
};
@TargetApi(Build.VERSION_CODES.M)
public Camera2Proxy(Activity activity) {
mActivity = activity;
mCameraManager = (CameraManager) mActivity.getSystemService(Context.CAMERA_SERVICE);
mOrientationEventListener = new OrientationEventListener(mActivity) {
@Override
public void onOrientationChanged(int orientation) {
mDeviceOrientation = orientation;
}
};
}
@SuppressLint("MissingPermission")
public void openCamera(int width, int height) {
Log.v(TAG, "openCamera");
startBackgroundThread(); // 对应 releaseCamera() 方法中的 stopBackgroundThread()
mOrientationEventListener.enable();
try {
mCameraCharacteristics = mCameraManager.getCameraCharacteristics(Integer.toString(mCameraId));
StreamConfigurationMap map = mCameraCharacteristics.get(CameraCharacteristics
.SCALER_STREAM_CONFIGURATION_MAP);
// 拍照大小,选择能支持的一个最大的图片大小
Size largest = Collections.max(Arrays.asList(map.getOutputSizes(ImageFormat.JPEG)), new
CompareSizesByArea());
Log.d(TAG, "picture size: " + largest.getWidth() + "*" + largest.getHeight());
mImageReader = ImageReader.newInstance(largest.getWidth(), largest.getHeight(), ImageFormat.JPEG, 2);
// 预览大小,根据上面选择的拍照图片的长宽比,选择一个和控件长宽差不多的大小
mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height, largest);
Log.d(TAG, "preview size: " + mPreviewSize.getWidth() + "*" + mPreviewSize.getHeight());
// 打开摄像头
mCameraManager.openCamera(Integer.toString(mCameraId), mStateCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void releaseCamera() {
Log.v(TAG, "releaseCamera");
if (null != mCaptureSession) {
mCaptureSession.close();
mCaptureSession = null;
}
if (mCameraDevice != null) {
mCameraDevice.close();
mCameraDevice = null;
}
if (mImageReader != null) {
mImageReader.close();
mImageReader = null;
}
mOrientationEventListener.disable();
stopBackgroundThread(); // 对应 openCamera() 方法中的 startBackgroundThread()
}
public void setImageAvailableListener(ImageReader.OnImageAvailableListener onImageAvailableListener) {
if (mImageReader == null) {
Log.w(TAG, "setImageAvailableListener: mImageReader is null");
return;
}
mImageReader.setOnImageAvailableListener(onImageAvailableListener, null);
}
public void setPreviewSurface(SurfaceHolder holder) {
mPreviewSurface = holder.getSurface();
}
public void setPreviewSurface(SurfaceTexture surfaceTexture) {
surfaceTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
mPreviewSurface = new Surface(surfaceTexture);
}
private void initPreviewRequest() {
try {
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(mPreviewSurface); // 设置预览输出的 Surface
mCameraDevice.createCaptureSession(Arrays.asList(mPreviewSurface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(@NonNull CameraCaptureSession session) {
mCaptureSession = session;
// 设置连续自动对焦
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest
.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
// 设置自动曝光
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest
.CONTROL_AE_MODE_ON_AUTO_FLASH);
// 设置完后自动开始预览
mPreviewRequest = mPreviewRequestBuilder.build();
startPreview();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession session) {
Log.e(TAG, "ConfigureFailed. session: mCaptureSession");
}
}, mBackgroundHandler); // handle 传入 null 表示使用当前线程的 Looper
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void startPreview() {
if (mCaptureSession == null || mPreviewRequestBuilder == null) {
Log.w(TAG, "startPreview: mCaptureSession or mPreviewRequestBuilder is null");
return;
}
try {
// 开始预览,即一直发送预览的请求
mCaptureSession.setRepeatingRequest(mPreviewRequest, null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void stopPreview() {
if (mCaptureSession == null || mPreviewRequestBuilder == null) {
Log.w(TAG, "stopPreview: mCaptureSession or mPreviewRequestBuilder is null");
return;
}
try {
mCaptureSession.stopRepeating();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
public void captureStillPicture() {
try {
CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice
.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getJpegOrientation(mDeviceOrientation));
// 预览如果有放大,拍照的时候也应该保存相同的缩放
Rect zoomRect = mPreviewRequestBuilder.get(CaptureRequest.SCALER_CROP_REGION);
if (zoomRect != null) {
captureBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
}
mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
final long time = System.currentTimeMillis();
mCaptureSession.capture(captureBuilder.build(), new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
Log.w(TAG, "onCaptureCompleted, time: " + (System.currentTimeMillis() - time));
try {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata
.CONTROL_AF_TRIGGER_CANCEL);
mCaptureSession.capture(mPreviewRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
startPreview();
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private int getJpegOrientation(int deviceOrientation) {
if (deviceOrientation == android.view.OrientationEventListener.ORIENTATION_UNKNOWN) return 0;
int sensorOrientation = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
// Round device orientation to a multiple of 90
deviceOrientation = (deviceOrientation + 45) / 90 * 90;
// Reverse device orientation for front-facing cameras
boolean facingFront = mCameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics
.LENS_FACING_FRONT;
if (facingFront) deviceOrientation = -deviceOrientation;
// Calculate desired JPEG orientation relative to camera orientation to make
// the image upright relative to the device orientation
int jpegOrientation = (sensorOrientation + deviceOrientation + 360) % 360;
return jpegOrientation;
}
public boolean isFrontCamera() {
return mCameraId == CameraCharacteristics.LENS_FACING_BACK;
}
public Size getPreviewSize() {
return mPreviewSize;
}
public void switchCamera(int width, int height) {
mCameraId ^= 1;
releaseCamera();
openCamera(width, height);
}
private Size chooseOptimalSize(Size[] sizes, int viewWidth, int viewHeight, Size pictureSize) {
int totalRotation = getRotation();
boolean swapRotation = totalRotation == 90 || totalRotation == 270;
int width = swapRotation ? viewHeight : viewWidth;
int height = swapRotation ? viewWidth : viewHeight;
return getSuitableSize(sizes, width, height, pictureSize);
}
private int getRotation() {
int displayRotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
switch (displayRotation) {
case Surface.ROTATION_0:
displayRotation = 90;
break;
case Surface.ROTATION_90:
displayRotation = 0;
break;
case Surface.ROTATION_180:
displayRotation = 270;
break;
case Surface.ROTATION_270:
displayRotation = 180;
break;
}
int sensorOrientation = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
mDisplayRotate = (displayRotation + sensorOrientation + 270) % 360;
return mDisplayRotate;
}
private Size getSuitableSize(Size[] sizes, int width, int height, Size pictureSize) {
int minDelta = Integer.MAX_VALUE; // 最小的差值,初始值应该设置大点保证之后的计算中会被重置
int index = 0; // 最小的差值对应的索引坐标
float aspectRatio = pictureSize.getHeight() * 1.0f / pictureSize.getWidth();
Log.d(TAG, "getSuitableSize. aspectRatio: " + aspectRatio);
for (int i = 0; i < sizes.length; i++) {
Size size = sizes[i];
// 先判断比例是否相等
if (size.getWidth() * aspectRatio == size.getHeight()) {
int delta = Math.abs(width - size.getWidth());
if (delta == 0) {
return size;
}
if (minDelta > delta) {
minDelta = delta;
index = i;
}
}
}
return sizes[index];
}
public void handleZoom(boolean isZoomIn) {
if (mCameraDevice == null || mCameraCharacteristics == null || mPreviewRequestBuilder == null) {
return;
}
int maxZoom = mCameraCharacteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM).intValue()
* 10;
Rect rect = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
if (isZoomIn && mZoom < maxZoom) {
mZoom++;
} else if (mZoom > 1) {
mZoom--;
}
int minW = rect.width() / maxZoom;
int minH = rect.height() / maxZoom;
int difW = rect.width() - minW;
int difH = rect.height() - minH;
int cropW = difW * mZoom / 100;
int cropH = difH * mZoom / 100;
cropW -= cropW & 3;
cropH -= cropH & 3;
Rect zoomRect = new Rect(cropW, cropH, rect.width() - cropW, rect.height() - cropH);
mPreviewRequestBuilder.set(CaptureRequest.SCALER_CROP_REGION, zoomRect);
mPreviewRequest = mPreviewRequestBuilder.build();
startPreview(); // 需要重新 start preview 才能生效
}
public void focusOnPoint(double x, double y, int width, int height) {
if (mCameraDevice == null || mPreviewRequestBuilder == null) {
return;
}
// 1. 先取相对于view上面的坐标
int previewWidth = mPreviewSize.getWidth();
int previewHeight = mPreviewSize.getHeight();
if (mDisplayRotate == 90 || mDisplayRotate == 270) {
previewWidth = mPreviewSize.getHeight();
previewHeight = mPreviewSize.getWidth();
}
// 2. 计算摄像头取出的图像相对于view放大了多少,以及有多少偏移
double tmp;
double imgScale;
double verticalOffset = 0;
double horizontalOffset = 0;
if (previewHeight * width > previewWidth * height) {
imgScale = width * 1.0 / previewWidth;
verticalOffset = (previewHeight - height / imgScale) / 2;
} else {
imgScale = height * 1.0 / previewHeight;
horizontalOffset = (previewWidth - width / imgScale) / 2;
}
// 3. 将点击的坐标转换为图像上的坐标
x = x / imgScale + horizontalOffset;
y = y / imgScale + verticalOffset;
if (90 == mDisplayRotate) {
tmp = x;
x = y;
y = mPreviewSize.getHeight() - tmp;
} else if (270 == mDisplayRotate) {
tmp = x;
x = mPreviewSize.getWidth() - y;
y = tmp;
}
// 4. 计算取到的图像相对于裁剪区域的缩放系数,以及位移
Rect cropRegion = mPreviewRequestBuilder.get(CaptureRequest.SCALER_CROP_REGION);
if (cropRegion == null) {
Log.w(TAG, "can't get crop region");
cropRegion = mCameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
}
int cropWidth = cropRegion.width();
int cropHeight = cropRegion.height();
if (mPreviewSize.getHeight() * cropWidth > mPreviewSize.getWidth() * cropHeight) {
imgScale = cropHeight * 1.0 / mPreviewSize.getHeight();
verticalOffset = 0;
horizontalOffset = (cropWidth - imgScale * mPreviewSize.getWidth()) / 2;
} else {
imgScale = cropWidth * 1.0 / mPreviewSize.getWidth();
horizontalOffset = 0;
verticalOffset = (cropHeight - imgScale * mPreviewSize.getHeight()) / 2;
}
// 5. 将点击区域相对于图像的坐标,转化为相对于成像区域的坐标
x = x * imgScale + horizontalOffset + cropRegion.left;
y = y * imgScale + verticalOffset + cropRegion.top;
double tapAreaRatio = 0.1;
Rect rect = new Rect();
rect.left = clamp((int) (x - tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width());
rect.right = clamp((int) (x + tapAreaRatio / 2 * cropRegion.width()), 0, cropRegion.width());
rect.top = clamp((int) (y - tapAreaRatio / 2 * cropRegion.height()), 0, cropRegion.height());
rect.bottom = clamp((int) (y + tapAreaRatio / 2 * cropRegion.height()), 0, cropRegion.height());
// 6. 设置 AF、AE 的测光区域,即上述得到的 rect
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_REGIONS, new MeteringRectangle[]{new MeteringRectangle
(rect, 1000)});
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_REGIONS, new MeteringRectangle[]{new MeteringRectangle
(rect, 1000)});
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_START);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_PRECAPTURE_TRIGGER, CameraMetadata
.CONTROL_AE_PRECAPTURE_TRIGGER_START);
try {
// 7. 发送上述设置的对焦请求,并监听回调
mCaptureSession.capture(mPreviewRequestBuilder.build(), mAfCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private final CameraCaptureSession.CaptureCallback mAfCaptureCallback = new CameraCaptureSession.CaptureCallback() {
private void process(CaptureResult result) {
Integer state = result.get(CaptureResult.CONTROL_AF_STATE);
if (null == state) {
return;
}
if (state == CaptureResult.CONTROL_AF_STATE_FOCUSED_LOCKED || state == CaptureResult
.CONTROL_AF_STATE_NOT_FOCUSED_LOCKED) {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER, CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest
.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.FLASH_MODE_OFF);
startPreview();
}
}
@Override
public void onCaptureProgressed(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull CaptureResult partialResult) {
process(partialResult);
}
@Override
public void onCaptureCompleted(@NonNull CameraCaptureSession session,
@NonNull CaptureRequest request,
@NonNull TotalCaptureResult result) {
process(result);
}
};
private void startBackgroundThread() {
if (mBackgroundThread == null || mBackgroundHandler == null) {
mBackgroundThread = new HandlerThread("CameraBackground");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
}
private void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private int clamp(int x, int min, int max) {
if (x > max) return max;
if (x < min) return min;
return x;
}
/**
* Compares two {@code Size}s based on their areas.
*/
static class CompareSizesByArea implements Comparator<Size> {
@Override
public int compare(Size lhs, Size rhs) {
// We cast here to ensure the multiplications won't overflow
return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
(long) rhs.getWidth() * rhs.getHeight());
}
}
}
通过上面的介绍,对于相机的操作应该有了一定的了解了,接下来完成 View 这部分。
Camera2SurfaceView
是要继承 SurfaceView
的。onMeasure
使得 Camera2SurfaceView
的宽高可以和相机预览尺寸相匹配,这样就不会有画面被拉伸的感觉了。Camera2SurfaceView
中完成对相机的打开、关闭等操作,值得庆幸的是我们可以通过上面的 Camera2Proxy
很容易的做到。onTouchEvent
方法,来实现单点聚焦,双指放大缩小的功能。主要是在 SurfaceHolder.Callback
的几个回调方法中打开和释放相机,另外就是重写 onMeasure
,onTouchEvent
那几个方法。
package com.afei.camerademo.surfaceview;
import android.app.Activity;
import android.content.Context;
import android.util.AttributeSet;
import android.util.Log;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import com.afei.camerademo.camera.Camera2Proxy;
public class Camera2SurfaceView extends SurfaceView {
private static final String TAG = "Camera2SurfaceView";
private Camera2Proxy mCameraProxy;
private int mRatioWidth = 0;
private int mRatioHeight = 0;
private float mOldDistance;
public Camera2SurfaceView(Context context) {
this(context, null);
}
public Camera2SurfaceView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
this(context, attrs, defStyleAttr, 0);
}
public Camera2SurfaceView(Context context, AttributeSet attrs, int defStyleAttr, int defStyleRes) {
super(context, attrs, defStyleAttr, defStyleRes);
init(context);
}
private void init(Context context) {
getHolder().addCallback(mSurfaceHolderCallback);
mCameraProxy = new Camera2Proxy((Activity) context);
}
private final SurfaceHolder.Callback mSurfaceHolderCallback = new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
mCameraProxy.setPreviewSurface(holder);
mCameraProxy.openCamera(getWidth(), getHeight());
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.d(TAG, "surfaceChanged: width: " + width + ", height: " + height);
int previewWidth = mCameraProxy.getPreviewSize().getWidth();
int previewHeight = mCameraProxy.getPreviewSize().getHeight();
if (width > height) {
setAspectRatio(previewWidth, previewHeight);
} else {
setAspectRatio(previewHeight, previewWidth);
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraProxy.releaseCamera();
}
};
public void setAspectRatio(int width, int height) {
if (width < 0 || height < 0) {
throw new IllegalArgumentException("Size cannot be negative.");
}
mRatioWidth = width;
mRatioHeight = height;
requestLayout();
}
public Camera2Proxy getCameraProxy() {
return mCameraProxy;
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = MeasureSpec.getSize(heightMeasureSpec);
if (0 == mRatioWidth || 0 == mRatioHeight) {
setMeasuredDimension(width, height);
} else {
if (width < height * mRatioWidth / mRatioHeight) {
setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
} else {
setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
}
}
}
@Override
public boolean onTouchEvent(MotionEvent event) {
if (event.getPointerCount() == 1) {
mCameraProxy.focusOnPoint(event.getX(), event.getY(), getWidth(), getHeight());
return true;
}
switch (event.getAction() & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_POINTER_DOWN:
mOldDistance = getFingerSpacing(event);
break;
case MotionEvent.ACTION_MOVE:
float newDistance = getFingerSpacing(event);
if (newDistance > mOldDistance) {
mCameraProxy.handleZoom(true);
} else if (newDistance < mOldDistance) {
mCameraProxy.handleZoom(false);
}
mOldDistance = newDistance;
break;
default:
break;
}
return super.onTouchEvent(event);
}
private static float getFingerSpacing(MotionEvent event) {
float x = event.getX(0) - event.getX(1);
float y = event.getY(0) - event.getY(1);
return (float) Math.sqrt(x * x + y * y);
}
}
接下来,我们把写好的 Camera2SurfaceView
放在 Activity 或者 Fragment 中使用就行了。
注意相机使用前,需要申请相关权限,以及权限的动态申请。
相机相关权限如下,动态权限的申请代码很多,这里不详细介绍了,不清楚的可以看这篇博客:Android动态权限申请
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
需要注意的是,前置摄像头是存在左右镜像的,因此针对前置摄像头我们需要手机进行一个左右镜像的操作。
下面是完整的 SurfaceCamera2Activity
代码:
package com.afei.camerademo.surfaceview;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.media.Image;
import android.media.ImageReader;
import android.os.AsyncTask;
import android.os.Bundle;
import android.provider.MediaStore;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import com.afei.camerademo.ImageUtils;
import com.afei.camerademo.R;
import com.afei.camerademo.camera.Camera2Proxy;
import java.nio.ByteBuffer;
public class SurfaceCamera2Activity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SurfaceCamera2Activity";
private ImageView mCloseIv;
private ImageView mSwitchCameraIv;
private ImageView mTakePictureIv;
private ImageView mPictureIv;
private Camera2SurfaceView mCameraView;
private Camera2Proxy mCameraProxy;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_surface_camera2);
initView();
}
private void initView() {
mCloseIv = findViewById(R.id.toolbar_close_iv);
mCloseIv.setOnClickListener(this);
mSwitchCameraIv = findViewById(R.id.toolbar_switch_iv);
mSwitchCameraIv.setOnClickListener(this);
mTakePictureIv = findViewById(R.id.take_picture_iv);
mTakePictureIv.setOnClickListener(this);
mPictureIv = findViewById(R.id.picture_iv);
mPictureIv.setOnClickListener(this);
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
mCameraView = findViewById(R.id.camera_view);
mCameraProxy = mCameraView.getCameraProxy();
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.toolbar_close_iv:
finish();
break;
case R.id.toolbar_switch_iv:
mCameraProxy.switchCamera(mCameraView.getWidth(), mCameraView.getHeight());
mCameraProxy.startPreview();
break;
case R.id.take_picture_iv:
mCameraProxy.setImageAvailableListener(mOnImageAvailableListener);
mCameraProxy.captureStillPicture(); // 拍照
break;
case R.id.picture_iv:
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivity(intent);
break;
}
}
private ImageReader.OnImageAvailableListener mOnImageAvailableListener =
new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
new ImageSaveTask().execute(reader.acquireNextImage()); // 保存图片
}
};
private class ImageSaveTask extends AsyncTask<Image, Void, Void> {
@Override
protected Void doInBackground(Image ... images) {
ByteBuffer buffer = images[0].getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
long time = System.currentTimeMillis();
if (mCameraProxy.isFrontCamera()) {
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
Log.d(TAG, "BitmapFactory.decodeByteArray time: " + (System.currentTimeMillis() - time));
time = System.currentTimeMillis();
// 前置摄像头需要左右镜像
Bitmap rotateBitmap = ImageUtils.rotateBitmap(bitmap, 0, true, true);
Log.d(TAG, "rotateBitmap time: " + (System.currentTimeMillis() - time));
time = System.currentTimeMillis();
ImageUtils.saveBitmap(rotateBitmap);
Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time));
rotateBitmap.recycle();
} else {
ImageUtils.saveImage(bytes);
Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time));
}
images[0].close();
return null;
}
@Override
protected void onPostExecute(Void aVoid) {
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
}
}
}
附上 ImageUtils
代码:
package com.afei.camerademo;
import android.content.ContentResolver;
import android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.Matrix;
import android.os.Environment;
import android.provider.MediaStore;
import android.util.Log;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
public class ImageUtils {
private static final String TAG = "ImageUtils";
private static final String GALLERY_PATH = Environment.getExternalStoragePublicDirectory(Environment
.DIRECTORY_DCIM) + File.separator + "Camera";
private static final SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyyMMdd_HHmmss");
public static Bitmap rotateBitmap(Bitmap source, int degree, boolean flipHorizontal, boolean recycle) {
if (degree == 0) {
return source;
}
Matrix matrix = new Matrix();
matrix.postRotate(degree);
if (flipHorizontal) {
matrix.postScale(-1, 1); // 前置摄像头存在水平镜像的问题,所以有需要的话调用这个方法进行水平镜像
}
Bitmap rotateBitmap = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, false);
if (recycle) {
source.recycle();
}
return rotateBitmap;
}
public static void saveBitmap(Bitmap bitmap) {
String fileName = DATE_FORMAT.format(new Date(System.currentTimeMillis())) + ".jpg";
File outFile = new File(GALLERY_PATH, fileName);
Log.d(TAG, "saveImage. filepath: " + outFile.getAbsolutePath());
FileOutputStream os = null;
try {
os = new FileOutputStream(outFile);
boolean success = bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os);
if (success) {
insertToDB(outFile.getAbsolutePath());
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (os != null) {
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
public static void insertToDB(String picturePath) {
ContentValues values = new ContentValues();
ContentResolver resolver = MyApp.getInstance().getContentResolver();
values.put(MediaStore.Images.ImageColumns.DATA, picturePath);
values.put(MediaStore.Images.ImageColumns.TITLE, picturePath.substring(picturePath.lastIndexOf("/") + 1));
values.put(MediaStore.Images.ImageColumns.DATE_TAKEN, System.currentTimeMillis());
values.put(MediaStore.Images.ImageColumns.MIME_TYPE, "image/jpeg");
resolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
}
}
部分没有贴出来的代码,可在下面地址中找到。
地址:
https://github.com/afei-cn/CameraDemo/tree/master/app/src/main/java/com/afei/camerademo/surfaceview
其它:
自定义Camera系列之:SurfaceView + Camera
自定义Camera系列之:TextureView + Camera
自定义Camera系列之:GLSurfaceViewView + Camera