最近公司项目App中要集成二维码扫描来适应在户外工作的时候,对码头集装箱等上面贴的A4纸张打印的二维码进行识别, 一般App二维码集成后,能扫出来就不管了,但是我们在集成成功后,根据用户反馈,在户外的环境下,很多二维码识别不了,或者识别速度慢,我们自己也是适用了一下,发现也确实是这样.
一般造成这个识别不出来的原因,我们总结了以下几点:
大概就是这些,但是用基于QBar(在Zxing上做了优化)的微信,却能很快的识别出上面几种情况造成的二维码; 基于libqrencode 库集成的支付宝或者钉钉二维码扫描,一样也能识别出来;还有IOS也就是调用系统的扫描,也一样能够扫描出来,为啥我们大android不行?
老板不管这些,只是说了,别人的可以,为啥你的不可以,那就是你的问题.......
网上找了很多各种几千个赞的第三方集成的二维码,根本就不能满足上面的需求,当时感觉真心不知道怎么办好了.
唯独网上找的这个还可以一点:https://github.com/vondear/RxTool 但是破损一些的还是扫描不出来;那个网上几千个赞的一片枫叶的 ,这种环境下的二维码扫描根本边都摸不到.
还有郭林推荐的:https://github.com/al4fun/SimpleScanner 这个库,虽然是大神推荐的,但是比上面的这个,还要差那么好几点,郭林的推荐的二维码扫描链接:https://mp.weixin.qq.com/s/aPqSK1FlsPiENzSE48BVUA
没办法,最后只能自己动手,网上找的,没有找到合适的,目前我们修改的二维码扫描基本可以做到:除了破损的识别不了,其他的都能识别,就是有时候速度慢了点,要多对一下焦,勉强能够比上面好那么一点点而已.
代码如下: (这里面很多类都在原有类的基础上有改动,虽然类名相同,但是里面的方法有些有变动!)
build.gradle
dependencies{
api fileTree(include: ['*.jar'], dir: 'libs')
api files('libs/core-3.3.0.jar')
// provided 'com.android.support:appcompat-v7:26.1.0'
compileOnly 'com.android.support:design:26.1.0'
compileOnly 'com.android.support:support-vector-drawable:26.1.0'
}
文件目录如下:
ZxingConfig.java
public class ZxingConfig implements Serializable {
/*是否播放声音*/
private boolean isPlayBeep = true;
/*是否震动*/
private boolean isShake = false;
/*是否显示下方的其他功能布局*/
private boolean isShowbottomLayout = true;
/*是否显示闪光灯按钮*/
private boolean isShowFlashLight = true;
/*是否显示相册按钮*/
private boolean isShowAlbum = true;
public boolean isPlayBeep() {
return isPlayBeep;
}
public void setPlayBeep(boolean playBeep) {
isPlayBeep = playBeep;
}
public boolean isShake() {
return isShake;
}
public void setShake(boolean shake) {
isShake = shake;
}
public boolean isShowbottomLayout() {
return isShowbottomLayout;
}
public void setShowbottomLayout(boolean showbottomLayout) {
isShowbottomLayout = showbottomLayout;
}
public boolean isShowFlashLight() {
return isShowFlashLight;
}
public void setShowFlashLight(boolean showFlashLight) {
isShowFlashLight = showFlashLight;
}
public boolean isShowAlbum() {
return isShowAlbum;
}
public void setShowAlbum(boolean showAlbum) {
isShowAlbum = showAlbum;
}
}
CameraFacing.java
public enum CameraFacing {
BACK, // must be value 0!
FRONT, // must be value 1!
}
OpenCamera.java
public final class OpenCamera {
private final int index;
private final Camera camera;
private final CameraFacing facing;
private final int orientation;
public OpenCamera(int index, Camera camera, CameraFacing facing, int orientation) {
this.index = index;
this.camera = camera;
this.facing = facing;
this.orientation = orientation;
}
public Camera getCamera() {
return camera;
}
public CameraFacing getFacing() {
return facing;
}
public int getOrientation() {
return orientation;
}
@Override
public String toString() {
return "Camera #" + index + " : " + facing + ',' + orientation;
}
}
OpenCameraInterface.java
public final class OpenCameraInterface {
private static final String TAG = OpenCameraInterface.class.getName();
private OpenCameraInterface() {
}
/**
* For {@link #open(int)}, means no preference for which camera to open.
*/
public static final int NO_REQUESTED_CAMERA = -1;
/**
* Opens the requested camera with {@link Camera#open(int)}, if one exists.
*
* @param cameraId camera ID of the camera to use. A negative value
* or {@link #NO_REQUESTED_CAMERA} means "no preference", in which case a rear-facing
* camera is returned if possible or else any camera
* @return handle to {@link OpenCamera} that was opened
*/
public static OpenCamera open(int cameraId) {
int numCameras = Camera.getNumberOfCameras();
if (numCameras == 0) {
Log.w(TAG, "No cameras!");
return null;
}
boolean explicitRequest = cameraId >= 0;
Camera.CameraInfo selectedCameraInfo = null;
int index;
if (explicitRequest) {
index = cameraId;
selectedCameraInfo = new Camera.CameraInfo();
Camera.getCameraInfo(index, selectedCameraInfo);
} else {
index = 0;
while (index < numCameras) {
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
Camera.getCameraInfo(index, cameraInfo);
CameraFacing reportedFacing = CameraFacing.values()[cameraInfo.facing];
if (reportedFacing == CameraFacing.BACK) {
selectedCameraInfo = cameraInfo;
break;
}
index++;
}
}
Camera camera;
if (index < numCameras) {
Log.i(TAG, "Opening camera #" + index);
camera = Camera.open(index);
} else {
if (explicitRequest) {
Log.w(TAG, "Requested camera does not exist: " + cameraId);
camera = null;
} else {
Log.i(TAG, "No camera facing " + CameraFacing.BACK + "; returning camera #0");
camera = Camera.open(0);
selectedCameraInfo = new Camera.CameraInfo();
Camera.getCameraInfo(0, selectedCameraInfo);
}
}
if (camera == null) {
return null;
}
return new OpenCamera(index,
camera,
CameraFacing.values()[selectedCameraInfo.facing],
selectedCameraInfo.orientation);
}
}
AutoFocusManager.java
final class AutoFocusManager implements Camera.AutoFocusCallback {
private static final String TAG = AutoFocusManager.class.getSimpleName();
private static final long AUTO_FOCUS_INTERVAL_MS = 2000L;
private static final Collection FOCUS_MODES_CALLING_AF;
static {
FOCUS_MODES_CALLING_AF = new ArrayList<>(2);
FOCUS_MODES_CALLING_AF.add(Camera.Parameters.FOCUS_MODE_AUTO);
FOCUS_MODES_CALLING_AF.add(Camera.Parameters.FOCUS_MODE_MACRO);
}
private boolean stopped;
private boolean focusing;
private final boolean useAutoFocus;
private final Camera camera;
private AsyncTask,?,?> outstandingTask;
AutoFocusManager(Context context, Camera camera) {
this.camera = camera;
SharedPreferences sharedPrefs = PreferenceManager.getDefaultSharedPreferences(context);
String currentFocusMode = camera.getParameters().getFocusMode();
useAutoFocus =
sharedPrefs.getBoolean(QRConstants.KEY_AUTO_FOCUS, true) &&
FOCUS_MODES_CALLING_AF.contains(currentFocusMode);
Log.i(TAG, "Current focus mode '" + currentFocusMode + "'; use auto focus? " + useAutoFocus);
start();
}
@Override
public synchronized void onAutoFocus(boolean success, Camera theCamera) {
focusing = false;
autoFocusAgainLater();
}
private synchronized void autoFocusAgainLater() {
if (!stopped && outstandingTask == null) {
AutoFocusTask newTask = new AutoFocusTask();
try {
newTask.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR);
outstandingTask = newTask;
} catch (RejectedExecutionException ree) {
Log.w(TAG, "Could not request auto focus", ree);
}
}
}
synchronized void start() {
// if (useAutoFocus) {
outstandingTask = null;
if (!stopped && !focusing) {
try {
camera.autoFocus(this);
focusing = true;
} catch (RuntimeException re) {
// Have heard RuntimeException reported in Android 4.0.x+; continue?
Log.w(TAG, "Unexpected exception while focusing", re);
// Try again later to keep cycle going
autoFocusAgainLater();
}
}
// }
}
private synchronized void cancelOutstandingTask() {
if (outstandingTask != null) {
if (outstandingTask.getStatus() != AsyncTask.Status.FINISHED) {
outstandingTask.cancel(true);
}
outstandingTask = null;
}
}
synchronized void stop() {
stopped = true;
// if (useAutoFocus) {
cancelOutstandingTask();
// Doesn't hurt to call this even if not focusing
try {
camera.cancelAutoFocus();
} catch (RuntimeException re) {
// Have heard RuntimeException reported in Android 4.0.x+; continue?
Log.w(TAG, "Unexpected exception while cancelling focusing", re);
}
// }
}
private final class AutoFocusTask extends AsyncTask
CameraConfigurationManager.java
final class CameraConfigurationManager {
private static final String TAG = "CameraConfiguration";
private final Context context;
private int cwNeededRotation;
private int cwRotationFromDisplayToCamera;
private Point screenResolution;
private Point cameraResolution;
private Point bestPreviewSize;
private Point previewSizeOnScreen;
CameraConfigurationManager(Context context) {
this.context = context;
}
/**
* Reads, one time, values from the camera that are needed by the app.
*/
void initFromCameraParameters(OpenCamera camera) {
Camera.Parameters parameters = camera.getCamera().getParameters();
WindowManager manager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
Display display = manager.getDefaultDisplay();
int displayRotation = display.getRotation();
int cwRotationFromNaturalToDisplay;
switch (displayRotation) {
case Surface.ROTATION_0:
cwRotationFromNaturalToDisplay = 0;
break;
case Surface.ROTATION_90:
cwRotationFromNaturalToDisplay = 90;
break;
case Surface.ROTATION_180:
cwRotationFromNaturalToDisplay = 180;
break;
case Surface.ROTATION_270:
cwRotationFromNaturalToDisplay = 270;
break;
default:
// Have seen this return incorrect values like -90
if (displayRotation % 90 == 0) {
cwRotationFromNaturalToDisplay = (360 + displayRotation) % 360;
} else {
throw new IllegalArgumentException("Bad rotation: " + displayRotation);
}
}
Log.i(TAG, "Display at: " + cwRotationFromNaturalToDisplay);
int cwRotationFromNaturalToCamera = camera.getOrientation();
Log.i(TAG, "Camera at: " + cwRotationFromNaturalToCamera);
// Still not 100% sure about this. But acts like we need to flip this:
if (camera.getFacing() == CameraFacing.FRONT) {
cwRotationFromNaturalToCamera = (360 - cwRotationFromNaturalToCamera) % 360;
Log.i(TAG, "Front camera overriden to: " + cwRotationFromNaturalToCamera);
}
/*
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
String overrideRotationString;
if (camera.getFacing() == CameraFacing.FRONT) {
overrideRotationString = prefs.getString(PreferencesActivity.KEY_FORCE_CAMERA_ORIENTATION_FRONT, null);
} else {
overrideRotationString = prefs.getString(PreferencesActivity.KEY_FORCE_CAMERA_ORIENTATION, null);
}
if (overrideRotationString != null && !"-".equals(overrideRotationString)) {
Log.i(TAG, "Overriding camera manually to " + overrideRotationString);
cwRotationFromNaturalToCamera = Integer.parseInt(overrideRotationString);
}
*/
cwRotationFromDisplayToCamera =
(360 + cwRotationFromNaturalToCamera - cwRotationFromNaturalToDisplay) % 360;
Log.i(TAG, "Final display orientation: " + cwRotationFromDisplayToCamera);
if (camera.getFacing() == CameraFacing.FRONT) {
Log.i(TAG, "Compensating rotation for front camera");
cwNeededRotation = (360 - cwRotationFromDisplayToCamera) % 360;
} else {
cwNeededRotation = cwRotationFromDisplayToCamera;
}
Log.i(TAG, "Clockwise rotation from display to camera: " + cwNeededRotation);
Point theScreenResolution = new Point();
display.getSize(theScreenResolution);
screenResolution = theScreenResolution;
Log.i(TAG, "Screen resolution in current orientation: " + screenResolution);
cameraResolution = CameraConfigurationUtils.findBestPreviewSizeValue(parameters, screenResolution);
Log.i(TAG, "Camera resolution: " + cameraResolution);
bestPreviewSize = CameraConfigurationUtils.findBestPreviewSizeValue(parameters, screenResolution);
Log.i(TAG, "Best available preview size: " + bestPreviewSize);
boolean isScreenPortrait = screenResolution.x < screenResolution.y;
boolean isPreviewSizePortrait = bestPreviewSize.x < bestPreviewSize.y;
if (isScreenPortrait == isPreviewSizePortrait) {
previewSizeOnScreen = bestPreviewSize;
} else {
previewSizeOnScreen = new Point(bestPreviewSize.y, bestPreviewSize.x);
}
Log.i(TAG, "Preview size on screen: " + previewSizeOnScreen);
}
void setDesiredCameraParameters(OpenCamera camera, boolean safeMode) {
Camera theCamera = camera.getCamera();
Camera.Parameters parameters = theCamera.getParameters();
if (parameters == null) {
Log.w(TAG, "Device error: no camera parameters are available. Proceeding without configuration.");
return;
}
Log.i(TAG, "Initial camera parameters: " + parameters.flatten());
if (safeMode) {
Log.w(TAG, "In camera config safe mode -- most settings will not be honored");
}
// SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context);
initializeTorch(parameters, safeMode, QRConstants.disableExposure);
CameraConfigurationUtils.setFocus(
parameters,
//是否支持自动对焦
QRConstants.autoFocus,
true,
safeMode);
if (!safeMode) {
//
//CameraConfigurationUtils.setInvertColor(parameters);
CameraConfigurationUtils.setBarcodeSceneMode(parameters);
CameraConfigurationUtils.setVideoStabilization(parameters);
CameraConfigurationUtils.setFocusArea(parameters);
CameraConfigurationUtils.setMetering(parameters);
}
parameters.setPreviewSize(bestPreviewSize.x, bestPreviewSize.y);
theCamera.setParameters(parameters);
theCamera.setDisplayOrientation(cwRotationFromDisplayToCamera);
Camera.Parameters afterParameters = theCamera.getParameters();
Camera.Size afterSize = afterParameters.getPreviewSize();
if (afterSize != null && (bestPreviewSize.x != afterSize.width || bestPreviewSize.y != afterSize.height)) {
Log.w(TAG, "Camera said it supported preview size " + bestPreviewSize.x + 'x' + bestPreviewSize.y +
", but after setting it, preview size is " + afterSize.width + 'x' + afterSize.height);
bestPreviewSize.x = afterSize.width;
bestPreviewSize.y = afterSize.height;
}
}
Point getBestPreviewSize() {
return bestPreviewSize;
}
Point getPreviewSizeOnScreen() {
return previewSizeOnScreen;
}
Point getCameraResolution() {
return cameraResolution;
}
Point getScreenResolution() {
return screenResolution;
}
int getCWNeededRotation() {
return cwNeededRotation;
}
boolean getTorchState(Camera camera) {
if (camera != null) {
Camera.Parameters parameters = camera.getParameters();
if (parameters != null) {
String flashMode = camera.getParameters().getFlashMode();
return flashMode != null &&
(Camera.Parameters.FLASH_MODE_ON.equals(flashMode) ||
Camera.Parameters.FLASH_MODE_TORCH.equals(flashMode));
}
}
return false;
}
void setTorch(Camera camera, boolean newSetting) {
Camera.Parameters parameters = camera.getParameters();
doSetTorch(parameters, newSetting, false, QRConstants.disableExposure);
camera.setParameters(parameters);
}
private void initializeTorch(Camera.Parameters parameters, boolean safeMode, boolean disableExposure) {
boolean currentSetting = QRConstants.frontLightMode == FrontLightMode.ON;
doSetTorch(parameters, currentSetting, safeMode, disableExposure);
}
private void doSetTorch(Camera.Parameters parameters, boolean newSetting, boolean safeMode, boolean disableExposure) {
CameraConfigurationUtils.setTorch(parameters, newSetting);
if (!safeMode && !disableExposure) {
CameraConfigurationUtils.setBestExposure(parameters, newSetting);
}
}
}
CameraConfigurationUtils.java
@TargetApi(Build.VERSION_CODES.ICE_CREAM_SANDWICH_MR1)
public final class CameraConfigurationUtils {
private static final String TAG = "CameraConfiguration";
private static final Pattern SEMICOLON = Pattern.compile(";");
private static final int MIN_PREVIEW_PIXELS = 480 * 320; // normal screen
private static final float MAX_EXPOSURE_COMPENSATION = 1.5f;
private static final float MIN_EXPOSURE_COMPENSATION = 0.0f;
private static final double MAX_ASPECT_DISTORTION = 0.15;
private static final int MIN_FPS = 10;
private static final int MAX_FPS = 20;
private static final int AREA_PER_1000 = 400;
private CameraConfigurationUtils() {
}
public static void setFocus(Camera.Parameters parameters,
boolean autoFocus,
boolean disableContinuous,
boolean safeMode) {
List supportedFocusModes = parameters.getSupportedFocusModes();
String focusMode = null;
if (autoFocus) {
if (safeMode || disableContinuous) {
focusMode = findSettableValue("focus mode",
supportedFocusModes,
Camera.Parameters.FOCUS_MODE_AUTO);
} else {
focusMode = findSettableValue("focus mode",
supportedFocusModes,
Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE,
Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO,
Camera.Parameters.FOCUS_MODE_AUTO);
}
}
// Maybe selected auto-focus but not available, so fall through here:
if (!safeMode && focusMode == null) {
focusMode = findSettableValue("focus mode",
supportedFocusModes,
Camera.Parameters.FOCUS_MODE_MACRO,
Camera.Parameters.FOCUS_MODE_EDOF);
}
if (focusMode != null) {
if (focusMode.equals(parameters.getFocusMode())) {
Log.i(TAG, "Focus mode already set to " + focusMode);
} else {
parameters.setFocusMode(focusMode);
}
}
}
public static void setTorch(Camera.Parameters parameters, boolean on) {
List supportedFlashModes = parameters.getSupportedFlashModes();
String flashMode;
if (on) {
flashMode = findSettableValue("flash mode",
supportedFlashModes,
Camera.Parameters.FLASH_MODE_TORCH,
Camera.Parameters.FLASH_MODE_ON);
} else {
flashMode = findSettableValue("flash mode",
supportedFlashModes,
Camera.Parameters.FLASH_MODE_OFF);
}
if (flashMode != null) {
if (flashMode.equals(parameters.getFlashMode())) {
Log.i(TAG, "Flash mode already set to " + flashMode);
} else {
Log.i(TAG, "Setting flash mode to " + flashMode);
parameters.setFlashMode(flashMode);
}
}
}
public static void setBestExposure(Camera.Parameters parameters, boolean lightOn) {
int minExposure = parameters.getMinExposureCompensation();
int maxExposure = parameters.getMaxExposureCompensation();
float step = parameters.getExposureCompensationStep();
if ((minExposure != 0 || maxExposure != 0) && step > 0.0f) {
// Set low when light is on
float targetCompensation = lightOn ? MIN_EXPOSURE_COMPENSATION : MAX_EXPOSURE_COMPENSATION;
int compensationSteps = Math.round(targetCompensation / step);
float actualCompensation = step * compensationSteps;
// Clamp value:
compensationSteps = Math.max(Math.min(compensationSteps, maxExposure), minExposure);
if (parameters.getExposureCompensation() == compensationSteps) {
Log.i(TAG, "Exposure compensation already set to " + compensationSteps + " / " + actualCompensation);
} else {
Log.i(TAG, "Setting exposure compensation to " + compensationSteps + " / " + actualCompensation);
parameters.setExposureCompensation(compensationSteps);
}
} else {
Log.i(TAG, "Camera does not support exposure compensation");
}
}
public static void setBestPreviewFPS(Camera.Parameters parameters) {
setBestPreviewFPS(parameters, MIN_FPS, MAX_FPS);
}
public static void setBestPreviewFPS(Camera.Parameters parameters, int minFPS, int maxFPS) {
List supportedPreviewFpsRanges = parameters.getSupportedPreviewFpsRange();
Log.i(TAG, "Supported FPS ranges: " + toString(supportedPreviewFpsRanges));
if (supportedPreviewFpsRanges != null && !supportedPreviewFpsRanges.isEmpty()) {
int[] suitableFPSRange = null;
for (int[] fpsRange : supportedPreviewFpsRanges) {
int thisMin = fpsRange[Camera.Parameters.PREVIEW_FPS_MIN_INDEX];
int thisMax = fpsRange[Camera.Parameters.PREVIEW_FPS_MAX_INDEX];
if (thisMin >= minFPS * 1000 && thisMax <= maxFPS * 1000) {
suitableFPSRange = fpsRange;
break;
}
}
if (suitableFPSRange == null) {
Log.i(TAG, "No suitable FPS range?");
} else {
int[] currentFpsRange = new int[2];
parameters.getPreviewFpsRange(currentFpsRange);
if (Arrays.equals(currentFpsRange, suitableFPSRange)) {
Log.i(TAG, "FPS range already set to " + Arrays.toString(suitableFPSRange));
} else {
Log.i(TAG, "Setting FPS range to " + Arrays.toString(suitableFPSRange));
parameters.setPreviewFpsRange(suitableFPSRange[Camera.Parameters.PREVIEW_FPS_MIN_INDEX],
suitableFPSRange[Camera.Parameters.PREVIEW_FPS_MAX_INDEX]);
}
}
}
}
public static void setFocusArea(Camera.Parameters parameters) {
if (parameters.getMaxNumFocusAreas() > 0) {
Log.i(TAG, "Old focus areas: " + toString(parameters.getFocusAreas()));
List middleArea = buildMiddleArea(AREA_PER_1000);
Log.i(TAG, "Setting focus area to : " + toString(middleArea));
parameters.setFocusAreas(middleArea);
} else {
Log.i(TAG, "Device does not support focus areas");
}
}
public static void setMetering(Camera.Parameters parameters) {
if (parameters.getMaxNumMeteringAreas() > 0) {
Log.i(TAG, "Old metering areas: " + parameters.getMeteringAreas());
List middleArea = buildMiddleArea(AREA_PER_1000);
Log.i(TAG, "Setting metering area to : " + toString(middleArea));
parameters.setMeteringAreas(middleArea);
} else {
Log.i(TAG, "Device does not support metering areas");
}
}
private static List buildMiddleArea(int areaPer1000) {
return Collections.singletonList(
new Camera.Area(new Rect(-areaPer1000, -areaPer1000, areaPer1000, areaPer1000), 1));
}
public static void setVideoStabilization(Camera.Parameters parameters) {
if (parameters.isVideoStabilizationSupported()) {
if (parameters.getVideoStabilization()) {
Log.i(TAG, "Video stabilization already enabled");
} else {
Log.i(TAG, "Enabling video stabilization...");
parameters.setVideoStabilization(true);
}
} else {
Log.i(TAG, "This device does not support video stabilization");
}
}
public static void setBarcodeSceneMode(Camera.Parameters parameters) {
if (Camera.Parameters.SCENE_MODE_BARCODE.equals(parameters.getSceneMode())) {
Log.i(TAG, "Barcode scene mode already set");
return;
}
String sceneMode = findSettableValue("scene mode",
parameters.getSupportedSceneModes(),
Camera.Parameters.SCENE_MODE_BARCODE);
if (sceneMode != null) {
parameters.setSceneMode(sceneMode);
}
}
public static void setZoom(Camera.Parameters parameters, double targetZoomRatio) {
if (parameters.isZoomSupported()) {
Integer zoom = indexOfClosestZoom(parameters, targetZoomRatio);
if (zoom == null) {
return;
}
if (parameters.getZoom() == zoom) {
Log.i(TAG, "Zoom is already set to " + zoom);
} else {
Log.i(TAG, "Setting zoom to " + zoom);
parameters.setZoom(zoom);
}
} else {
Log.i(TAG, "Zoom is not supported");
}
}
private static Integer indexOfClosestZoom(Camera.Parameters parameters, double targetZoomRatio) {
List ratios = parameters.getZoomRatios();
Log.i(TAG, "Zoom ratios: " + ratios);
int maxZoom = parameters.getMaxZoom();
if (ratios == null || ratios.isEmpty() || ratios.size() != maxZoom + 1) {
Log.w(TAG, "Invalid zoom ratios!");
return null;
}
double target100 = 100.0 * targetZoomRatio;
double smallestDiff = Double.POSITIVE_INFINITY;
int closestIndex = 0;
for (int i = 0; i < ratios.size(); i++) {
double diff = Math.abs(ratios.get(i) - target100);
if (diff < smallestDiff) {
smallestDiff = diff;
closestIndex = i;
}
}
Log.i(TAG, "Chose zoom ratio of " + (ratios.get(closestIndex) / 100.0));
return closestIndex;
}
public static void setInvertColor(Camera.Parameters parameters) {
if (Camera.Parameters.EFFECT_NEGATIVE.equals(parameters.getColorEffect())) {
Log.i(TAG, "Negative effect already set");
return;
}
String colorMode = findSettableValue("color effect",
parameters.getSupportedColorEffects(),
Camera.Parameters.EFFECT_NEGATIVE);
if (colorMode != null) {
parameters.setColorEffect(colorMode);
}
}
public static Point findBestPreviewSizeValue(Camera.Parameters parameters, Point screenResolution) {
List rawSupportedSizes = parameters.getSupportedPreviewSizes();
if (rawSupportedSizes == null) {
Log.w(TAG, "Device returned no supported preview sizes; using default");
Camera.Size defaultSize = parameters.getPreviewSize();
if (defaultSize == null) {
throw new IllegalStateException("Parameters contained no preview size!");
}
return new Point(defaultSize.width, defaultSize.height);
}
// Sort by size, descending
List supportedPreviewSizes = new ArrayList<>(rawSupportedSizes);
Collections.sort(supportedPreviewSizes, new Comparator() {
@Override
public int compare(Camera.Size a, Camera.Size b) {
int aPixels = a.height * a.width;
int bPixels = b.height * b.width;
if (bPixels < aPixels) {
return -1;
}
if (bPixels > aPixels) {
return 1;
}
return 0;
}
});
if (Log.isLoggable(TAG, Log.INFO)) {
StringBuilder previewSizesString = new StringBuilder();
for (Camera.Size supportedPreviewSize : supportedPreviewSizes) {
previewSizesString.append(supportedPreviewSize.width).append('x')
.append(supportedPreviewSize.height).append(' ');
}
Log.i(TAG, "Supported preview sizes: " + previewSizesString);
}
double screenAspectRatio = (double) screenResolution.x / (double) screenResolution.y;
// Remove sizes that are unsuitable
Iterator it = supportedPreviewSizes.iterator();
while (it.hasNext()) {
Camera.Size supportedPreviewSize = it.next();
int realWidth = supportedPreviewSize.width;
int realHeight = supportedPreviewSize.height;
if (realWidth * realHeight < MIN_PREVIEW_PIXELS) {
it.remove();
continue;
}
boolean isCandidatePortrait = realWidth < realHeight;
int maybeFlippedWidth = isCandidatePortrait ? realHeight : realWidth;
int maybeFlippedHeight = isCandidatePortrait ? realWidth : realHeight;
double aspectRatio = (double) maybeFlippedWidth / (double) maybeFlippedHeight;
double distortion = Math.abs(aspectRatio - screenAspectRatio);
if (distortion > MAX_ASPECT_DISTORTION) {
it.remove();
continue;
}
if (maybeFlippedWidth == screenResolution.x && maybeFlippedHeight == screenResolution.y) {
Point exactPoint = new Point(realWidth, realHeight);
Log.i(TAG, "Found preview size exactly matching screen size: " + exactPoint);
return exactPoint;
}
}
// If no exact match, use largest preview size. This was not a great idea on older devices because
// of the additional computation needed. We're likely to get here on newer Android 4+ devices, where
// the CPU is much more powerful.
if (!supportedPreviewSizes.isEmpty()) {
Camera.Size largestPreview = supportedPreviewSizes.get(0);
Point largestSize = new Point(largestPreview.width, largestPreview.height);
Log.i(TAG, "Using largest suitable preview size: " + largestSize);
return largestSize;
}
// If there is nothing at all suitable, return current preview size
Camera.Size defaultPreview = parameters.getPreviewSize();
if (defaultPreview == null) {
throw new IllegalStateException("Parameters contained no preview size!");
}
Point defaultSize = new Point(defaultPreview.width, defaultPreview.height);
Log.i(TAG, "No suitable preview sizes, using default: " + defaultSize);
return defaultSize;
}
private static String findSettableValue(String name,
Collection supportedValues,
String... desiredValues) {
Log.i(TAG, "Requesting " + name + " value from among: " + Arrays.toString(desiredValues));
Log.i(TAG, "Supported " + name + " values: " + supportedValues);
if (supportedValues != null) {
for (String desiredValue : desiredValues) {
if (supportedValues.contains(desiredValue)) {
Log.i(TAG, "Can set " + name + " to: " + desiredValue);
return desiredValue;
}
}
}
Log.i(TAG, "No supported values match");
return null;
}
private static String toString(Collection arrays) {
if (arrays == null || arrays.isEmpty()) {
return "[]";
}
StringBuilder buffer = new StringBuilder();
buffer.append('[');
Iterator it = arrays.iterator();
while (it.hasNext()) {
buffer.append(Arrays.toString(it.next()));
if (it.hasNext()) {
buffer.append(", ");
}
}
buffer.append(']');
return buffer.toString();
}
private static String toString(Iterable areas) {
if (areas == null) {
return null;
}
StringBuilder result = new StringBuilder();
for (Camera.Area area : areas) {
result.append(area.rect).append(':').append(area.weight).append(' ');
}
return result.toString();
}
public static String collectStats(Camera.Parameters parameters) {
return collectStats(parameters.flatten());
}
public static String collectStats(CharSequence flattenedParams) {
StringBuilder result = new StringBuilder(1000);
result.append("BOARD=").append(Build.BOARD).append('\n');
result.append("BRAND=").append(Build.BRAND).append('\n');
result.append("CPU_ABI=").append(Build.CPU_ABI).append('\n');
result.append("DEVICE=").append(Build.DEVICE).append('\n');
result.append("DISPLAY=").append(Build.DISPLAY).append('\n');
result.append("FINGERPRINT=").append(Build.FINGERPRINT).append('\n');
result.append("HOST=").append(Build.HOST).append('\n');
result.append("ID=").append(Build.ID).append('\n');
result.append("MANUFACTURER=").append(Build.MANUFACTURER).append('\n');
result.append("MODEL=").append(Build.MODEL).append('\n');
result.append("PRODUCT=").append(Build.PRODUCT).append('\n');
result.append("TAGS=").append(Build.TAGS).append('\n');
result.append("TIME=").append(Build.TIME).append('\n');
result.append("TYPE=").append(Build.TYPE).append('\n');
result.append("USER=").append(Build.USER).append('\n');
result.append("VERSION.CODENAME=").append(Build.VERSION.CODENAME).append('\n');
result.append("VERSION.INCREMENTAL=").append(Build.VERSION.INCREMENTAL).append('\n');
result.append("VERSION.RELEASE=").append(Build.VERSION.RELEASE).append('\n');
result.append("VERSION.SDK_INT=").append(Build.VERSION.SDK_INT).append('\n');
if (flattenedParams != null) {
String[] params = SEMICOLON.split(flattenedParams);
Arrays.sort(params);
for (String param : params) {
result.append(param).append('\n');
}
}
return result.toString();
}
}
CameraManager.java
public final class CameraManager {
private static final String TAG = CameraManager.class.getSimpleName();
private static final int MIN_FRAME_WIDTH = 240;
private static final int MIN_FRAME_HEIGHT = 240;
private static final int MAX_FRAME_WIDTH = 1200; // = 5/8 * 1920
private static final int MAX_FRAME_HEIGHT = 675; // = 5/8 * 1080
private final Context context;
private final CameraConfigurationManager configManager;
private OpenCamera camera;
private AutoFocusManager autoFocusManager;
private Rect framingRect;
private Rect framingRectInPreview;
private boolean initialized;
private boolean previewing;
private int requestedCameraId = OpenCameraInterface.NO_REQUESTED_CAMERA;
private int requestedFramingRectWidth;
private int requestedFramingRectHeight;
/**
* Preview frames are delivered here, which we pass on to the registered handler. Make sure to
* clear the handler so it will only receive one message.
*/
private final PreviewCallback previewCallback;
public CameraManager(Context context) {
this.context = context;
this.configManager = new CameraConfigurationManager(context);
previewCallback = new PreviewCallback(configManager);
}
/**
* Opens the camera driver and initializes the hardware parameters.
*
* @param holder The surface object which the camera will draw preview frames into.
* @throws IOException Indicates the camera driver failed to open.
*/
public synchronized void openDriver(SurfaceHolder holder) throws IOException {
OpenCamera theCamera = camera;
if (theCamera == null) {
theCamera = OpenCameraInterface.open(requestedCameraId);
if (theCamera == null) {
throw new IOException("Camera.open() failed to return object from driver");
}
camera = theCamera;
}
if (!initialized) {
initialized = true;
configManager.initFromCameraParameters(theCamera);
if (requestedFramingRectWidth > 0 && requestedFramingRectHeight > 0) {
setManualFramingRect(requestedFramingRectWidth, requestedFramingRectHeight);
requestedFramingRectWidth = 0;
requestedFramingRectHeight = 0;
}
}
Camera cameraObject = theCamera.getCamera();
Camera.Parameters parameters = cameraObject.getParameters();
String parametersFlattened = parameters == null ? null : parameters.flatten(); // Save these, temporarily
try {
configManager.setDesiredCameraParameters(theCamera, false);
} catch (RuntimeException re) {
// Driver failed
Log.w(TAG, "Camera rejected parameters. Setting only minimal safe-mode parameters");
Log.i(TAG, "Resetting to saved camera params: " + parametersFlattened);
// Reset:
if (parametersFlattened != null) {
parameters = cameraObject.getParameters();
parameters.unflatten(parametersFlattened);
try {
cameraObject.setParameters(parameters);
configManager.setDesiredCameraParameters(theCamera, true);
} catch (RuntimeException re2) {
// Well, darn. Give up
Log.w(TAG, "Camera rejected even safe-mode parameters! No configuration");
}
}
}
cameraObject.setPreviewDisplay(holder);
}
public synchronized boolean isOpen() {
return camera != null;
}
/**
* Closes the camera driver if still in use.
*/
public synchronized void closeDriver() {
if (camera != null) {
camera.getCamera().release();
camera = null;
// Make sure to clear these each time we close the camera, so that any scanning rect
// requested by intent is forgotten.
framingRect = null;
framingRectInPreview = null;
}
}
/**
* Asks the camera hardware to begin drawing preview frames to the screen.
*/
public synchronized void startPreview() {
OpenCamera theCamera = camera;
if (theCamera != null && !previewing) {
theCamera.getCamera().startPreview();
previewing = true;
autoFocusManager = new AutoFocusManager(context, theCamera.getCamera());
}
}
/**
* Tells the camera to stop drawing preview frames.
*/
public synchronized void stopPreview() {
if (autoFocusManager != null) {
autoFocusManager.stop();
autoFocusManager = null;
}
if (camera != null && previewing) {
camera.getCamera().stopPreview();
previewCallback.setHandler(null, 0);
previewing = false;
}
}
/**
* Convenience method for {@link CaptureActivity}
*
* @param newSetting if {@code true}, light should be turned on if currently off. And vice versa.
*/
public synchronized void setTorch(boolean newSetting) {
OpenCamera theCamera = camera;
if (theCamera != null) {
if (newSetting != configManager.getTorchState(theCamera.getCamera())) {
boolean wasAutoFocusManager = autoFocusManager != null;
if (wasAutoFocusManager) {
autoFocusManager.stop();
autoFocusManager = null;
}
configManager.setTorch(theCamera.getCamera(), newSetting);
if (wasAutoFocusManager) {
autoFocusManager = new AutoFocusManager(context, theCamera.getCamera());
autoFocusManager.start();
}
}
}
}
/**
* A single preview frame will be returned to the handler supplied. The data will arrive as byte[]
* in the message.obj field, with width and height encoded as message.arg1 and message.arg2,
* respectively.
*
* @param handler The handler to send the message to.
* @param message The what field of the message to be sent.
*/
public synchronized void requestPreviewFrame(Handler handler, int message) {
OpenCamera theCamera = camera;
if (theCamera != null && previewing) {
previewCallback.setHandler(handler, message);
theCamera.getCamera().setOneShotPreviewCallback(previewCallback);
}
}
/**
* Calculates the framing rect which the UI should draw to show the user where to place the
* barcode. This target helps with alignment as well as forces the user to hold the device
* far enough away to ensure the image will be in focus.
*
* @return The rectangle to draw on screen in window coordinates.
*/
public synchronized Rect getFramingRect() {
if (framingRect == null) {
if (camera == null) {
return null;
}
Point screenResolution = configManager.getScreenResolution();
if (screenResolution == null) {
// Called early, before init even finished
return null;
}
int width = findDesiredDimensionInRange(screenResolution.x, MIN_FRAME_WIDTH, MAX_FRAME_WIDTH);
int height = findDesiredDimensionInRange(screenResolution.y, MIN_FRAME_HEIGHT, MAX_FRAME_HEIGHT);
//保持扫描框宽高一致
int finalSize = height;
if (height > width) {
finalSize = width;
}
int leftOffset = (screenResolution.x - finalSize) / 2;
int topOffset = (screenResolution.y - finalSize) / 2;
framingRect = new Rect(leftOffset, topOffset, leftOffset + finalSize, topOffset + finalSize);
Log.d(TAG, "Calculated framing rect: " + framingRect + " width =" + width + " height = " + height +
" screenResolution.x = " + screenResolution.x + " screenResolution.y = " + screenResolution.y);
}
return framingRect;
}
private static int findDesiredDimensionInRange(int resolution, int hardMin, int hardMax) {
int dim = 5 * resolution / 8; // Target 5/8 of each dimension
if (dim < hardMin) {
return hardMin;
}
if (dim > hardMax) {
return hardMax;
}
return dim;
}
/**
* Like {@link #getFramingRect} but coordinates are in terms of the preview frame,
* not UI / screen.
*
* @return {@link Rect} expressing barcode scan area in terms of the preview size
*/
public synchronized Rect getFramingRectInPreview() {
if (framingRectInPreview == null) {
Rect framingRect = getFramingRect();
if (framingRect == null) {
return null;
}
Rect rect = new Rect(framingRect);
Point cameraResolution = configManager.getCameraResolution();
Point screenResolution = configManager.getScreenResolution();
if (cameraResolution == null || screenResolution == null) {
// Called early, before init even finished
return null;
}
rect.left = rect.left * cameraResolution.x / screenResolution.x;
rect.right = rect.right * cameraResolution.x / screenResolution.x;
rect.top = rect.top * cameraResolution.y / screenResolution.y;
rect.bottom = rect.bottom * cameraResolution.y / screenResolution.y;
framingRectInPreview = rect;
}
return framingRectInPreview;
}
/**
* Allows third party apps to specify the camera ID, rather than determine
* it automatically based on available cameras and their orientation.
*
* @param cameraId camera ID of the camera to use. A negative value means "no preference".
*/
public synchronized void setManualCameraId(int cameraId) {
requestedCameraId = cameraId;
}
/**
* Allows third party apps to specify the scanning rectangle dimensions, rather than determine
* them automatically based on screen resolution.
*
* @param width The width in pixels to scan.
* @param height The height in pixels to scan.
*/
public synchronized void setManualFramingRect(int width, int height) {
if (initialized) {
Point screenResolution = configManager.getScreenResolution();
if (width > screenResolution.x) {
width = screenResolution.x;
}
if (height > screenResolution.y) {
height = screenResolution.y;
}
int leftOffset = (screenResolution.x - width) / 2;
int topOffset = (screenResolution.y - height) / 2;
framingRect = new Rect(leftOffset, topOffset, leftOffset + width, topOffset + height);
Log.d(TAG, "Calculated manual framing rect: " + framingRect);
framingRectInPreview = null;
} else {
requestedFramingRectWidth = width;
requestedFramingRectHeight = height;
}
}
/**
* A factory method to build the appropriate LuminanceSource object based on the format
* of the preview buffers, as described by Camera.Parameters.
*
* @param data A preview frame.
* @param width The width of the image.
* @param height The height of the image.
* @return A PlanarYUVLuminanceSource instance.
*/
public PlanarYUVLuminanceSource buildLuminanceSource(byte[] data, int width, int height) {
Rect rect = getFramingRectInPreview();
if (rect == null) {
return null;
}
// Go ahead and assume it's YUV rather than die.
return new PlanarYUVLuminanceSource(data, width, height, rect.left, rect.top,
rect.width(), rect.height(), false);
}
public void openLight() {
if (camera != null && camera.getCamera() != null) {
Camera.Parameters parameter = camera.getCamera().getParameters();
parameter.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
camera.getCamera().setParameters(parameter);
}
}
public void offLight() {
if (camera != null && camera.getCamera() != null) {
Camera.Parameters parameter = camera.getCamera().getParameters();
parameter.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
camera.getCamera().setParameters(parameter);
}
}
}
FrontLightMode.java
public enum FrontLightMode {
/** Always on. */
ON,
/** On only when ambient light is low. */
AUTO,
/** Always off. */
OFF
/* private static FrontLightMode parse(String modeString) {
return modeString == null ? OFF : valueOf(modeString);
}
public static FrontLightMode readPref(SharedPreferences sharedPrefs) {
return parse(sharedPrefs.getString(PreferencesActivity.KEY_FRONT_LIGHT_MODE, OFF.toString()));
}*/
}
PreviewCallback.java
final class PreviewCallback implements Camera.PreviewCallback {
private static final String TAG = PreviewCallback.class.getSimpleName();
private final CameraConfigurationManager configManager;
private Handler previewHandler;
private int previewMessage;
PreviewCallback(CameraConfigurationManager configManager) {
this.configManager = configManager;
}
void setHandler(Handler previewHandler, int previewMessage) {
this.previewHandler = previewHandler;
this.previewMessage = previewMessage;
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
Point cameraResolution = configManager.getCameraResolution();
Handler thePreviewHandler = previewHandler;
if (cameraResolution != null && thePreviewHandler != null) {
//给DecodeHandler发消息
Message message = thePreviewHandler.obtainMessage(previewMessage, cameraResolution.x,
cameraResolution.y, data);
message.sendToTarget();
previewHandler = null;
} else {
Log.d(TAG, "Got preview callback, but no handler or resolution available");
}
}
}
QRConstants.java
public class QRConstants {
public static boolean vibrateEnable = true;
public static boolean beepEnable = true;
public static FrontLightMode frontLightMode = FrontLightMode.OFF;
public static boolean disableExposure = true;
public static boolean autoFocus = true;
public static final String KEY_AUTO_FOCUS = "preferences_auto_focus";
}
Constant.java
public class Constant {
public static final int DECODE = 1;
public static final int DECODE_FAILED = 2;
public static final int DECODE_SUCCEEDED = 3;
public static final int LAUNCH_PRODUCT_QUERY = 4;
public static final int QUIT = 5;
public static final int RESTART_PREVIEW = 6;
public static final int RETURN_SCAN_RESULT = 7;
public static final int FLASH_OPEN = 8;
public static final int FLASH_CLOSE = 9;
public static final int REQUEST_IMAGE = 10;
public static final String CODED_CONTENT = "codedContent";
public static final String CODED_BITMAP = "codedBitmap";
/*传递的zxingconfing*/
public static final String INTENT_ZXING_CONFIG = "zxingConfig";
}
AmbientLightManager.java
public final class AmbientLightManager implements SensorEventListener {
private static final float TOO_DARK_LUX = 45.0f;
private static final float BRIGHT_ENOUGH_LUX = 450.0f;
private final Context context;
private CameraManager cameraManager;
private Sensor lightSensor;
public AmbientLightManager(Context context) {
this.context = context;
}
public void start(CameraManager cameraManager) {
this.cameraManager = cameraManager;
SharedPreferences sharedPrefs = PreferenceManager.getDefaultSharedPreferences(context);
if (QRConstants.frontLightMode == FrontLightMode.AUTO) {
SensorManager sensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
lightSensor = sensorManager.getDefaultSensor(Sensor.TYPE_LIGHT);
if (lightSensor != null) {
sensorManager.registerListener(this, lightSensor, SensorManager.SENSOR_DELAY_NORMAL);
}
}
}
public void stop() {
if (lightSensor != null) {
SensorManager sensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
sensorManager.unregisterListener(this);
cameraManager = null;
lightSensor = null;
}
}
@Override
public void onSensorChanged(SensorEvent sensorEvent) {
float ambientLightLux = sensorEvent.values[0];
if (cameraManager != null) {
if (ambientLightLux <= TOO_DARK_LUX) {
cameraManager.setTorch(true);
} else if (ambientLightLux >= BRIGHT_ENOUGH_LUX) {
cameraManager.setTorch(false);
}
}
}
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// do nothing
}
}
BeepManager.java
public final class BeepManager implements MediaPlayer.OnCompletionListener,
MediaPlayer.OnErrorListener, Closeable {
private static final String TAG = BeepManager.class.getSimpleName();
private static final float BEEP_VOLUME = 0.10f;
private static final long VIBRATE_DURATION = 200L;
private final Activity activity;
private MediaPlayer mediaPlayer;
private boolean playBeep;
private boolean vibrate;
public BeepManager(Activity activity) {
this.activity = activity;
this.mediaPlayer = null;
updatePrefs();
}
public boolean isPlayBeep() {
return playBeep;
}
public void setPlayBeep(boolean playBeep) {
this.playBeep = playBeep;
}
public boolean isVibrate() {
return vibrate;
}
public void setVibrate(boolean vibrate) {
this.vibrate = vibrate;
}
public synchronized void updatePrefs() {
if (playBeep && mediaPlayer == null) {
// The volume on STREAM_SYSTEM is not adjustable, and users found it
// too loud,
// so we now play on the music stream.
// 设置activity音量控制键控制的音频流
activity.setVolumeControlStream(AudioManager.STREAM_MUSIC);
mediaPlayer = buildMediaPlayer(activity);
}
}
/**
* 开启响铃和震动
*/
@SuppressLint("MissingPermission")
public synchronized void playBeepSoundAndVibrate() {
if (playBeep && mediaPlayer != null) {
mediaPlayer.start();
}
if (vibrate) {
Vibrator vibrator = (Vibrator) activity
.getSystemService(Context.VIBRATOR_SERVICE);
vibrator.vibrate(VIBRATE_DURATION);
}
}
/**
* 创建MediaPlayer
*
* @param activity
* @return
*/
private MediaPlayer buildMediaPlayer(Context activity) {
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
// 监听是否播放完成
mediaPlayer.setOnCompletionListener(this);
mediaPlayer.setOnErrorListener(this);
// 配置播放资源
try {
AssetFileDescriptor file = activity.getResources()
.openRawResourceFd(R.raw.beep);
try {
mediaPlayer.setDataSource(file.getFileDescriptor(),
file.getStartOffset(), file.getLength());
} finally {
file.close();
}
// 设置音量
mediaPlayer.setVolume(BEEP_VOLUME, BEEP_VOLUME);
mediaPlayer.prepare();
return mediaPlayer;
} catch (IOException ioe) {
Log.w(TAG, ioe);
mediaPlayer.release();
return null;
}
}
@Override
public void onCompletion(MediaPlayer mp) {
// When the beep has finished playing, rewind to queue up another one.
mp.seekTo(0);
}
@Override
public synchronized boolean onError(MediaPlayer mp, int what, int extra) {
if (what == MediaPlayer.MEDIA_ERROR_SERVER_DIED) {
// we are finished, so put up an appropriate error toast if required
// and finish
activity.finish();
} else {
// possibly media player error, so release and recreate
mp.release();
mediaPlayer = null;
updatePrefs();
}
return true;
}
@Override
public synchronized void close() {
if (mediaPlayer != null) {
mediaPlayer.release();
mediaPlayer = null;
}
}
}
CaptureActivity.java
public class CaptureActivity extends Activity implements SurfaceHolder.Callback, View.OnClickListener {
static {
AppCompatDelegate.setCompatVectorFromResourcesEnabled(true);//处理api 5.1以下手机不兼容问题
}
private static final String TAG = CaptureActivity.class.getSimpleName();
private ImageView mBackmImg;
private TextView mTitle;
public int REQ_ID_GALLERY = 0;
public static boolean isLightOn = false;
private IMResUtil mImResUtil;
public static void startAction(Activity activity, Bundle bundle, int requestCode) {
Intent intent = new Intent(activity, CaptureActivity.class);
intent.putExtras(bundle);
activity.startActivityForResult(intent, requestCode);
}
private CameraManager cameraManager;
private CaptureActivityHandler handler;
private ViewfinderView viewfinderView;
private boolean hasSurface;
private Collection decodeFormats;
private String characterSet;
private InactivityTimer inactivityTimer;
private BeepManager beepManager;
private AmbientLightManager ambientLightManager;
private LinearLayout bottomLayout;
private TextView flashLightTv;
private ImageView flashLightIv;
private LinearLayout flashLightLayout;
private LinearLayout albumLayout;
private ZxingConfig config;
private ImageView img_phone;
public ViewfinderView getViewfinderView() {
return viewfinderView;
}
public Handler getHandler() {
return handler;
}
public CameraManager getCameraManager() {
return cameraManager;
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mImResUtil = new IMResUtil(this);
setContentView(mImResUtil.getLayout("activity_device_qrcode_capture"));
//设置浸入式状态栏
setColor(this, Color.BLACK);
hasSurface = false;
inactivityTimer = new InactivityTimer(this);
ambientLightManager = new AmbientLightManager(this);
/*先获取配置信息*/
try {
config = (ZxingConfig) getIntent().getExtras().get(Constant.INTENT_ZXING_CONFIG);
} catch (Exception e) {
Log.i("config", e.toString());
}
if (config == null) {
config = new ZxingConfig();
}
beepManager = new BeepManager(this);
beepManager.setPlayBeep(config.isPlayBeep());
beepManager.setVibrate(config.isShake());
initView(getIntent().getExtras());
onEvent(getIntent().getExtras());
}
private void initView(Bundle bundle) {
//判断是否为横屏状态
if ("landscape".equals(bundle.getString("portraitOrLandscape"))) {
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
mBackmImg = (ImageView) findViewById(mImResUtil.getId("iv_qr_back"));
mTitle = (TextView) findViewById(mImResUtil.getId("tv_qr_title"));
viewfinderView = (ViewfinderView) findViewById(mImResUtil.getId("vv_qr_viewfinderView"));
flashLightTv = (TextView) findViewById(mImResUtil.getId("flashLightTv"));
bottomLayout = (LinearLayout) findViewById(mImResUtil.getId("bottomLayout"));
flashLightIv = (ImageView) findViewById(mImResUtil.getId("flashLightIv"));
img_phone = (ImageView) findViewById(mImResUtil.getId("img_phone"));
flashLightLayout = (LinearLayout) findViewById(mImResUtil.getId("flashLightLayout"));
flashLightLayout.setOnClickListener(this);
albumLayout = (LinearLayout) findViewById(mImResUtil.getId("albumLayout"));
albumLayout.setOnClickListener(this);
}
private void onEvent(Bundle bundle) {
switchVisibility(bottomLayout, config.isShowbottomLayout());
switchVisibility(flashLightLayout, config.isShowFlashLight());
switchVisibility(albumLayout, config.isShowAlbum());
flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_off);
img_phone.setImageResource(R.drawable.ic_photo);
/*有闪光灯就显示手电筒按钮 否则不显示*/
if (isSupportCameraLedFlash(getPackageManager())) {
flashLightLayout.setVisibility(View.VISIBLE);
} else {
flashLightLayout.setVisibility(View.GONE);
}
/********************新增 END*****************************/
mBackmImg.setOnClickListener(this);
if (bundle == null) {
return;
}
String titileText = bundle.getString("titileText");
if (titileText != null && !titileText.isEmpty()) {
mTitle.setText(titileText);
}
String headColor = bundle.getString("headColor");
mTitle.setTextColor(Color.parseColor(headColor));
float headSize = bundle.getFloat("headSize");
if (headSize > 0) {
mTitle.setTextSize(headSize);
}
}
/**
* 沉浸式状态栏
*
* @param activity
* @param color
*/
public static void setColor(Activity activity, int color) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
// 设置状态栏透明
activity.getWindow().addFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_STATUS);
// 生成一个状态栏大小的矩形
View statusView = createStatusView(activity, color);
// 添加 statusView 到布局中
ViewGroup decorView = (ViewGroup) activity.getWindow().getDecorView();
decorView.addView(statusView);
// 设置根布局的参数
ViewGroup rootView = (ViewGroup) ((ViewGroup) activity.findViewById(android.R.id.content)).getChildAt(0);
rootView.setFitsSystemWindows(true);
rootView.setClipToPadding(true);
}
}
/**
* 绘制一个和状态栏登高的矩形
*
* @param activity
* @param color
*/
private static View createStatusView(Activity activity, int color) {
// 获得状态栏高度
int resourceId = activity.getResources().getIdentifier("status_bar_height", "dimen", "android");
int statusBarHeight = activity.getResources().getDimensionPixelSize(resourceId);
View statusView = new View(activity);
LinearLayout.LayoutParams params = new LinearLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, statusBarHeight);
statusView.setLayoutParams(params);
statusView.setBackgroundColor(color);
return statusView;
}
@Override
public void onClick(View v) {
int id = v.getId();
if (id == mImResUtil.getId("iv_qr_back")) {
this.finish();
} else if (id == mImResUtil.getId("albumLayout")) {//打开相册
// Intent intent = new Intent();
// intent.setType("image/*");
// intent.setAction(Intent.ACTION_GET_CONTENT);
// intent.addCategory(Intent.CATEGORY_OPENABLE);
// startActivityForResult(intent, REQ_ID_GALLERY);
/*打开相册*/
Intent intent = new Intent();
intent.setAction(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQ_ID_GALLERY);
} else if (id == mImResUtil.getId("flashLightLayout")) {//打开手电筒感应部分
if (isLightOn) {
isLightOn = false;
cameraManager.offLight();
switchFlashImg(9);
} else {
isLightOn = true;
cameraManager.openLight();
switchFlashImg(8);
}
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQ_ID_GALLERY) {
if (resultCode == RESULT_OK) {
Uri uri = data.getData();
if (uri != null) {
String path = FileUtil.checkPicturePath(CaptureActivity.this, uri);//Device.getActivity()
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = false;
bmOptions.inPurgeable = true;
Bitmap bmp = BitmapFactory.decodeFile(path, bmOptions);
decodeQRCode(bmp, this);
}
}
}
//-----------------------------------------------------
if (requestCode == Constant.REQUEST_IMAGE && resultCode == RESULT_OK) {
String path = ImageUtil.getImageAbsolutePath(this, data.getData());
Log.e(TAG, "onActivityResult: -------二维码:path" + path);
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = false;
bmOptions.inPurgeable = true;
Bitmap bmp = BitmapFactory.decodeFile(path, bmOptions);
decodeQRCode(bmp, this);
}
}
/**
* 解析二维码图片
*
* @param bitmap 要解析的二维码图片
*/
public final Map HINTS = new EnumMap<>(DecodeHintType.class);
@SuppressLint("StaticFieldLeak")
public void decodeQRCode(final Bitmap bitmap, final Activity activity) {
new AsyncTask() {
@Override
protected String doInBackground(Void... params) {
try {
int width = bitmap.getWidth();
int height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
RGBLuminanceSource source = new RGBLuminanceSource(width, height, pixels);
Result result = new MultiFormatReader().decode(new BinaryBitmap(new HybridBinarizer(source)), HINTS);
String url = "解析失败";
if (result != null && result.getText() != null) {
url = result.getText();
}
Intent resultIntent = new Intent();
Bundle bundle = new Bundle();
bundle.putString(Constant.CODED_CONTENT, url);
resultIntent.putExtras(bundle);
activity.setResult(RESULT_OK, resultIntent);
CaptureActivity.this.finish();
return result.getText();
} catch (Exception e) {
return null;
}
}
@Override
protected void onPostExecute(String result) {
Log.d("CaptureActivity", "result=" + result);
Toast.makeText(CaptureActivity.this, "解析失败,换个图片试一下", Toast.LENGTH_LONG).show();
}
}.execute();
}
@Override
protected void onResume() {
super.onResume();
cameraManager = new CameraManager(getApplication());
viewfinderView.setCameraManager(cameraManager);
handler = null;
beepManager.updatePrefs();
ambientLightManager.start(cameraManager);
inactivityTimer.onResume();
decodeFormats = null;
characterSet = null;
SurfaceView surfaceView = (SurfaceView) findViewById(mImResUtil.getId("device_qrcode_preview_view"));
SurfaceHolder surfaceHolder = surfaceView.getHolder();
if (hasSurface) {
initCamera(surfaceHolder);
} else {
surfaceHolder.addCallback(this);
}
}
@Override
protected void onPause() {
if (handler != null) {
handler.quitSynchronously();
handler = null;
}
inactivityTimer.onPause();
ambientLightManager.stop();
beepManager.close();
cameraManager.closeDriver();
if (!hasSurface) {
SurfaceView surfaceView = (SurfaceView) findViewById(mImResUtil.getId("device_qrcode_preview_view"));
SurfaceHolder surfaceHolder = surfaceView.getHolder();
surfaceHolder.removeCallback(this);
}
super.onPause();
}
@Override
protected void onDestroy() {
inactivityTimer.shutdown();
super.onDestroy();
}
private void initCamera(SurfaceHolder surfaceHolder) {
if (surfaceHolder == null) {
throw new IllegalStateException("No SurfaceHolder provided");
}
if (cameraManager.isOpen()) {
Log.w(TAG, "initCamera() while already open -- late SurfaceView callback?");
return;
}
try {
cameraManager.openDriver(surfaceHolder);
// Creating the handler starts the preview, which can also throw a RuntimeException.
if (handler == null) {
handler = new CaptureActivityHandler(this, decodeFormats, characterSet, cameraManager);
}
} catch (IOException ioe) {
Log.w(TAG, ioe);
} catch (RuntimeException e) {
// Barcode Scanner has seen crashes in the wild of this variety:
// java.?lang.?RuntimeException: Fail to connect to camera service
Log.e(TAG, "Unexpected error initializing camera", e);
}
}
public void drawViewfinder() {
viewfinderView.drawViewfinder();
}
public void handleDecode(Result rawResult, Bitmap barcode, float scaleFactor) {
Log.d("wxl", "rawResult=" + rawResult);
boolean fromLiveScan = barcode != null;
if (fromLiveScan) {
String resultString = rawResult.getText();
Log.e("wxl", "rawResult=" + rawResult.getText());
beepManager.playBeepSoundAndVibrate();
Intent resultIntent = new Intent();
Bundle bundle = new Bundle();
bundle.putString(Constant.CODED_CONTENT, resultString);
resultIntent.putExtras(bundle);
this.setResult(RESULT_OK, resultIntent);
} else {
// this.setResult(RESULT_OK, resultIntent);
Toast.makeText(CaptureActivity.this, "扫描失败", Toast.LENGTH_SHORT).show();
}
CaptureActivity.this.finish();
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
if (holder == null) {
Log.e(TAG, "*** WARNING *** surfaceCreated() gave us a null surface!");
}
if (!hasSurface) {
hasSurface = true;
initCamera(holder);
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
hasSurface = false;
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
}
/********************新增代码 start: **************************/
private void switchVisibility(View view, boolean b) {
if (b) {
view.setVisibility(View.VISIBLE);
} else {
view.setVisibility(View.GONE);
}
}
/**
* @param pm
*
* @return 是否有闪光灯
*/
public static boolean isSupportCameraLedFlash(PackageManager pm) {
if (pm != null) {
FeatureInfo[] features = pm.getSystemAvailableFeatures();
if (features != null) {
for (FeatureInfo f : features) {
if (f != null && PackageManager.FEATURE_CAMERA_FLASH.equals(f.name)) {
return true;
}
}
}
}
return false;
}
/**
* @param flashState 切换闪光灯图片
*/
public void switchFlashImg(int flashState) {
if (flashState == Constant.FLASH_OPEN) {
flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_on);//ic_open
flashLightTv.setText("关闭闪光灯");
} else {
flashLightIv.setImageResource(R.drawable.device_qrcode_scan_flash_off);//ic_close
flashLightTv.setText("打开闪光灯");
}
}
/********************新增代码 END*****************************/
}
CaptureActivityHandler.java
public final class CaptureActivityHandler extends Handler {
private static final String TAG = CaptureActivityHandler.class.getSimpleName();
private final CaptureActivity activity;
private final DecodeThread decodeThread;
private State state;
private final CameraManager cameraManager;
private final IMResUtil mImResUtil;
// private RelativeLayout mLightLinearLay;//LinearLayout 手电筒的线性布局
private enum State {
PREVIEW, SUCCESS, DONE
}
public CaptureActivityHandler(CaptureActivity activity, Collection decodeFormats, String characterSet, CameraManager cameraManager) {
this.activity = activity;
mImResUtil = new IMResUtil(activity);
decodeThread = new DecodeThread(activity, decodeFormats, characterSet, new ViewfinderResultPointCallback(activity.getViewfinderView()));
decodeThread.start();
state = State.SUCCESS;
// Start ourselves capturing previews and decoding.
this.cameraManager = cameraManager;
cameraManager.startPreview();
Log.d(TAG, "CaptureActivityHandler " + CaptureActivityHandler.class.toString());
restartPreviewAndDecode();
}
@Override
public void handleMessage(Message message) {
int what = message.what;
if (what == mImResUtil.getId("device_qrcode_restart_preview")) {
restartPreviewAndDecode();
} else if (what == mImResUtil.getId("device_qrcode_decode_succeeded")) {
state = State.SUCCESS;
Bundle bundle = message.getData();
Bitmap barcode = null;
float scaleFactor = 1.0f;
if (bundle != null) {
byte[] compressedBitmap = bundle.getByteArray(DecodeThread.BARCODE_BITMAP);
if (compressedBitmap != null) {
barcode = BitmapFactory.decodeByteArray(compressedBitmap, 0, compressedBitmap.length, null);
// Mutable copy:
barcode = barcode.copy(Bitmap.Config.ARGB_8888, true);
}
scaleFactor = bundle.getFloat(DecodeThread.BARCODE_SCALED_FACTOR);
}
activity.handleDecode((Result) message.obj, barcode, scaleFactor);
} else if (what == mImResUtil.getId("device_qrcode_decode_failed")) {
state = State.PREVIEW;
cameraManager.requestPreviewFrame(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_decode"));
} else if (what == mImResUtil.getId("device_qrcode_return_scan_result")) {
activity.setResult(Activity.RESULT_OK, (Intent) message.obj);
activity.finish();
} else if (what == mImResUtil.getId("device_qrcode_launch_product_query")) {
String url = (String) message.obj;
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_WHEN_TASK_RESET);
intent.setData(Uri.parse(url));
ResolveInfo resolveInfo = activity.getPackageManager().resolveActivity(intent, PackageManager.MATCH_DEFAULT_ONLY);
String browserPackageName = null;
if (resolveInfo != null && resolveInfo.activityInfo != null) {
browserPackageName = resolveInfo.activityInfo.packageName;
Log.d(TAG, "Using browser in package " + browserPackageName);
}
// Needed for default Android browser / Chrome only apparently
if ("com.android.browser".equals(browserPackageName) || "com.android.chrome".equals(browserPackageName)) {
intent.setPackage(browserPackageName);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.putExtra(Browser.EXTRA_APPLICATION_ID, browserPackageName);
}
try {
activity.startActivity(intent);
} catch (ActivityNotFoundException ignored) {
Log.w(TAG, "Can't find anything to handle VIEW of URI " + url);
}
}
}
public void quitSynchronously() {
state = State.DONE;
cameraManager.stopPreview();
Message quit = Message.obtain(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_quit"));
quit.sendToTarget();
try {
// Wait at most half a second; should be enough time, and onPause() will timeout quickly
decodeThread.join(500L);
} catch (InterruptedException e) {
// continue
}
// Be absolutely sure we don't send any queued up messages
removeMessages(mImResUtil.getId("device_qrcode_decode_succeeded"));
removeMessages(mImResUtil.getId("device_qrcode_decode_failed"));
}
private void restartPreviewAndDecode() {
if (state == State.SUCCESS) {
state = State.PREVIEW;
//decodeThread.getHandler()拿到DecodeHandler
cameraManager.requestPreviewFrame(decodeThread.getHandler(), mImResUtil.getId("device_qrcode_decode"));
activity.drawViewfinder();
}
}
}
DecodeFormatManager.java
final class DecodeFormatManager {
private static final Pattern COMMA_PATTERN = Pattern.compile(",");
// static final Set PRODUCT_FORMATS;
// static final Set INDUSTRIAL_FORMATS;
// private static final Set ONE_D_FORMATS;
static final Set QR_CODE_FORMATS = EnumSet.of(BarcodeFormat.QR_CODE);
// static final Set DATA_MATRIX_FORMATS = EnumSet.of(BarcodeFormat.DATA_MATRIX);
// static final Set AZTEC_FORMATS = EnumSet.of(BarcodeFormat.AZTEC);
// static final Set PDF417_FORMATS = EnumSet.of(BarcodeFormat.PDF_417);
/*static {
PRODUCT_FORMATS = EnumSet.of(BarcodeFormat.UPC_A,
BarcodeFormat.UPC_E,
BarcodeFormat.EAN_13,
BarcodeFormat.EAN_8,
BarcodeFormat.RSS_14,
BarcodeFormat.RSS_EXPANDED);
INDUSTRIAL_FORMATS = EnumSet.of(BarcodeFormat.CODE_39,
BarcodeFormat.CODE_93,
BarcodeFormat.CODE_128,
BarcodeFormat.ITF,
BarcodeFormat.CODABAR);
ONE_D_FORMATS = EnumSet.copyOf(PRODUCT_FORMATS);
ONE_D_FORMATS.addAll(INDUSTRIAL_FORMATS);
}*/
private static final Map> FORMATS_FOR_MODE;
static {
FORMATS_FOR_MODE = new HashMap<>();
// FORMATS_FOR_MODE.put(Intents.Scan.ONE_D_MODE, ONE_D_FORMATS);
// FORMATS_FOR_MODE.put(Intents.Scan.PRODUCT_MODE, PRODUCT_FORMATS);
FORMATS_FOR_MODE.put(Intents.Scan.QR_CODE_MODE, QR_CODE_FORMATS);
// FORMATS_FOR_MODE.put(Intents.Scan.DATA_MATRIX_MODE, DATA_MATRIX_FORMATS);
// FORMATS_FOR_MODE.put(Intents.Scan.AZTEC_MODE, AZTEC_FORMATS);
// FORMATS_FOR_MODE.put(Intents.Scan.PDF417_MODE, PDF417_FORMATS);
}
private DecodeFormatManager() {}
static Set parseDecodeFormats(Intent intent) {
Iterable scanFormats = null;
CharSequence scanFormatsString = intent.getStringExtra(Intents.Scan.FORMATS);
if (scanFormatsString != null) {
scanFormats = Arrays.asList(COMMA_PATTERN.split(scanFormatsString));
}
return parseDecodeFormats(scanFormats, intent.getStringExtra(Intents.Scan.MODE));
}
static Set parseDecodeFormats(Uri inputUri) {
List formats = inputUri.getQueryParameters(Intents.Scan.FORMATS);
if (formats != null && formats.size() == 1 && formats.get(0) != null){
formats = Arrays.asList(COMMA_PATTERN.split(formats.get(0)));
}
return parseDecodeFormats(formats, inputUri.getQueryParameter(Intents.Scan.MODE));
}
private static Set parseDecodeFormats(Iterable scanFormats, String decodeMode) {
if (scanFormats != null) {
Set formats = EnumSet.noneOf(BarcodeFormat.class);
try {
for (String format : scanFormats) {
formats.add(BarcodeFormat.valueOf(format));
}
return formats;
} catch (IllegalArgumentException iae) {
// ignore it then
}
}
if (decodeMode != null) {
return FORMATS_FOR_MODE.get(decodeMode);
}
return null;
}
}
DecodeHandler.java
final class DecodeHandler extends Handler {
private static final String TAG = DecodeHandler.class.getSimpleName();
public static boolean isWeakLight = false;
private final CaptureActivity activity;
private final MultiFormatReader multiFormatReader;
private boolean running = true;
private final IMResUtil mImResUtil;
DecodeHandler(CaptureActivity activity, Map hints) {
multiFormatReader = new MultiFormatReader();
multiFormatReader.setHints(hints);
this.activity = activity;
mImResUtil = new IMResUtil(activity);
}
@Override
public void handleMessage(Message message) {
if (!running) {
return;
}
int what = message.what;
if (what == mImResUtil.getId("device_qrcode_decode")) {
decode((byte[]) message.obj, message.arg1, message.arg2);
} else if (what == mImResUtil.getId("device_qrcode_quit")) {
isWeakLight = false;
CaptureActivity.isLightOn = false;
running = false;
Looper.myLooper().quit();
}
}
/**
* Decode the data within the viewfinder rectangle, and time how long it took. For efficiency,
* reuse the same reader objects from one decode to the next.
*
* @param data The YUV preview frame.
* @param width The width of the preview frame.
* @param height The height of the preview frame.
*/
private void decode(byte[] data, int width, int height) {
Log.i(TAG, "decode");
//弱光检测
analysisColor(data,width,height);
long start = System.currentTimeMillis();
Result rawResult = null;
PlanarYUVLuminanceSource source = activity.getCameraManager().buildLuminanceSource(data, width, height);
if (source != null) {
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
try {
rawResult = multiFormatReader.decodeWithState(bitmap);
} catch (ReaderException re) {
// continue
} finally {
multiFormatReader.reset();
}
}
Handler handler = activity.getHandler();//拿到CaptureActivityHandler
Log.d(TAG, "Found handler " + handler);
if (rawResult != null) {
// Don't log the barcode contents for security.
long end = System.currentTimeMillis();
Log.d(TAG, "Found barcode in " + (end - start) + " ms");
if (handler != null) {
//给CaptureActivityHandler发消息
Message message = Message.obtain(handler, mImResUtil.getId("device_qrcode_decode_succeeded"), rawResult);
Bundle bundle = new Bundle();
bundleThumbnail(source, bundle);
message.setData(bundle);
message.sendToTarget();
}
} else {
if (handler != null) {
Message message = Message.obtain(handler, mImResUtil.getId("device_qrcode_decode_failed"));
message.sendToTarget();
}
}
}
private static void bundleThumbnail(PlanarYUVLuminanceSource source, Bundle bundle) {
int[] pixels = source.renderThumbnail();
int width = source.getThumbnailWidth();
int height = source.getThumbnailHeight();
Bitmap bitmap = Bitmap.createBitmap(pixels, 0, width, width, height, Bitmap.Config.ARGB_8888);
ByteArrayOutputStream out = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 50, out);
bundle.putByteArray(DecodeThread.BARCODE_BITMAP, out.toByteArray());
bundle.putFloat(DecodeThread.BARCODE_SCALED_FACTOR, (float) width / source.getWidth());
}
private int[] decodeYUV420SP(byte[] yuv420sp, int width, int height) {
final int frameSize = width * height;
int rgb[] = new int[width * height];
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0) y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0) r = 0;
else if (r > 262143) r = 262143;
if (g < 0) g = 0;
else if (g > 262143) g = 262143;
if (b < 0) b = 0;
else if (b > 262143) b = 262143;
rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) &
0xff00) | ((b >> 10) & 0xff);
}
}
return rgb;
}
private int getAverageColor(Bitmap bitmap) {
int redBucket = 0;
int greenBucket = 0;
int blueBucket = 0;
int pixelCount = 0;
for (int y = 0; y < bitmap.getHeight(); y++) {
for (int x = 0; x < bitmap.getWidth(); x++) {
int c = bitmap.getPixel(x, y);
pixelCount++;
redBucket += Color.red(c);
greenBucket += Color.green(c);
blueBucket += Color.blue(c);
}
}
int averageColor = Color.rgb(redBucket / pixelCount, greenBucket
/ pixelCount, blueBucket / pixelCount);
return averageColor;
}
//分析预览帧中图片的arg 取平均值
public void analysisColor(byte[] data, int width, int height) {
int[] rgb = decodeYUV420SP(data, width / 8, height / 8);
Bitmap bmp = Bitmap.createBitmap(rgb, width / 8, height / 8, Bitmap.Config.ARGB_8888);//这里报错
if (bmp != null) {
//取以中心点宽高10像素的图片来分析
Bitmap resizeBitmap = Bitmap.createBitmap(bmp, bmp.getWidth() / 2, bmp.getHeight() / 2, 10, 10);
float color = (float) getAverageColor(resizeBitmap);
DecimalFormat decimalFormat1 = new DecimalFormat("0.00");
String percent = decimalFormat1.format(color / -16777216);
float floatPercent = Float.parseFloat(percent);
isWeakLight = floatPercent >= 0.99 && floatPercent <= 1.00;
// Log.i(TAG,"isWeakLight "+isWeakLight);
if (null != resizeBitmap) {
resizeBitmap.recycle();
}
bmp.recycle();
}
}
}
DecodeHintManager.java
final class DecodeHintManager {
private static final String TAG = DecodeHintManager.class.getSimpleName();
// This pattern is used in decoding integer arrays.
private static final Pattern COMMA = Pattern.compile(",");
private DecodeHintManager() {}
/**
* Split a query string into a list of name-value pairs.
*
* This is an alternative to the {@link Uri#getQueryParameterNames()} and
* {@link Uri#getQueryParameters(String)}, which are quirky and not suitable
* for exist-only Uri parameters.
*
* This method ignores multiple parameters with the same name and returns the
* first one only. This is technically incorrect, but should be acceptable due
* to the method of processing Hints: no multiple values for a hint.
*
* @param query query to split
* @return name-value pairs
*/
private static Map splitQuery(String query) {
Map map = new HashMap<>();
int pos = 0;
while (pos < query.length()) {
if (query.charAt(pos) == '&') {
// Skip consecutive ampersand separators.
pos ++;
continue;
}
int amp = query.indexOf('&', pos);
int equ = query.indexOf('=', pos);
if (amp < 0) {
// This is the last element in the query, no more ampersand elements.
String name;
String text;
if (equ < 0) {
// No equal sign
name = query.substring(pos);
name = name.replace('+', ' '); // Preemptively decode +
name = Uri.decode(name);
text = "";
} else {
// Split name and text.
name = query.substring(pos, equ);
name = name.replace('+', ' '); // Preemptively decode +
name = Uri.decode(name);
text = query.substring(equ + 1);
text = text.replace('+', ' '); // Preemptively decode +
text = Uri.decode(text);
}
if (!map.containsKey(name)) {
map.put(name, text);
}
break;
}
if (equ < 0 || equ > amp) {
// No equal sign until the &: this is a simple parameter with no value.
String name = query.substring(pos, amp);
name = name.replace('+', ' '); // Preemptively decode +
name = Uri.decode(name);
if (!map.containsKey(name)) {
map.put(name, "");
}
pos = amp + 1;
continue;
}
String name = query.substring(pos, equ);
name = name.replace('+', ' '); // Preemptively decode +
name = Uri.decode(name);
String text = query.substring(equ+1, amp);
text = text.replace('+', ' '); // Preemptively decode +
text = Uri.decode(text);
if (!map.containsKey(name)) {
map.put(name, text);
}
pos = amp + 1;
}
return map;
}
static Map parseDecodeHints(Uri inputUri) {
String query = inputUri.getEncodedQuery();
if (query == null || query.isEmpty()) {
return null;
}
// Extract parameters
Map parameters = splitQuery(query);
Map hints = new EnumMap<>(DecodeHintType.class);
for (DecodeHintType hintType: DecodeHintType.values()) {
if (hintType == DecodeHintType.CHARACTER_SET ||
hintType == DecodeHintType.NEED_RESULT_POINT_CALLBACK ||
hintType == DecodeHintType.POSSIBLE_FORMATS) {
continue; // This hint is specified in another way
}
String parameterName = hintType.name();
String parameterText = parameters.get(parameterName);
if (parameterText == null) {
continue;
}
if (hintType.getValueType().equals(Object.class)) {
// This is an unspecified type of hint content. Use the value as is.
// TODO: Can we make a different assumption on this?
hints.put(hintType, parameterText);
continue;
}
if (hintType.getValueType().equals(Void.class)) {
// Void hints are just flags: use the constant specified by DecodeHintType
hints.put(hintType, Boolean.TRUE);
continue;
}
if (hintType.getValueType().equals(String.class)) {
// A string hint: use the decoded value.
hints.put(hintType, parameterText);
continue;
}
if (hintType.getValueType().equals(Boolean.class)) {
// A boolean hint: a few values for false, everything else is true.
// An empty parameter is simply a flag-style parameter, assuming true
if (parameterText.isEmpty()) {
hints.put(hintType, Boolean.TRUE);
} else if ("0".equals(parameterText) ||
"false".equalsIgnoreCase(parameterText) ||
"no".equalsIgnoreCase(parameterText)) {
hints.put(hintType, Boolean.FALSE);
} else {
hints.put(hintType, Boolean.TRUE);
}
continue;
}
if (hintType.getValueType().equals(int[].class)) {
// An integer array. Used to specify valid lengths.
// Strip a trailing comma as in Java style array initialisers.
if (!parameterText.isEmpty() && parameterText.charAt(parameterText.length() - 1) == ',') {
parameterText = parameterText.substring(0, parameterText.length() - 1);
}
String[] values = COMMA.split(parameterText);
int[] array = new int[values.length];
for (int i = 0; i < values.length; i++) {
try {
array[i] = Integer.parseInt(values[i]);
} catch (NumberFormatException ignored) {
Log.w(TAG, "Skipping array of integers hint " + hintType + " due to invalid numeric value: '" + values[i] + '\'');
array = null;
break;
}
}
if (array != null) {
hints.put(hintType, array);
}
continue;
}
Log.w(TAG, "Unsupported hint type '" + hintType + "' of type " + hintType.getValueType());
}
Log.i(TAG, "Hints from the URI: " + hints);
return hints;
}
static Map parseDecodeHints(Intent intent) {
Bundle extras = intent.getExtras();
if (extras == null || extras.isEmpty()) {
return null;
}
Map hints = new EnumMap<>(DecodeHintType.class);
for (DecodeHintType hintType: DecodeHintType.values()) {
if (hintType == DecodeHintType.CHARACTER_SET ||
hintType == DecodeHintType.NEED_RESULT_POINT_CALLBACK ||
hintType == DecodeHintType.POSSIBLE_FORMATS) {
continue; // This hint is specified in another way
}
String hintName = hintType.name();
if (extras.containsKey(hintName)) {
if (hintType.getValueType().equals(Void.class)) {
// Void hints are just flags: use the constant specified by the DecodeHintType
hints.put(hintType, Boolean.TRUE);
} else {
Object hintData = extras.get(hintName);
if (hintType.getValueType().isInstance(hintData)) {
hints.put(hintType, hintData);
} else {
Log.w(TAG, "Ignoring hint " + hintType + " because it is not assignable from " + hintData);
}
}
}
}
Log.i(TAG, "Hints from the Intent: " + hints);
return hints;
}
}
DecodeThread.java
final class DecodeThread extends Thread {
public static final String BARCODE_BITMAP = "barcode_bitmap";
public static final String BARCODE_SCALED_FACTOR = "barcode_scaled_factor";
private final CaptureActivity activity;
private final Map hints;
private Handler handler;
private final CountDownLatch handlerInitLatch;
public DecodeThread(CaptureActivity activity,
Collection decodeFormats,
String characterSet,
ResultPointCallback resultPointCallback) {
this.activity = activity;
handlerInitLatch = new CountDownLatch(1);
hints = new EnumMap<>(DecodeHintType.class);
// The prefs can't change while the thread is running, so pick them up once here.
if (decodeFormats == null || decodeFormats.isEmpty()) {
// SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(activity);
decodeFormats = EnumSet.noneOf(BarcodeFormat.class);
/*if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_1D_PRODUCT, true)) {
decodeFormats.addAll(DecodeFormatManager.PRODUCT_FORMATS);
}
if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_1D_INDUSTRIAL, true)) {
decodeFormats.addAll(DecodeFormatManager.INDUSTRIAL_FORMATS);
}*/
// if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_QR, true)) {
decodeFormats.addAll(DecodeFormatManager.QR_CODE_FORMATS);
// }
/* if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_DATA_MATRIX, true)) {
decodeFormats.addAll(DecodeFormatManager.DATA_MATRIX_FORMATS);
}
if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_AZTEC, false)) {
decodeFormats.addAll(DecodeFormatManager.AZTEC_FORMATS);
}
if (prefs.getBoolean(PreferencesActivity.KEY_DECODE_PDF417, false)) {
decodeFormats.addAll(DecodeFormatManager.PDF417_FORMATS);
}*/
}
hints.put(DecodeHintType.POSSIBLE_FORMATS, DecodeFormatManager.QR_CODE_FORMATS);
if (characterSet != null) {
hints.put(DecodeHintType.CHARACTER_SET, characterSet);
}
hints.put(DecodeHintType.NEED_RESULT_POINT_CALLBACK, resultPointCallback);
Log.i("DecodeThread", "Hints: " + hints);
}
Handler getHandler() {
try {
handlerInitLatch.await();
} catch (InterruptedException ie) {
// continue?
}
return handler;
}
@Override
public void run() {
Looper.prepare();
handler = new DecodeHandler(activity, hints);
handlerInitLatch.countDown();
Looper.loop();
}
}
ImageUtil.java
public class ImageUtil {
@TargetApi(19)
public static String getImageAbsolutePath(Context context, Uri imageUri) {
if (context == null || imageUri == null)
return null;
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT && DocumentsContract.isDocumentUri(context, imageUri)) {
if (isExternalStorageDocument(imageUri)) {
String docId = DocumentsContract.getDocumentId(imageUri);
String[] split = docId.split(":");
String type = split[0];
if ("primary".equalsIgnoreCase(type)) {
return Environment.getExternalStorageDirectory() + "/" + split[1];
}
} else if (isDownloadsDocument(imageUri)) {
String id = DocumentsContract.getDocumentId(imageUri);
Uri contentUri = ContentUris.withAppendedId(Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));
return getDataColumn(context, contentUri, null, null);
} else if (isMediaDocument(imageUri)) {
String docId = DocumentsContract.getDocumentId(imageUri);
String[] split = docId.split(":");
String type = split[0];
Uri contentUri = null;
if ("image".equals(type)) {
contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
} else if ("video".equals(type)) {
contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
} else if ("audio".equals(type)) {
contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
}
String selection = MediaStore.Images.Media._ID + "=?";
String[] selectionArgs = new String[]{split[1]};
return getDataColumn(context, contentUri, selection, selectionArgs);
}
} // MediaStore (and general)
else if ("content".equalsIgnoreCase(imageUri.getScheme())) {
// Return the remote address
if (isGooglePhotosUri(imageUri))
return imageUri.getLastPathSegment();
return getDataColumn(context, imageUri, null, null);
}
// File
else if ("file".equalsIgnoreCase(imageUri.getScheme())) {
return imageUri.getPath();
}
return null;
}
public static String getDataColumn(Context context, Uri uri, String selection, String[] selectionArgs) {
Cursor cursor = null;
String column = MediaStore.Images.Media.DATA;
String[] projection = {column};
try {
cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs, null);
if (cursor != null && cursor.moveToFirst()) {
int index = cursor.getColumnIndexOrThrow(column);
return cursor.getString(index);
}
} finally {
if (cursor != null)
cursor.close();
}
return null;
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is ExternalStorageProvider.
*/
public static boolean isExternalStorageDocument(Uri uri) {
return "com.android.externalstorage.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is DownloadsProvider.
*/
public static boolean isDownloadsDocument(Uri uri) {
return "com.android.providers.downloads.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is MediaProvider.
*/
public static boolean isMediaDocument(Uri uri) {
return "com.android.providers.media.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is Google Photos.
*/
public static boolean isGooglePhotosUri(Uri uri) {
return "com.google.android.apps.photos.content".equals(uri.getAuthority());
}
}
InactivityTimer.java
public final class InactivityTimer {
private static final String TAG = InactivityTimer.class.getSimpleName();
private static final long INACTIVITY_DELAY_MS = 5 * 60 * 1000L;
private final Activity activity;
private final BroadcastReceiver powerStatusReceiver;
private boolean registered;
private AsyncTask
Intents.java
public final class Intents {
private Intents() {
}
public static final class Scan {
/**
* Send this intent to open the Barcodes app in scanning mode, find a barcode, and return
* the results.
*/
public static final String ACTION = "com.google.zxing.client.android.SCAN";
/**
* By default, sending this will decode all barcodes that we understand. However it
* may be useful to limit scanning to certain formats. Use
* {@link android.content.Intent#putExtra(String, String)} with one of the values below.
*
* Setting this is effectively shorthand for setting explicit formats with {@link #FORMATS}.
* It is overridden by that setting.
*/
public static final String MODE = "SCAN_MODE";
/**
* Decode only UPC and EAN barcodes. This is the right choice for shopping apps which get
* prices, reviews, etc. for products.
*/
public static final String PRODUCT_MODE = "PRODUCT_MODE";
/**
* Decode only 1D barcodes.
*/
public static final String ONE_D_MODE = "ONE_D_MODE";
/**
* Decode only QR codes.
*/
public static final String QR_CODE_MODE = "QR_CODE_MODE";
/**
* Decode only Data Matrix codes.
*/
public static final String DATA_MATRIX_MODE = "DATA_MATRIX_MODE";
/**
* Decode only Aztec.
*/
public static final String AZTEC_MODE = "AZTEC_MODE";
/**
* Decode only PDF417.
*/
public static final String PDF417_MODE = "PDF417_MODE";
/**
* Comma-separated list of formats to scan for. The values must match the names of
* {@link com.google.zxing.BarcodeFormat}s, e.g. {@link com.google.zxing.BarcodeFormat#EAN_13}.
* Example: "EAN_13,EAN_8,QR_CODE". This overrides {@link #MODE}.
*/
public static final String FORMATS = "SCAN_FORMATS";
/**
* Optional parameter to specify the id of the camera from which to recognize barcodes.
* Overrides the default camera that would otherwise would have been selected.
* If provided, should be an int.
*/
public static final String CAMERA_ID = "SCAN_CAMERA_ID";
/**
* @see com.google.zxing.DecodeHintType#CHARACTER_SET
*/
public static final String CHARACTER_SET = "CHARACTER_SET";
/**
* Optional parameters to specify the width and height of the scanning rectangle in pixels.
* The app will try to honor these, but will clamp them to the size of the preview frame.
* You should specify both or neither, and pass the size as an int.
*/
public static final String WIDTH = "SCAN_WIDTH";
public static final String HEIGHT = "SCAN_HEIGHT";
/**
* Desired duration in milliseconds for which to pause after a successful scan before
* returning to the calling intent. Specified as a long, not an integer!
* For example: 1000L, not 1000.
*/
public static final String RESULT_DISPLAY_DURATION_MS = "RESULT_DISPLAY_DURATION_MS";
/**
* Prompt to show on-screen when scanning by intent. Specified as a {@link String}.
*/
public static final String PROMPT_MESSAGE = "PROMPT_MESSAGE";
/**
* If a barcode is found, Barcodes returns {@link android.app.Activity#RESULT_OK} to
* {@link android.app.Activity#onActivityResult(int, int, android.content.Intent)}
* of the app which requested the scan via
* {@link android.app.Activity#startActivityForResult(android.content.Intent, int)}
* The barcodes contents can be retrieved with
* {@link android.content.Intent#getStringExtra(String)}.
* If the user presses Back, the result code will be {@link android.app.Activity#RESULT_CANCELED}.
*/
public static final String RESULT = "SCAN_RESULT";
/**
* Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_FORMAT}
* to determine which barcode format was found.
* See {@link com.google.zxing.BarcodeFormat} for possible values.
*/
public static final String RESULT_FORMAT = "SCAN_RESULT_FORMAT";
/**
* Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_UPC_EAN_EXTENSION}
* to return the content of any UPC extension barcode that was also found. Only applicable
* to {@link com.google.zxing.BarcodeFormat#UPC_A} and {@link com.google.zxing.BarcodeFormat#EAN_13}
* formats.
*/
public static final String RESULT_UPC_EAN_EXTENSION = "SCAN_RESULT_UPC_EAN_EXTENSION";
/**
* Call {@link android.content.Intent#getByteArrayExtra(String)} with {@link #RESULT_BYTES}
* to get a {@code byte[]} of raw bytes in the barcode, if available.
*/
public static final String RESULT_BYTES = "SCAN_RESULT_BYTES";
/**
* Key for the value of {@link com.google.zxing.ResultMetadataType#ORIENTATION}, if available.
* Call {@link android.content.Intent#getIntArrayExtra(String)} with {@link #RESULT_ORIENTATION}.
*/
public static final String RESULT_ORIENTATION = "SCAN_RESULT_ORIENTATION";
/**
* Key for the value of {@link com.google.zxing.ResultMetadataType#ERROR_CORRECTION_LEVEL}, if available.
* Call {@link android.content.Intent#getStringExtra(String)} with {@link #RESULT_ERROR_CORRECTION_LEVEL}.
*/
public static final String RESULT_ERROR_CORRECTION_LEVEL = "SCAN_RESULT_ERROR_CORRECTION_LEVEL";
/**
* Prefix for keys that map to the values of {@link com.google.zxing.ResultMetadataType#BYTE_SEGMENTS},
* if available. The actual values will be set under a series of keys formed by adding 0, 1, 2, ...
* to this prefix. So the first byte segment is under key "SCAN_RESULT_BYTE_SEGMENTS_0" for example.
* Call {@link android.content.Intent#getByteArrayExtra(String)} with these keys.
*/
public static final String RESULT_BYTE_SEGMENTS_PREFIX = "SCAN_RESULT_BYTE_SEGMENTS_";
/**
* Setting this to false will not save scanned codes in the history. Specified as a {@code boolean}.
*/
public static final String SAVE_HISTORY = "SAVE_HISTORY";
private Scan() {
}
}
public static final class History {
public static final String ITEM_NUMBER = "ITEM_NUMBER";
private History() {
}
}
public static final class Encode {
/**
* Send this intent to encode a piece of data as a QR code and display it full screen, so
* that another person can scan the barcode from your screen.
*/
public static final String ACTION = "com.google.zxing.client.android.ENCODE";
/**
* The data to encode. Use {@link android.content.Intent#putExtra(String, String)} or
* {@link android.content.Intent#putExtra(String, android.os.Bundle)},
* depending on the type and format specified. Non-QR Code formats should
* just use a String here. For QR Code, see Contents for details.
*/
public static final String DATA = "ENCODE_DATA";
/**
* The type of data being supplied if the format is QR Code. Use
* {@link android.content.Intent#putExtra(String, String)} with one of {@link Contents.Type}.
*/
public static final String TYPE = "ENCODE_TYPE";
/**
* The barcode format to be displayed. If this isn't specified or is blank,
* it defaults to QR Code. Use {@link android.content.Intent#putExtra(String, String)}, where
* format is one of {@link com.google.zxing.BarcodeFormat}.
*/
public static final String FORMAT = "ENCODE_FORMAT";
/**
* Normally the contents of the barcode are displayed to the user in a TextView. Setting this
* boolean to false will hide that TextView, showing only the encode barcode.
*/
public static final String SHOW_CONTENTS = "ENCODE_SHOW_CONTENTS";
private Encode() {
}
}
public static final class SearchBookContents {
/**
* Use Google Book Search to search the contents of the book provided.
*/
public static final String ACTION = "com.google.zxing.client.android.SEARCH_BOOK_CONTENTS";
/**
* The book to search, identified by ISBN number.
*/
public static final String ISBN = "ISBN";
/**
* An optional field which is the text to search for.
*/
public static final String QUERY = "QUERY";
private SearchBookContents() {
}
}
public static final class WifiConnect {
/**
* Internal intent used to trigger connection to a wi-fi network.
*/
public static final String ACTION = "com.google.zxing.client.android.WIFI_CONNECT";
/**
* The network to connect to, all the configuration provided here.
*/
public static final String SSID = "SSID";
/**
* The network to connect to, all the configuration provided here.
*/
public static final String TYPE = "TYPE";
/**
* The network to connect to, all the configuration provided here.
*/
public static final String PASSWORD = "PASSWORD";
private WifiConnect() {
}
}
public static final class Share {
/**
* Give the user a choice of items to encode as a barcode, then render it as a QR Code and
* display onscreen for a friend to scan with their phone.
*/
public static final String ACTION = "com.google.zxing.client.android.SHARE";
private Share() {
}
}
}
ViewfinderResultPointCallback.java
final class ViewfinderResultPointCallback implements ResultPointCallback {
private final ViewfinderView viewfinderView;
ViewfinderResultPointCallback(ViewfinderView viewfinderView) {
this.viewfinderView = viewfinderView;
}
@Override
public void foundPossibleResultPoint(ResultPoint point) {
viewfinderView.addPossibleResultPoint(point);
}
}
CodeCreator.java
public class CodeCreator {
/*logo*/
private static Bitmap logoBitmap;
/*生成二维码*/
public static Bitmap createQRCode(String content, int w, int h,Bitmap logo) throws WriterException {
if (TextUtils.isEmpty(content)) {
return null;
}
/*偏移量*/
int offsetX = w / 2;
int offsetY = h / 2;
/*生成logo*/
if (logo!=null){
Matrix matrix = new Matrix();
float scaleFactor = Math.min(w * 1.0f / 5 / logo.getWidth(), h * 1.0f / 5 /logo.getHeight());
matrix.postScale(scaleFactor,scaleFactor);
logoBitmap= Bitmap.createBitmap(logo, 0, 0, logo.getWidth(), logo.getHeight(), matrix, true);
}
/*如果log不为null,重新计算偏移量*/
int logoW = 0;
int logoH = 0;
if (logoBitmap != null) {
logoW = logoBitmap.getWidth();
logoH = logoBitmap.getHeight();
offsetX = (w - logoW) / 2;
offsetY = (h - logoH) / 2;
}
/*指定为UTF-8*/
Hashtable hints = new Hashtable();
hints.put(EncodeHintType.CHARACTER_SET, "utf-8");
//容错级别
hints.put(EncodeHintType.ERROR_CORRECTION, ErrorCorrectionLevel.H);
//设置空白边距的宽度
hints.put(EncodeHintType.MARGIN, 0);
// 生成二维矩阵,编码时指定大小,不要生成了图片以后再进行缩放,这样会模糊导致识别失败
BitMatrix matrix = new MultiFormatWriter().encode(content,
BarcodeFormat.QR_CODE, w, h, hints);
// 二维矩阵转为一维像素数组,也就是一直横着排了
int[] pixels = new int[w * h];
for (int y = 0; y < h; y++) {
for (int x = 0; x < w; x++) {
if(x >= offsetX && x < offsetX + logoW && y>= offsetY && y < offsetY + logoH){
int pixel = logoBitmap.getPixel(x-offsetX,y-offsetY);
if(pixel == 0){
if(matrix.get(x, y)){
pixel = 0xff000000;
}else{
pixel = 0xffffffff;
}
}
pixels[y * w + x] = pixel;
}else{
if (matrix.get(x, y)) {
pixels[y * w + x] = 0xff000000;
} else {
pixels[y * w + x] = 0xffffffff;
}
}
}
}
Bitmap bitmap = Bitmap.createBitmap(w, h,
Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, w, 0, 0, w, h);
return bitmap;
}
}
QRCodeEncoder.java
public final class QRCodeEncoder {
private static final String TAG = QRCodeEncoder.class.getSimpleName();
private static final int WHITE = 0xFFFFFFFF;
private static final int BLACK = 0xFF000000;
public static Bitmap encodeAsBitmap(String contents, int dimension) throws WriterException {
String contentsToEncode = contents;
if (contentsToEncode == null) {
return null;
}
Map hints = null;
String encoding = guessAppropriateEncoding(contentsToEncode);
if (encoding != null) {
hints = new EnumMap<>(EncodeHintType.class);
hints.put(EncodeHintType.CHARACTER_SET, encoding);
}
BitMatrix result;
try {
result = new MultiFormatWriter().encode(contentsToEncode, BarcodeFormat.QR_CODE, dimension, dimension, hints);
} catch (IllegalArgumentException iae) {
// Unsupported format
return null;
}
int width = result.getWidth();
int height = result.getHeight();
int[] pixels = new int[width * height];
for (int y = 0; y < height; y++) {
int offset = y * width;
for (int x = 0; x < width; x++) {
pixels[offset + x] = result.get(x, y) ? BLACK : WHITE;
}
}
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
return bitmap;
}
private static String guessAppropriateEncoding(CharSequence contents) {
// Very crude at the moment
for (int i = 0; i < contents.length(); i++) {
if (contents.charAt(i) > 0xFF) {
return "UTF-8";
}
}
return null;
}
}
ViewfinderResultPointCallback.java
public final class ViewfinderResultPointCallback implements ResultPointCallback {
private final ViewfinderView viewfinderView;
public ViewfinderResultPointCallback(ViewfinderView viewfinderView) {
this.viewfinderView = viewfinderView;
}
@Override
public void foundPossibleResultPoint(ResultPoint point) {
viewfinderView.addPossibleResultPoint(point);
}
}
ViewfinderView.java
public final class ViewfinderView extends View{
private static final long ANIMATION_DELAY = 10;
private static final int CORNER_RECT_HEIGHT = 40;
private static final int CORNER_RECT_WIDTH = 8;
private static final int OPAQUE = 255;
private static final int[] SCANNER_ALPHA = new int[]{0, 64, 128, 192, OPAQUE, 192, 128, 64};
private static final int SCANNER_LINE_HEIGHT = 10;
private static final int SCANNER_LINE_MOVE_DISTANCE = 5;
public static int scannerEnd = 0;
public static int scannerStart = 0;
private CameraManager cameraManager;
private final int cornerColor;
private final int frameColor;
private String labelText;
private int labelTextColor;
private float labelTextSize;
private final int laserColor;
private Collection lastPossibleResultPoints;
private final IMResUtil mImResUtil;
private final int maskColor;
private final Paint paint = new Paint();
private Collection possibleResultPoints;
private Bitmap resultBitmap;
private final int resultColor;
private final int resultPointColor;
private int scannerAlpha;
public ViewfinderView(Context context, AttributeSet attrs) {
super(context, attrs);
this.mImResUtil = new IMResUtil(context);
TypedArray array = context.obtainStyledAttributes(attrs, this.mImResUtil.getStyleableArray("ViewfinderView"));
this.laserColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_laser_color"), 65280);
this.cornerColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_corner_color"), 65280);
this.frameColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_frame_color"), 16777215);
this.resultPointColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_result_point_color"), -1056964864);
this.maskColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_mask_color"), 1610612736);
this.resultColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_result_color"), -1342177280);
this.labelTextColor = array.getColor(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text_color"), -1862270977);
this.labelText = array.getString(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text"));
this.labelTextSize = (float) array.getDimensionPixelSize(this.mImResUtil.getStyleable("ViewfinderView_device_qrcode_label_text_size"), (int) TypedValue.applyDimension(2, 16.0f, getResources().getDisplayMetrics()));
this.paint.setAntiAlias(true);
this.scannerAlpha = 0;
this.possibleResultPoints = new HashSet(5);
}
public void setCameraManager(CameraManager cameraManager) {
this.cameraManager = cameraManager;
}
public void onDraw(Canvas canvas) {
Rect frame = this.cameraManager.getFramingRect();
if (frame != null) {
if (scannerStart == 0 || scannerEnd == 0) {
scannerStart = frame.top;
scannerEnd = frame.bottom;
}
drawExterior(canvas, frame, canvas.getWidth(), canvas.getHeight());
if (this.resultBitmap != null) {
this.paint.setAlpha(OPAQUE);
canvas.drawBitmap(this.resultBitmap, (float) frame.left, (float) frame.top, this.paint);
return;
}
drawFrame(canvas, frame);
drawCorner(canvas, frame);
drawLaserScanner(canvas, frame);
drawTextInfo(canvas, frame);
Collection currentPossible = this.possibleResultPoints;
Collection currentLast = this.lastPossibleResultPoints;
if (currentPossible.isEmpty()) {
this.lastPossibleResultPoints = null;
} else {
this.possibleResultPoints = new HashSet(5);
this.lastPossibleResultPoints = currentPossible;
this.paint.setAlpha(OPAQUE);
this.paint.setColor(this.resultPointColor);
for (ResultPoint point : currentPossible) {
canvas.drawCircle(((float) frame.left) + point.getX(), ((float) frame.top) + point.getY(), 6.0f, this.paint);
}
}
if (currentLast != null) {
this.paint.setAlpha(127);
this.paint.setColor(this.resultPointColor);
for (ResultPoint point2 : currentLast) {
canvas.drawCircle(((float) frame.left) + point2.getX(), ((float) frame.top) + point2.getY(), 3.0f, this.paint);
}
}
postInvalidateDelayed(ANIMATION_DELAY, frame.left, frame.top, frame.right, frame.bottom);
}
}
private void drawTextInfo(Canvas canvas, Rect frame) {
this.paint.setColor(this.labelTextColor);
this.paint.setTextSize(TypedValue.applyDimension(2, this.labelTextSize, getResources().getDisplayMetrics()));
this.paint.setTextAlign(Align.CENTER);
}
private void drawCorner(Canvas canvas, Rect frame) {
this.paint.setColor(this.cornerColor);
canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.left + CORNER_RECT_WIDTH), (float) (frame.top + CORNER_RECT_HEIGHT), this.paint);
canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.left + CORNER_RECT_HEIGHT), (float) (frame.top + CORNER_RECT_WIDTH), this.paint);
canvas.drawRect((float) (frame.right - 8), (float) frame.top, (float) frame.right, (float) (frame.top + CORNER_RECT_HEIGHT), this.paint);
canvas.drawRect((float) (frame.right - 40), (float) frame.top, (float) frame.right, (float) (frame.top + CORNER_RECT_WIDTH), this.paint);
canvas.drawRect((float) frame.left, (float) (frame.bottom - 8), (float) (frame.left + CORNER_RECT_HEIGHT), (float) frame.bottom, this.paint);
canvas.drawRect((float) frame.left, (float) (frame.bottom - 40), (float) (frame.left + CORNER_RECT_WIDTH), (float) frame.bottom, this.paint);
canvas.drawRect((float) (frame.right - 8), (float) (frame.bottom - 40), (float) frame.right, (float) frame.bottom, this.paint);
canvas.drawRect((float) (frame.right - 40), (float) (frame.bottom - 8), (float) frame.right, (float) frame.bottom, this.paint);
}
private void drawLaserScanner(Canvas canvas, Rect frame) {
this.paint.setColor(this.laserColor);
LinearGradient linearGradient = new LinearGradient((float) frame.left, (float) scannerStart, (float) frame.left, (float) (scannerStart + 10), shadeColor(this.laserColor), this.laserColor, TileMode.MIRROR);
RadialGradient radialGradient = new RadialGradient((float) (frame.left + (frame.width() / 2)), (float) (scannerStart + 5), 360.0f, this.laserColor, shadeColor(this.laserColor), TileMode.MIRROR);
SweepGradient sweepGradient = new SweepGradient((float) (frame.left + (frame.width() / 2)), (float) (scannerStart + 10), shadeColor(this.laserColor), this.laserColor);
ComposeShader composeShader = new ComposeShader(radialGradient, linearGradient, Mode.ADD);
this.paint.setShader(radialGradient);
if (scannerStart <= scannerEnd) {
canvas.drawOval(new RectF((float) (frame.left + 20), (float) scannerStart, (float) (frame.right - 20), (float) (scannerStart + 10)), this.paint);
scannerStart += 5;
} else {
scannerStart = frame.top;
}
this.paint.setShader(null);
}
public int shadeColor(int color) {
return Integer.valueOf("20" + Integer.toHexString(color).substring(2), 16).intValue();
}
private void drawFrame(Canvas canvas, Rect frame) {
this.paint.setColor(this.frameColor);
canvas.drawRect((float) frame.left, (float) frame.top, (float) (frame.right + 1), (float) (frame.top + 2), this.paint);
canvas.drawRect((float) frame.left, (float) (frame.top + 2), (float) (frame.left + 2), (float) (frame.bottom - 1), this.paint);
canvas.drawRect((float) (frame.right - 1), (float) frame.top, (float) (frame.right + 1), (float) (frame.bottom - 1), this.paint);
canvas.drawRect((float) frame.left, (float) (frame.bottom - 1), (float) (frame.right + 1), (float) (frame.bottom + 1), this.paint);
}
private void drawExterior(Canvas canvas, Rect frame, int width, int height) {
this.paint.setColor(this.resultBitmap != null ? this.resultColor : this.maskColor);
canvas.drawRect(0.0f, 0.0f, (float) width, (float) frame.top, this.paint);
canvas.drawRect(0.0f, (float) frame.top, (float) frame.left, (float) (frame.bottom + 1), this.paint);
canvas.drawRect((float) (frame.right + 1), (float) frame.top, (float) width, (float) (frame.bottom + 1), this.paint);
canvas.drawRect(0.0f, (float) (frame.bottom + 1), (float) width, (float) height, this.paint);
}
public void drawViewfinder() {
this.resultBitmap = null;
invalidate();
}
public void drawResultBitmap(Bitmap barcode) {
this.resultBitmap = barcode;
invalidate();
}
public void addPossibleResultPoint(ResultPoint point) {
this.possibleResultPoints.add(point);
}
public void setLabelText(String labelText) {
this.labelText = labelText;
}
public void setLabelTextColor(int labelTextColor) {
this.labelTextColor = labelTextColor;
}
public void setLabelTextSize(float labelTextSize) {
this.labelTextSize = labelTextSize;
}
}
FileUtil.java
public class FileUtil {
private static String TAG = "FileUtil";
public static String randomFileName(String ext) {
return Long.toString(System.currentTimeMillis()) + ext;
}
public static boolean deleteFile(String name) {
File file = new File(name);
if (file.exists()) {
return file.delete();
}
return false;
}
@SuppressWarnings("deprecation")
public static String rotateAndSaveBitmap(File file, int outW, int outH, String outPath, Bitmap.CompressFormat format, int quality) {
int originW;
int originH;
int orientation = ExifInterface.ORIENTATION_NORMAL;
try {
ExifInterface exif = new ExifInterface(file.getAbsolutePath());
orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
int w = exif.getAttributeInt(ExifInterface.TAG_IMAGE_WIDTH, 0);
int h = exif.getAttributeInt(ExifInterface.TAG_IMAGE_LENGTH, 0);
switch (orientation) {
case ExifInterface.ORIENTATION_UNDEFINED:
return null;
case ExifInterface.ORIENTATION_ROTATE_90:
case ExifInterface.ORIENTATION_ROTATE_270:
originW = h;
originH = w;
break;
default:
originW = w;
originH = h;
break;
}
} catch (Exception e) {
return null;
}
// boost decode bitmap performance
int sampleSize;
if (outW <= 0 && outH <= 0) {
sampleSize = 1;
outW = originW;
outH = originH;
} else if (outW <= 0 || outH <= 0) {
if (outW <= 0) {
sampleSize = originH / outH;
outW = (int) (originW * (outH / (float) originH));
} else {
sampleSize = originW / outW;
outH = (int) (originH * (outW / (float) originW));
}
} else {
sampleSize = Math.min(originW / outW, originH / outH);
}
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = sampleSize;
bmOptions.inPurgeable = true;
Bitmap bmp = BitmapFactory.decodeFile(file.getAbsolutePath(), bmOptions);
float scaleX = outW;
float scaleY = outH;
Matrix matrix = new Matrix();
switch (orientation) {
case ExifInterface.ORIENTATION_NORMAL:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_FLIP_HORIZONTAL:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.setScale(-1, 1);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_ROTATE_180:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.setRotate(180);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_FLIP_VERTICAL:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.setRotate(180);
matrix.postScale(-1, 1);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_TRANSPOSE:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.setRotate(90);
matrix.postScale(-1, 1);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_ROTATE_90:
scaleX /= bmp.getHeight();
scaleY /= bmp.getWidth();
matrix.setRotate(90);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_TRANSVERSE:
scaleX /= bmp.getWidth();
scaleY /= bmp.getHeight();
matrix.setRotate(-90);
matrix.postScale(-1, 1);
matrix.postScale(scaleX, scaleY);
break;
case ExifInterface.ORIENTATION_ROTATE_270:
scaleX /= bmp.getHeight();
scaleY /= bmp.getWidth();
matrix.setRotate(-90);
matrix.postScale(scaleX, scaleY);
break;
default:
break;
}
Bitmap bmpNew = Bitmap.createBitmap(bmp, 0, 0, bmp.getWidth(), bmp.getHeight(), matrix, true);
String newFile = bmp2File(bmpNew, outPath, format, quality);
bmpNew.recycle();
return newFile;
}
private static Point getBitmapSize(String file) {
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = true;
BitmapFactory.decodeFile(file, bmOptions);
int originW = bmOptions.outWidth;
int originH = bmOptions.outHeight;
return new Point(originW, originH);
}
@SuppressWarnings("deprecation")
public static String saveBitmap(File f, int outW, int outH, String outPath, Bitmap.CompressFormat format, int quality) {
String rotateFile = rotateAndSaveBitmap(f, outW, outH, outPath, format, quality);
if (null != rotateFile)
return rotateFile;
if (f == null) return "";
String file = f.getAbsolutePath();
// Get the dimensions of the bitmap
Point size = getBitmapSize(file);
int originW = size.x;
int originH = size.y;
// Determine how much to scale down the image
int scaleFactor;
if (outW <= 0 && outH <= 0) {
scaleFactor = 1;
outW = originW;
outH = originH;
} else if (outW <= 0 || outH <= 0) {
if (outW <= 0) {
scaleFactor = originH / outH;
outW = (int) (originW * (outH / (float) originH));
} else {
scaleFactor = originW / outW;
outH = (int) (originH * (outW / (float) originW));
}
} else {
scaleFactor = Math.min(originW / outW, originH / outH);
}
// Decode the image file into a Bitmap sized to fit out size
BitmapFactory.Options bmOptions = new BitmapFactory.Options();
bmOptions.inJustDecodeBounds = false;
bmOptions.inSampleSize = scaleFactor;
bmOptions.inPurgeable = true;
Bitmap bmp = BitmapFactory.decodeFile(file, bmOptions);
String newFile;
if (bmp == null) {
return "";
}
if (bmp.getWidth() != outW && bmp.getHeight() != outH) {
Bitmap bmpNew = Bitmap.createScaledBitmap(bmp, outW, outH, true);
newFile = bmp2File(bmpNew, outPath, format, quality);
bmpNew.recycle();
} else {
newFile = bmp2File(bmp, outPath, format, quality);
}
bmp.recycle();
return newFile;
}
public static Point calcScaleSize(String file, int outW, int outH) {
int originW = 0;
int originH = 0;
try {
ExifInterface exif = new ExifInterface(file);
int orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
int w = exif.getAttributeInt(ExifInterface.TAG_IMAGE_WIDTH, 0);
int h = exif.getAttributeInt(ExifInterface.TAG_IMAGE_LENGTH, 0);
switch (orientation) {
case ExifInterface.ORIENTATION_UNDEFINED:
break;
case ExifInterface.ORIENTATION_ROTATE_90:
case ExifInterface.ORIENTATION_ROTATE_270:
originW = h;
originH = w;
break;
default:
originW = w;
originH = h;
break;
}
} catch (Exception e) {
}
if (0 == originW || 0 == originH) {
Point size = getBitmapSize(file);
originW = size.x;
originH = size.y;
}
if (outW <= 0 && outH <= 0) {
outW = originW;
outH = originH;
} else if (outW <= 0 || outH <= 0) {
if (outW <= 0) {
outW = (int) (originW * (outH / (float) originH));
} else {
outH = (int) (originH * (outW / (float) originW));
}
} else {
}
return new Point(outW, outH);
}
public static String bmp2File(Bitmap bmp, String filePath, Bitmap.CompressFormat format, int quality) {
String fileName = FileUtil.randomFileName(Bitmap.CompressFormat.PNG == format ? ".png" : ".jpg");
String fullPath = filePath + fileName;
deleteFile(fileName);
File file = new File(fullPath);
try {
file.createNewFile();
} catch (IOException e) {
e.printStackTrace();
return null;
}
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
return null;
}
bmp.compress(format, quality, fOut);
try {
fOut.flush();
fOut.close();
fOut = null;
} catch (IOException e) {
e.printStackTrace();
return null;
} finally {
try {
if (null != fOut) {
fOut.close();
fOut = null;
}
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
return fileName;
}
public static final String insertImage(ContentResolver cr, String imagePath, String name, String description) throws FileNotFoundException {
FileInputStream inStream = new FileInputStream(imagePath);
return insertImage(cr, inStream, name, description);
}
/**
* A copy of the Android internals insertImage method, this method populates
* the meta data with DATE_ADDED and DATE_TAKEN. This fixes a common problem
* where media that is inserted manually gets saved at the end of the
* gallery (because date is not populated).
*
* @see Images.Media#insertImage(ContentResolver,
* Bitmap, String, String)
*/
public static final String insertImage(ContentResolver cr, FileInputStream inStream, String title, String description) {
ContentValues values = new ContentValues();
values.put(Images.Media.TITLE, title);
values.put(Images.Media.DISPLAY_NAME, title);
values.put(Images.Media.DESCRIPTION, description);
values.put(Images.Media.MIME_TYPE, "image/jpeg");
// Add the date meta data to ensure the image is added at the front of
// the gallery
values.put(Images.Media.DATE_ADDED, System.currentTimeMillis());
values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis());
Uri url = null;
String stringUrl = null; /* value to be returned */
try {
url = cr.insert(Images.Media.EXTERNAL_CONTENT_URI, values);
if (inStream != null) {
OutputStream imageOut = cr.openOutputStream(url);
byte[] buffer = new byte[1024];
int count = 0;
while ((count = inStream.read(buffer)) >= 0) {
imageOut.write(buffer, 0, count);
}
imageOut.flush();
inStream.close();
long id = ContentUris.parseId(url);
// Wait until MINI_KIND thumbnail is generated.
Bitmap miniThumb = Images.Thumbnails.getThumbnail(cr, id, Images.Thumbnails.MINI_KIND, null);
// This is for backward compatibility.
storeThumbnail(cr, miniThumb, id, 50F, 50F, Images.Thumbnails.MICRO_KIND);
} else {
cr.delete(url, null, null);
url = null;
}
} catch (Exception e) {
e.printStackTrace();
if (url != null) {
cr.delete(url, null, null);
url = null;
}
}
if (url != null) {
stringUrl = url.toString();
}
return stringUrl;
}
/**
* A copy of the Android internals StoreThumbnail method, it used with the
* insertImage to populate the
* android.provider.MediaStore.Images.Media#insertImage with all the correct
* meta data. The StoreThumbnail method is private so it must be duplicated
* here.
*
* @see Images.Media (StoreThumbnail private
* method)
*/
private static final Bitmap storeThumbnail(ContentResolver cr, Bitmap source, long id, float width, float height, int kind) {
// create the matrix to scale it
Matrix matrix = new Matrix();
float scaleX = width / source.getWidth();
float scaleY = height / source.getHeight();
matrix.setScale(scaleX, scaleY);
Bitmap thumb = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, true);
ContentValues values = new ContentValues(4);
values.put(Images.Thumbnails.KIND, kind);
values.put(Images.Thumbnails.IMAGE_ID, (int) id);
values.put(Images.Thumbnails.HEIGHT, thumb.getHeight());
values.put(Images.Thumbnails.WIDTH, thumb.getWidth());
Uri url = cr.insert(Images.Thumbnails.EXTERNAL_CONTENT_URI, values);
try {
OutputStream thumbOut = cr.openOutputStream(url);
thumb.compress(Bitmap.CompressFormat.JPEG, 100, thumbOut);
thumbOut.close();
return thumb;
} catch (FileNotFoundException ex) {
return null;
} catch (IOException ex) {
return null;
}
}
// /////////////////////////////////////////////////
@SuppressLint("NewApi")
public static String checkPicturePath(final Context context, final Uri uri) {
final boolean isKitKat = Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT;
// DocumentProvider
if (isKitKat && DocumentsContract.isDocumentUri(context, uri)) {
// ExternalStorageProvider
if (isExternalStorageDocument(uri)) {
final String docId = DocumentsContract.getDocumentId(uri);
final String[] split = docId.split(":");
final String type = split[0];
if ("primary".equalsIgnoreCase(type)) {
return Environment.getExternalStorageDirectory() + "/" + split[1];
}
}
// DownloadsProvider
else if (isDownloadsDocument(uri)) {
final String id = DocumentsContract.getDocumentId(uri);
final Uri contentUri = ContentUris.withAppendedId(Uri.parse("content://downloads/public_downloads"), Long.valueOf(id));
return getDataColumn(context, contentUri, null, null);
}
// MediaProvider
else if (isMediaDocument(uri)) {
final String docId = DocumentsContract.getDocumentId(uri);
final String[] split = docId.split(":");
final String type = split[0];
Uri contentUri = null;
if ("image".equals(type)) {
contentUri = Images.Media.EXTERNAL_CONTENT_URI;
} else if ("video".equals(type)) {
contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
} else if ("audio".equals(type)) {
contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI;
}
final String selection = "_id=?";
final String[] selectionArgs = new String[]{split[1]};
return getDataColumn(context, contentUri, selection, selectionArgs);
}
}
// MediaStore (and general)
else if ("content".equalsIgnoreCase(uri.getScheme())) {
// Return the remote address
if (isGooglePhotosUri(uri))
return uri.getLastPathSegment();
return getDataColumn(context, uri, null, null);
}
// File
else if ("file".equalsIgnoreCase(uri.getScheme())) {
return uri.getPath();
}
return null;
}
/**
* Get the value of the data column for this Uri. This is useful for
* MediaStore Uris, and other file-based ContentProviders.
*
* @param context The context.
* @param uri The Uri to query.
* @param selection (Optional) Filter used in the query.
* @param selectionArgs (Optional) Selection arguments used in the query.
* @return The value of the _data column, which is typically a file path.
*/
private static String getDataColumn(Context context, Uri uri, String selection, String[] selectionArgs) {
Cursor cursor = null;
final String column = "_data";
final String[] projection = {column};
try {
cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs, null);
if (cursor != null && cursor.moveToFirst()) {
final int index = cursor.getColumnIndexOrThrow(column);
return cursor.getString(index);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (cursor != null)
cursor.close();
}
return null;
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is ExternalStorageProvider.
*/
private static boolean isExternalStorageDocument(Uri uri) {
return "com.android.externalstorage.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is DownloadsProvider.
*/
private static boolean isDownloadsDocument(Uri uri) {
return "com.android.providers.downloads.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is MediaProvider.
*/
private static boolean isMediaDocument(Uri uri) {
return "com.android.providers.media.documents".equals(uri.getAuthority());
}
/**
* @param uri The Uri to check.
* @return Whether the Uri authority is Google Photos.
*/
private static boolean isGooglePhotosUri(Uri uri) {
return "com.google.android.apps.photos.content".equals(uri.getAuthority());
}
}
IMResUtil.java
public class IMResUtil {
private static final String TAG = "IMResUtil";
private static IMResUtil instance;
private Context context;
private static Class id = null;
private static Class layout = null;
private static Class style = null;
private static Class attr = null;
private static Class styleable = null;
public IMResUtil(Context paramContext) {
this.context = paramContext.getApplicationContext();
try {
layout = Class.forName(this.context.getPackageName() + ".R$layout");
} catch (ClassNotFoundException localClassNotFoundException2) {
Log.e(TAG,"localClassNotFoundException2 = " + localClassNotFoundException2.toString());
}
try {
id = Class.forName(this.context.getPackageName() + ".R$id");
} catch (ClassNotFoundException localClassNotFoundException3) {
Log.e(TAG,"localClassNotFoundException3 = " + localClassNotFoundException3.toString());
}
try {
style = Class.forName(this.context.getPackageName() + ".R$style");
} catch (ClassNotFoundException localClassNotFoundException5) {
Log.e(TAG,"localClassNotFoundException5 = " + localClassNotFoundException5.toString());
}
try {
attr = Class.forName(this.context.getPackageName() + ".R$attr");
} catch (ClassNotFoundException localClassNotFoundException10) {
Log.e(TAG,"localClassNotFoundException10 = " + localClassNotFoundException10.toString());
}
try {
styleable = Class.forName(this.context.getPackageName() + ".R$styleable");
} catch (ClassNotFoundException localClassNotFoundException10) {
Log.e(TAG,"localClassNotFoundException10 = " + localClassNotFoundException10.toString());
}
}
public static IMResUtil getResofR(Context paramContext) {
if (instance == null)
instance = new IMResUtil(paramContext);
return instance;
}
public int getId(String paramString) {
return getResofR(id, paramString);
}
public int getLayout(String paramString) {
return getResofR(layout, paramString);
}
public int getStyle(String paramString) {
return getResofR(style, paramString);
}
public int getAttr(String paramString) {return getResofR(attr, paramString);}
public int getStyleable(String paramString) {return getResofR(styleable, paramString);}
public int [] getStyleableArray(String paramString){
try {
Class clz = Class.forName(context.getPackageName()+".R$styleable");
Field field = clz.getField(paramString);
Object object = field.get(clz);
int[] attr = (int[]) object;
for (int i = 0; i < attr.length; i++) {
Log.d(TAG, attr[i]+"");
}
return attr;
} catch (Exception e) {
Log.d(TAG, e.getMessage());
e.printStackTrace();
}
return null;
}
private int getResofR(Class> paramClass, String paramString) {
if (paramClass == null) {
Log.d(TAG,"getRes(null," + paramString + ")");
Log.d(TAG,"Class is null ,ResClass is not initialized.");
return -1;
// throw new IllegalArgumentException("ResClass is not initialized.");
}
try {
Field localField = paramClass.getField(paramString);
int k = localField.getInt(paramString);
return k;
} catch (Exception localException) {
Log.e(TAG,"getRes(" + paramClass.getName() + ", " + paramString + ")");
Log.e(TAG,"Error getting resource. Make sure you have copied all resources (res/) from SDK to your project. ");
Log.e(TAG,localException.getMessage());
}
return -1;
}
}
xml
清单文件:
以上的总结如果能够帮助到你,希望给个赞,如果不能,或者有更好的推荐,不管是拍砖,还是怎样,烦请留言,提供链接,改动方案,或者提供更好的优化建议,谢谢!我们定会虚心采纳,谢谢!