一、前言
之前一直想把 Camera 系列的写一下,拖了很久,现在慢慢填坑吧。
首先介绍 SurfaceView + Camera
的组合。虽然从 Android 5.0 后推荐使用 Camera2
了,不过某些旧工程或者需要适配低版本的场景还是用得着旧的 Camer API 的。
为什么选择 SurfaceView
?
SurfaceView
在自己独立的线程中绘制,不会影响到主线程,内部使用双缓冲机制,画面更流畅。相比于 TextureView
,它内存占用低,绘制更及时,耗时也更低,但不支持动画和截图。
下面是该应用的简要截图:
二、相机开发步骤
我们选择将 Camera 和 View 分开,Camera 的相关操作由 CameraProxy
类完成,而 View 持有一个 CameraProxy 对象。这样 CameraProxy 也是可以重复利用的。
1. 打开相机
打开相机需要传入一个 cameraId
,旧的 API 中只有 CAMERA_FACING_BACK
和 CAMERA_FACING_FRONT
两个值可以选择,返回一个 Camera
对象。
public void openCamera() {
mCamera = Camera.open(mCameraId); // 打开相机
Camera.getCameraInfo(mCameraId, mCameraInfo); // 获取相机信息
initConfig(); // 初始化相机配置
setDisplayOrientation(); // 设置相机显示方向
}
2. 初始化相机配置
在旧 Camera API 中,相机的配置都是通过 Parameters
类完成。
我们可以设置 闪光模式、聚焦模式、曝光强度、预览图片格式和大小、拍照图片格式和大小 等等信息。
private void initConfig() {
try {
mParameters = mCamera.getParameters();
// 如果摄像头不支持这些参数都会出错的,所以设置的时候一定要判断是否支持
List<String> supportedFlashModes = mParameters.getSupportedFlashModes();
if (supportedFlashModes != null && supportedFlashModes.contains(Parameters.FLASH_MODE_OFF)) {
mParameters.setFlashMode(Parameters.FLASH_MODE_OFF); // 设置闪光模式(关闭)
}
List<String> supportedFocusModes = mParameters.getSupportedFocusModes();
if (supportedFocusModes != null && supportedFocusModes.contains(Parameters.FOCUS_MODE_AUTO)) {
mParameters.setFocusMode(Parameters.FOCUS_MODE_AUTO); // 设置聚焦模式(自动)
}
mParameters.setPreviewFormat(ImageFormat.NV21); // 设置预览图片格式
mParameters.setPictureFormat(ImageFormat.JPEG); // 设置拍照图片格式
mParameters.setExposureCompensation(0); // 设置曝光强度
Size previewSize = getSuitableSize(mParameters.getSupportedPreviewSizes());
mPreviewWidth = previewSize.width;
mPreviewHeight = previewSize.height;
mParameters.setPreviewSize(mPreviewWidth, mPreviewHeight); // 设置预览图片大小
Log.d(TAG, "previewWidth: " + mPreviewWidth + ", previewHeight: " + mPreviewHeight);
Size pictureSize = getSuitableSize(mParameters.getSupportedPictureSizes());
mParameters.setPictureSize(pictureSize.width, pictureSize.height);
Log.d(TAG, "pictureWidth: " + pictureSize.width + ", pictureHeight: " + pictureSize.height);
mCamera.setParameters(mParameters); // 将设置好的parameters添加到相机里
} catch (Exception e) {
e.printStackTrace();
}
}
这里用到的一个 getSuitableSize
方法获取合数的 预览/拍照 尺寸。(后面贴完整代码)
3. 设置相机预览时的显示方向
这个方法很重要,关乎着你预览画面是否正常。其实相机底层的预览画面全都是宽度大于高度的,但是竖屏时画面要显示正常,都是通过这个方法设置了一定的显示方向。
private void setDisplayOrientation() {
int rotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
int result;
if (mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
result = (mCameraInfo.orientation + degrees) % 360;
result = (360 - result) % 360; // compensate the mirror
} else { // back-facing
result = (mCameraInfo.orientation - degrees + 360) % 360;
}
mCamera.setDisplayOrientation(result);
}
4. 开始预览、停止预览
可以通过 SurfaceView
的 getHolder
方法获取一个 SurfaceHolder
,设置给 Camera 系统即可帮你完成一系列复杂的绑定。
public void startPreview(SurfaceHolder holder) {
if (mCamera != null) {
try {
mCamera.setPreviewDisplay(holder); // 先绑定显示的画面
} catch (IOException e) {
e.printStackTrace();
}
mCamera.startPreview(); // 这里才是开始预览
}
}
public void stopPreview() {
if (mCamera != null) {
mCamera.stopPreview(); // 停止预览
}
}
5. 释放相机
相机是很耗费系统资源的东西,用完一定要释放。对应于 openCamera
。
public void releaseCamera() {
if (mCamera != null) {
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
6. 点击聚焦
简单的说,就是根据用户在 view 上的触摸点,使相机对该点进行一次对焦操作。
详细看我的这篇博客介绍的把:Android自定义相机定点聚焦
完整代码后面会贴的。
7. 双指放大缩小
我们也只实现放大缩小的逻辑,至于 View 的触摸交给 View 类去完成。
public void handleZoom(boolean isZoomIn) {
if (mParameters.isZoomSupported()) { // 首先还是要判断是否支持
int maxZoom = mParameters.getMaxZoom();
int zoom = mParameters.getZoom();
if (isZoomIn && zoom < maxZoom) {
zoom++;
} else if (zoom > 0) {
zoom--;
}
mParameters.setZoom(zoom); // 通过这个方法设置放大缩小
mCamera.setParameters(mParameters);
} else {
Log.w(TAG, "zoom not supported");
}
}
8. 拍照
拍照的逻辑我交给上层去完成了,后面再详细介绍把。这里我们只是简单的封装了一下元接口,一般常用的是 Camera.PictureCallback
,会返回可用的 jpeg 给我们。
public void takePicture(Camera.PictureCallback pictureCallback) {
mCamera.takePicture(null, null, pictureCallback);
}
9. 其它
诸如设置预览回调、切换前后摄像头的操作等直接看下面的实现把。另外对于 聚焦模式、闪光灯模式 等没有详细去介绍了,感兴趣的可以另外搜索相关模块。毕竟相机要介绍完全的话还是一块很大的东西。
10. CameraProxy 类
下面代码还用到了 OrientationEventListener
,这里之前没介绍,是通过传感器来获取当前手机的方向的,用于 拍照 的时候设置图片的选择使用,后面会介绍。
package com.afei.camerademo.camera;
import android.app.Activity;
import android.graphics.ImageFormat;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.hardware.Camera.CameraInfo;
import android.hardware.Camera.Parameters;
import android.hardware.Camera.PreviewCallback;
import android.hardware.Camera.Size;
import android.util.Log;
import android.view.OrientationEventListener;
import android.view.Surface;
import android.view.SurfaceHolder;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
@SuppressWarnings("deprecation")
public class CameraProxy implements Camera.AutoFocusCallback {
private static final String TAG = "CameraProxy";
private Activity mActivity;
private Camera mCamera;
private Parameters mParameters;
private CameraInfo mCameraInfo = new CameraInfo();
private int mCameraId = CameraInfo.CAMERA_FACING_BACK;
private int mPreviewWidth = 1440; // default 1440
private int mPreviewHeight = 1080; // default 1080
private float mPreviewScale = mPreviewHeight * 1f / mPreviewWidth;
private PreviewCallback mPreviewCallback; // 相机预览的数据回调
private OrientationEventListener mOrientationEventListener;
private int mLatestRotation = 0;
public byte[] mPreviewBuffer;
public CameraProxy(Activity activity) {
mActivity = activity;
mOrientationEventListener = new OrientationEventListener(mActivity) {
@Override
public void onOrientationChanged(int orientation) {
Log.d(TAG, "onOrientationChanged: orientation: " + orientation);
setPictureRotate(orientation);
}
};
}
public void openCamera() {
Log.d(TAG, "openCamera cameraId: " + mCameraId);
mCamera = Camera.open(mCameraId);
Camera.getCameraInfo(mCameraId, mCameraInfo);
initConfig();
setDisplayOrientation();
Log.d(TAG, "openCamera enable mOrientationEventListener");
mOrientationEventListener.enable();
}
public void releaseCamera() {
if (mCamera != null) {
Log.v(TAG, "releaseCamera");
mCamera.setPreviewCallback(null);
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
mOrientationEventListener.disable();
}
public void startPreview(SurfaceHolder holder) {
if (mCamera != null) {
Log.v(TAG, "startPreview");
try {
mCamera.setPreviewDisplay(holder);
} catch (IOException e) {
e.printStackTrace();
}
mCamera.startPreview();
}
}
public void startPreview(SurfaceTexture surface) {
if (mCamera != null) {
Log.v(TAG, "startPreview");
try {
mCamera.setPreviewTexture(surface);
} catch (IOException e) {
e.printStackTrace();
}
mCamera.startPreview();
}
}
public void stopPreview() {
if (mCamera != null) {
Log.v(TAG, "stopPreview");
mCamera.stopPreview();
}
}
public boolean isFrontCamera() {
return mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT;
}
private void initConfig() {
Log.v(TAG, "initConfig");
try {
mParameters = mCamera.getParameters();
// 如果摄像头不支持这些参数都会出错的,所以设置的时候一定要判断是否支持
List<String> supportedFlashModes = mParameters.getSupportedFlashModes();
if (supportedFlashModes != null && supportedFlashModes.contains(Parameters.FLASH_MODE_OFF)) {
mParameters.setFlashMode(Parameters.FLASH_MODE_OFF); // 设置闪光模式
}
List<String> supportedFocusModes = mParameters.getSupportedFocusModes();
if (supportedFocusModes != null && supportedFocusModes.contains(Parameters.FOCUS_MODE_AUTO)) {
mParameters.setFocusMode(Parameters.FOCUS_MODE_AUTO); // 设置聚焦模式
}
mParameters.setPreviewFormat(ImageFormat.NV21); // 设置预览图片格式
mParameters.setPictureFormat(ImageFormat.JPEG); // 设置拍照图片格式
mParameters.setExposureCompensation(0); // 设置曝光强度
Size previewSize = getSuitableSize(mParameters.getSupportedPreviewSizes());
mPreviewWidth = previewSize.width;
mPreviewHeight = previewSize.height;
mParameters.setPreviewSize(mPreviewWidth, mPreviewHeight); // 设置预览图片大小
Log.d(TAG, "previewWidth: " + mPreviewWidth + ", previewHeight: " + mPreviewHeight);
Size pictureSize = getSuitableSize(mParameters.getSupportedPictureSizes());
mParameters.setPictureSize(pictureSize.width, pictureSize.height);
Log.d(TAG, "pictureWidth: " + pictureSize.width + ", pictureHeight: " + pictureSize.height);
mCamera.setParameters(mParameters); // 将设置好的parameters添加到相机里
} catch (Exception e) {
e.printStackTrace();
}
}
private Size getSuitableSize(List<Size> sizes) {
int minDelta = Integer.MAX_VALUE; // 最小的差值,初始值应该设置大点保证之后的计算中会被重置
int index = 0; // 最小的差值对应的索引坐标
for (int i = 0; i < sizes.size(); i++) {
Size previewSize = sizes.get(i);
Log.v(TAG, "SupportedPreviewSize, width: " + previewSize.width + ", height: " + previewSize.height);
// 找到一个与设置的分辨率差值最小的相机支持的分辨率大小
if (previewSize.width * mPreviewScale == previewSize.height) {
int delta = Math.abs(mPreviewWidth - previewSize.width);
if (delta == 0) {
return previewSize;
}
if (minDelta > delta) {
minDelta = delta;
index = i;
}
}
}
return sizes.get(index); // 默认返回与设置的分辨率最接近的预览尺寸
}
/**
* 设置相机显示的方向,必须设置,否则显示的图像方向会错误
*/
private void setDisplayOrientation() {
int rotation = mActivity.getWindowManager().getDefaultDisplay().getRotation();
int degrees = 0;
switch (rotation) {
case Surface.ROTATION_0:
degrees = 0;
break;
case Surface.ROTATION_90:
degrees = 90;
break;
case Surface.ROTATION_180:
degrees = 180;
break;
case Surface.ROTATION_270:
degrees = 270;
break;
}
int result;
if (mCameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
result = (mCameraInfo.orientation + degrees) % 360;
result = (360 - result) % 360; // compensate the mirror
} else { // back-facing
result = (mCameraInfo.orientation - degrees + 360) % 360;
}
mCamera.setDisplayOrientation(result);
}
private void setPictureRotate(int orientation) {
if (orientation == OrientationEventListener.ORIENTATION_UNKNOWN) return;
orientation = (orientation + 45) / 90 * 90;
int rotation = 0;
if (mCameraInfo.facing == CameraInfo.CAMERA_FACING_FRONT) {
rotation = (mCameraInfo.orientation - orientation + 360) % 360;
} else { // back-facing camera
rotation = (mCameraInfo.orientation + orientation) % 360;
}
Log.d(TAG, "picture rotation: " + rotation);
mLatestRotation = rotation;
}
public int getLatestRotation() {
return mLatestRotation;
}
public void setPreviewCallback(PreviewCallback previewCallback) {
mPreviewCallback = previewCallback;
if (mPreviewBuffer == null) {
mPreviewBuffer = new byte[mPreviewWidth * mPreviewHeight * 3 / 2];
}
mCamera.addCallbackBuffer(mPreviewBuffer);
mCamera.setPreviewCallbackWithBuffer(mPreviewCallback); // 设置预览的回调
}
public void takePicture(Camera.PictureCallback pictureCallback) {
mCamera.takePicture(null, null, pictureCallback);
}
public void switchCamera() {
mCameraId ^= 1; // 先改变摄像头朝向
releaseCamera();
openCamera();
}
public void focusOnPoint(int x, int y, int width, int height) {
Log.v(TAG, "touch point (" + x + ", " + y + ")");
if (mCamera == null) {
return;
}
Parameters parameters = mCamera.getParameters();
// 1.先要判断是否支持设置聚焦区域
if (parameters.getMaxNumFocusAreas() > 0) {
// 2.以触摸点为中心点,view窄边的1/4为聚焦区域的默认边长
int length = Math.min(width, height) >> 3; // 1/8的长度
int left = x - length;
int top = y - length;
int right = x + length;
int bottom = y + length;
// 3.映射,因为相机聚焦的区域是一个(-1000,-1000)到(1000,1000)的坐标区域
left = left * 2000 / width - 1000;
top = top * 2000 / height - 1000;
right = right * 2000 / width - 1000;
bottom = bottom * 2000 / height - 1000;
// 4.判断上述矩形区域是否超过边界,若超过则设置为临界值
left = left < -1000 ? -1000 : left;
top = top < -1000 ? -1000 : top;
right = right > 1000 ? 1000 : right;
bottom = bottom > 1000 ? 1000 : bottom;
Log.d(TAG, "focus area (" + left + ", " + top + ", " + right + ", " + bottom + ")");
ArrayList<Camera.Area> areas = new ArrayList<>();
areas.add(new Camera.Area(new Rect(left, top, right, bottom), 600));
parameters.setFocusAreas(areas);
}
try {
mCamera.cancelAutoFocus(); // 先要取消掉进程中所有的聚焦功能
mCamera.setParameters(parameters);
mCamera.autoFocus(this); // 调用聚焦
} catch (Exception e) {
e.printStackTrace();
}
}
public void handleZoom(boolean isZoomIn) {
if (mParameters.isZoomSupported()) {
int maxZoom = mParameters.getMaxZoom();
int zoom = mParameters.getZoom();
if (isZoomIn && zoom < maxZoom) {
zoom++;
} else if (zoom > 0) {
zoom--;
}
Log.d(TAG, "handleZoom: zoom: " + zoom);
mParameters.setZoom(zoom);
mCamera.setParameters(mParameters);
} else {
Log.i(TAG, "zoom not supported");
}
}
public Camera getCamera() {
return mCamera;
}
public int getPreviewWidth() {
return mPreviewWidth;
}
public int getPreviewHeight() {
return mPreviewHeight;
}
@Override
public void onAutoFocus(boolean success, Camera camera) {
Log.d(TAG, "onAutoFocus: " + success);
}
}
三、CameraSurfaceView
通过上面的介绍,对于相机的操作应该有了一定的了解了,接下来完成 View 这部分。
需求分析:
CameraSurfaceView
是要继承SurfaceView
的。- 我们需要重写
onMeasure
使得CameraSurfaceView
的宽高可以和相机预览尺寸相匹配,这样就不会有画面被拉伸的感觉了。 - 我们需要在
CameraSurfaceView
中完成对相机的打开、关闭等操作,值得庆幸的是我们可以通过上面的CameraProxy
很容易的做到。 - 我们需要重写
onTouchEvent
方法,来实现单点聚焦,双指放大缩小的功能。
实现:
主要是在 SurfaceHolder.Callback
的几个回调方法中打开和释放相机,另外就是重写 onMeasure
,onTouchEvent
那几个方法。
package com.afei.camerademo.surfaceview;
import android.app.Activity;
import android.content.Context;
import android.util.AttributeSet;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import com.afei.camerademo.camera.CameraProxy;
public class CameraSurfaceView extends SurfaceView {
private CameraProxy mCameraProxy;
private int mRatioWidth = 0;
private int mRatioHeight = 0;
private float mOldDistance;
public CameraSurfaceView(Context context) {
this(context, null);
}
public CameraSurfaceView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public CameraSurfaceView(Context context, AttributeSet attrs, int defStyleAttr) {
this(context, attrs, defStyleAttr, 0);
}
public CameraSurfaceView(Context context, AttributeSet attrs, int defStyleAttr, int defStyleRes) {
super(context, attrs, defStyleAttr, defStyleRes);
init(context);
}
private void init(Context context) {
getHolder().addCallback(mSurfaceHolderCallback);
mCameraProxy = new CameraProxy((Activity) context);
}
private final SurfaceHolder.Callback mSurfaceHolderCallback = new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder holder) {
mCameraProxy.openCamera();
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
int previewWidth = mCameraProxy.getPreviewWidth();
int previewHeight = mCameraProxy.getPreviewHeight();
if (width > height) {
setAspectRatio(previewWidth, previewHeight);
} else {
setAspectRatio(previewHeight, previewWidth);
}
mCameraProxy.startPreview(holder);
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mCameraProxy.releaseCamera();
}
};
public void setAspectRatio(int width, int height) {
if (width < 0 || height < 0) {
throw new IllegalArgumentException("Size cannot be negative.");
}
mRatioWidth = width;
mRatioHeight = height;
requestLayout();
}
public CameraProxy getCameraProxy() {
return mCameraProxy;
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = MeasureSpec.getSize(heightMeasureSpec);
if (0 == mRatioWidth || 0 == mRatioHeight) {
setMeasuredDimension(width, height);
} else {
if (width < height * mRatioWidth / mRatioHeight) {
setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
} else {
setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
}
}
}
@Override
public boolean onTouchEvent(MotionEvent event) {
if (event.getPointerCount() == 1) {
// 点击聚焦
mCameraProxy.focusOnPoint((int) event.getX(), (int) event.getY(), getWidth(), getHeight());
return true;
}
switch (event.getAction() & MotionEvent.ACTION_MASK) {
case MotionEvent.ACTION_POINTER_DOWN:
mOldDistance = getFingerSpacing(event);
break;
case MotionEvent.ACTION_MOVE:
float newDistance = getFingerSpacing(event);
if (newDistance > mOldDistance) {
mCameraProxy.handleZoom(true);
} else if (newDistance < mOldDistance) {
mCameraProxy.handleZoom(false);
}
mOldDistance = newDistance;
break;
default:
break;
}
return super.onTouchEvent(event);
}
private static float getFingerSpacing(MotionEvent event) {
float x = event.getX(0) - event.getX(1);
float y = event.getY(0) - event.getY(1);
return (float) Math.sqrt(x * x + y * y);
}
}
四、SurfaceCameraActivity
接下来,我们把写好的 CameraSurfaceView
放在 Activity 或者 Fragment 中使用就行了。
注意相机使用前,需要申请相关权限,以及权限的动态申请。
1. AndroidManifest.xml
相机相关权限如下,动态权限的申请代码很多,这里不详细介绍了,不清楚的可以看这篇博客:Android动态权限申请
<uses-permission android:name="android.permission.CAMERA"/>
<uses-feature android:name="android.hardware.camera"/>
<uses-feature android:name="android.hardware.camera.autofocus"/>
2. 拍照功能
之前只是预留了拍照的功能,并等待外面出传入一个接口回调,这里我们在拍照键点下的时候执行拍照操作就行。
当拍照完成时,会自动调用 onPictureTaken
方法,我们在这个回调中执行保存的操作。保存图片是耗时操作,我们不要放在主线程中执行。
SurfaceCameraActivity 完整代码:
package com.afei.camerademo.surfaceview;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.hardware.Camera;
import android.os.AsyncTask;
import android.os.Bundle;
import android.provider.MediaStore;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import com.afei.camerademo.ImageUtils;
import com.afei.camerademo.R;
import com.afei.camerademo.camera.CameraProxy;
public class SurfaceCameraActivity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SurfaceCameraActivity";
private ImageView mCloseIv;
private ImageView mSwitchCameraIv;
private ImageView mTakePictureIv;
private ImageView mPictureIv;
private CameraSurfaceView mCameraView;
private CameraProxy mCameraProxy;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_surface_camera);
initView();
}
private void initView() {
mCloseIv = findViewById(R.id.toolbar_close_iv);
mCloseIv.setOnClickListener(this);
mSwitchCameraIv = findViewById(R.id.toolbar_switch_iv);
mSwitchCameraIv.setOnClickListener(this);
mTakePictureIv = findViewById(R.id.take_picture_iv);
mTakePictureIv.setOnClickListener(this);
mPictureIv = findViewById(R.id.picture_iv);
mPictureIv.setOnClickListener(this);
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
mCameraView = findViewById(R.id.camera_view);
mCameraProxy = mCameraView.getCameraProxy();
}
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.toolbar_close_iv:
finish();
break;
case R.id.toolbar_switch_iv:
mCameraProxy.switchCamera();
mCameraProxy.startPreview(mCameraView.getHolder());
break;
case R.id.take_picture_iv:
mCameraProxy.takePicture(mPictureCallback);
break;
case R.id.picture_iv:
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivity(intent);
break;
}
}
private final Camera.PictureCallback mPictureCallback = new Camera.PictureCallback() {
@Override
public void onPictureTaken(byte[] data, Camera camera) {
mCameraProxy.startPreview(mCameraView.getHolder()); // 拍照结束后继续预览
new ImageSaveTask().execute(data); // 保存图片
}
};
private class ImageSaveTask extends AsyncTask<byte[], Void, Void> {
@Override
protected Void doInBackground(byte[]... bytes) {
long time = System.currentTimeMillis();
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes[0], 0, bytes[0].length);
Log.d(TAG, "BitmapFactory.decodeByteArray time: " + (System.currentTimeMillis() - time));
int rotation = mCameraProxy.getLatestRotation();
time = System.currentTimeMillis();
Bitmap rotateBitmap = ImageUtils.rotateBitmap(bitmap, rotation, mCameraProxy.isFrontCamera(), true);
Log.d(TAG, "rotateBitmap time: " + (System.currentTimeMillis() - time));
time = System.currentTimeMillis();
ImageUtils.saveBitmap(rotateBitmap);
Log.d(TAG, "saveBitmap time: " + (System.currentTimeMillis() - time));
return null;
}
@Override
protected void onPostExecute(Void aVoid) {
mPictureIv.setImageBitmap(ImageUtils.getLatestThumbBitmap());
}
}
}
另外,我又尝试过通过 Camera.Parameters
类的 setRotation()
方法来设置拍照的旋转信息,但是没见有效果,不知道哪里有问题。
最后还是选择使用手动旋转 Bitmap 的方式来完成图片的旋转并保存。
附上 ImageUtils
代码:
package com.afei.camerademo;
import android.content.ContentResolver;
import android.content.ContentValues;
import android.content.Context;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.Matrix;
import android.os.Environment;
import android.provider.MediaStore;
import android.util.Log;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
public class ImageUtils {
private static final String TAG = "ImageUtils";
private static final String GALLERY_PATH = Environment.getExternalStoragePublicDirectory(Environment
.DIRECTORY_DCIM) + File.separator + "Camera";
private static final SimpleDateFormat DATE_FORMAT = new SimpleDateFormat("yyyyMMdd_HHmmss");
public static Bitmap rotateBitmap(Bitmap source, int degree, boolean flipHorizontal, boolean recycle) {
if (degree == 0) {
return source;
}
Matrix matrix = new Matrix();
matrix.postRotate(degree);
if (flipHorizontal) {
matrix.postScale(-1, 1); // 前置摄像头存在水平镜像的问题,所以有需要的话调用这个方法进行水平镜像
}
Bitmap rotateBitmap = Bitmap.createBitmap(source, 0, 0, source.getWidth(), source.getHeight(), matrix, false);
if (recycle) {
source.recycle();
}
return rotateBitmap;
}
public static void saveBitmap(Bitmap bitmap) {
String fileName = DATE_FORMAT.format(new Date(System.currentTimeMillis())) + ".jpg";
File outFile = new File(GALLERY_PATH, fileName);
Log.d(TAG, "saveImage. filepath: " + outFile.getAbsolutePath());
FileOutputStream os = null;
try {
os = new FileOutputStream(outFile);
boolean success = bitmap.compress(Bitmap.CompressFormat.JPEG, 100, os);
if (success) {
insertToDB(outFile.getAbsolutePath());
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (os != null) {
try {
os.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
public static void insertToDB(String picturePath) {
ContentValues values = new ContentValues();
ContentResolver resolver = MyApp.getInstance().getContentResolver();
values.put(MediaStore.Images.ImageColumns.DATA, picturePath);
values.put(MediaStore.Images.ImageColumns.TITLE, picturePath.substring(picturePath.lastIndexOf("/") + 1));
values.put(MediaStore.Images.ImageColumns.DATE_TAKEN, System.currentTimeMillis());
values.put(MediaStore.Images.ImageColumns.MIME_TYPE, "image/jpeg");
resolver.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
}
}
五、项目地址
部分没有贴出来的代码,可在下面地址中找到。
地址:
https://github.com/afei-cn/CameraDemo/tree/master/app/src/main/java/com/afei/camerademo/surfaceview
其它:
自定义Camera系列之:TextureView + Camera
自定义Camera系列之:GLSurfaceViewView + Camera
自定义Camera系列之:SurfaceView + Camera2