1 背景
一般Unity都是RGB直接渲染的,但是总有特殊情况下,需要渲染YUV数据。比如,Unity读取Android的Camera YUV数据,并渲染。本文就基于这种情况,来展开讨论。
Unity读取Android的byte数组,本身就耗时,如果再把YUV数据转为RGB也在脚本中实现(即CPU运行),那就很卡了。
一种办法,就是这个转换,放在GPU完成,即,在shader实现!
接下来,分2块来贴出源码和实现。
2 YUV数据来源 ---- Android 侧
Android的Camera数据,一般是YUV格式的,最常用的就是NV21。其像素布局如下:

即数据排列是YYYYVUVU…
现在,Android就做一项工作,打开Camera,随便渲染到一个空纹理,然后呢,获得Preview的帧数据。
用一个SimpleCameraPlugin作为Unity调用Android的接口类:
代码如下:
package com.chenxf.unitycamerasdk;
import android.app.Activity;
import android.content.Context;
import android.graphics.Point;
import android.graphics.SurfaceTexture;
import android.hardware.Camera;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.os.Handler;
import android.os.Looper;
import android.os.Message;
import android.util.Log;
import android.util.Size;
/**
*
* 直接读取YUV数据,在unity渲染的方案
*/
public class SimpleCameraPlugin implements SurfaceTexture.OnFrameAvailableListener, Camera.PreviewCallback {
private static final String TAG = "qymv#CameraPlugin";
private final static int REQUEST_CODE = 1;
private final static int MSG_START_PREVIEW = 1;
private final static int MSG_SWITCH_CAMERA = 2;
private final static int MSG_RELEASE_PREVIEW = 3;
private final static int MSG_MANUAL_FOCUS = 4;
private final static int MSG_ROCK = 5;
private SurfaceTexture mSurfaceTexture;
private boolean mIsUpdateFrame;
private Context mContext;
private Handler mCameraHanlder;
private Size mExpectedPreviewSize;
private Size mPreviewSize;
private boolean isFocusing;
private int mWidth;
private int mHeight;
private byte[] yBuffer = null;
private byte[] uvBuffer = null;
public SimpleCameraPlugin() {
Log.i(TAG, " create");
initCameraHandler();
}
private void initCameraHandler() {
mCameraHanlder = new Handler(Looper.getMainLooper()) {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case MSG_START_PREVIEW:
startPreview();
break;
case MSG_RELEASE_PREVIEW:
releasePreview();
break;
case MSG_SWITCH_CAMERA:
//switchCamera();
break;
case MSG_MANUAL_FOCUS:
//manualFocus(msg.arg1, msg.arg2);
break;
case MSG_ROCK:
autoFocus();
break;
default:
break;
}
}
};
}
public void releasePreview() {
CameraUtil.releaseCamera();
// mCameraSensor.stop();
// mFocusView.cancelFocus();
Log.e(TAG, "releasePreview releaseCamera");
}
public void startPreview() {
//if (mPreviewSize != null && requestPermission() ) {
if (mExpectedPreviewSize != null) {
if (CameraUtil.getCamera() == null) {
CameraUtil.openCamera();
Log.e(TAG, "openCamera");
//CameraUtil.setDisplay(mSurfaceTexture);
}
Camera.Size previewSize = CameraUtil.startPreview((Activity) mContext, mExpectedPreviewSize.getWidth(), mExpectedPreviewSize.getHeight());
CameraUtil.setCallback(this);
if(previewSize != null) {
mWidth = previewSize.width;
mHeight = previewSize.height;
mPreviewSize = new Size(previewSize.width, previewSize.height);
initSurfaceTexture(previewSize.width, previewSize.height);
initBuffer(previewSize.width, previewSize.height);
CameraUtil.setDisplay(mSurfaceTexture);
}
}
}
private void initBuffer(int width, int height) {
yBuffer = new byte[width * height];
uvBuffer = new byte[width * height / 2];
}
public void autoFocus() {
if (CameraUtil.isBackCamera() && CameraUtil.getCamera() != null) {
focus(mWidth / 2, mHeight / 2, true);
}
}
private void focus(final int x, final int y, final boolean isAutoFocus) {
Log.i(TAG, "focus, position: " + x + " " + y);
if (CameraUtil.getCamera() == null || !CameraUtil.isBackCamera()) {
return;
}
if (isFocusing && isAutoFocus) {
return;
}
if (mWidth == 0 || mHeight == 0)
return;
isFocusing = true;
Point focusPoint = new Point(x, y);
Size screenSize = new Size(mWidth, mHeight);
if (!isAutoFocus) {
//mFocusView.beginFocus(x, y);
}
CameraUtil.newCameraFocus(focusPoint, screenSize, new Camera.AutoFocusCallback() {
@Override
public void onAutoFocus(boolean success, Camera camera) {
isFocusing = false;
if (!isAutoFocus) {
//mFocusView.endFocus(success);
}
}
});
}
/**
* 初始化
* 调用该函数需要EGL 线程,否则会出现如下错误
* libEGL : call to OpenGL ES API with no current context
*
* @param context android的context,最好传递activity的上下文
* @param width 纹理宽
* @param height 纹理高
*/
public void start(Context context, int width, int height) {
Log.w(TAG, "Start context " + context);
mContext

本文介绍了如何在Unity中处理从Android Camera获取的YUV数据,并通过GPU在Shader中转换为RGB进行实时渲染。Android端通过SurfaceTexture接收Camera预览帧,Unity端通过Android Java插件获取YUV数据,然后更新两个纹理,最终在RawImage上显示。详细步骤包括Android端的Camera开启、预览帧处理,以及Unity端的纹理加载和渲染。这种方式避免了CPU转换的性能瓶颈,提升了实时性。
最低0.47元/天 解锁文章
375

被折叠的 条评论
为什么被折叠?



