应用层 Java App <----->Framework层Java类android.hardware.camera Framework层Java类android.hardware.camera <------>libandroid_runtime.so(android_hardware_camera.cpp)
libandroid_runtime.so(android_hardware_camera.cpp)<------>libcamera_client.so camera
libcamera_client.so camera<------>CameraService libcameraservice.so(Binder)
CameraService libcameraservice.so(Binder)<------>虚拟Camera libcamrastub.so
CameraService libcameraservice.so(Binder)<------>HAL libcamera.so
HAL libcamera.so<------>Linux kernel Camera driver(V4L2)
其中虚拟Camera libcamrastub.so指的是在没有Camera硬件是系统虚拟化出来的Camera 设备请看框架图
Android框架中preview 数据的现实过程
1.打开内核设备文件(/dev/video0)创建初始化一些相关的类
2.设置摄像头的工作参数setParameters() 哪一个摄像头,具体的size format并在HAL层
分配存储preview数据的buffers
3.设置显示目标需在java APP中创建一个surface 然后传递到CameraService中。
下面具体分析Camera相关的类
1.CameraManager.java
含有3个内部类
public static abstract class AvailabilityCallback
public static abstract class TorchCallback
private static final class CameraManagerGlobal extends ICameraServiceListener.Stub
implements IBinder.DeathRecipient
从字面上看就能大概猜到 前两个应该是与回调有关 第三个应该就是Binder通信用的
重点先看CameraManagerGlobal分析代码可知CameraManager其实是用单例模式来响应调用,大部分的调用都会转到
CameraManagerGlobal里面的调用进行处理,这其实是单例模式的一种表现形式。
IBinder cameraServiceBinder = ServiceManager.getService(CAMERA_SERVICE_BINDER_NAME);
if (cameraServiceBinder == null) {
// Camera service is now down, leave mCameraService as null
return;
}
try {
cameraServiceBinder.linkToDeath(this, /*flags*/ 0);
} catch (RemoteException e) {
// Camera service is now down, leave mCameraService as null
return;
}
ICameraService cameraService = ICameraService.Stub.asInterface(cameraServiceBinder);
此处是CameraManager 调用CameraService的功能的关键代码。CameraService作为Service 端,CameraManager 作为Client 端 调用其功能必须要通过代理对象来实现。
CameraManager 中有很多回调函数均是调用方注册的,在合适的条件下会被CameraService 调用以使Camera运行的流程顺畅。
CameraMetadataNative info = cameraService.getCameraCharacteristics(cameraId);
try {
info.setCameraId(Integer.parseInt(cameraId));
} catch (NumberFormatException e) {
Log.e(TAG, "Failed to parse camera Id " + cameraId + " to integer");
}
info.setDisplaySize(displaySize);
characteristics = new CameraCharacteristics(info);
下面讲Camera.java类(frameworks/base/core/java/android/hardware/Camera.java)
先看它有几个内部类
EventHandler
IAppOpsCallbackWrapper extends IAppOpsCallback.Stub
Face Size Area
CameraInfo Parameters
在分析Camera.java文件时需要主要这个函数setParameters() 这个函数调用native的函数实现对Camera参数的设置—native_setParameters
同时我们要看看这几个本地函数
native_autoFocus native_cancelAutoFocus
native_setParameters native_getParameters
native_release native_setup
native_takePicture
由此我们需要看frameworks/base/core/jni/android_hardware_Camera.cpp出的代码
native_setup android_hardware_Camera_native_setup
如下三个文件就与native_setup有关
frameworks/base/core/jni/android_hardware_Camera.cpp
frameworks/av/services/camera/libcameraservice/api1/CameraClient.cpp
frameworks/av/services/camera/libcameraservice/api1/Camera2Client.cpp
frameworks/av/camera/ICamera.cpp
frameworks/av/camera/ICameraClient.cpp
frameworks/av/camera/aidl/android/hardware/ICameraService.aidl 通过aidl编译生成
ICameraService.java文件
IBinder cameraServiceBinder = getBinderService(CAMERA_SERVICE_BINDER_NAME);
if (cameraServiceBinder == null) {
Slog.w(TAG, "Could not notify cameraserver, camera service not available.");
return false; // Camera service not active, cannot evict user clients.
}
try {
cameraServiceBinder.linkToDeath(this, /*flags*/ 0);
} catch (RemoteException e) {
Slog.w(TAG, "Could not link to death of native camera service");
return false;
}
mCameraServiceRaw = ICameraService.Stub.asInterface(cameraServiceBinder);
由上述代码可知 ICamera,ICameraClient, ICamaerService是Binder用于IPC通信的
很好奇那么关于Camera的最开始的Service是谁? 怎么启动的?如何调用它的函数?
Camera最开始的Service是:cameraserver
frameworks/av/camera/cameraserver/main_cameraserver.cpp 这个文件就一个函数main()
CameraService::instantiate(); frameworks/av/services/camera/libcameraservice/CameraService.cpp
class CameraService :
public BinderService<CameraService>,
public virtual ::android::hardware::BnCameraService,
public virtual IBinder::DeathRecipient,
public virtual CameraProviderManager::StatusListener
很显然 CameraService会实现Binder间通信的相关操作 即它是作为Bn端
frameworks/av/camera/Camera.cpp 中
status_t Camera::startPreview()
{
ALOGV("startPreview");
sp <::android::hardware::ICamera> c = mCamera;
if (c == 0) return NO_INIT;
int permission = cameraDoJudge();
if (permission < 0) {
return INVALID_OPERATION;
}
return c->startPreview();
}
调用的是frameworks/av/camera/ICamera.cpp
status_t startPreview()
{
ALOGV("startPreview");
Parcel data, reply;
data.writeInterfaceToken(ICamera::getInterfaceDescriptor());
remote()->transact(START_PREVIEW, data, &reply);
return reply.readInt32();
}
case START_PREVIEW: {
ALOGV("START_PREVIEW");
CHECK_INTERFACE(ICamera, data, reply);
reply->writeInt32(startPreview());
return NO_ERROR;
} break;
下一步会调用到哪里呢?
(图三)
查看Camera.h Camera.cpp frameworks/av/camera/Camera.cpp
setPreviewTarget() startPreview() stopPreview(); previewEnabled();
class Camera :
public CameraBase<Camera>,
public ::android::hardware::BnCameraClient
由此可知 Camera.cpp 中的Camera类 实现了CameraClient Bn端的功能
可以看看ICameraClient BnCameraClient BpCameraClient 三者的关系
很显然ICameraClient 是建立Binder通信的机制的 Bp端作为Proxy端 Bn端作为Service端
客户端通过调用Bp端的函数达到调用Bn端的真正功能 只不过上述Binder通信是在native层的,不是java层,看看ICameraClient.cpp函数就可以知道Bp端其实仅仅是调用远程的remote()->transact(NOTIFY_CALLBACK, data, &reply, IBinder::FLAG_ONEWAY); 而remote 就是Bn端。而Bn端的实现是在Camera.cpp中,看Camera.h中的代码
class Camera : public CameraBase<Camera>, public ::android::hardware::BnCameraClient
下面看ICamera.h ICamera.cpp
首先class ICamera: public android::IInterface 说明ICamera只是用于进程间通信的接口—Binder 分为Bn端BnCamera 和Bp端 BpCamera。 查看ICamera.cpp代码可知 ICamera.cpp
是Bp端,Bn端在CameraService.h中的Client类中class Client : public hardware::BnCamera, public BasicClient 而针对ICamera的服务端并不是CameraService.cpp中实现的而是ICameraClient的Bp端。而ICameraClient的服务端是:Camera.cpp。结合图三分析如下:
现在有三组Binder对象:CameraService,Camera,CameraClient
CameraService的Bn对象在CameraService.cpp中,Bp端在Camera.cpp中
查看CameraService的继承关系
class CameraService :
public BinderService<CameraService>,
public virtual ::android::hardware::BnCameraService,
public virtual IBinder::DeathRecipient,
public virtual CameraProviderManager::StatusListener
再看ICameraService.h文件是没有的,有的是ICameraService.aidl。再看里面的函数形式:
virtual binder::Status getNumberOfCameras(int32_t type, int32_t* numCameras);
virtual binder::Status getCameraVendorTagDescriptor(
/*out*/
hardware::camera2::params::VendorTagDescriptor* desc);
都带有关键字"binder"说明这个是继承与java代码,而不是native层的代码以getNumberOfCameras为例说明
Status CameraService::getNumberOfCameras(int32_t type, int32_t* numCameras) {
ATRACE_CALL();
Mutex::Autolock l(mServiceLock);
switch (type) {
case CAMERA_TYPE_BACKWARD_COMPATIBLE:
*numCameras = static_cast<int>(mNormalDeviceIds.size());
break;
case CAMERA_TYPE_ALL:
*numCameras = mNumberOfCameras;
break;
default:
ALOGW("%s: Unknown camera type %d",
__FUNCTION__, type);
return STATUS_ERROR_FMT(ERROR_ILLEGAL_ARGUMENT,
"Unknown camera type %d", type);
}
return Status::ok();
}
继续跟进代码可知 getNumberOfCameras()函数带有两个参数而java层调用是没有参数的。原来前面的Camera.cpp 中有古怪!
class Camera :
public CameraBase<Camera>,
public ::android::hardware::BnCameraClient
Camera 类是继承CameraBase<Camera>的,而CameraBase其实是一个模板类,其cpp文件中有调用CameraService的getNumberOfCameras
template <typename TCam, typename TCamTraits>
int CameraBase<TCam, TCamTraits>::getNumberOfCameras() {
const sp<::android::hardware::ICameraService> cs = getCameraService();
if (!cs.get()) {
// as required by the public Java APIs
return 0;
}
int32_t count;
binder::Status res = cs->getNumberOfCameras(
::android::hardware::ICameraService::CAMERA_TYPE_BACKWARD_COMPATIBLE,
&count);
if (!res.isOk()) {
ALOGE("Error reading number of cameras: %s",
res.toString8().string());
count = 0;
}
return count;
}
而CameraBase向外提供接口时是没有参数的,而Camera是继承CameraBase的,在jni层调用getNumberOfCameras()时其实是调用CameraBase中的函数实现功能的。这中间绕了一层。我们再看图三,图中BnCamera,BpCamera这两个类也是Binder通信是要用的。查看可知,BnCamera被CameraService的Client内部类继承,继续跟进时可知,Client类也相当于基类,CameraClient类和Camera2Client才是真正实现Client类的功能的。而Bp端则位于Camera.cpp中。其实主要的一些相机使用的功能在Camera.cpp中就有体现。比如预览,录像,对焦,拍照,相机参数设置等。
再来看看CameraClient类的Binder通信BpCameraClient,BnCameraClient。
BpCameraClient在ICameraClient.cpp中,CameraService中有其变量,这边我猜测应该是Camera端传递给CameraService的因为CameraService端有时也是需要回调Camera端写好的功能的。请看CameraService中的此函数
Status CameraService::makeClient(const sp<CameraService>& cameraService,
const sp<IInterface>& cameraCb, const String16& packageName, const String8& cameraId,
int api1CameraId, int facing, int clientPid, uid_t clientUid, int servicePid,
int halVersion, int deviceVersion, apiLevel effectiveApiLevel,
/*out*/sp<BasicClient>* client) {
case CAMERA_DEVICE_API_VERSION_1_0:
if (effectiveApiLevel == API_1) { // Camera1 API route
sp<ICameraClient> tmp = static_cast<ICameraClient*>(cameraCb.get());
*client = new CameraClient(cameraService, tmp, packageName,
api1CameraId, facing, clientPid, clientUid,
getpid());
} else { // Camera2 API route
ALOGW("Camera using old HAL version: %d", deviceVersion);
return STATUS_ERROR_FMT(ERROR_DEPRECATED_HAL,
"Camera device \"%s\" HAL version %d does not support camera2 API",
cameraId.string(), deviceVersion);
}
break;
case CAMERA_DEVICE_API_VERSION_3_0:
case CAMERA_DEVICE_API_VERSION_3_1:
case CAMERA_DEVICE_API_VERSION_3_2:
case CAMERA_DEVICE_API_VERSION_3_3:
case CAMERA_DEVICE_API_VERSION_3_4:
case CAMERA_DEVICE_API_VERSION_3_5:
if (effectiveApiLevel == API_1) { // Camera1 API route
sp<ICameraClient> tmp = static_cast<ICameraClient*>(cameraCb.get());
*client = new Camera2Client(cameraService, tmp, packageName,
cameraId, api1CameraId,
facing, clientPid, clientUid,
servicePid);
} else { // Camera2 API route
sp<hardware::camera2::ICameraDeviceCallbacks> tmp =
static_cast<hardware::camera2::ICameraDeviceCallbacks*>(cameraCb.get());
*client = new CameraDeviceClient(cameraService, tmp, packageName, cameraId,
facing, clientPid, clientUid, servicePid);
}
break;
前面Camera.cpp 大家都看过显然对于CameraClient的,Camera类就是Bn端。大家可以看看ICameraClient定义的Binder通信接口有哪些。
class ICameraClient: public android::IInterface
{
public:
DECLARE_META_INTERFACE(CameraClient);
virtual void notifyCallback(int32_t msgType, int32_t ext1, int32_t ext2) = 0;
virtual void dataCallback(int32_t msgType, const sp<IMemory>& data,
camera_frame_metadata_t *metadata) = 0;
virtual void taCallbackTimestamp(nsecs_t timestamp, int32_t msgType,
const sp<IMemory>& data) = 0;
virtual void recordingFrameHandleCallbackTimestamp(nsecs_t timestamp,
native_handle_t* handle) = 0;
virtual void recordingFrameHandleCallbackTimestampBatch(
const std::vector<nsecs_t>& timestamps,
const std::vector<native_handle_t*>& handles) = 0;
};
几乎都带有Callback关键字显然就是回调,供CameraService在某个时刻调用,来实现功能。
可以这样说,Camera框架的复杂性的一个重要原因是因为它是在nativie层实现Binder通信。相较于java层而言,确实有些不适应。其实我们还忽略了一个重要的服务"media.camera"。这个我想在了解Camera框架的最后阶段来看,因为它是一个native层的service。调用,启动流程上相较于其他java service而言有所不同。其实如果我们如果想从CameraService这里查看调用HAL层的话,可以具体看看这两个类Camera2Client,CameraClient
CameraService:: connect为连接时调用的函数。
CameraService是与HAL层打交道的
Camera api部分:
frameworks/base/core/java/android/hardware/camera2
Camera JNI部分:
frameworks/base/core/jni/android_hardware_Camera.cpp
编译选项在目录下的Android.bp
make libandroid_runtime -j1
Camera UI库部分:
frameworks/av/camera/
编译选项在目录下的Android.bp
make libcamera_client -j1
Camera服务部分:
frameworks/av/services/camera/libcameraservice/
编译选项在目录下的Android.mk
make libcameraservice -j1
Camera HAL部分:
hardware/qcom/camera/
几个aidl文件请注意查看编译后有的形成.h文件 .cpp文件有的形成.java文件
ICameraDeviceUser.aidl,ICameraDeviceCallbacks.aidl
ICameraService.aidl,ICameraServiceProxy.aidl,ICameraServiceListener.aidl
这个cameraUser就是cameraservice端设置的ICameraDeviceUser.Stub对象
这个mRemoteDevice是应用程序进程和android camera service端之间链接的桥梁,上层操作camera的方法会通过调用mRemoteDevice来调用到camera service端来实现操作底层camera驱动的目的。
这个mRemoteDevice是应用程序进程和android camera service端之间链接的桥梁,上层操作camera的方法会通过调用mRemoteDevice来调用到camera service端来实现操作底层camera驱动的目的。
链接:https://www.jianshu.com/p/1332d3864f7c
CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
manager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
createCameraPreviewSession();
}
@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice cameraDevice, int error) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
Activity activity = getActivity();
if (null != activity) {
activity.finish();
}
}
};
CameraManager-->openCamera ---> 打开相机
CameraDeviceImpl-->createCaptureSession ---> 创建捕获会话
CameraCaptureSession-->setRepeatingRequest ---> 设置预览界面
CameraDeviceImpl-->capture ---> 开始捕获图片
CameraDeviceImpl,ICameraDevice,
Camera3Device.cpp, CameraDeviceClient.cpp
Camera3Device.cpp 与HAL层交互mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
1.CameraDeviceImpl->createCaptureSession
public interface OnImageAvailableListener {
void onImageAvailable(ImageReader reader);
}
2.CameraDeviceImpl->createCaptureSessionInternal
3.CameraDeviceImpl->configureStreamsChecked
3.4 mDevice->createStream
3.5 mRemoteDevice.endConfigure
createSurfaceFromGbp (mStreamInfoMap[streamId], true,surface, bufferProducer, physicalId);//创建surface
4.CameraCaptureSessionImpl构造函数
Android Camera原理之setRepeatingRequest与capture模块
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
1.1 向底层发送captureRequest请求
1.2 将返回请求信息和 CaptureCallback 绑定
CameraDeviceCallbacks.aidl才是camera service进程与用户进程通信的回调,到这个回调里面,再取出CaptureRequest绑定的CaptureCallback回调,调用到CaptureCallback回调函数,这样开发者可以直接使用。
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
class AndroidCamera2AgentImpl extends CameraAgent
frameworks/ex/camera2/portability/src/com/android/ex/camera2/portability/AndroidCamera2AgentImpl.java
public void handleMessage(final Message msg)
CameraProvider
看下图可知CameraProvider处于CameraService与CameraDriver之间
hardware/interfaces/camera/common/1.0/default/CameraModule.cpp此文件中有getNumberOfCameras调用的是底层vendor/sprd/modules/libcamera/ 目录中的hal3_2v4/SprdCamera3Factory.cpp中的对应函数,由此与libcamera联系上了!
vendor/sprd/modules/libcamera/hal3_2v4/SprdCamera3Hal.cpp
camera_module_t HAL_MODULE_INFO_SYM = {}
CameraProvider的注册
hardware/interfaces/camera/provider/2.4/default/service.cpp
hardware/interfaces/camera/provider/2.4/default/external-service.cpp
system/libhidl/transport/include/hidl/LegacySupport.h
hardware/interfaces/camera/device/3.5/default/CameraDevice.cpp
hardware/interfaces/camera/common/1.0/default/CameraModule.cpp
以上几个均是涉及到HAL层与CameraService交互时用到的
camera service: frameworks/av/services/camera/
分为三个模块来讲解Camera3Device,CameraManagerProvider,Camera3Stream
frameworks/av/services/camera/libcameraservice/devices/Camera3Device.cpp
调用源头,也是开发者控制逻辑的起始点,变向的也可以看成是控制源
frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp
控制camera service 与camera Provider交互的地方保证双方会话正常
frameworks/av/services/camera/libcameraservice/devices/Camera3Stream .cpp
管理输入输出源的地方,从HAL层取输出源,并将输出源返回到上层以供上层消费
Camera3Device
CameraProviderManager
Camera3Stream
camera service与camera provider session会话与capture request轮转
CameraProviderManager::openSession
CameraDevice::open
主要是设置了上层的回调到底层,并且底层返回可用的camera session到上层来,实现底层和上层的交互通信。
1.获取的session是什么?为什么这个重要?
此session是 ICameraDeviceSession 对象,这个对象是指为了操作camera device,camera provider与 camera service之间建立的一个会话机制,可以保证camera service IPC调用到camera provider进程中的代码。
1.1.获取session当前请求原数组队列
1.2.获取session 当前结果原数组队列
2.开始运转capture request线程
Camera3Device::RequestThread::threadLoop()