三方应用调用 Camera API1 的 startPreview
接口时, Framework 部分的时序图如下,其中注意几个点:
openCamera
时会有相关的实例化逻辑);startPreview
只打开一路 Stream,对应上 StreamingProcessor。其流程中对应 configureStream 这一动作的逻辑是在 startStream 调用期间完成的。主要涉及到的类及其对应的代码地址分别是:
/frameworks/av/services/camera/libcameraservice/api1/Camera2Client.cpp
/frameworks/av/services/camera/libcameraservice/api1/client2/StreamingProcessor.cpp
/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp
接下来我会照着上面的时序图,结合具体代码进行更深入一些的解释。
下面这部分我只会写我关注到的部分,没有完整覆盖所有的调用逻辑以及相应细节,建议有空时自己走一遍这个流程,印象、理解会深刻许多。
根据时序图来看,可以分成三个重点部分来看:
startPreview
;updatePreviewStream
;setStreamingRequest
;在进行分析前,有必要提及一下,preview stream 和 preview callback stream 是两路不同的 stream。其中 preview 这一路主要是由 StreamingProcessor 来创建管理的,preview callback 则主要对应 CallbackProcessor。
首先需要提一下,APP 调用 Camera API1 的接口 startPreview
后,会先到 JNI 部分会调用到 android_hardware_Camera_startPreview
,然后才会到 Camera2Client 这边的 startPreview
。
这个函数应该不用多说:
startPreviewL
;status_t Camera2Client::startPreview() {
ATRACE_CALL();
ALOGV("%s: E", __FUNCTION__);
Mutex::Autolock icl(mBinderSerializationLock);
status_t res;
if ( (res = checkPid(__FUNCTION__) ) != OK) return res;
SharedParameters::Lock l(mParameters);
return startPreviewL(l.mParameters, false);
}
到这个函数就要注意了,内容比较多,我省略部分语句,挑一些重点(时序图中体现到的逻辑)来讲。
NOTE:省略的部分我会加一句注释 “N Lines are omitted here” 代替。
下面来看一下这里面的流程:
updatePreviewStream
方法,这个方法后续单独拿出来分析,这里主要作用是创建 preview 这一路 stream,对应地会生成 streamID;deleteStream
;updatePreviewRequest
,这个函数主要是把 API1 风格的参数 Parameters 转换成 API2 风格的参数 Metadata;startStream
,这里面的逻辑会更复杂一些,后面会详细分析,需要注意的是这里会对接到 configureStreams 流程,与 HAL 存在交互;status_t Camera2Client::startPreviewL(Parameters ¶ms, bool restart) {
// NOTE: N Lines are omitted here
params.state = Parameters::STOPPED;
int lastPreviewStreamId = mStreamingProcessor->getPreviewStreamId();
res = mStreamingProcessor->updatePreviewStream(params);
if (res != OK) {
ALOGE("%s: Camera %d: Unable to update preview stream: %s (%d)",
__FUNCTION__, mCameraId, strerror(-res), res);
return res;
}
bool previewStreamChanged = mStreamingProcessor->getPreviewStreamId() != lastPreviewStreamId;
// NOTE: N Lines are omitted here
Vector<int32_t> outputStreams;
bool callbacksEnabled = (params.previewCallbackFlags &
CAMERA_FRAME_CALLBACK_FLAG_ENABLE_MASK) ||
params.previewCallbackSurface;
// NOTE: N Lines are omitted here
if (params.useZeroShutterLag() &&
getRecordingStreamId() == NO_STREAM) {
// NOTE: N Lines are omitted here
} else {
mZslProcessor->deleteStream();
}
outputStreams.push(getPreviewStreamId());
// NOTE: N Lines are omitted here
if (!params.recordingHint) {
if (!restart) {
res = mStreamingProcessor->updatePreviewRequest(params);
if (res != OK) {
ALOGE("%s: Camera %d: Can't set up preview request: "
"%s (%d)", __FUNCTION__, mCameraId,
strerror(-res), res);
return res;
}
}
res = mStreamingProcessor->startStream(StreamingProcessor::PREVIEW,
outputStreams);
} else {
// NOTE: N Lines are omitted here
}
params.state = Parameters::PREVIEW;
return OK;
}
这一部分主要作用是创建 preview 这一路 stream。
关于这段代码:
createStream
方法去创建一个 OuputStream,需要注意这里的 device 对应的是 Camera3Device 的实例;status_t StreamingProcessor::updatePreviewStream(const Parameters ¶ms) {
// NOTE: N Lines are omitted here
if (mPreviewStreamId == NO_STREAM) {
res = device->createStream(mPreviewWindow,
params.previewWidth, params.previewHeight,
CAMERA2_HAL_PIXEL_FORMAT_OPAQUE, HAL_DATASPACE_UNKNOWN,
CAMERA3_STREAM_ROTATION_0, &mPreviewStreamId, String8());
if (res != OK) {
ALOGE("%s: Camera %d: Unable to create preview stream: %s (%d)",
__FUNCTION__, mId, strerror(-res), res);
return res;
}
}
res = device->setStreamTransform(mPreviewStreamId,
params.previewTransform);
if (res != OK) {
ALOGE("%s: Camera %d: Unable to set preview stream transform: "
"%s (%d)", __FUNCTION__, mId, strerror(-res), res);
return res;
}
return OK;
}
注意这里有两个 createStream
:
createStream
是首先调用到的,它会调用真正实现创建 stream 的 createStream
函数;startStream
期间 configureStreamLocked
函数的走向。status_t Camera3Device::createStream(sp<Surface> consumer,
uint32_t width, uint32_t height, int format,
android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
const String8& physicalCameraId,
std::vector<int> *surfaceIds, int streamSetId, bool isShared, uint64_t consumerUsage) {
ATRACE_CALL();
if (consumer == nullptr) {
ALOGE("%s: consumer must not be null", __FUNCTION__);
return BAD_VALUE;
}
std::vector<sp<Surface>> consumers;
consumers.push_back(consumer);
return createStream(consumers, /*hasDeferredConsumer*/ false, width, height,
format, dataSpace, rotation, id, physicalCameraId, surfaceIds, streamSetId,
isShared, consumerUsage);
}
status_t Camera3Device::createStream(const std::vector<sp<Surface>>& consumers,
bool hasDeferredConsumer, uint32_t width, uint32_t height, int format,
android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
const String8& physicalCameraId,
std::vector<int> *surfaceIds, int streamSetId, bool isShared, uint64_t consumerUsage) {
// NOTE: N Lines are omitted here
status_t res;
bool wasActive = false;
switch (mStatus) {
case STATUS_ERROR:
CLOGE("Device has encountered a serious error");
return INVALID_OPERATION;
case STATUS_UNINITIALIZED:
CLOGE("Device not initialized");
return INVALID_OPERATION;
case STATUS_UNCONFIGURED:
case STATUS_CONFIGURED:
// OK
break;
case STATUS_ACTIVE:
ALOGV("%s: Stopping activity to reconfigure streams", __FUNCTION__);
res = internalPauseAndWaitLocked(maxExpectedDuration);
if (res != OK) {
SET_ERR_L("Can't pause captures to reconfigure streams!");
return res;
}
wasActive = true;
break;
default:
SET_ERR_L("Unexpected status: %d", mStatus);
return INVALID_OPERATION;
}
assert(mStatus != STATUS_ACTIVE);
sp<Camera3OutputStream> newStream;
// NOTE: N Lines are omitted here
if (format == HAL_PIXEL_FORMAT_BLOB) {
// NOTE: N Lines are omitted here
} else if (format == HAL_PIXEL_FORMAT_RAW_OPAQUE) {
ssize_t rawOpaqueBufferSize = getRawOpaqueBufferSize(width, height);
if (rawOpaqueBufferSize <= 0) {
SET_ERR_L("Invalid RAW opaque buffer size %zd", rawOpaqueBufferSize);
return BAD_VALUE;
}
newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
width, height, rawOpaqueBufferSize, format, dataSpace, rotation,
mTimestampOffset, physicalCameraId, streamSetId);
} else if (isShared) {
// NOTE: N Lines are omitted here
} else if (consumers.size() == 0 && hasDeferredConsumer) {
// NOTE: N Lines are omitted here
} else {
// NOTE: N Lines are omitted here
}
// NOTE: N Lines are omitted here
newStream->setStatusTracker(mStatusTracker);
newStream->setBufferManager(mBufferManager);
res = mOutputStreams.add(mNextStreamId, newStream);
if (res < 0) {
SET_ERR_L("Can't add new stream to set: %s (%d)", strerror(-res), res);
return res;
}
*id = mNextStreamId++;
mNeedConfig = true;
// NOTE: N Lines are omitted here
return OK;
}
这部分主要是由 startStream
触发,作用是引出 ConfigureStream 的相关逻辑,并且通过 setRepeatingRequest
开启持续预览模式。
这个函数逻辑比较简单:
Camera3Device::RequestThread::prepareHalRequests
里的 captureRequests->mOutputStreams.size()
,这几行主要是更新当前存在的 outputStream;setStreamingRequest
,从这里开始就进入我们这一部分的正题了。status_t StreamingProcessor::startStream(StreamType type,
const Vector &outputStreams) {
// NOTE: N Lines are omitted here
CameraMetadata &request = (type == PREVIEW) ?
mPreviewRequest : mRecordingRequest;
res = request.update(
ANDROID_REQUEST_OUTPUT_STREAMS,
outputStreams);
// NOTE: N Lines are omitted here
res = device->setStreamingRequest(request);
if (res != OK) {
ALOGE("%s: Camera %d: Unable to set preview request to start preview: "
"%s (%d)",
__FUNCTION__, mId, strerror(-res), res);
return res;
}
mActiveRequest = type;
mPaused = false;
mActiveStreamIds = outputStreams;
return OK;
}
这里只是一个入口:
setStreamingRequestList
。status_t Camera3Device::setStreamingRequest(const CameraMetadata &request,
int64_t* /*lastFrameNumber*/) {
ATRACE_CALL();
List<const PhysicalCameraSettingsList> requestsList;
std::list<const SurfaceMap> surfaceMaps;
convertToRequestList(requestsList, surfaceMaps, request);
return setStreamingRequestList(requestsList, /*surfaceMap*/surfaceMaps,
/*lastFrameNumber*/NULL);
}
这个函数没什么意思,只是继续向下调用到 submitRequestsHelper
。
status_t Camera3Device::setStreamingRequestList(
const List<const PhysicalCameraSettingsList> &requestsList,
const std::list<const SurfaceMap> &surfaceMaps, int64_t *lastFrameNumber) {
ATRACE_CALL();
return submitRequestsHelper(requestsList, surfaceMaps, /*repeating*/true, lastFrameNumber);
}
这里面有几个关键的动作:
status_t Camera3Device::submitRequestsHelper(
const List<const PhysicalCameraSettingsList> &requests,
const std::list<const SurfaceMap> &surfaceMaps,
bool repeating,
/*out*/
int64_t *lastFrameNumber) {
// NOTE: N Lines are omitted here
RequestList requestList;
res = convertMetadataListToRequestListLocked(requests, surfaceMaps,
repeating, /*out*/&requestList);
if (res != OK) {
// error logged by previous call
return res;
}
if (repeating) {
res = mRequestThread->setRepeatingRequests(requestList, lastFrameNumber);
} else {
res = mRequestThread->queueRequestList(requestList, lastFrameNumber);
}
if (res == OK) {
waitUntilStateThenRelock(/*active*/true, kActiveTimeout);
if (res != OK) {
SET_ERR_L("Can't transition to active in %f seconds!",
kActiveTimeout/1e9);
}
ALOGV("Camera %s: Capture request %" PRId32 " enqueued", mId.string(),
(*(requestList.begin()))->mResultExtras.requestId);
} else {
CLOGE("Cannot queue request. Impossible.");
return BAD_VALUE;
}
return res;
}
这里面主要关注的是:
status_t Camera3Device::convertMetadataListToRequestListLocked(
const List<const PhysicalCameraSettingsList> &metadataList,
const std::list<const SurfaceMap> &surfaceMaps,
bool repeating,
RequestList *requestList) {
// NOTE: N Lines are omitted here
for (; metadataIt != metadataList.end() && surfaceMapIt != surfaceMaps.end();
++metadataIt, ++surfaceMapIt) {
sp<CaptureRequest> newRequest = setUpRequestLocked(*metadataIt, *surfaceMapIt);
if (newRequest == 0) {
CLOGE("Can't create capture request");
return BAD_VALUE;
}
newRequest->mRepeating = repeating;
// Setup burst Id and request Id
// NOTE: N Lines are omitted here
requestList->push_back(newRequest);
ALOGV("%s: requestId = %" PRId32, __FUNCTION__, newRequest->mResultExtras.requestId);
}
// NOTE: N Lines are omitted here
return OK;
}
这里面有 2 件事:
sp<Camera3Device::CaptureRequest> Camera3Device::setUpRequestLocked(
const PhysicalCameraSettingsList &request, const SurfaceMap &surfaceMap) {
status_t res;
if (mStatus == STATUS_UNCONFIGURED || mNeedConfig) {
// This point should only be reached via API1 (API2 must explicitly call configureStreams)
// so unilaterally select normal operating mode.
res = filterParamsAndConfigureLocked(request.begin()->metadata,
CAMERA3_STREAM_CONFIGURATION_NORMAL_MODE);
// Stream configuration failed. Client might try other configuraitons.
if (res != OK) {
CLOGE("Can't set up streams: %s (%d)", strerror(-res), res);
return NULL;
} else if (mStatus == STATUS_UNCONFIGURED) {
// Stream configuration successfully configure to empty stream configuration.
CLOGE("No streams configured");
return NULL;
}
}
sp<CaptureRequest> newRequest = createCaptureRequest(request, surfaceMap);
return newRequest;
}
这里比较简单:
status_t Camera3Device::filterParamsAndConfigureLocked(const CameraMetadata& sessionParams,
int operatingMode) {
//Filter out any incoming session parameters
const CameraMetadata params(sessionParams);
camera_metadata_entry_t availableSessionKeys = mDeviceInfo.find(
ANDROID_REQUEST_AVAILABLE_SESSION_KEYS);
CameraMetadata filteredParams(availableSessionKeys.count);
camera_metadata_t *meta = const_cast<camera_metadata_t *>(
filteredParams.getAndLock());
set_camera_metadata_vendor_id(meta, mVendorTagId);
filteredParams.unlock(meta);
if (availableSessionKeys.count > 0) {
for (size_t i = 0; i < availableSessionKeys.count; i++) {
camera_metadata_ro_entry entry = params.find(
availableSessionKeys.data.i32[i]);
if (entry.count > 0) {
filteredParams.update(entry);
}
}
}
return configureStreamsLocked(operatingMode, filteredParams);
}
分析该函数:
startConfiguration
(第 33 行),更深入具体的逻辑这里就不关注了;Camera3Device::HalInterface::configureStreams
这个函数,它会调用到 mHidlSession_3_4->configureStreams_3_4
从而进入 HAL 层的 configure stream 动作;startConfiguration
,此处调用每个 output stream 的 finishConfiguration
动作;configurationComplete
动作;resume
函数让其继续运作。status_t Camera3Device::configureStreamsLocked(int operatingMode,
const CameraMetadata& sessionParams, bool notifyRequestThread) {
// NOTE: N Lines are omitted here
// Workaround for device HALv3.2 or older spec bug - zero streams requires
// adding a dummy stream instead.
// TODO: Bug: 17321404 for fixing the HAL spec and removing this workaround.
if (mOutputStreams.size() == 0) {
addDummyStreamLocked();
} else {
tryRemoveDummyStreamLocked();
}
// Start configuring the streams
ALOGV("%s: Camera %s: Starting stream configuration", __FUNCTION__, mId.string());
mPreparerThread->pause();
// NOTE: N Lines are omitted here
for (size_t i = 0; i < mOutputStreams.size(); i++) {
// Don't configure bidi streams twice, nor add them twice to the list
if (mOutputStreams[i].get() ==
static_cast<Camera3StreamInterface*>(mInputStream.get())) {
config.num_streams--;
continue;
}
camera3_stream_t *outputStream;
outputStream = mOutputStreams.editValueAt(i)->startConfiguration();
if (outputStream == NULL) {
CLOGE("Can't start output stream configuration");
cancelStreamsConfigurationLocked();
return INVALID_OPERATION;
}
streams.add(outputStream);
// NOTE: N Lines are omitted here
}
config.streams = streams.editArray();
// Do the HAL configuration; will potentially touch stream
// max_buffers, usage, priv fields.
const camera_metadata_t *sessionBuffer = sessionParams.getAndLock();
res = mInterface->configureStreams(sessionBuffer, &config, bufferSizes);
sessionParams.unlock(sessionBuffer);
// NOTE: N Lines are omitted here
for (size_t i = 0; i < mOutputStreams.size(); i++) {
sp<Camera3OutputStreamInterface> outputStream =
mOutputStreams.editValueAt(i);
if (outputStream->isConfiguring() && !outputStream->isConsumerConfigurationDeferred()) {
res = outputStream->finishConfiguration();
if (res != OK) {
CLOGE("Can't finish configuring output stream %d: %s (%d)",
outputStream->getId(), strerror(-res), res);
cancelStreamsConfigurationLocked();
return BAD_VALUE;
}
}
}
// Request thread needs to know to avoid using repeat-last-settings protocol
// across configure_streams() calls
if (notifyRequestThread) {
mRequestThread->configurationComplete(mIsConstrainedHighSpeedConfiguration, sessionParams);
}
char value[PROPERTY_VALUE_MAX];
property_get("camera.fifo.disable", value, "0");
int32_t disableFifo = atoi(value);
if (disableFifo != 1) {
// Boost priority of request thread to SCHED_FIFO.
pid_t requestThreadTid = mRequestThread->getTid();
res = requestPriority(getpid(), requestThreadTid,
kRequestThreadPriority, /*isForApp*/ false, /*asynchronous*/ false);
if (res != OK) {
ALOGW("Can't set realtime priority for request processing thread: %s (%d)",
strerror(-res), res);
} else {
ALOGD("Set real time priority for request queue thread (tid %d)", requestThreadTid);
}
}
// NOTE: N Lines are omitted here
mNeedConfig = false;
internalUpdateStatusLocked((mDummyStreamId == NO_STREAM) ?
STATUS_CONFIGURED : STATUS_UNCONFIGURED);
ALOGV("%s: Camera %s: Stream configuration complete", __FUNCTION__, mId.string());
// tear down the deleted streams after configure streams.
mDeletedStreams.clear();
auto rc = mPreparerThread->resume();
if (rc != OK) {
SET_ERR_L("%s: Camera %s: Preparer thread failed to resume!", __FUNCTION__, mId.string());
return rc;
}
return OK;
}
ConfigureStream 完成后,就要创建 CaptureRequest 实例并配置其 stream 信息:
sp<Camera3Device::CaptureRequest> Camera3Device::createCaptureRequest(
const PhysicalCameraSettingsList &request, const SurfaceMap &surfaceMap) {
ATRACE_CALL();
status_t res;
sp<CaptureRequest> newRequest = new CaptureRequest;
newRequest->mSettingsList = request;
// NOTE: N Lines are omitted here
camera_metadata_entry_t streams =
newRequest->mSettingsList.begin()->metadata.find(ANDROID_REQUEST_OUTPUT_STREAMS);
if (streams.count == 0) {
CLOGE("Zero output streams specified!");
return NULL;
}
for (size_t i = 0; i < streams.count; i++) {
int idx = mOutputStreams.indexOfKey(streams.data.i32[i]);
if (idx == NAME_NOT_FOUND) {
CLOGE("Request references unknown stream %d",
streams.data.u8[i]);
return NULL;
}
sp<Camera3OutputStreamInterface> stream =
mOutputStreams.editValueAt(idx);
// It is illegal to include a deferred consumer output stream into a request
auto iter = surfaceMap.find(streams.data.i32[i]);
if (iter != surfaceMap.end()) {
const std::vector<size_t>& surfaces = iter->second;
for (const auto& surface : surfaces) {
if (stream->isConsumerConfigurationDeferred(surface)) {
CLOGE("Stream %d surface %zu hasn't finished configuration yet "
"due to deferred consumer", stream->getId(), surface);
return NULL;
}
}
newRequest->mOutputSurfaces[i] = surfaces;
}
// NOTE: N Lines are omitted here
newRequest->mOutputStreams.push(stream);
}
newRequest->mSettingsList.begin()->metadata.erase(ANDROID_REQUEST_OUTPUT_STREAMS);
newRequest->mBatchSize = 1;
return newRequest;
}
创建完 CaptureRequest ,回到 submitRequestHelper
会调用到这个函数:
unpauseForNewRequests
向相关线程发送信号。status_t Camera3Device::RequestThread::setRepeatingRequests(
const RequestList &requests,
/*out*/
int64_t *lastFrameNumber) {
ATRACE_CALL();
Mutex::Autolock l(mRequestLock);
if (lastFrameNumber != NULL) {
*lastFrameNumber = mRepeatingLastFrameNumber;
}
mRepeatingRequests.clear();
mRepeatingRequests.insert(mRepeatingRequests.begin(),
requests.begin(), requests.end());
unpauseForNewRequests();
mRepeatingLastFrameNumber = hardware::camera2::ICameraDeviceUser::NO_IN_FLIGHT_REPEATING_FRAMES;
return OK;
}
这里面需要注意的是第 6 行,触发了 mRequestSignal 的信号,它对应的是 Camera3Device::RequestThread::waitForNextRequestLocked
中的 wait 动作。
void Camera3Device::RequestThread::unpauseForNewRequests() {
ATRACE_CALL();
// With work to do, mark thread as unpaused.
// If paused by request (setPaused), don't resume, to avoid
// extra signaling/waiting overhead to waitUntilPaused
mRequestSignal.signal();
Mutex::Autolock p(mPauseLock);
if (!mDoPause) {
ALOGV("%s: RequestThread: Going active", __FUNCTION__);
if (mPaused) {
sp<StatusTracker> statusTracker = mStatusTracker.promote();
if (statusTracker != 0) {
statusTracker->markComponentActive(mStatusId);
}
}
mPaused = false;
}
}
这片文章主要介绍了 Android P 版本下,相机 APP 调用 Camera API1 的 startPreview
时,Framework 层是如何转换成 HAL3 的逻辑以确保兼容性的。
而下一篇文章将会介绍到,调用了 startPreview
后,接着去调用 setPreviewCallbackWithBuffer
(会触发到 setPreviewCallbackFlag
)时相关的时序。