问题
this is regarding Android's Camera2 APIs. Since capture result and output frame are produced asynchronously, one could get capture result much before the actual frame. Is there a good way to associate produced frame with the corresponding capture result ?
回答1:
Assuming you are talking about a frame that is sent to an ImageReader
or SurfaceTexture
upon capture (as in the ubiquitous camera2basic example), the trick is to compare unique timestamps identifying the images.
Save the
TotalCaptureResult
somewhere accessible when it is available in yourCameraCaptureSession.CaptureCallback
'sonCaptureComplete(...)
call.Then, when the actual image is available via your
ImageReader.OnAvailableListener
orSurfaceTexture.OnFrameAvailableListener
, get the image's timestamp:
Long imageTimestamp = Long.valueOf(reader.acquireNextImage().getTimestamp());
or
Long imageTimestamp = Long.valueOf(surfaceTexture.getTimestamp())
, respectively.
- Compare timestamps with:
imageTimestamp.equals(totalCaptureResult.get(CaptureResult.SENSOR_TIMESTAMP));
Notes:
The timestamp may not be an actual true system timestamp for your device, but it is guaranteed to be unique and monotonically increasing, so it works as an ID.
If you are sending the image to a
SurfaceHolder
or something else instead, you're out of luck as only the pixel information gets sent, not the timestamp present in theImage
object. I'm not sure about the other places you can send a frame, e.g.,MediaRecorder
orAllocation
, but I think not.You probably need to add each new
TotalCaptureResult
to a growing set as they are generated, and then compare an incoming image's timestamp against all of these, because of the asynchronous nature you noted. I'll let you figure out how to do that as you see fit.
回答2:
I had to solve a similar situation (sync frames across surfaces); Sumner's solution (.getTimestamp()
of the respective received Image object) did the trick for me for SurfaceTexture
and ImageReader
.
Just a quick note on other surfaces (which, as pointed out, don't give you an Image
object): at least for MediaCodec
, the BufferInfo object received by the onOutputBufferAvailable callback has a presentationTimeUs
, which is "derived from the presentation timestamp passed in with the corresponding input buffer" and, at least for me, appears to match the timestamps from other surfaces. (Note the different unit though.)
来源:https://stackoverflow.com/questions/28960172/android-camera2-associate-totalcaptureresult-with-frame