问题
I've been looking at different ways of grabbing a YUV frame from a video stream but most of what I've seen rely on getting the width and height from previewSize. However, a cell phone can shoot video at 720p but a lot of phones can only display it at a lower resolution (ie 800x480) so is it possible to grab a screen shot that's closer to 1920x1080 (if video is being shot at 720p)? Or Am i forced to use the preview resolution (800x400 on some phones)?
Thanks
回答1:
Yes, you can. *
* Conditions Apply -
- You need access to middle layer, mediaframe work to be more precise
- No, it cannot be done only through the application
Now if you want to do it at the mediaframe work level here are steps -
- Assuming you are using Froyo and above, the default mediaframe work used is StageFright
- In StageFright go to method
onVideoEvent
after a buffer is read from themVideoSource
use themVideoBuffer
to access the video frame at its original resolution
Linking this with your application -
- You will need a button in the application to indicate screen capture
- Once the user presses this button then you read the video frame from the location mentioned above and then return this buffer to the Java layer
- From here you can use the JPEG Encoder to convert the raw video frame to an image.
EDIT:
Re read your question, you were asking for the screen capture during recording or the camera path. Even for this there is no way to achieve this in application alone, you will have to do something similar but you will need access to CameraSource
in the StageFright framework.
来源:https://stackoverflow.com/questions/9426372/grab-frame-from-video-in-android