Android Face Detection API - Stored video file

六眼飞鱼酱① 提交于 2019-12-04 12:08:02

Alternatively I can see that Frame.Builder has functions setImageData and setTimestampMillis. If I was able to read in the video as ByteBuffer, how would I pass that to the FaceDetector API?

Simply call SparseArray<Face> faces = detector.detect(frame); where detector has to be created like this:

FaceDetector detector = new FaceDetector.Builder(context)
   .setProminentFaceOnly(true)
   .build();

If processing time is not an issue, using MediaMetadataRetriever.getFrameAtTime solves the question. As Anton suggested, you can also use FaceDetector.detect:

Bitmap bitmap;
Frame frame;
SparseArray<Face> faces;
MediaMetadataRetriever mMMR = new MediaMetadataRetriever();
mMMR.setDataSource(videoPath);
String timeMs = mMMR.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION); // video time in ms
int totalVideoTime= 1000*Integer.valueOf(timeMs); // total video time, in uS
for (int time_us=1;time_us<totalVideoTime;time_us+=deltaT){
        bitmap = mMMR.getFrameAtTime(time_us, MediaMetadataRetriever.OPTION_CLOSEST_SYNC); // extract a bitmap element from the closest key frame from the specified time_us
        if (bitmap==null) break; 
        frame = new Frame.Builder().setBitmap(bitmap).build(); // generates a "Frame" object, which can be fed to a face detector
        faces = detector.detect(frame); // detect the faces (detector is a FaceDetector)
        // TODO ... do something with "faces"
    }

where deltaT=1000000/fps, and fps is the desired number of frames per second. For example, if you want to extract 4 frames every second, deltaT=250000 (Note that faces will be overwritten on every iteration, so you should do something (store/report results) inside the loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!