I have extracted frames from a video using MetadataRetriever, and have stored all images in an ArrayList
. I want to store all of them on an SD car
Just convert your milliseconds to micro because getFrameAt get data in milliseconds
1 miliseconds have 1000 microseconds..
for(int i=1000000;i<millis*1000;i+=1000000){
Bitmap bitmap=retriever.getFrameAtTime(i,OPTION_CLOSEST_SYNC);
rev.add(bitmap);
}
then your problem is solved..
I created it according to my need.
public class MainActivity extends Activity
{
@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
File videoFile=new File(Environment.getExternalStorageDirectory().getAbsolutePath()+"/screenshots/","myvideo.mp4");
Uri videoFileUri=Uri.parse(videoFile.toString());
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(videoFile.getAbsolutePath());
ArrayList<Bitmap> rev=new ArrayList<Bitmap>();
//Create a new Media Player
MediaPlayer mp = MediaPlayer.create(getBaseContext(), videoFileUri);
int millis = mp.getDuration();
for(int i=1000000;i<millis*1000;i+=1000000)
{
Bitmap bitmap=retriever.getFrameAtTime(i,MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
rev.add(bitmap);
}
}
}
@Navneet Krishna
Videos have key-frames (sometimes called "sync" frames). The issue might be that: the first frame is the closest sync frame when you use OPTION_CLOSEST_SYNC
...
Try replacing this line:
Bitmap bitmap=retriever.getFrameAtTime(i,OPTION_CLOSEST_SYNC);
with this (which gets closest-available frame to the given time):
Bitmap bitmap=retriever.getFrameAtTime(i,OPTION_CLOSEST);
Read about: CLOSEST & CLOSEST_SYNC.
Code below is untested, but is a general idea of "how to...", so let me now if it helps you...
Result should be one picture per second of video duration. Test with a short video.
//Create a new Media Player
MediaPlayer mp = MediaPlayer.create(getBaseContext(), videoFileUri);
int millis = mp.getDuration();
for (int i = 0; i < millis; i += 1000)
{
Bitmap bmp = retriever.getFrameAtTime( i * 1000, MediaMetadataRetriever.OPTION_CLOSEST );
if (bmp != null) { rev.add( bmp ); }
}
retriever.release ();
long time;
String formattedFileCount;
FileOutputStream fos;
BufferedOutputStream bos;
NumberFormat fileCountFormatter = new DecimalFormat("00000");
int fileCount = 0;
File jpegFile;
ArrayList<Bitmap> bArray = null;
Bitmap lastbitmap = null;
bArray = new ArrayList<Bitmap>();
I am taking Time in microseconds and em getting duration from the Mediaplayer like:
time = mp.getDuration()*1000;
Log.e("Timeeeeeeee", time);
bArray.clear();
MediaMetadataRetriever mRetriever = new MediaMetadataRetriever();
mRetriever.setDataSource(path);
int j=0;
My frame rate is 12 frame/sec so I divided 1/12 = 0.083333 and this is the second's part of a frame and than I convert my frame sec to microsecond so it becomes 83333
for (int i = 833333; i <= time; i=i+83333) {
bArray.add(mRetriever.getFrameAtTime(i, MediaMetadataRetriever.OPTION_CLOSEST_SYNC));
formattedFileCount = fileCountFormatter.format(fileCount);
lastbitmap = bArray.get(j);
j++;
// image is the bitmap
jpegFile = new File(Environment.getExternalStorageDirectory().getPath() + "/frames/frame_" + formattedFileCount + ".jpg");
fileCount++;
try {
fos = new FileOutputStream(jpegFile);
bos = new BufferedOutputStream(fos);
lastbitmap.compress(Bitmap.CompressFormat.JPEG, 15, bos);
bos.flush();
bos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
} catch (Exception e) {
}
Using MetaMediaRetriever i was able to get the screeshot of video
https://code.google.com/p/android/issues/detail?id=35794
Try this on your for/loop:
Bitmap bitmap = retriever.getFrameAtTime(
TimeUnit.MICROSECONDS.convert(i, TimeUnit.MILLISECONDS),
retriever.OPTION_CLOSEST_SYNC);
MediaMetadataRetriever
's getFrameAt
method takes in microseconds (1/1000000th of a second) instead of milliseconds, so in your case it is always rounding down to the 1st frame.