问题
From FFMPEG's GitHub, I use the encode_video.c
to generate a 1 second video. Here is the example in question: https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/encode_video.c
I compile with: gcc -Wall -o ffencode encode_video.c -lavcodec -lavutil -lz -lm
Clean compile, zero warnings.
I test the program by running: ./ffencode video.mp4 libx264
Lots of stats printed out (expected based on source code) as well as ffmpeg logs, but ultimately no errors or warnings.
However, then the generated output video.mp4
, can only be played by ffplay
, and VLC Player (as well as Google Chrome) fail to play the video.
Playing it via vlc
command line actually prints:
[00007ffd3550fec0] main libvlc: Running vlc with the default interface. Use 'cvlc' to use vlc without interface.
TagLib: MP4: Invalid atom size
TagLib: MP4: Invalid atom size
TagLib: MP4: Invalid atom size
Looking at ffprobe
output, the bitrate and duration fields are empty:
Input #0, h264, from 'video.mp4':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p(progressive), 352x288, 25 fps, 25 tbr, 1200k tbn, 50 tbc
I am using ffmpeg 4.1 with the following configuration:
ffprobe version 4.1 Copyright (c) 2007-2018 the FFmpeg developers
built with Apple LLVM version 10.0.0 (clang-1000.11.45.5)
configuration: --prefix=/usr/local/Cellar/ffmpeg/4.1 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gpl --enable-libmp3lame --enable-libopus --enable-libsnappy --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-opencl --enable-videotoolbox
libavutil 56. 22.100 / 56. 22.100
libavcodec 58. 35.100 / 58. 35.100
libavformat 58. 20.100 / 58. 20.100
libavdevice 58. 5.100 / 58. 5.100
libavfilter 7. 40.101 / 7. 40.101
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 3.100 / 5. 3.100
libswresample 3. 3.100 / 3. 3.100
libpostproc 55. 3.100 / 55. 3.100
Any ideas how to fix this? It is pretty surprising to see an API's official example to be lacking such basic information.
回答1:
You will need to perform muxing of your video stream into video container, such as .mp4
. Muxing stuff is kept in libavformat. Algorithm should go like this:
- Initialize format library by invoking
av_register_all
or manually registering formats of interests. - Create muxing context by invoking
avformat_alloc_context
- Create one or more media streams by invoking
avformat_new_stream
- Write header by invoking
avformat_write_header
- Write media data by invoking
av_write_frame
- Write trailer by invoking
av_write_trailer
- Destroy muxing context by invoking
avformat_free_context
回答2:
The example generates a raw bitstream - it is not a MP4. The example is only meant to demonstrate encoding, not muxing (the term for packaging one or more streams into a container file format like MP4).
Rename extension to .h264 and test with VLC. No idea if Chrome supports raw .h264 files.
To generate MP4 from this output, run
ffmpeg -i video.mp4 -c copy actually.mp4
回答3:
can only be played by ffplay, and VLC Player (as well as Google Chrome) fail to play the video. Since ffplay is able to play the video, other media players can play it. You need to configure vlc to read X264 encoded video. Go to preference select all and change the demuxer to video H264
来源:https://stackoverflow.com/questions/53328946/cannot-play-video-output-of-libavcodec-ffmpeg-encoding-example