libavformat

ffmpeg: RGB to YUV conversion loses color and scale

巧了我就是萌 提交于 2019-11-29 08:51:03
I am trying to convert RGB frames to YUV420P format in ffmpeg/libav. Following is the code for conversion and also the images before and after conversion. The converted image loses all color information and also the scale changes significantly. Does anybody have idea how to handle this? I am completely new to ffmpeg/libav! // Did we get a video frame? if(frameFinished) { i++; sws_scale(img_convert_ctx, (const uint8_t * const *)pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize); //============================================================== AVFrame

How can libavformat be used without using other libav libraries?

若如初见. 提交于 2019-11-29 04:03:44
问题 I would like a simple working example of using just libavformat to mux video. There are nice examples (doc/examples/muxing.c) that show encoding with libavcodec, muxing with libavformat and saving the data with libavio. However, there is no example I know of that uses libavformat by itself, feeding in encoded data in a buffer and getting muxed data in a buffer. The difficulty is two-fold: one, adding a stream with avformat_new_stream(AVFormatContext *s, const AVCodec *c) requires a reference

Record RTSP stream with FFmpeg libavformat

耗尽温柔 提交于 2019-11-28 16:54:33
I'm trying to record RTSP stream from Axis camera with FFmpeg libavformat. I can grab video from files and then save it to another file, this is OK. But camera sends strange data, FPS is 100 and camera sends every 4th frame so result FPS is about 25. But libavformat set packets dts/pts for 90000 fps (default?) and new file stream has 100fps. Result is one hour video with only 100 frames. Here is my code #include <stdio.h> #include <stdlib.h> #include <libavcodec/avcodec.h> #include <libavformat/avformat.h> #include <libavformat/avio.h> int main(int argc, char** argv) { AVFormatContext* context

Reading a file located in memory with libavformat

若如初见. 提交于 2019-11-28 16:24:16
I'm currently trying to read small video files sent from a server In order to read a file using libavformat, you are supposed to call av_open_input_file(&avFormatContext, "C:\\path\\to\\video.avi", 0, 0, 0); The problem is that in this case the file is not on the disk, but in memory. What I'm doing for the moment is downloading the file, writing it on the disk using a temporary name, and then calling av_open_input_file with the temporary file name, which is not a very clean solution. In fact what I want is a function like av_open_custom(&avFormatContext, &myReadFunction, &mySeekFunction); but

ffmpeg: RGB to YUV conversion loses color and scale

冷暖自知 提交于 2019-11-28 02:13:57
问题 I am trying to convert RGB frames to YUV420P format in ffmpeg/libav. Following is the code for conversion and also the images before and after conversion. The converted image loses all color information and also the scale changes significantly. Does anybody have idea how to handle this? I am completely new to ffmpeg/libav! // Did we get a video frame? if(frameFinished) { i++; sws_scale(img_convert_ctx, (const uint8_t * const *)pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameRGB-

Reading a file located in memory with libavformat

烈酒焚心 提交于 2019-11-27 19:58:25
问题 I'm currently trying to read small video files sent from a server In order to read a file using libavformat, you are supposed to call av_open_input_file(&avFormatContext, "C:\\path\\to\\video.avi", 0, 0, 0); The problem is that in this case the file is not on the disk, but in memory. What I'm doing for the moment is downloading the file, writing it on the disk using a temporary name, and then calling av_open_input_file with the temporary file name, which is not a very clean solution. In fact

Raw H264 frames in mpegts container using libavcodec

天大地大妈咪最大 提交于 2019-11-27 10:28:10
I would really appreciate some help with the following issue: I have a gadget with a camera, producing H264 compressed video frames, these frames are being sent to my application. These frames are not in a container, just raw data. I want to use ffmpeg and libav functions to create a video file, which can be used later. If I decode the frames, then encode them, everything works fine, I get a valid video file. (the decode/encode steps are the usual libav commands, nothing fancy here, I took them from the almighty internet, they are rock solid)... However, I waste a lot of time by decoding and

Record RTSP stream with FFmpeg libavformat

安稳与你 提交于 2019-11-27 09:53:30
问题 I'm trying to record RTSP stream from Axis camera with FFmpeg libavformat. I can grab video from files and then save it to another file, this is OK. But camera sends strange data, FPS is 100 and camera sends every 4th frame so result FPS is about 25. But libavformat set packets dts/pts for 90000 fps (default?) and new file stream has 100fps. Result is one hour video with only 100 frames. Here is my code #include <stdio.h> #include <stdlib.h> #include <libavcodec/avcodec.h> #include

Raw H264 frames in mpegts container using libavcodec

不打扰是莪最后的温柔 提交于 2019-11-26 17:56:52
问题 I would really appreciate some help with the following issue: I have a gadget with a camera, producing H264 compressed video frames, these frames are being sent to my application. These frames are not in a container, just raw data. I want to use ffmpeg and libav functions to create a video file, which can be used later. If I decode the frames, then encode them, everything works fine, I get a valid video file. (the decode/encode steps are the usual libav commands, nothing fancy here, I took