audio-video-sync

FFmpeg - What does non monotonically increasing dts mean?

不羁的心 提交于 2020-08-04 04:03:30
问题 Observations - Part - I I saw a suggestion elsewhere to run the following command to see if there's something wrong with my .mp4. ffmpeg -v error -i ~/Desktop/5_minute_sync_output_15mn.mp4 -f null - 2>error.log When I run the above command, I see a whole bunch of the logs on the lines of what's shown below. Application provided invalid, non monotonically increasing dts to muxer in stream 0: 15635 >= 15635 This, from searching and reading up quite a bit, I understand that the decoding

Combining an audio and video stream using gstreamer [closed]

放肆的年华 提交于 2019-12-10 04:28:49
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 years ago . I am streaming an mp4(mpeg-4) file from one device to another using gstreamer over RTP stream. Basically I am splitting up the mp4 file into its audio and video file and then sending it all to the other device where it gets streamed. Now, I want to save the mp4 file to disk in the other device, but my problem is

Combining an audio and video stream using gstreamer [closed]

孤人 提交于 2019-12-05 04:41:31
I am streaming an mp4(mpeg-4) file from one device to another using gstreamer over RTP stream. Basically I am splitting up the mp4 file into its audio and video file and then sending it all to the other device where it gets streamed. Now, I want to save the mp4 file to disk in the other device, but my problem is that I am able to save the audio and video files seperately and it cannot be played individually. I am confused on how to combine both the audio and video rtp streams to form my mp4 file back and save it to a file in the other device. Here are the command line codes : Sender(Server)

iOS AVFoundation audio/video out of sync

[亡魂溺海] 提交于 2019-12-04 11:52:50
问题 The Problem: During every playback, the audio is between 1-2 seconds behind the video. The Setup: The assets are loaded with AVURLAssets from a media stream. To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600. The playback is done with AVPlayer. Attempted Solutions: Using videoAssetTrack.timeRange for

iOS AVFoundation audio/video out of sync

你离开我真会死。 提交于 2019-12-03 08:11:23
The Problem: During every playback, the audio is between 1-2 seconds behind the video. The Setup: The assets are loaded with AVURLAssets from a media stream. To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600. The playback is done with AVPlayer. Attempted Solutions: Using videoAssetTrack.timeRange for [composition insertTimeRange] . Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration); Using