cmsamplebuffer

How to convert CMSampleBuffer to Data in Swift?

只谈情不闲聊 提交于 2019-12-17 20:37:16
问题 I need to convert CMSampleBuffer to Data format. I am using one Third party framework for audio related task. That framework gives me the streaming (i.e Real Time audio) audio in CMSampleBuffer object. Like this: func didAudioStreaming(audioSample: CMSampleBuffer!) { //Here I need to conver this to Data format. //Because I am using GRPC framework for Audio Recognization, } Please provide me the steps to convert the CMSampleBuffer to Data . FYI let formatDesc:CMFormatDescription? =

AVAssetWriter rotate buffer for video orientation

北慕城南 提交于 2019-12-11 15:40:40
问题 I'm working on a live recording app in Swift using AVFoundation and I have an issue with the video orientation. I use AVAssetWriter and not AVCaptureMovieFileOutput because I need to record in square format (correct me if I'm wrong). I tried to use videoInput.transform but I heard that it is not supported in all video player. I can't use avcaptureconnection.videoOrientation based on the device orientation because there is some "Main UI thread stop". I read that the best solution is to rotate

Convert CMSampleBuffer to UIImage

六月ゝ 毕业季﹏ 提交于 2019-12-07 05:30:10
问题 I am trying to convert sampleBuffer to a UIImage and display it in an image view with colorspaceGray. But it displays as the following image. I think there is a problem regarding the conversion. How can I convert the CMSampleBuffer? func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { print("buffered") let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!

How to fill audio AVFrame (ffmpeg) with the data obtained from CMSampleBufferRef (AVFoundation)?

╄→尐↘猪︶ㄣ 提交于 2019-12-05 09:30:29
I am writing program for streaming live audio and video from webcamera to rtmp-server. I work in MacOS X 10.8, so I use AVFoundation framework for obtaining audio and video frames from input devices. This frames come into delegate: -(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer: (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection , where sampleBuffer contains audio or video data. When I recieve audio data in the sampleBuffer , I'm trying to convert this data into AVFrame and encode AVFrame with libavcodec: aframe = avcodec_alloc_frame();

Split CMSampleBufferRef containing Audio

大兔子大兔子 提交于 2019-12-03 22:25:03
问题 I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796

Split CMSampleBufferRef containing Audio

做~自己de王妃 提交于 2019-12-01 00:59:26
I'am splitting the recording into different files while recording... The problem is, captureOutput video and audio sample buffers doesn't correspond 1:1 (which is logical) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection AUDIO START: 36796.833236847 | DURATION: 0.02321995464852608 | END: 36796.856456802 VIDEO START: 36796.842089239 | DURATION: nan | END: nan AUDIO START: 36796.856456805 | DURATION: 0.02321995464852608 | END: 36796.87967676 AUDIO START: 36796.879676764 | DURATION: 0

How to convert CMSampleBuffer to Data in Swift?

霸气de小男生 提交于 2019-11-28 12:42:16
I need to convert CMSampleBuffer to Data format. I am using one Third party framework for audio related task. That framework gives me the streaming (i.e Real Time audio) audio in CMSampleBuffer object. Like this: func didAudioStreaming(audioSample: CMSampleBuffer!) { //Here I need to conver this to Data format. //Because I am using GRPC framework for Audio Recognization, } Please provide me the steps to convert the CMSampleBuffer to Data . FYI let formatDesc:CMFormatDescription? = CMSampleBufferGetFormatDescription(audioSample) <CMAudioFormatDescription 0x17010d890 [0x1b453ebb8]> { mediaType:

Pulling data from a CMSampleBuffer in order to create a deep copy

ⅰ亾dé卋堺 提交于 2019-11-28 09:08:45
I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureVideoDataOutputSampleBufferDelegate. Since the CMSampleBuffers come from a preallocated pool of (15) buffers, if I attach a reference to them they cannot be recollected. This causes all remaining frames to be dropped. To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If

Pulling data from a CMSampleBuffer in order to create a deep copy

安稳与你 提交于 2019-11-27 02:16:05
问题 I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureVideoDataOutputSampleBufferDelegate. Since the CMSampleBuffers come from a preallocated pool of (15) buffers, if I attach a reference to them they cannot be recollected. This causes all remaining frames to be dropped. To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the