问题
I am trying to get a MediaFoundation SourceReader to read an H.264 video file, and render the frames to some Direct3D textures so I can them render them as I wish with Direct3D.
I'm using SharpDX, but the principles are of course the same as native code.
As I understand it, the general data flow should be like this:
- A
SourceReader
reads and decodes the video usingSourceReader.ReadSample()
- A
VideoProcessor
(fromIDirect3DDeviceManager9
) transfers the frame to a Direct3D9 surface usingVideoProcessor.VideoProcessBlt()
- My Direct3D9 renderer uses the surface to render the frame however I want onto the screen
If this is how it's supposed to work, I've pretty much have it figured out, and almost running.
I was able to decode frames from H.264 to X8R8G8B8 by setting EnableVideoProcessing
when I created my SourceReader
:
MediaFactory.CreateAttributes(mediaAttributes, 0);
mediaAttributes.Set(SourceReaderAttributeKeys.EnableVideoProcessing, 1);
MediaFactory.CreateSourceReaderFromURL(url, mediaAttributes, SourceReader);
And by setting the media type subtype to the X8R8G8B8 GUID for the video stream:
VideoSubType = currentMediaType.Get<Guid>(MediaTypeAttributeKeys.Subtype);
UnpackLong(currentMediaType.Get(MediaTypeAttributeKeys.FrameSize), out VideoWidth, out VideoHeight);
UnpackLong(currentMediaType.Get(MediaTypeAttributeKeys.FrameRate), out VideoFrameRateNumerator, out VideoFrameRateDenominator);
VideoInterlaceMode = (VideoInterlaceMode)(uint)(currentMediaType.Get(MediaTypeAttributeKeys.InterlaceMode));
MediaFactory.CreateMediaType(outputMediaType);
outputMediaType.Set(MediaTypeAttributeKeys.MajorType, MediaTypeGuids.Video);
outputMediaType.Set(MediaTypeAttributeKeys.Subtype, VideoSubType);
outputMediaType.Set(MediaTypeAttributeKeys.FrameSize.Guid, PackLong(VideoWidth, VideoHeight));
outputMediaType.Set(MediaTypeAttributeKeys.FrameRate.Guid, PackLong(VideoFrameRateNumerator, VideoFrameRateDenominator));
outputMediaType.Set(MediaTypeAttributeKeys.InterlaceMode.Guid, VideoInterlaceMode);
outputMediaType.Set(MediaTypeAttributeKeys.PixelAspectRatio.Guid, PackLong(1, 1));
SourceReader.SetCurrentMediaType(streamIndex, outputMediaType);
Later, when I call ReadSample()
, I am getting video and audio samples, so I think the decoding process is working fine.
However, in order to get VideoProcessor
-compatible frames, I need to create my SourceReader
, setting the D3DManager
on the media attributes, and also make sure DXVA is not disabled:
mediaAttributes.Set(SourceReaderAttributeKeys.D3DManager, VideoConnector.GetManager());
mediaAttributes.Set(SourceReaderAttributeKeys.DisableDxva, 0);
The problem is that CreateSourceReaderFromURL
fails (0x80070057) when I set both EnableVideoProcessing
and D3DManager
. This is implied by the D3DManager
reference:
You would not set this attribute if [...] you are getting compressed video from the source reader. In that case, the source reader does not create a decoder.
Supposing I could decode the frames later (perhaps with a VideoDecoder
?), I tried removing EnableVideoProcessing
and just leave D3DManager
and DisableDxva
, but in that case, it's ReadSample
that fails (0xC00D36B4) even before I get a chance to use the data in the sample.
So how am I supposed to read, decode and send my frames to the Direct3D surface?
回答1:
Please take a look at the MFCaptureToFile sample project in the SDK, there it configures the source reader to output the frames in the desired format. The method ConfigureSourceReader() does that.
回答2:
I haven't use SharpDX, but in native D3D, if you want to decode video with DXVA, you should
set MF_SOURCE_READER_DISABLE_DXVA
to false and enable MF_SOURCE_READER_ENABLE_ADVANCED_VIDEO_PROCESSING
instead of MF_SOURCE_READER_ENABLE_VIDEO_PROCESSING
. Maybe you can try it.
来源:https://stackoverflow.com/questions/21624688/how-to-get-a-sourcereader-to-decompress-frames-and-send-them-to-a-direct3d9-text