I have a directshow filter that reads M2TS stream and demux it. I can render video and audio in GraphStudio and see it. My question is how can I use this filter as a live source for Expression Encoder job? I saw some recommendations saying to use subgraphs, but I am not so familiar with DirectShow to understand what it is. Thank you in advance for any help.
Expression Encoder would look for DirectShow video source devices, which are typically cameras. You might want to implement your won virtual camera so that EE4
would pick it up and start receiving video from it.
A popular sample on doing it is Vivek's virtual camera project available from http://tmhare.mvps.org/downloads.htm (4th link).
来源:https://stackoverflow.com/questions/7870843/how-to-use-directshow-filter-as-a-live-input-for-expression-encoder-4