openh264

Using OpenH264 DLL in C# Project

非 Y 不嫁゛ 提交于 2021-02-20 00:42:40
问题 I'm receiving an H264 stream over UDP. I'd like to decode the stream so I can send frames to OpenCV or whatever. I came across Cisco's open sourced H264 decoder here: https://github.com/cisco/openh264 With a little effort I got the decoder solution to build in Visual Studio 2019 and tested it from the command line with a file I created from the raw UDP datagrams. It works. Now I want to figure out how to use the decoder DLL (welsdec.dll) in a C# project. The last time I did anything serious

Using OpenH264 DLL in C# Project

不羁岁月 提交于 2021-02-20 00:42:12
问题 I'm receiving an H264 stream over UDP. I'd like to decode the stream so I can send frames to OpenCV or whatever. I came across Cisco's open sourced H264 decoder here: https://github.com/cisco/openh264 With a little effort I got the decoder solution to build in Visual Studio 2019 and tested it from the command line with a file I created from the raw UDP datagrams. It works. Now I want to figure out how to use the decoder DLL (welsdec.dll) in a C# project. The last time I did anything serious

How to properly embed 3rd party .dylib files in iOS app project for App Store release?

亡梦爱人 提交于 2020-05-15 01:59:26
问题 I am building an iOS app using PJSIP library with H264 support. When building H264, I get 1 .a file and 2 .dylib files. I tried to use the .dylibs in my project by adding as "Embedded Libraries" and also by creating a separate framework and then adding it to "Embedded Libraries". But when uploading build to App Store, I'm getting errors "ERROR ITMS-90206:...", "ERROR ITMS-90171:..". All points to using external dynamic libraries in project. I followed https://developer.apple.com/library

ParamValidationExt error with WelsInitEncoderExt failed while setting up OpenH264 encoder

倖福魔咒の 提交于 2019-12-23 02:36:20
问题 Scenario: I am using OpenH264 with my App to encode into a video_file.mp4 . Environment: Platform : MacOs Sierra Compiler : Clang++ The code: Following is the crux of the code I have: void EncodeVideoFile() { ISVCEncoder * encoder_; std:string video_file_name = "/Path/to/some/folder/video_file.mp4"; EncodeFileParam * pEncFileParam; SEncParamExt * pEnxParamExt; float frameRate = 1000; EUsageType usageType = EUsageType::CAMERA_VIDEO_REAL_TIME; bool denoise = false; bool lossless = true; bool

openh264 - bEnableFrameSkip=0, bitrate can't be controlled

旧城冷巷雨未停 提交于 2019-12-22 10:35:33
问题 there are a lot of questions asked regarding opencv + H.264 but none of them gave detailed explanation. i am using openh264(openh264-1.4.0-win32msvc.dll) along with opencv 3.1(custom build with cmake having ffmpeg enabled) in visual studio, i wanted to save video coming from webcam in mp4 format with H.264 compression VideoWriter write = VideoWriter("D:/movie.mp4", CV_FOURCC('H', '2', '6', '4'), 10.0, cv::Size(192, 144), true); before using openh264, in console window i was seeing an warning

WebRTC: What is RTPFragmentationHeader in encoder implementation?

笑着哭i 提交于 2019-12-21 21:24:53
问题 I have modified h264_encoder_impl to use nvidia grid based hardware encoder. This is done by replacing OpenH264 specific calls with Nvidia API calls. Encoded stream can be written to file successfully but writing _buffer and _size of encoded_image_ are not enough and RTPFragmentationHeader also needs to be filled. // RtpFragmentize(EncodedImage* encoded_image, // std::unique_ptr<uint8_t[]>* encoded_image_buffer, // const VideoFrameBuffer& frame_buffer, // SFrameBSInfo* info, //

Openh264 compiling using PJSIP

偶尔善良 提交于 2019-12-11 12:16:24
问题 I am trying to build pjsip project with openh264 lib. Everything works fine except openh264 is not being detected by pjsip ./configure-android this is my config_site.h /* Activate Android specific settings in the 'config_site_sample.h' */ #define PJ_CONFIG_ANDROID 1 #include <pj/config_site_sample.h> #define PJMEDIA_HAS_VIDEO 1 #define PJMEDIA_HAS_OPENH264_CODEC 1 I am getting following log Using OpenH264 prefix... /home/user_name/PJSIPTOOLS/openh264-1.0.0/openlib/ checking OpenH264 usability

How to fix black screen when call is answered and loss packs on H264

允我心安 提交于 2019-12-11 11:53:58
问题 I am developing VOIP app using linphone IOS/Android library. Basically I used OpenH264 video codec. When call is answered, both side saw black screen. After long time, both size can see the video each other. When I see the log, there are many packets loss so that First frame can not be decoded. At LinphoneCallStreamsRunning, I called FIR (linphone_call_send_vfu_request) request but not helped. Is there any config for OpenH264 video codec? I want to see the video as soon as accept call. Thank

building openh264 for android platform in x86

你。 提交于 2019-12-10 23:16:55
问题 I'm trying to build openh264 for android with following command : $ make OS=android NDKROOT=/Users/nazmulhasan/android-ndk-r10d TARGET=android-17 ARCH=x86 And getting the following error: /Users/nazmulhasan/android-ndk-r10d/toolchains/x86-4.8/prebuilt/darwin-x86/bin/../lib/gcc/i686-linux-android/4.8/../../../../i686-linux-android/bin/ld: error: codec/common/cpu-features.o: incompatible target codec/common/src/WelsThreadLib.o:WelsThreadLib.cpp:function WelsQueryLogicalProcessInfo: error: