问题
I am using libstreaming library and trying to stream with the RtspClient and the MedicaCodec API. I am testing with a galaxy s3 with android 4.4.
The problem is that no matter if I use buffer to buffer or surface to buffer I get this error : java.lang.IllegalStateException: The decoder input buffer is not big enough (nal=181322, capacity=65536).
and java.lang.RuntimeException: The decoder did not decode anything.
MediaRecorder api works fine but the quality is so low I can't tell if I have a cat or a dog in front of me.
Here is my code:
import android.support.v4.app.Fragment;
import android.os.Bundle;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.view.SurfaceHolder;
import net.majorkernelpanic.streaming.Session;
import net.majorkernelpanic.streaming.SessionBuilder;
import net.majorkernelpanic.streaming.audio.AudioQuality;
import net.majorkernelpanic.streaming.gl.SurfaceView;
import net.majorkernelpanic.streaming.rtsp.RtspClient;
import net.majorkernelpanic.streaming.video.VideoQuality;
import net.majorkernelpanic.streaming.MediaStream;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/**
* A placeholder fragment containing a simple view.
*/
public class MainActivityFragment extends Fragment implements RtspClient.Callback,
Session.Callback, SurfaceHolder.Callback {
// surfaceview
private static SurfaceView mSurfaceView;
// Rtsp session
private Session mSession;
private static RtspClient mClient;
public MainActivityFragment() {
}
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
View view = inflater.inflate(R.layout.fragment_main, container, false);
mSurfaceView = (SurfaceView) view.findViewById(R.id.surface);
// Configures the SessionBuilder
mSession = SessionBuilder.getInstance()
.setContext(getActivity().getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
.setAudioQuality(new AudioQuality(8000, 16000))
.setVideoEncoder(SessionBuilder.VIDEO_H264)
.setVideoQuality(new VideoQuality(960, 720, 20, 500000))
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(0)
.setCallback(this)
.build();
// Configures the RTSP client
mClient = new RtspClient();
String ip, port, path;
// We parse the URI written in the Editext
Pattern uri = Pattern.compile("rtsp://(.+):(\\d+)/(.+)");
Matcher m = uri.matcher(AppConfig.STREAM_URL);
m.find();
ip = m.group(1);
port = m.group(2);
path = m.group(3);
mClient.setCredentials(AppConfig.PUBLISHER_USERNAME,
AppConfig.PUBLISHER_PASSWORD);
mClient.setServerAddress(ip, Integer.parseInt(port));
mClient.setStreamPath("/" + path);
mClient.setSession(mSession);
mClient.setCallback(this);
// Use this to force streaming with the MediaRecorder API
mSession.getVideoTrack().setStreamingMethod(MediaStream.MODE_MEDIACODEC_API_2);
mSurfaceView.getHolder().addCallback(this);
return view;
}
@Override
public void onDestroy() {
super.onDestroy();
mClient.release();
mSession.release();
mSurfaceView.getHolder().removeCallback(this);
}
@Override
public void onRtspUpdate(int message, Exception exception) {
switch (message) {
case RtspClient.ERROR_CONNECTION_FAILED:
case RtspClient.ERROR_WRONG_CREDENTIALS:
System.out.println(exception.getMessage());
exception.printStackTrace();
break;
}
}
@Override
public void onSessionError(int reason, int streamType, Exception e) {
switch (reason) {
case Session.ERROR_CAMERA_ALREADY_IN_USE:
break;
case Session.ERROR_CAMERA_HAS_NO_FLASH:
break;
case Session.ERROR_INVALID_SURFACE:
break;
case Session.ERROR_STORAGE_NOT_READY:
break;
case Session.ERROR_CONFIGURATION_NOT_SUPPORTED:
VideoQuality quality = mSession.getVideoTrack().getVideoQuality();
System.out.println("APPERROR: The following settings are not supported on this phone: " +
quality.toString()+" "+
"("+e.getMessage()+")");
e.printStackTrace();
break;
case Session.ERROR_OTHER:
break;
}
if (e != null) {
System.out.println(e.getMessage());
e.printStackTrace();
}
}
@Override
public void onPreviewStarted() {
mClient.startStream();
}
@Override
public void onSessionConfigured() {
}
@Override
public void onSessionStopped() {
}
@Override
public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2, int arg3) {
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
mSession.startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
mClient.stopStream();
}
@Override
public void onBitrateUpdate(long bitrate) {
}
@Override
public void onSessionStarted() {
}
}
Please Help! I'm desperate!
回答1:
The request for more data came late but I found a solution. As many posts say I had to modify the libstreaming library. I changed:
public MediaStream() {
// code change
mRequestedMode = MODE_MEDIACODEC_API_2;
mMode = MODE_MEDIACODEC_API_2;
}
public synchronized void start() throws IllegalStateException, IOException {
if (mDestination==null)
throw new IllegalStateException("No destination ip address set for the stream !");
if (mRtpPort<=0 || mRtcpPort<=0)
throw new IllegalStateException("No destination ports set for the stream !");
mPacketizer.setTimeToLive(mTTL);
// code change
encodeWithMediaCodec();
}
AND had to call to surface method mSession.getVideoTrack().setStreamingMethod(MediaStream.MODE_MEDIACODEC_API_2);
AND had to limit my config to these values: http://developer.android.com/guide/appendix/media-formats.html#recommendations
or it would crash or green screen or whatever.
来源:https://stackoverflow.com/questions/31267641/android-libstreaming-with-medicacodec-api-buffer-size-not-big-enough-error