Streaming audio and video from Android to PC/web.

前端 未结 2 1871
别那么骄傲
别那么骄傲 2021-02-04 18:39

I am recent beginner to Android SDK, and the overall goal of this project is to create an app very similar to Ustream\'s or Qik\'s (yeah, I know not the best idea for a beginner

相关标签:
2条回答
  • 2021-02-04 19:02

    Doing so is not simple but possible.

    The MediaRecorder API assumes that the output is a random access file, meaning, it can got forth and back for writing the mp4 (or other) file container. As you can see in the ipcamera-for-android, the output file is directed to a socket which is not random access. The fact makes it hard to parse the outgoing stream since the MediaRecorder API will "write" some data like fps, sps/pps (on h264) and so on only when the recording is done. The API will try to seek back to the beginning of the stream (where the file header exists) but it will fail since the stream is sent to a socket and not to a file.

    Taking the ipcamera-for-android is a good reference, if I recall correctly, before streaming, it records a video to a file, opens the header and takes what it needs from there, than, it start recording to the socket and uses the data it took from the header in order to parse the stream.

    You will also need some basic understanding in parsing mp4 (or other file container you'd want to use) in order to capture the frames. You can do that either on the device or on the server side.

    Here is a good start for writing the stream to a socket: Tutorial

    I hope it was helpful, there is no good tutorial for parsing and decoding the outgoing streams since it is not so simple...but again, it is possible with some effort.

    Take a look also here to see how to direct the output stream to a stream that can be sent to the server: MediaRecorder Question

    0 讨论(0)
  • 2021-02-04 19:20

    SipDroid does exactly what you need.

    It involves a hack to circumvent the limitation of the MediaRecorder class which require a file descriptor. It saves the result of the MediaRecorder video stream to a local socket (used as a kind of pipe), then re-read (in the same application but another thread) from this socket on the other end, create RTP packets out of the received data, and finally broadcast the RTP packets to the network (you can use here broadcast or unicast mode, as you wish).

    Basically it boils down to the following (simplified code):

    // Create a MediaRecorder
    MediaRecorder mr = new MediaRecorder();
    // (Initialize mr as usual)
    // Create a LocalServerSocket
    LocalServerSocket lss = new LocalServerSocket("foobar");
    // Connect both end of this socket
    LocalSocket sender = lss.accept();
    LocalSocket receiver = new LocalSocket();
    receiver.connect(new LocalSocketAddress("foobar"));
    // Set the output of the MediaRecorder to the sender socket file descriptor
    mr.setOutputFile(sender.getFileDescriptor());
    // Start the video recording:
    mr.start();
    // Launch a background thread that will loop, 
    // reading from the receiver socket,
    // and creating a RTP packet out of read data.
    RtpSocket rtpSocket = new RtpSocket();
    InputStream in = receiver.getInputStream();
    while(true) {
        fis.read(buffer, ...);
        // Here some data manipulation on the received buffer ...
        RtpPacket rtp = new RtpPacket(buffer, ...);
        rtpSocket.send(rtpPacket);
    }
    

    The implementation of RtpPacket and RtpSocket classes (rather simple), and the exact code which manipulate the video stream content can be found in the SipDroid project (especially VideoCamera.java).

    0 讨论(0)
提交回复
热议问题