How to decode H.264 video frame in Java environment

前端 未结 5 774
迷失自我
迷失自我 2020-12-23 22:21

Does anyone know how to decode H.264 video frame in Java environment?

My network camera products support the RTP/RTSP Streaming.

The service standard RTP/RTS

相关标签:
5条回答
  • 2020-12-23 22:46

    I found a very simple and straight-forward solution based on JavaCV's FFmpegFrameGrabber class. This library allows you to play a streaming media by wrapping the ffmpeg in Java.

    How to use it?

    First, you may download and install the library, using Maven or Gradle.

    Here you have a StreamingClient class that calls a SimplePlayer class that has Thread to play the video.

    public class StreamingClient extends Application implements GrabberListener
    {
        public static void main(String[] args)
        {
            launch(args);
        }
    
        private Stage primaryStage;
        private ImageView imageView;
    
        private SimplePlayer simplePlayer;
    
        @Override
        public void start(Stage stage) throws Exception
        {
            String source = "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov"; // the video is weird for 1 minute then becomes stable
    
            primaryStage = stage;
            imageView = new ImageView();
    
            StackPane root = new StackPane();
    
            root.getChildren().add(imageView);
            imageView.fitWidthProperty().bind(primaryStage.widthProperty());
            imageView.fitHeightProperty().bind(primaryStage.heightProperty());
    
            Scene scene = new Scene(root, 640, 480);
    
            primaryStage.setTitle("Streaming Player");
            primaryStage.setScene(scene);
            primaryStage.show();
    
            simplePlayer = new SimplePlayer(source, this);
        }
    
        @Override
        public void onMediaGrabbed(int width, int height)
        {
            primaryStage.setWidth(width);
            primaryStage.setHeight(height);
        }
    
        @Override
        public void onImageProcessed(Image image)
        {
            LogHelper.e(TAG, "image: " + image);
    
            Platform.runLater(() -> {
                imageView.setImage(image);
            });
        }
    
        @Override
        public void onPlaying() {}
    
        @Override
        public void onGainControl(FloatControl gainControl) {}
    
        @Override
        public void stop() throws Exception
        {
            simplePlayer.stop();
        }
    }
    

    SimplePlayer class uses FFmpegFrameGrabber to decode a frame that is converted into an image and displayed in your Stage

    public class SimplePlayer
    {
        private static volatile Thread playThread;
        private AnimationTimer timer;
    
        private SourceDataLine soundLine;
    
        private int counter;
    
        public SimplePlayer(String source, GrabberListener grabberListener)
        {
            if (grabberListener == null) return;
            if (source.isEmpty()) return;
    
            counter = 0;
    
            playThread = new Thread(() -> {
                try {
                    FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(source);
                    grabber.start();
    
                    grabberListener.onMediaGrabbed(grabber.getImageWidth(), grabber.getImageHeight());
    
                    if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {
                        AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);
    
                        DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
                        soundLine = (SourceDataLine) AudioSystem.getLine(info);
                        soundLine.open(audioFormat);
                        soundLine.start();
                    }
    
                    Java2DFrameConverter converter = new Java2DFrameConverter();
    
                    while (!Thread.interrupted()) {
                        Frame frame = grabber.grab();
                        if (frame == null) {
                            break;
                        }
                        if (frame.image != null) {
    
                            Image image = SwingFXUtils.toFXImage(converter.convert(frame), null);
                            Platform.runLater(() -> {
                                grabberListener.onImageProcessed(image);
                            });
                        } else if (frame.samples != null) {
                            ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                            channelSamplesFloatBuffer.rewind();
    
                            ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
    
                            for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                                short val = channelSamplesFloatBuffer.get(i);
                                outBuffer.putShort(val);
                            }
                        }
                    }
                    grabber.stop();
                    grabber.release();
                    Platform.exit();
                } catch (Exception exception) {
                    System.exit(1);
                }
            });
            playThread.start();
        }
    
        public void stop()
        {
            playThread.interrupt();
        }
    }
    
    0 讨论(0)
  • 2020-12-23 22:46

    You can use a pure Java library called JCodec ( http://jcodec.org ).
    Decoding one H.264 frame is as easy as:

    ByteBuffer bb = ... // Your frame data is stored in this buffer
    H264Decoder decoder = new H264Decoder();
    Picture out = Picture.create(1920, 1088, ColorSpace.YUV_420); // Allocate output frame of max size
    Picture real = decoder.decodeFrame(bb, out.getData());
    BufferedImage bi = JCodecUtil.toBufferedImage(real); // If you prefere AWT image
    

    If you want to read a from from a container ( like MP4 ) you can use a handy helper class FrameGrab:

    int frameNumber = 150;
    BufferedImage frame = FrameGrab.getFrame(new File("filename.mp4"), frameNumber);
    ImageIO.write(frame, "png", new File("frame_150.png"));
    

    Finally, here's a full sophisticated sample:

    private static void avc2png(String in, String out) throws IOException {
        SeekableByteChannel sink = null;
        SeekableByteChannel source = null;
        try {
            source = readableFileChannel(in);
            sink = writableFileChannel(out);
    
            MP4Demuxer demux = new MP4Demuxer(source);
    
            H264Decoder decoder = new H264Decoder();
    
            Transform transform = new Yuv420pToRgb(0, 0);
    
            MP4DemuxerTrack inTrack = demux.getVideoTrack();
    
            VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
            Picture target1 = Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf,
                    ColorSpace.YUV420);
            Picture rgb = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.RGB);
            ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
            BufferedImage bi = new BufferedImage(ine.getWidth(), ine.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
            AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));
    
            decoder.addSps(avcC.getSpsList());
            decoder.addPps(avcC.getPpsList());
    
            Packet inFrame;
            int totalFrames = (int) inTrack.getFrameCount();
            for (int i = 0; (inFrame = inTrack.getFrames(1)) != null; i++) {
                ByteBuffer data = inFrame.getData();
    
                Picture dec = decoder.decodeFrame(splitMOVPacket(data, avcC), target1.getData());
                transform.transform(dec, rgb);
                _out.clear();
    
                AWTUtil.toBufferedImage(rgb, bi);
                ImageIO.write(bi, "png", new File(format(out, i)));
                if (i % 100 == 0)
                    System.out.println((i * 100 / totalFrames) + "%");
            }
        } finally {
            if (sink != null)
                sink.close();
            if (source != null)
                source.close();
        }
    }
    
    0 讨论(0)
  • 2020-12-23 22:47

    Or use Xuggler. Works with RTP, RTMP, HTTP or other protocols, and can decode and encode H264 and most other codecs. And is actively maintained, free, and open-source (LGPL).

    0 讨论(0)
  • 2020-12-23 22:55

    Take a look at the Java Media Framework (JMF) - http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html

    I used it a while back and it was a bit immature, but they may have beefed it up since then.

    0 讨论(0)
  • 2020-12-23 23:00

    I think the best solution is using "JNI + ffmpeg". In my current project, I need to play several full screen videos at the same time in a java openGL game based on libgdx. I have tried almost all the free libs but none of them has acceptable performance. So finally I decided to write my own jni C codes to work with ffmpeg. Here is the final performance on my laptop:

    • Environment: CPU: Core i7 Q740 @1.73G, Video: nVidia GeForce GT 435M, OS: Windows 7 64bit, Java: Java7u60 64bit
    • Video: h264rgb / h264 encoded, no sound, resolution: 1366 * 768
    • Solution: Decode: JNI + ffmpeg v2.2.2, Upload to GPU: update openGL texture using lwjgl
    • Performance: Decoding speed: 700-800FPS, Texture Uploading: about 1ms per frame.

    I only spent several days to complete the first version. But the first version's decoding speed was only about 120FPS, and uploading time was about 5ms per frame. After several months' optimization, I got this final performance and some additional features. Now I can play several HD videos at the same time without any slowness.

    Most videos in my game have transparent background. This kind of transparent video is a mp4 file with 2 video streams, one stream stores h264rgb encoded rgb data, the other stream stores h264 encoded alpha data. So to play an alpha video, I need to decode 2 video streams and merge them together and then upload to GPU. As a result, I can play several transparent HD videos above an opaque HD video at the same time in my game.

    0 讨论(0)
提交回复
热议问题