android-camera2

Recording video using MediaCodec with Camera2 API

家住魔仙堡 提交于 2019-12-05 02:23:20
I am trying to use MediaCodec to record raw frames from ImageReader in onImageAvailable callback but unable to write a working code. Most of the examples are using Camera 1 API or MediaRecorder. My aim is to capture individual frames process it and create an mp4 out of it Raw YUV frames @Override public void onImageAvailable(ImageReader reader) { Image i = reader.acquireLatestImage(); processImage(i); i.close(); Log.d("hehe", "onImageAvailable"); } }; MediaCodec MediaCodec codec = MediaCodec.createByCodecName(name); MediaFormat mOutputFormat; // member variable codec.setCallback(new MediaCodec

Android camera2 api galaxy s7

寵の児 提交于 2019-12-05 00:01:03
问题 I am writing an app which records video from the phone and uploads it to a server. Works fine on any device except Galaxy S7. On the Galaxy S7 recording produces a video file with audio only and either no video or one video frame. This is true in the temporary file created on the phone and not just the one uploaded to the server. I am using the Camera2 API, and I have tried with the front and back cameras. I have tried with my code, and these two example applications: https://developer

Get Video and Audio buffer separately while recording video using front camera

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-04 23:03:12
问题 I dug a lot on SO and some nice blog post But seems I am having unique requirement of reading Video and Audio buffer separately for further processing on it while recording going on. My use case is like When the user starts the Video recording, I need to continuously process the Video frame using ML-Face-Detection-Kit and also continuously process the Audio frame to make sure user is speaking out something and detect the noise level as well. For this, I think I need both Video and Audio in a

android textureview full screen preview with correct aspect ratio

家住魔仙堡 提交于 2019-12-04 18:01:26
问题 I have been working with the camera2 api demo from google and unfortunately the sample application is built to display the textureview preview at approximately 70% of the screen height, after looking around I was able to determine that this was being caused by the AutoFitTextureView overriding the onMeasure() method as shown below: @Override protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) { super.onMeasure(widthMeasureSpec, heightMeasureSpec); int width = MeasureSpec

Get full screen preview with Android camera2

穿精又带淫゛_ 提交于 2019-12-04 17:57:50
问题 I'm building a custom camera using the new camera2 API. My code is based on the code sample provided by Google here. I can't find a way to get the camera preview in full screen. In the code sample, they use ratio optimization to adapt to all screens but it's only taking around 3/4 of the screen's height. Here is my code of AutoFitTextureView : public class AutoFitTextureView extends TextureView { private int mRatioWidth = 0; private int mRatioHeight = 0; public AutoFitTextureView(Context

Android Camera2 preview occasionally rotated by 90 degrees

前提是你 提交于 2019-12-04 16:51:47
I'm working on some app using Android's Camera2 API. So far I've been able to get a preview displayed within a TextureView . The app is by default in landscape mode. When using the emulator the preview will appear upside-down. On my physical Nexus 5 the preview is usually displayed correctly (landscape, not upside-down), but occasionally it is rotated by 90 degrees, yet stretched to the dimensions of the screen. I thought that should be easy and thought the following code would return the necessary information on the current orientation: // display rotation getActivity().getWindowManager()

Android Camera2 Video Playback Video and Audio Out of Sync

戏子无情 提交于 2019-12-04 13:39:31
I've been having an issue using the camera 2 api for android. I'm able to record videos, but during playback, only the audio plays. After the video is done playing, the time jumps ahead anywhere from 10 minutes to 2 hours and then plays the video back. I've never heard of an issue like this. I pretty much followed this . Here is the code for setting my media recorder: mMediaRecorder = new MediaRecorder(); // Step 2: Set sources mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE); mMediaRecorder.setOutputFormat

Android Camera2 API - Set AE-regions not working

孤街浪徒 提交于 2019-12-04 12:45:55
In my Camera2 API project for Android, I want to set a region for my Exposure Calculation. Unfortunately it doesn't work. On the other side the Focus region works without any problems. Device: Samsung S7 / Nexus 5 1.) Initial values for CONTROL_AF_MODE & CONTROL_AE_MODE mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_AUTO); mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON); 2.) Create the MeteringRectangle List meteringFocusRectangleList = new MeteringRectangle[]{new MeteringRectangle(0,0,500,500,1000)}; 3.)

Manual Focus using android camera2 API

核能气质少年 提交于 2019-12-04 11:27:20
I want to develop an Android Camera App for myself (can share it if there are interested people) that has a manual focus while video recording. I've added a SeekBar to the google sample Camera2 app but I can't find the way to implement the manual focus. I found Manual focus in camera2, android but it doesn't work on my LG G4. The stock camera app is almost perfect since it doesn't allow the manual focus in video mode. Does anyone of you have an idea ? EDIT: here's the code of the SeekBar listener: @Override public void onStopTrackingTouch(SeekBar seekBar) {} @Override public void

Video inverts 180 after recording in front camera 2 api

帅比萌擦擦* 提交于 2019-12-04 05:27:35
问题 I am using video camera2 API for video recording in app. I am following https://github.com/googlesamples/android-Camera2Basic this demo. After recording a video is preview is inverted 180 degrees. How can manage this in both cases front and back camera? 回答1: I done the following : in Kotlin setUpCameraOutputs() // Find out if we need to swap dimension to get the preview size relative to sensor // coordinate. val displayRotation = activity.windowManager.defaultDisplay.rotation