I developed video recording using image_picker. We need to limit video recording time.
pubspec.yaml dependencies: image_picker: ^0.4.10
[f
@Coding24h:
The following is the dart file that capture video and play video, explanation:
dart files written by me:
(1) 'GlobalVariables.dart' - contains a class 'gv' with static variables that can be accessed by all 'Pages/Widgets'
(2) 'LangStrings.dart' - this app is multi-languages, this dart files contains Strings in different languages (English, Chinese ......)
(3) 'ScreenVariables.dart' - contains all screen related variables such as orientation, height, width, physical height, devicepixelratio ......
(4) 'Utilities.dart' - contains 'utilities functions' that can be used by all 'Pages', for example, display a toast message whenever and whereever you want.
The InitState() Method: this is where the camera control is declared and initialized.
The dispose() Method: The video recording will be stopped manually when: (1) The user click the 'Stop' buttun, or (2) The user leave this page. The video recording will also be stopped automatically when the user switch to another app or turn off the screen of the mobile phone. In either case, you should dispose the camera Control object inside dispose()
didChangeAppLifecycleState() Method: Pause Video playing if the app goes background.
funTimerVideo() Method: A timer that change the position of the 'Slider' per second. (You can create another timer to stop the video recording if that user has a recording limit)
funSelectVideo() Method: Select videos from the gallery.
funCameraStart() Method: Start capture a picture 'for future preview', then start capture video
funCameraStop() Method: Stop capture video, you can call this method if: (1) The user press the 'Video Recording Stop' button, or (2) The 'another timer' I mentioned in 'funTimerVideo' calls this method when the recording limit exceeded.
Screen Capture (Video Player and Capture Screen):
Program Source Code (For the Video Player and Capture Page ONLY):
// Import Flutter Darts
import 'dart:io';
import 'package:camera/camera.dart';
import 'package:file_picker/file_picker.dart';
import 'package:flutter/cupertino.dart';
import 'package:flutter/material.dart';
import 'package:font_awesome_flutter/font_awesome_flutter.dart';
import 'package:intl/intl.dart';
import "package:threading/threading.dart";
import 'package:video_player/video_player.dart';
// Import Self Darts
import 'GlobalVariables.dart';
import 'LangStrings.dart';
import 'ScreenVariables.dart';
import 'Utilities.dart';
// Import Pages
import 'BottomBar.dart';
// Home Page
class ClsHome extends StatefulWidget {
@override
_ClsHomeState createState() => _ClsHomeState();
}
class _ClsHomeState extends State<ClsHome> with WidgetsBindingObserver {
AppLifecycleState _lastLifecycleState;
// Declare Camera
CameraController ctlCamera;
// Var for Video
bool bolVideoPaused = true;
@override
void initState() {
super.initState();
print("Init State Started");
if (gv.bolHomeFirstInit) {
// This page is called by navigator.push twice, do nothing on the first call
gv.bolHomeFirstInit = false;
} else {
// Not the first time call of Init, do Init
WidgetsBinding.instance.addObserver(this);
try {
// Try to dispose old Camera Control
ctlCamera.dispose();
print("Camera Disposed 1");
} catch (err) {
print("Camera Dispose Error: " + err.toString());
}
try {
// Declare New Camera Control
ctlCamera = CameraController(gv.cameras[1], ResolutionPreset.high);
ctlCamera.initialize().then((_) {
if (!mounted) {
ut.showToast('1:' + ls.gs('SystemErrorOpenAgain'), true);
return;
}
setState(() {});
print('Controller Inited');
});
} catch (err) {
ut.showToast('2:' + ls.gs('SystemErrorOpenAgain'), true);
print("Camera Init Error: " + err.toString());
}
try {
gv.threadHomeVideo = new Thread(funTimerVideo);
gv.threadHomeVideo.start();
print('New Video Timer Started');
} catch (err) {
ut.showToast('3:' + ls.gs('SystemErrorOpenAgain'), true);
print('New Video Timer Error: ' + err.toString());
}
}
print("Init State Ended");
}
@override
void dispose() async {
super.dispose();
print("Dispose Started");
if (gv.bolHomeFirstDispose) {
gv.bolHomeFirstDispose = false;
} else {
WidgetsBinding.instance.removeObserver(this);
try {
await funCameraStop();
ctlCamera?.dispose();
print("Camera Disposed");
} catch (err) {
//
print("Play Video Dispose Error 1: " + err.toString());
}
try {
// gv.ctlVideo?.dispose();
gv.ctlVideo.pause();
// gv.threadPageHomeVideo.abort();
// print('Thread Video Aborted');
} catch (err) {
//
print("Play Video Dispose Error 2: " + err.toString());
}
// print('Controller dispose');
}
print("Dispose Ended");
}
@override
void didChangeAppLifecycleState(AppLifecycleState state) {
_lastLifecycleState = state;
print('***** Life Cycle State: ' + _lastLifecycleState.toString() + ' *****');
if (_lastLifecycleState.toString() == 'AppLifecycleState.paused') {
try {
if (gv.ctlVideo.value.isPlaying) {
bolVideoPaused = true;
gv.ctlVideo.pause();
}
setState(() {});
} catch (err) {
//
}
} else if (_lastLifecycleState.toString() == 'AppLifecycleState.resumed') {
}
}
// Timer to setState Video Play Position
void funTimerVideo() async {
while (true) {
await Thread.sleep(1000);
try {
if (gv.ctlVideo.value.isPlaying) {
gv.dblHomeVDSliderValueMS =
gv.ctlVideo.value.position.inMilliseconds.toDouble();
setState(() {});
}
} catch (err) {
// Video Not Yet Ready, Do Nothing
}
}
}
// Select Video from External Storage
funSelectVideo() async {
print('Select Video Started');
String filePath = '';
filePath = await FilePicker.getFilePath(type: FileType.VIDEO);
if (filePath != '') {
try {
// Declare Video if a video file is selected
gv.ctlVideo = VideoPlayerController.file(File(filePath))
..initialize().then((_) {
// Set Video Looping
gv.ctlVideo.setLooping(true);
// Get Video Duration in Milliseconds
gv.intHomeVDMS = gv.ctlVideo.value.duration.inMilliseconds;
setState(() {});
print('Video Inited');
});
} catch (err) {
print('Video Init Error: ' + err.toString());
gv.intHomeVDMS = 0;
ut.showToast(ls.gs('VideoErrorUnsupport'), true);
}
} else {
print('No Video Selected');
setState(() {});
}
print('Select Video Ended');
}
// The Widget that show the Video Player
Widget ctnVideoPlayer() {
try {
double dblHeight = sv.dblBodyHeight / 1.8;
print('Before Check Video Init');
if (gv.ctlVideo.value.initialized) {
print('Before Check Video AspectRatio');
if (gv.ctlVideo.value.aspectRatio < 1) {
dblHeight = sv.dblBodyHeight / 1.25;
}
print('Before Return ctnVideoPlayer');
return Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
Container(
padding: EdgeInsets.fromLTRB(0, 10, 0, 10),
height: dblHeight,
width: sv.dblScreenWidth,
child: Center(
child: AspectRatio(
aspectRatio: gv.ctlVideo.value.aspectRatio,
child: VideoPlayer(gv.ctlVideo),
),
),
),
objVideoSlider(),
],
);
print('After Return ctnVideoPlayer');
} else {
print('Before Return EMPTY ctnVideoPlayer');
return Container(
// color: Colors.white,
height: dblHeight,
width: sv.dblScreenWidth,
child: Center(
child: Text(ls.gs('SelectVideo')),
),
);
print('After Return EMPTY ctnVideoPlayer');
}
} catch (err) {
print('Page Home ctnVideoPlayer() : ' + err.toString());
return Container(
// color: Colors.white,
height: sv.dblBodyHeight / 1.8,
width: sv.dblScreenWidth,
child: Center(
child: Text(ls.gs('SelectVideo')),
),
);
}
}
// function when Play or Pause clicked
funPlayVideo() async {
try {
if (gv.ctlVideo.value.initialized) {
if (gv.ctlVideo.value.isPlaying) {
bolVideoPaused = true;
gv.ctlVideo.pause();
// Stop Camera Recording
funCameraStop();
} else {
bolVideoPaused = false;
gv.ctlVideo.play();
// Start Camera Recording
funCameraStart();
}
setState(() {});
} else {
// Do Nothing
}
} catch (err) {
// Do Nothing
}
}
// function when Forward 15 seconds clicked
funForwardVideo() async {
try {
if (gv.ctlVideo.value.initialized) {
gv.ctlVideo.seekTo(gv.ctlVideo.value.position + Duration(seconds: 15));
setState(() {});
} else {
// Do Nothing
}
} catch (err) {
// Do Nothing
}
}
// function when Backward 15 seconds clicked
funBackwardVideo() async {
try {
if (gv.ctlVideo.value.initialized) {
gv.ctlVideo.seekTo(gv.ctlVideo.value.position - Duration(seconds: 15));
setState(() {});
} else {
// Do Nothing
}
} catch (err) {
// Do Nothing
}
}
// Widget to show the Slider of the playing position of Video
Widget objVideoSlider() {
try {
if (gv.ctlVideo.value.initialized) {
return Row(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
Text(' '),
Text(gv.ctlVideo.value.position.inHours.toString() +
":" +
(gv.ctlVideo.value.position.inMinutes % 60)
.toString()
.padLeft(2, '0') +
":" +
(gv.ctlVideo.value.position.inSeconds % 60)
.toString()
.padLeft(2, '0')),
Expanded(
child: CupertinoSlider(
min: 0.0,
max: gv.intHomeVDMS.toDouble(),
divisions: (gv.intHomeVDMS / 1000).toInt(),
value: gv.dblHomeVDSliderValueMS,
onChanged: (double dblNewValue) {
objVideoSliderChanged(dblNewValue);
},
),
),
Text(gv.ctlVideo.value.duration.inHours.toString() +
":" +
(gv.ctlVideo.value.duration.inMinutes % 60)
.toString()
.padLeft(2, '0') +
":" +
(gv.ctlVideo.value.duration.inSeconds % 60)
.toString()
.padLeft(2, '0')),
Text(' '),
],
);
} else {
return Container();
}
} catch (err) {
return Container();
}
}
// Function when Slider Changed Manually
objVideoSliderChanged(dblNewValue) {
gv.dblHomeVDSliderValueMS = dblNewValue;
gv.ctlVideo
.seekTo(Duration(milliseconds: gv.dblHomeVDSliderValueMS.toInt()));
setState(() {});
}
// Function Start Camera
void funCameraStart() async{
// Declare File Name
DateTime dtTimeStamp() => DateTime.now();
String strTimeStamp = DateFormat('yyyyMMdd_kkmmss').format(dtTimeStamp());
String strMovieFile = gv.strMoviePath + '/' + strTimeStamp + '.mp4';
gv.strImageFile = gv.strImagePath + '/' + strTimeStamp;
print('File Path: ' + strMovieFile);
try {
await ctlCamera.takePicture(gv.strImageFile + '_01.jpg');
await ctlCamera.startVideoRecording(strMovieFile);
} catch(err) {
ut.showToast('4:' + ls.gs('SystemErrorOpenAgain'), true);
}
}
// Function Stop Camera
void funCameraStop() async{
try {
await ctlCamera.stopVideoRecording();
} catch(err) {
// ut.showToast('5:' + ls.gs('SystemErrorOpenAgain'), true);
}
try {
await ctlCamera.takePicture(gv.strImageFile + '_02.jpg');
} catch(err) {
// ut.showToast('5:' + ls.gs('SystemErrorOpenAgain'), true);
}
}
// Main Widget
@override
Widget build(BuildContext context) {
try {
if (ctlCamera != null) {
if (!ctlCamera.value.isInitialized) {
print('return Container');
return Container();
}
print('Before Return');
return Scaffold(
appBar: PreferredSize(
child: AppBar(
title: Text(
ls.gs('Player'),
style: TextStyle(fontSize: sv.dblDefaultFontSize),
),
),
preferredSize: new Size.fromHeight(sv.dblTopHeight),
),
body: Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
ctnVideoPlayer(),
Stack(children: <Widget>[
Container(
// color: Colors.white,
height: sv.dblBodyHeight / 25,
width: sv.dblScreenWidth,
child: Center(
child: Row(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
Text(' '),
AspectRatio(
aspectRatio: ctlCamera.value.aspectRatio,
child: CameraPreview(ctlCamera),
),
]),
),
),
Container(
// color: Colors.white,
height: sv.dblBodyHeight / 25,
width: sv.dblScreenWidth,
child: Center(
child: Row(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
RaisedButton(
onPressed: () => funSelectVideo(),
child: Icon(Icons.folder_open),
),
Text(' '),
RaisedButton(
onPressed: () => funBackwardVideo(),
child: Icon(FontAwesomeIcons.angleDoubleLeft),
),
Text(' '),
RaisedButton(
onPressed: () => funPlayVideo(),
child: bolVideoPaused
? Icon(Icons.play_arrow)
: Icon(Icons.pause),
),
Text(' '),
RaisedButton(
onPressed: () => funForwardVideo(),
child: Icon(FontAwesomeIcons.angleDoubleRight),
),
],
),
),
),
])
]),
bottomNavigationBar: ClsBottom(),
);
} else {
return Container();
}
} catch (err) {
print('PageHome Error build: ' + err.toString());
return Container();
}
}
}
The camera recording is launched using an intent on Android (ACTION_VIDEO_CAPTURE) and a UIModal in ios. It's not a plugins that's customizable to your needs it only does what it's intended too. The activity and the UIModal UI is built for that purpose. The best way to do what you want to do is:
To create your own plugin and add a timer in the class that triggers after recoring. When timer finishes just stop and save the recording and return to the previous view. You can see how it's done here on Android for Flutter then just look at iOS and do the same. It should only be a few additional lines of code.
Modify the current plugin to fit your needs, this shouldn't be this difficult. It won't be something that they would want to merge back into the code because it's not a feature most of the users of this library would need so no pull request would be made for that, but you can keep it on your own github and share it. The code is here. Just keep in mind you'd have to maintain it yourself, even though there won't be much change in here.