Creating an App using Flutter to run Videos and Audios through network and assets
What is Flutter?
Flutter is an open source UI development kit created by Google which is used to develop applications for Android,iOS,Windows all from a single codebase. This means that we can create one single code and then Flutter Framework will convert that code into the native languages of the respective platforms.
Why Flutter?
The use case of Flutter is simple: Create beautiful application with just few lines of code. Moreover flutter comparatively has a fast development time in which it uses its in-house created tool known as Hot Reload. Flutter has an expressive and flexible UI. Flutter uses Dart Language which is optimized for UI development.
Any Prerequisites?
To install and run the application we need the following:
- Flutter
- Dart
- Android Studio to download the necessary SDKs
- Android Emulator created using Android Studio or your personal mobile phone
- Any IDE that suits you. In this article we will be using Visual Studio Code
What is the Task?
The task includes the following :
- Create a flutter app.
- Use assets (eg. audios and videos)
- App will have to play this audios and videos from Assets
- Also add features to play audio and video from Internet
- Create buttons like play,pause stop for audio and video both.
How to accomplish the Task?
To fulfill the task just follow the steps shown below:
Step 1: Create a basic test app using the create command of the flutter in cmd.
Step 2: In the main.dart file clear the contents and write the code shown below.
import 'package:flutter/material.dart';
import 'package:test_app/ui/home.dart';
main() {
runApp(ProApp());
}
class ProApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MyApp();
}
}
Dart function always runs the code which is written inside the main function. So we use the already created flutter method called runApp to run our application. Here we created a ProApp class which inherits the Stateless Widget. This class should return our main application interface. We need to use this class as we want to utilize the hot reload feature.
Step 3: Now we will create the User Interface of the App. We have used the material design created by Google which provides us with rich user interface widgets.
import 'package:audioplayers/audio_cache.dart';
import 'package:flutter/material.dart';
import 'package:test_app/ui/vidnet.dart';
MyApp(){
var audioPlayer = AudioCache();
play() async => await audioPlayer.play("audio/uclring.mp3");
var videonetwork = Container(
padding: EdgeInsets.only(top: 40),
alignment: Alignment.bottomCenter,
child: VideoApp(),
);
var colrowtext2 = Text(
"Player: Scholes",
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrowtext3 = Text(
"Team Against: Barcelona",
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrowtext4 = Text(
"Year: 2008",
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrowtext5 = Text(
"Stage: SemiFinal 2nd Leg",
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrowtext6 = Text(
"Stadium: Old Trafford",
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrowtext1 = Text(
'Team:Manchester United',
style: TextStyle(fontWeight: FontWeight.bold,fontSize: 15,color: Colors.red),
textAlign: TextAlign.justify,
);
var colrow = Row(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[colrowtext1],
);
var cont1col = Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[colrow,colrowtext2,colrowtext3,colrowtext4,colrowtext5,colrowtext6],
);
var cont1 = Container(
alignment: Alignment.center,
width: 300,
height: 200,
decoration: BoxDecoration(
color: Colors.red.shade100,
borderRadius: BorderRadius.circular(50)
),
child: cont1col,
);
var imgurl1 = "https://upload.wikimedia.org/wikipedia/en/thumb/7/7a/Manchester_United_FC_crest.svg/200px-Manchester_United_FC_crest.svg.png";
var cont2 = GestureDetector(
onTap: play,
child: Container(
margin: EdgeInsets.only(bottom: 200),
width: 100,
height: 100,
decoration: BoxDecoration(
borderRadius: BorderRadius.circular(50),
image: DecorationImage(
image: NetworkImage(imgurl1),
),
),
),
);
var contcolumn = Stack(
alignment: Alignment.center,
children: <Widget>[cont1,cont2,videonetwork
],
);
var imgurl2 = "https://wallpapercave.com/wp/YyfyCXK.jpg";
var mycontainer = Container(
alignment: Alignment.center,
width: double.infinity,
height: double.infinity,
decoration: BoxDecoration(
image: DecorationImage(
image: NetworkImage(imgurl2),
fit: BoxFit.cover
),
),
child: contcolumn,
);
var ucllogourl = "https://upload.wikimedia.org/wikipedia/en/thumb/b/bf/UEFA_Champions_League_logo_2.svg/1200px-UEFA_Champions_League_logo_2.svg.png";
var ucllogo = Image.network(ucllogourl);
var myAppBar = AppBar(
title: Text("Manchester United"),
leading: ucllogo,
);
var myhome = Scaffold(
appBar: myAppBar ,
body: mycontainer,
);
return MaterialApp(
debugShowCheckedModeBanner: false,
home: myhome,
);
}
Step 4: After creating the User Interface which will include the audio and the video we need to define a class which will set the state,initialize, and play the video. This class will return the video inside a video player.
import 'package:flutter/material.dart';
import 'package:video_player/video_player.dart';
class VideoApp extends StatefulWidget {
@override
_VideoAppState createState() => _VideoAppState();
}
class _VideoAppState extends State<VideoApp> {
VideoPlayerController _controller;
@override
void initState() {
super.initState();
_controller = VideoPlayerController.network(
'https://github.com/abhi-jeet589/flutter/raw/master/ucl_Trim.mp4')
..initialize().then((_) {
// Ensure the first frame is shown after the video is initialized, even before the play button has been pressed.
setState(() {
_controller.play();
});
});
}
@override
Widget build(BuildContext context) {
return SingleChildScrollView(
child: Column(
children: <Widget>[
Container(
padding: const EdgeInsets.all(20),
child: AspectRatio(
aspectRatio: _controller.value.aspectRatio,
child: Stack(
alignment: Alignment.bottomCenter,
children: <Widget>[
VideoPlayer(_controller),
VideoProgressIndicator(_controller, allowScrubbing: true),
IconButton(icon: Icon(Icons.video_library), onPressed: () {
if(_controller.value.isPlaying){
_controller.pause();
}else
_controller.play();
})
],
),
)
)
]
)
);
}
@override
void dispose() {
super.dispose();
_controller.dispose();
}
}
Step 5: To use the audio and video methods in our task we need to import the packages from the public repository of flutter. To do this we will just declare the packages in the pubspec.yaml file. Also we want to use assets in this app so we will declare the assets in our pubspec.yaml file.
name: test_app description: A new Flutter project. # The following line prevents the package from being accidentally published to # pub.dev using `pub publish`. This is preferred for private packages. publish_to: 'none' # Remove this line if you wish to publish to pub.dev # The following defines the version and build number for your application. # A version number is three numbers separated by dots, like 1.2.43 # followed by an optional build number separated by a +. # Both the version and the builder number may be overridden in flutter # build by specifying --build-name and --build-number, respectively. # In Android, build-name is used as versionName while build-number used as versionCode. # Read more about Android versioning at https://developer.android.com/studio/publish/versioning # In iOS, build-name is used as CFBundleShortVersionString while build-number used as CFBundleVersion. # Read more about iOS versioning at # https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html version: 1.0.0+1 environment: sdk: ">=2.7.0 <3.0.0" dependencies: flutter: sdk: flutter audioplayers: ^0.15.1 video_player: ^0.10.11+2 flick_video_player: ^0.1.1 # The following adds the Cupertino Icons font to your application. # Use with the CupertinoIcons class for iOS style icons. cupertino_icons: ^0.1.3 dev_dependencies: flutter_test: sdk: flutter # For information on the generic Dart part of this file, see the # following page: https://dart.dev/tools/pub/pubspec # The following section is specific to Flutter. flutter: # The following line ensures that the Material Icons font is # included with your application, so that you can use the icons in # the material Icons class. uses-material-design: true # To add assets to your application, add an assets section, like this: assets: - assets/ - assets/audio/uclring.mp3 # - images/a_dot_burr.jpeg # - images/a_dot_ham.jpeg # An image asset can refer to one or more resolution-specific "variants", see # https://flutter.dev/assets-and-images/#resolution-aware. # For details regarding adding assets from package dependencies, see # https://flutter.dev/assets-and-images/#from-packages # To add custom fonts to your application, add a fonts section here, # in this "flutter" section. Each entry in this list should have a # "family" key with the font family name, and a "fonts" key with a # list giving the asset and other descriptors for the font. For # example: # fonts: # - family: Schyler # fonts: # - asset: fonts/Schyler-Regular.ttf # - asset: fonts/Schyler-Italic.ttf # style: italic # - family: Trajan Pro # fonts: # - asset: fonts/TrajanPro.ttf # - asset: fonts/TrajanPro_Bold.ttf # weight: 700 # # For details regarding fonts from package dependencies, # see https://flutter.dev/custom-fonts/#from-packages
Step 6: To run our app in our phone we need to be in developer mode which can be accessed in the settings menu by tapping the build number 7 times. After we have activated developer mode we need to enable USB debugging such that our app is built and installed on our phone. After completing these steps we just need to run our app using the Run Without Debugging option under the Run drop down bar.
After the app has been built and installed the interface will look like this.
Future Scope: The app right now only uses audio asset and video network due to restrictions of the page size and my knowledge at the time of creation of the app. We can add the audio network and video asset by just changing the method name in the home.dart and videonet.dart file.