Music Player with Flutter framework
Completed Flutter's first task given by Vimal Daga Sir by creating a music player.
The main motto of this task is to use the Flutter's Framework and play the audio from local files, inbuilt tones, and from the internet. The same goes for the Video Player too.
The UI of my app is:
Dart is the programming language used in this application.
Now Understanding each and every concept :
Firstly create a different folder for all of the projects or tasks then create a basic flutter application from command line(cmd)
My project name is task_app colored in yellow.
I used Visual Studio Code as IDE, Android Studio can be used as IDE.
For design, audio, and video, Libraries, or packages are used to integrate their functionality in my code.
I got these dependencies from Pub.dev and every dependency should be written in pubspec.yml file then save the file it will automatically download the dependencies.
For the design of the application, a package called Material.dart is used and it is a preloaded package so don't take tension about this package.
Just like java, the function that is to be called is written in the main function, main() runs the main called function runApp().
runApp() is a pre-built function responsible for the running of applications on your phone.
It calls the function that is to be displayed, in my case it is Task()
Task is a stateless widget that returns MaterialApp() that provides a box to your application that is a white screen. The stateless widget provides a superpower of hot reload that is any changes to code reflect instantly without recompiling the whole code. MaterialApp will call Homepage() function to be displayed on white screen.
The homepage() is a stateful widget that creates a state of the application that can be initialized/changed throughout the development. The difference between stateful and stateless widget is about the state that makes stateful widget more powerful.
In the above figure, audioPlayer is an object of AudioPlayer() that can be used further to call functions/methods. The same goes for video player but instead of object video package has its own data type VideoPlayerController. Video is picked up through the server using VideoPlayerCOntroller.network(URL) function. and to initialize the video _controller.initialize() is used.
"Remember: You just can't take any URL to play in your app like YouTube, Dailymotion and so on. Because those URLs are not for this. To use these videos one should know the concept of APIs. This principle is also for audio"
The _HomePageState returns the Scaffold function that has a pre-defined framework that includes AppBar, bottom, body, and so on.
In User Interface(UI) from the body's top-left, a raised button is placed which on press plays a tone from the assets of the application. For this functionality, AudioCache() from audioplayers package is used. It plays the song which is on the assets folder of the application. In my case on the press, it will play a Nokia.mp3 file. Similarly the next to this is also a raised button that will play Krishna.mp3 file.
The third raised button plays a file that comes from server. I uploaded the file on my GitHub due to API issues from gaana.com, Spotify, and many others.
I also used images to beautify my design which are placed in pic/images folder and it is also an assets.
Assets are defined in the punspec.yml file:
The second row that is the audio player control is placed:
Firstly created a container then a row to place the buttons horizontally. The functionality of buttons comes from the audioPlayer object like play, pause, stop, and so on. Pressing the play button will play the song with the help of audioPlayer.play() function and the pause button appears to set the played value to false. Similarly pressing the pause button will pause the song and set the played value to true. For setting the value we use setState((){}) function as it tells the program to change the state of the variable.
The third Column has a container that retains the video. Here we use FutureBuilder() function because whatever comes from server and store in a variable. Then this variable has a datatype called Future. The builder tells what is going to be built. Builder returns Aspect ratio that has a Gesture Detector as a child which on tap on the video will reload the video from the starting if the connection is established. If the connection is not established then buffer sign will come up.
The last one is the floating button:
The Floating button is placed in a row that's why they have row widget as a parent. The first one will go to local storage with the help of a function called filepicker() from filepicker package. filepicker function is a future type that's why "await" keyword is used. await keyword can only be used if the whole is async {}. After going to storage it will select a file and music starts to play and can be controlled by audio player control.
The Second Floating button is for playing and pausing the video. If video is playing then a pause icon appears else play icon.
Working model of music player
{{ thankyou for reading }}
Github Link