Video Streaming Applications Testing
Introduction
-What is Video Streaming?
Video/Media which is continuously received and being played back by a multimedia player is called video streaming. Live streaming is the process of broadcasting real time video to the people who are accessing the video over the internet. You see so many examples in the real world of media streaming like, YouTube, Livestream, Vimeo, Hulu, Ustream, Netflix etc.
-How it fits in IT?
We all love to watch live events especially the sports, concerts by our celebrity entertainers, but when it is impossible or difficult to attend a live event, we watch these programs in real time on our TV or over the Internet through Live streaming Apps that are available on different platforms like Android, iOS, Desktop etc.
-Who uses Video Streaming?
Broadcasting companies who want their content to be delivered everywhere uses the streaming concepts and delivers the video over the internet to its concepts
Streaming Protocols (Basics)
-Types of Streaming
Progressive Download: In this process client/player asks for a video file from the server and server sends the whole file to the client over HTTP, Your playback will only start when the file is downloaded. Since the content is downloaded on your local machine, the content is not secure. Users cannot skip forward in timeline as content is downloaded in a linear manner. There is no monitoring of the video file, it is simply being downloaded and played back. If the network speed is slow, you'll see buffering/stalled in the video. Some examples of Progressive download are YouTube, Vimeo.
Adaptive Bitrate Streaming: It is a technique used for multimedia streaming over the internet at fluctuating bandwidth. Adaptive Bitrate streaming works by detecting a user's bandwidth and CPU capacity and adjusts the quality of a video stream accordingly. You'll get a clear picture later when we explain it with a figure and an example. If we see the technologies used in the past we find that most of the streaming was done on RTSP streaming protocol, but now for adaptive bitrate streaming we use HTTP based streaming which is designed to work efficiently in the distributed HTTP networks such as Internet.
This streaming requires an encoder which encodes the raw video from a single stream to multiple bitrates. The client application or Player switches between the streams (multiple bitrate streams) as per available bandwidth and pays the stream accordingly, in result we do not see much buffering and get good experience in both high bandwidth and low bandwidth connections.
Adaptive Bitrate Streaming Technologies:
1. Apple HTTP Live Streaming
2. Adobe HTTP Dynamic Streaming
3. Microsoft Smooth Steaming
I will explain HTTP Live Streaming (HLS) as it is widely used in the media streaming industry.
What is HTTP Live Streaming?
HLS lets you send audio and video over HTTP from a web server for playback on Android, iOS, Desktop and other platform applications that support HLS playback. HLS supports both Live and Video on Demand content.
HLS Architecture:
Fig 1 – HLS Architecture
HLS consists of three major components:
1. Server Component
2. Distribution Component
3. Client Software
Server Component: The server component is responsible for taking input streams of media and encoding them digitally and then by creating multiple bitrate streams suitable for delivery that are then sent to the segmenter for creating segments or chunks.
Fig 2 – Server Component
Distribution Component:
Distribution component is responsible for content distribution. When a client/player sends a request it reaches to the origin web server and the response is sent back to the client in the form of index files. Player reads that index file and again request for the content and the content is then sent to the client in the form of chunks or segments.
All the request and response are done through the CDN over HTTP. Once the content is served to the client, a cached copy of that content is created on to the CDN. So if some other client request the same data, it is directly served from the CDN. Which reduces the load on to the origin web server.
Few examples of Media Servers are Wowza Streaming Engine, Akamai & Amazon Cloud Front.
Fig 3 – Distribution Component
Client Software:
Client software is a player that is capable of playing HLS stream in a native application or on a HLS supported browser. Any player which supports HLS stream can be embedded into the application for Live and on Demand Playback.
Few examples are Quick Time Player, JW Player, ffplay/avplay, Safari browser and many more.
How Player gets the stream from server:
When player request for the stream from server, it sends a playlist/manifest (master playlist) file which contain the details of all the streams. Master Playlist also contains the information of sub playlists/streams that are encoded in different bitrates and resolutions. When you open these playlist (.m3u8) files in a text editor you'll be able to see all the details. Below are the samples which will give you a complete picture of these playlists.
Master Playlist:
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=800000,RESOLUTION=624x352
624x352_800.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=2600000,RESOLUTION=1280x720
1280x720_2600.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=1200000,RESOLUTION=640x360
640x360_1200.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=450000,RESOLUTION=448x252
448x252_450.m3u8
Master Playlist provide the address for individual playlists/streams. In this master playlist you can see there are 4 streams present which are enecoded at different bitrates/bandwidth and resolutions. Here denotes,
- EXTM3U: Extended M3U File/Playlist
- EXT-X-STREAM-INF: Indicates that the next URL in the playlist file identifies another playlist file.
- PROGRAM-ID: Indicates value is a decimal-integer that uniquely identifies a particular presentation
- BANDWIDTH: Indicates the value is a decimal-integer of bits per second (bitrate the video is encoded)
- RESOLUTION: Indicates the resolution of the video
Sub Playlist:
When a player choose a playlist by deciding the available bandwidth, it gets the sub playlist which contains the details of the streams. Below is an example of such sub playlist.
#EXTM3U
#EXT-X-TARGETDURATION:10
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:10, no desc
fileSequence0.ts
#EXTINF:10, no desc
fileSequence1.ts
#EXTINF:10, no desc
fileSequence2.ts
#EXTINF:10, no desc
fileSequence3.ts
#EXTINF:10, no desc
fileSequence4.ts
#EXTINF:10, no desc
fileSequence5.ts
#EXTINF:10, no desc
fileSequence6.ts
#EXTINF:10, no desc
fileSequence7.ts
#EXTINF:10, no desc
fileSequence8.ts
#EXTINF:10, no desc
fileSequence9.ts
#EXTINF:10, no desc
fileSequence10.ts
Fig 4 - Play List Relationship
Testing Live Streaming Applications:
Testing Live streaming or multimedia applications require proper lab setup which includes a bandwidth controller, network sniffing tools like Wireshark, Fiddler, application debug tools like adb and monitor for Android based applications, iTools, iTunes and xCode for iOS based applications, physical devices to test the application, internet with required bandwidth.
Testing Approach:
1. Functional Testing: We should first test the functionality of the application. Below points should be tested wrt. Player.
· Ability to Launch the application
· Ability to play/pause the video stream
· Ability to increase/decrease the volume
· Ability to see Closed Captions if implemented
· Ability to forward/rewind the stream
2. Testing Video Stream: Streaming testing should be done to check the Bitrate, Buffer length and Lag in video playback. One should test below mentioned points in video streaming:
· How much time it takes to start the playback
· Lag in the video from Live content
· What is the buffer fill
· Able to playback the stream at variable bandwidth if that is implemented
3. AV Sync: Audio Video Sync or Lip Sync refers to the timing sync between audio part (sound) and video part (images) during playback. It must be tested to deliver the quality streaming application. When we talk about a streaming application and video playback, AV Sync comes into the picture.
You can test AV Sync simply by watching and observing the video playback or you can use AV test videos. There are several AV Test videos available on the internet to use. However you cannot find exact AV Syncout conditions by observing. To find exact issues you need to test you video from available stream analysers.
4. CC: Closed Captions or CC are the subtitles you see during playback of a video. These need to be tested if they are implemented in your stream. To test this functionality you need to verify that the CC is in sync with the Audio
5. Profile Switching: This is the main part to test if your stream supports Adaptive Bitrate. To test this you will require a bandwidth controller that is capable of controlling your network bandwidth (you may use any open source firmware like Openwrt/Tomato on Netgear/Buffalo etc. routers. Please note while flashing the custom firmware please follow exact steps as any bad step will brick your router) as required. A network sniffing tool to check the switching of profiles. Debugging tools to capture any kind of error situation.
Below is an illustration on how Adaptive Bitrate works in the variable bandwidth conditions. Suppose when your player started the playback it is getting a bandwidth of 500 kbps, so it started the playback of first chunk of (B-500kbps R-480x270) stream.
Then suddenly your bandwidth increases to 2000 kbps, then the player switched the stream to second chunk of (B-2000 R- 1280x720) stream. The bandwidth does not changed from 2000 kbps for the third chunk so the player plays the third chunk of (B-2000 R- 1280x720) stream.
Again your bandwidth decreases to 300 kbps, then the player switched the stream to the fourth chunk of (B-300 R- 240x144) stream.
This goes on in the same manner. The stream gets switched to the profiles as per available bandwidth.
Fig 5 – Stream Switching
Nice start Manjul!