Video streaming with Azure and .NET
Here are the lessons learned from using Azure Media Services for implementing the video streaming feature in one of our projects: what we really enjoyed, and what we were puzzled with.
Jisp project, developed by WaveAccess, has a feature for the uploading and streaming of videos. Video streaming is a common task, but in our case it it has some tricky parts, such as preprocessing and maintaining the streaming quality and format. Along with that, the video also must be played properly on all popular platforms, namely Android, iOS and web desktop.
What is Jisp
Jisp is an Award winning application that connects buyers with online and offline retailers. When installed on a customer’s mobile device, it informs them about discounts and special offers, stores the browsing history, and forms a customer profile for retailers. The app utilises standard iBeacon and NFC technologies as well as proprietary developed solution for accelerometer-enabled beacons that enables offline stores to track customers’ product interests and shopping behavior which can be used later to increase sales and store throughput. From the customer perspective it's a great way to quickly get product details on their phone screen, see if there are any special offers on this item and even complete their shopping journey with self-checkout via JispPAY. Jisp also enables a sort of social media for shopping lovers to share videos, interesting goods, and tips.
In 2017, Jisp was the winner of the Microsoft Partner Awards in Business Analytics, for implementing a machine learning module that creates a customer profile based on their online and offline shopping choices for making personal offers to them.
About Azure Media Services
To solve this task, we used Azure Media Services (AMS). The backend uses .NET. In addition to usual video streaming specifications, there were business-specific app requirements:
- Users must be able to upload their videos and tag them with a specific geographical location
- The full video must be available only in a specific location, in other cases viewers will see a short preview defined by the uploader
- Besides that, the video must be cropped to a square screen format
Accessibility setup
To start uploading videos, the storage must be set up and configured. There are two types of accounts: Media service account and Azure Storage аccount.
Media service account
This account is responsible for video metadata storage. When creating this account, the following parameters must be defined:
- Resource group — to set up groups to monitor and control access to assets
- Location — physical location of the data center where the resource is hosted
- Storage account — for asset files storage
Storage account
This account type is responsible for file storage — source video file storage, in our case.
Authentication
After creating the accounts, authentication must be set up. Azure Media Services use authentication via Azure Active Directory, and there are two options:
- User authentication
- Service principal authentication
The first option, user authentication, makes sense if there are no intermediate authentication services — for example, if there is a desktop app that connects directly to Azure Media Services (AMS). In this case we will request the user’s login and password for Azure Active Directory.
User account authentication
The second option, service principal authentication, is convenient if we have our own service for user authentication in the app, — for example, if we have a mobile or web app with its own backend service. In this case, the AMS keys are stored on the server, and user authentication is now a part of the application’s business logic.
Authentication via Service Principal
Jisp uses the second option.
Having accounts and authentication all set, we can start integrating AMS with .NET SDK. The main object for working with AMS entities is CloudMediaContext. Below is the code to initialize CloudMediaContext:
var azureKey = new AzureAdClientSymmetricKey(clientId, clientKey);
var credentials = new AzureAdTokenCredentials(tenant, azureKey, AzureEnvironments.AzureCloudEnvironment);
var tokenProvider = new AzureAdTokenProvider(credentials);
var context = new CloudMediaContext(new Uri(url), tokenProvider);
Assets
After having CloudMediaContext initialized, we can start uploading videos. In AMS, Asset is an entity that contains digital files (video, audio, images, thumbnail collections, text tracks and closed caption files) and the metadata about these files. First, we need to create an asset.
var inputAsset = context.Assets.Create(assetName, storageName, AssetCreationOptions.None);
Then we add our source video to an asset.
assetFile = inputAsset.AssetFiles.Create(fileName);
Then we need to modify it for adaptive streaming:
var job = context.Jobs.CreateWithSingleTask("Media Encoder Standard", "Adaptive Streaming", inputAsset, "Adaptive Bitrate MP4", AssetCreationOptions.None);
job.Submit();
job = job.StartExecutionProgressTask(
j =>
{
Console.WriteLine("Job state: {0}", j.State);
Console.WriteLine("Job progress: {0:0.##}%", j.GetOverallProgress());
}, CancellationToken.None).Result;
Console.WriteLine("Transcoding job finished.");
var outputAsset = job.OutputMediaAssets[0];
After the video is ready, we can publish it using a Locator:
context.Locators.Create(LocatorType.OnDemandOrigin, outputAsset, AccessPermissions.Read, TimeSpan.FromDays(365));
Then we can play our video.
var videoUri = outputAsset.GetMpegDashUri();
Filters
To modify videos, AMS offers a filtering mechanism. Users can choose the video and audio quality, resolution, codecs used, and filter videos by posting time.
There are two types of filters:
- Local filters are the filters that can be applied only to the video they were created for. This means, a specific local filter can be created for each video.
- Global filters are those that can be applied to any video in the given account. They conveniently enable video streaming in a number of formats for different devices.
Below is an example of a filter that limits video playing to a certain interval. It is a way to “crop” a video without having to modify the original file.
var filterName = $"{length}seconds";
outputAsset.AssetFilters.Create(filterName,
new PresentationTimeRange(start: 0, end: Convert.ToUInt64(10000000 * length)),
new List());
To apply the created filters, we need to modify the video URL in the following way:
Source URL:
testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf)
URL after applying the 10seconds filter:
testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf,filter=10seconds)
If several filters are applied, the URL looks like this:
testendpoint-testaccount.streaming.mediaservices.windows.net/fecebb23-46f6-490d-8b70-203e86b0df58/BigBuckBunny.ism/Manifest(format=mpd-time-csf,filter=10seconds;square)
Scalability
Like any cloud service, Azure Media Services offers great scalability.
As for video preprocessing, AMS has the following options:
- Scaling up, which in this case means faster video processing. There are three rates with 1x, 2x, and 4x capacities accordingly. If video processing time is not important, the minimal 1x rate can be chosen. If video processing time is crucial and has to be minimized, the 4x rate is the best option.
- Scaling out, meaning the parallel processing of several videos. This is easily achieved by setting the necessary number of Media processing units.
Speaking of video streaming, AMS supports automatic output quality if necessary. Also, standard and premium rates are available. For the standard rate, the maximum capacity is 600Mbps. Premium rate allows up to 200Mbps per each streaming unit.
Speed
As an example, let’s compare the processing time for 30 second, 1 minute, and 5 minute videos in 720p.
Rate |
30 sec (4,4 Mb) |
1 min (8,1 Mb) |
5 min (44,9 Mb) |
x1 |
1:32 (49 Kb/s) |
2:14 (62 Kb/s) |
8:38 (89 Kb/s) |
x2 |
1:07 (68 Kb/s) |
1:31 (91 Kb/s) |
5:07 (150 Kb/s) |
x4 |
0:33 (137 Kb/s) |
0:45 (185 Kb/s) |
1:49 (422 Kb/s) |
For shorter videos, the effect of capacity increase is less significant, especially at a x2 rate. But as the video length goes up, x2 and x4 capacity increase saves a lot of processing time.
Pitfalls
All technologies have their drawbacks, and Azure Media Services is no exception. During development, we had to overcome some limitations to deliver the project on schedule.
The first difficulty we faced was the delay between uploading the video and it being ready for streaming. Ideally, the video is ready for streaming right after uploading, but in fact, it is rarely so. While AMS allows for getting the source video URL and streaming it, processing (which in our case means also preparing asset files) still takes longer than just downloading the video to Azure Storage bypassing AMS. For this reason, we have decided to upload the source video to Azure Storage and stream the source directly while it is being processed, and then move to streaming the processed asset when it's ready.
The second challenge was to change the aspect ratio. In our case AMS did not change the video aspect ratio to 1:1 correctly: despite the video being converted to square, the metadata displayed otherwise, therefore the thumbnail image was broken. This problem was fixed using FFmpeg in conjunction with AMS.
Another problem we had to deal with was to indicate the length of a played sample. By default, in AMS the cropping step was 6 seconds and not less. In the advanced configuration we managed to enable the 2-second cropping step, which was acceptable.
Apart from that, Azure Media Services is a fully functional video streaming solution, serving many clients.
Let us tell you more about our projects!
Сontact us:
hello@wave-access.com