Creating a streaming video solution with AWS

Getting streaming video right is hard. Really hard! In hind sight if I was to do this over again I wouldn’t try and roll my own. If I had to do this over again, I’d jump straight onto a service like Vimeo.

But I have a phobia of taking the easy route for anything, so of course I decided to roll my own. Well not 100% of my own, I did at least utilise existing AWS services.

If you’re crazy like me, this will fill in some of the gaps in the AWS documentation. If you’re sane, and don’t want to be bombarded supporting user’s playback problems, then utilise an existing service.

What I needed to provide for this project was a secure way to stream videos. This needed to cater for my users to shared one or more individual videos with one or more of their clients (or viewers).

Step 1: Setup the S3 Buckets

Firstly jump into your S3 account and setup the S3 buckets you’re going to use. I created one bucket with two folders. One for uploads that would contain the source video files (conveniently called uploads) and another for the outputs from MediaConvert. I called this one outputs.

The whole bucket was set to Block all public access; one of the main requirements was to protect the video sources. I was going to use CloudFront to serve these final files up.

I also needed to set a CORS policy to allow access only from the domains that I would be serving video content from. There’s a convenient CORS policy section in the bucket config for this:


[
    {
        "AllowedHeaders": [
            "*"
        ],
        "AllowedMethods": [
            "GET"
        ],
        "AllowedOrigins": [
            "https://alloweddomain1.com",
            "https://alloweddomain2.com"
        ],
        "ExposeHeaders": []
    }
]

I blew a bunch of time fighting with CORS.

Step 2: Setup the CloudFront Distribution

Since the AWS bucket is private, CloudFront will be used to serve up the final video resources to viewers.

Setup a new CloudFront Distribution from the AWS console. As part of the setup CloudFront will provide a *.cloudfront.net domain name you can use. You can also setup your own domain. I went down the latter path.

You can also set the Origin that CloudFront will use to load resources from. In this case point it to the folder you created on the S3 bucket at the start.

The area that took me the most time was configuring the behaviours. I needed to set the allowed methods to GET, HEAD and OPTIONS and set the Origin Request Policy to Managed-CORS-S3Origin. I also set the Restrict Viewer Access to Yes. I am restricting access based on Signed Cookies. This was another area that caused a lot of trial and error so I strongly suggest browsing in incognito mode to avoid old cookies making this harder to test.

Step 3: Setup the Signed Cookies The next step was to implement the signed cookie to allow access to the transcoded videos. I used signed cookies because I wanted to:

restrict access to the video content. have some control of the device that was accessing the content. allow easy access to multiple resources once the viewer was authenticated. prevent signed URLs from being shared directly. stop allowing access after 3 months. To setup a signed cookie the public key needs to be first added to CloudFront using the CloudFront Key Management section.

Once that was setup I used the AWS SDK to create the signed cookie using the private key.

            'profile' => 'default',
            'version' => '2014-11-06',
            'region'  => 'us-east-1'
        ]);
        $expires = date_create()->modify("+3 months")->setTimezone(new DateTimeZone("UTC"))->format("U");
        $k = 'the resource to allow access to'
        $p       = <<<POLICY
{
    "Statement": [
        {
            "Resource": "{$k}*",
            "Condition": {
                "DateLessThan": {"AWS:EpochTime": {$expires}}
            }
        }
    ]
}
POLICY;
        $cookies = $c->getSignedCookie([
                "policy"      => $p,
                "private_key" => "pathtoprivatekey.pem",
                "key_pair_id" => "KEYPAIRID",
            ]
        );

I had an intermediate page that an authenticated viewer visited after logging in and before accessing the videos. On this page a cookie was created and added to the viewer’s browser that covered any resources they were able to access. This also enabled me to ensure the cookie did not include videos the viewer no longer had access to.

Step 4: The Viewing Page

Each time a viewer accessed a video I redirected them to a media viewing page. This just included some information about the video and then linked to the m3u8 playlist file provided as part of the transcoding process.

Since HLS is not yet supported natively across all browsers I used the hls.js library from Github.

This was surprisingly straightforward to implement. The biggest mistake I made was using hls.js first and then falling back to the html video element. This led to problems with Safari on iPhone which supported HLS natively but was not supported by the hls.js library. There’s more information on the website about this.

In Summary

AWS provides a great suite of tools to stream media to a viewer’s browser in a secure and controlled manner. The tools could easily be built on to implement custom logic for access. The browser support for streaming media is patchy, however the hls.js project makes this easy to cover.

The biggest problems I have had since implementing this solution is managing problems based on viewer’s internet connections and devices. Ideally when transcoding specific versions should be created for HD, SD, and mobile phone devices for better playback. It would also be nice to allow playback frame rate to be adjusted based on the connection.