Generally, I’ve found that the most common way to upload files to Microsoft Azure Storage is making a request to the web server that we’ve developed and once you have the file in this, we start to upload to the Storage account. If the file is small could be enough. However, if you want to upload items of considerable size we can improve the process uploading files directly from the client side to the Storage account. To do this we must follow some steps: Enable CORS for our Storage account, generate a token to get access and finally upload the file through an AJAX request.
Enable CORS for Blobs
Some time ago, enable CORS for an Azure Storage account was not possible and what was done was to locate the HTML code in the own Storage, in order to be under the same origin. Now it’s possible to enable CORS for Blobs and Tables, as follows:
using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Shared.Protocol; using System; using System.Collections.Generic; using System.Configuration; using System.Linq; using System.Web; namespace UploadBlobsJavaScript { public class StorageCORSConfig { public static void RegisterDomains() { //1. Install-Package WindowsAzure.Storage //2. Get Storage context var account = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageAccount"].ConnectionString); //3. Create a blob client var blobClient = account.CreateCloudBlobClient(); // 4. Get the current service properties ServiceProperties blobServiceProperties = blobClient.GetServiceProperties(); //5. Create a new CORS properties configuration blobServiceProperties.Cors = new CorsProperties(); //6. Add CORS rules blobServiceProperties.Cors.CorsRules.Add(new CorsRule() { AllowedHeaders = new List<string>() { "*" }, AllowedMethods = CorsHttpMethods.Put | CorsHttpMethods.Get | CorsHttpMethods.Head | CorsHttpMethods.Post, AllowedOrigins = new List<string>() { "http://corsazurestorage.azurewebsites.net" }, ExposedHeaders = new List<string>() { "*" }, MaxAgeInSeconds = 1800 // 30 minutes }); blobClient.SetServiceProperties(blobServiceProperties); } } }
Basically what we do is retrieve the service properties for our blobs and create a new configuration for the CORS property. It is based on rules in terms of we can determine which sources are allowed to make requests and which Http Verbs are available. In this example I only allow access from http://corsazurestorage.azurewebsites.net.
This method is called from the Global.asax to make it available from the start:
using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Http; using System.Web.Mvc; using System.Web.Optimization; using System.Web.Routing; namespace UploadBlobsJavaScript { public class MvcApplication : System.Web.HttpApplication { protected void Application_Start() { AreaRegistration.RegisterAllAreas(); GlobalConfiguration.Configure(WebApiConfig.Register); FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters); RouteConfig.RegisterRoutes(RouteTable.Routes); BundleConfig.RegisterBundles(BundleTable.Bundles); StorageCORSConfig.RegisterDomains(); } } }
A quick way to check that CORS is enabled and It has applied the rule that you’ve created is using Azure Storage Explorer:
Select the Blob Containers (N) line and the CORS option will appear with a check. If we click on it we can see, edit and delete the rules:
Retrieve a Shared Access Signature for the new blob
In order to access to the account resources is necessary to generate a token that will be attached during request:
using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.Blob; using System; using System.Configuration; using System.Globalization; using System.Web.Http; namespace UploadBlobsJavaScript.Controllers { public class StorageController : ApiController { public string GetSaS(string containerName, string blobName) { //1. Nuget: Install-Package WindowsAzure.Storage //2. Get context account var account = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageAccount"].ConnectionString); //3. Create a blob client var blobClient = account.CreateCloudBlobClient(); //4. Get a container and create it if not exists var container = blobClient.GetContainerReference(containerName); container.CreateIfNotExists(); //5. Get a blob reference CloudBlockBlob blob = container.GetBlockBlobReference(blobName); //6. Create a Shared Access Signature for the blob var SaS = blob.GetSharedAccessSignature( new SharedAccessBlobPolicy() { Permissions = SharedAccessBlobPermissions.Write, SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(30), }); return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, SaS); } } }
In this code we have a method called GetSaS which receives a container name and the name of the blob we want to create. Once we have the reference of the blob we generate the token. In this case we can only write in the next 30 minutes.
Upload files from JavaScript
To check the configuration with an example, I used the following page:
@{ ViewBag.Title = "Home Page"; } <div class="jumbotron"> <h2>Upload files to Microsoft Azure Storage using JavaScript</h2> </div> <div class="container"> <div class="row"> <div class="form-group"> <label for="ContainerName">Container name:</label> <input type="text" class="form-control" id="ContainerName" placeholder="Enter a container name"> </div> <div class="form-group"> <label for="Files"></label> <input type="file" id="fileControl" multiple /> <progress id="uploadProgress" class="form-control" value="0" max="100"></progress> </div> <div class="form-group"> <input type="button" id="btnUpload" value="Upload files" /> </div> </div> </div> @section scripts{ <script> function upload(file, type, url) { var ajaxRequest = new XMLHttpRequest(); ajaxRequest.onreadystatechange = function (aEvt) { console.log(ajaxRequest.readyState); if (ajaxRequest.readyState == 4) console.log(ajaxRequest.responseText); }; ajaxRequest.upload.onprogress = function (e) { var percentComplete = (e.loaded / e.total) * 100; console.log(percentComplete + "% completed"); uploadProgress.value = percentComplete; }; ajaxRequest.onerror = function () { alert("ajaxRequest error"); }; ajaxRequest.open('PUT', url, true); ajaxRequest.setRequestHeader('Content-Type', type); ajaxRequest.setRequestHeader('x-ms-blob-type', 'BlockBlob'); ajaxRequest.send(file); } $("#btnUpload").click(function () { var files = fileControl.files; for (var i = 0, file; file = files[i]; i++) { var reader = new FileReader(); reader.onloadend = (function (theFile) { return function (e) { $.ajax({ type: 'GET', url: '/api/storage/getsas?containerName=' + $("#ContainerName").val() + '&blobName=' + theFile.name, success: function (res, status, xhr) { upload(e.target.result, theFile.type, res); }, error: function (res, status, xhr) { alert("Can't get the Shared Access Signature"); } }); }; })(file); reader.readAsArrayBuffer(file); } }); </script> }
Here I get the name of the container, the file you want to upload (I can upload several at once, If I want to) and a button to start the process. In the JavaScript code I read each of the selected files and make a call to /api/storage/getSaS for each one, with the name of the container and the blob. Once I get the URL, I make a call to the upload method, which handles several XMLHttpRequest events and perform a request through the verb PUT with Content-Type and x-ms- blob-type headers and the file.
Hope this helps.
Cheers!