DEV Community

Cover image for Connecting to Azure blob storage from React using Azure.Identity!
John Patrick Dandison for The 425 Show

Posted on • Edited on

Connecting to Azure blob storage from React using Azure.Identity!

take me straight to the code!

On stream, after our great talk with Simon Brown, we decided to dig into building a fully client-side app that connects to Azure Blob Storage.

What is blob storage?

Just that - storage for blobs of data, big and small. Historically it stood for 'Binary Large OBjects' although that was mostly used in SQL circles for storing data in databases. Regardless of the origin, blob storage (aka S3 at AWS) is a staple of modern apps.

Azure Blob storage has some unique features that make designing apps even easier. For example - a standard storage account has a maximum egress of up to 50 Gb/s! - that's 50 Gb/s that your app server/platform doesn't have to handle on its own.

This works for upload too - standard storage in the US has a max ingress of 10Gb/s. Clients uploading or downloading directory to and from storage accounts can have a massive impact on your app's design, cost and scalability.

We've seen customers leverage this over the years - for example, streaming large media assets (think videos, pictures, datasets) from blob storage directly to clients instead of proxying through your app server.

Take this scenario - I want to share videos and pictures with people I work with, or with the internet as a whole. Previously, I would have had some storage - a network share, NAS device - and my app server would have some sort of API exposed to access that data. My app would have to send and receive data from clients, which meant my app servers would need enough bandwidth for pushing and pulling all that data around.

By using storage directly, my servers and APIs can direct clients to upload and download directly from storage, significantly reducing compute bandwidth requirements, with the benefit of a worldwide footprint of storage locations.

But how do we ensure secure access?

Historically, we used shared access signatures (SAS) tokens with storage, which are time- and operation-limited URLs with a signature for validation. For example - I'd like Read access to https://storageaccount.blob.core.windows.net/container/blob1.mp4 for the next 60 seconds - this would generate a URL with some parameters, which was then signed with the master storage account key, then the signature was tacked onto the end of the URL. Then we share that URL with whatever client needed to do the operations.

This was cool, except it meant we needed some server-side API or web server to store and manage the master account key, since can't send it directly to the client.

Enter Azure AD & Storage Blob Data RBAC

If you're familiar with Azure, you know there are two distinct 'planes' - the control plane (the management interface) and the data plane (the actual resource data). I like to think of it as the difference between being able to deploy a VM vs actually having credentials to RDP or SSH into it.

Alt Text
If you've seen this screen before, you've used Azure RBAC

All Azure resources have some degree of control plane role-based-access-control - things like 'Resource group owner' or 'resource group reader' - that allow management operations on those resources. Over time more and more data plane operations have been added, so we can use Azure RBAC for controlling both who can manage the resource as well as who has access to the resource or data itself. The advantage here is furthering the 'least privilege' mantra - a storage key is the key to the proverbial castle, so to speak, so if we can limit operations on an ongoing basis, we can limit the blast radius of any bad actors.

Storage has roles specifically for connecting to the account's data plane - connecting to blobs specifically, for example. In the IAM/role assignments blade for the storage account, note the 'Storage Blob Data...' roles. These give Azure AD accounts (users and service principals) access to the blobs directly.

Alt Text
The different storage-related roles

We're going to use this to build our client-side blob reader app.

Bill of Materials

We're going to:

  • deploy a storage account to Azure
  • add a user to the Storage Blob Data Reader role
  • Register an app in Azure AD to represent our React app
  • Create a quick-and-dirty React app
  • Add Azure Identity dependencies
  • Authenticate the user and list out our blobs

Setting up our blob storage account

Want to use CLI but don't have it setup yet? Try Azure Cloud Shell straight from your browser, or read here on getting it installed for your platform

CLI for a standard, LRS, v2 storage account:



az storage account create --name somednssafename --resource-group some-resource-group-name --kind StorageV2 --sku Standard_LRS --location eastus


Enter fullscreen mode Exit fullscreen mode

First, create a blob storage account in Azure. General Purpose v2 is fine for what we're building. I use Locally-redundant storage (LRS) for my account, but pick what's best based on your requirements.

azure storage creation pane 1
Storage account names are pretty particular

Once it's created (may take a moment or two), we'll go to the IAM blade of your storage account. Here we need to add a role assignment of Storage Blob Data Reader to a user you're going to sign-in with. This could be yourself or a test account. Start by clicking 'Add Role Assignment,' which should open up a side pane. Here we'll choose 'Storage Blob Data Reader,' and the user to whom you're allowing access. Make sure to click Save at the bottom.

add role assignment pane for blob storage
Choose Storage Blob Data Reader

Now let's add some test data. We used some images, but you can use whatever files you want. First, under Containers in the side menu, add a new container, making sure to leave it as Private. Public will open that container to the internet with no authentication, so be careful here!

create container - name, private

Once you've created your container, click it and you can upload files directly from the web interface. Upload a few files, it doesn't really matter what they are. We used pictures, but you can use whatever's handy.

Great! Now we're finished with our storage account. You can download Storage Explorer for a desktop app to view/upload/download to and from your storage accounts.

On to Azure AD!

Azure AD setup

In Azure AD, we need to register an application. This is essentially telling Azure AD "hey, here's an app, at a specific set of URLs, that needs permissions to do things - either sign-in users, and/or access resources protected by Azure AD."

CLI to register a new app:



az ad app create --reply-urls "http://localhost:3000/" \
--oauth2-allow-implicit-flow "true" \
--display-name msaljs-to-blobs \
--required-resource-access "[{\"resourceAppId\": \"00000003-0000-0000-c000-000000000000\",\"resourceAccess\": [{\"id\": \"e1fe6dd8-ba31-4d61-89e7-88639da4683d\",\"type\": \"Scope\"}]},{\"resourceAppId\": \"e406a681-f3d4-42a8-90b6-c2b029497af1\",\"resourceAccess\": [{\"id\": \"03e0da56-190b-40ad-a80c-ea378c433f7f\",\"type\": \"Scope\"}]}]"


Enter fullscreen mode Exit fullscreen mode

To register a new app in the portal, head over to the Azure Active Directory blade; alternatively, go to the AAD portal - then App Registrations.

We're going to register a new app - give it a name, choose an audience and a platform. For us, we only want users in our directory to login, so we'll stick with single tenant. More on multitenancy in a different post :). Then we need our platform - ours is a client app, so we're going to use that for now.

azure ad app registration page
Yours should look something like this

Now we'll have our app registered! Almost done. We need to go grab a couple of extra pieces of info. Once the app is registered, from the overview blade, grab the Application (Client) Id and tenant ID and stash them off somewhere, like notepad or sticky notes.

app-registration-overview

If you used the CLI, the appId will be in the returned data from the az ad app create command:

az ad app create output

We need to give our app permission to the storage service. We could do this in code when we need it, but we'll do it now since we're already here. Under the API Permissions menu, we're going to add a new one, then choose Azure Storage. There will only be one delegated permission, user_impersonation. Add this, make sure to click Save at the bottom.

choose an api
Choose Azure Storage

storage scope selection
There's only one delegated permission, user_impersonation

If you're using the CLI, you're already done - we added those permissions in the requiredResourceAccess parameter of our command.

CLI or portal, by the end, under the 'API permissions' blade you should see something like this:

configured permissions

Now we can write some code!

We've made it! We're ready to build our app. Let's start with creating a new React app. I'm using create-react-app because I'm not a React pro - use what you're comfortable with.



npx create-react-app msaljs-to-blobs --typescript
cd msaljs-to-blobs


Enter fullscreen mode Exit fullscreen mode

Now we've got our React app, let's add a few dependencies. We're using the Azure.Identity libraries for this as it's what the storage library uses.

We can add these two to our dependencies in package.json and do an npm i to install.



"dependencies: {
"@azure/identity": "1.0.3",
"@azure/storage-blob": "^12.2.0-preview.1"
}


Enter fullscreen mode Exit fullscreen mode

Next we're going to create a new component. I've got a new one called blobView.tsx:



import React from 'react';
// we'll need InteractiveBrowserCredential here to force a user to sign-in through the browser
import { InteractiveBrowserCredential } from "@azure/identity";
// we're using these objects from the storage sdk - there are others for different needs
import { BlobServiceClient, BlobItem } from "@azure/storage-blob";

interface Props {}
interface State {
    // a place to store our blob item metadata after we query them from the service
    blobsWeFound: BlobItem[];
    containerUrl: string;
}

export class BlobView extends React.Component<Props, State> {
    state: State;

    constructor(props: Props, state: State) {
        //super(state);
        super(props, state);
        this.state = { blobsWeFound: [], containerUrl: "" }
    }

    // here's our azure identity config
    async componentDidMount() {
        const signInOptions = {
            // the client id is the application id, from your earlier app registration
            clientId: "01dd2ae0-4a39-43a6-b3e4-742d2bd41822",
            // this is your tenant id - the id of your azure ad tenant. available from your app registration overview
            tenantId: "98a34a88-7940-40e8-af71-913452037f31"
        }

        const blobStorageClient = new BlobServiceClient(
            // this is the blob endpoint of your storage acccount. Available from the portal 
            // they follow this format: <accountname>.blob.core.windows.net for Azure global
            // the endpoints may be slightly different from national clouds like US Gov or Azure China
            "https://<your storage account name>.blob.core.windows.net/",
            new InteractiveBrowserCredential(signInOptions)
        )

        // this uses our container we created earlier - I named mine "private"
        var containerClient = blobStorageClient.getContainerClient("private");
        var localBlobList = [];
        // now let's query our container for some blobs!
        for await (const blob of containerClient.listBlobsFlat()) {
            // and plunk them in a local array...
            localBlobList.push(blob);
        }
        // ...that we push into our state
        this.setState({ blobsWeFound: localBlobList, containerUrl: containerClient.url });
    }

    render() {
        return (
            <div>
                <table>
                    <thead>
                        <tr>
                            <th>blob name</th>
                            <th>blob size</th>
                            <th>download url</th>
                        </tr>
                    </thead>
                    <tbody>{
                        this.state.blobsWeFound.map((x, i) => {
                            return <tr key={i}>
                                <td>{x.name}</td>
                                <td>{x.properties.contentLength}</td>
                                <td>
                                    <img src={this.state.containerUrl + x.name} />
                                </td>
                            </tr>
                        })
                    }
                    </tbody>
                </table>
            </div>
        )
    }
}


Enter fullscreen mode Exit fullscreen mode

And that's it! Our App.tsx just includes a reference to this component. The Azure Identity libraries handle logging you in, asking for consent and putting tokens in the correct headers, absolving the developer from having to worry with token storage.

Run the app and you should see the blobs listed in your storage account.

Stay connected!

We stream live twice a week at twitch.tv/425Show! Join us:

  • 11a - 1p eastern US time Tuesdays
  • 11a - 12n eastern US time Fridays for Community Hour

Be sure to send your questions to us here, on twitter or email: iddevsoc@service.microsoft.com!

Until next time,
JPD

Top comments (5)

Collapse
 
bmasottalcg profile image
bmasotta-lcg • Edited

Hi!
what about just putting urls to images stored on a container
Can I skip using a client for downloading the images urls? supose I have and API for that/the image urls are stored in another place
The frontend dev is already using MSAL so the app user is already logged in with MSAL.
Thanks!

Collapse
 
yaroslavoz profile image
Yaroslav

why this annoying ts?? how to make it on js-react??

Collapse
 
q118 profile image
Shelby Anne

just remove the type declarations and replace interface with class

Collapse
 
q118 profile image
Shelby Anne

Could you please share the link to the repo for this?

Collapse
 
nidhimohan profile image
Nidhi Mohan

Did you release a post for a multitanent application?