Using Sas Keys to Upload Files Powershell

You may find yourself in demand of a cheap however efficient solution to store your files at some indicate, but where practice yous find that solution? Look into Microsoft Azure'southward Binary Large Object (blob) storage! Blob storage is one of the Azure storage services and lets you store big amounts of text and binary information files, streaming content, or even static content for distribution.

In this tutorial, y'all'll learn how to work with blob storage in Azure by walking through some common examples.

Read on to jump in!

Prerequisites

This tutorial will be a easily-on demonstration. If yous'd like to follow along, be sure y'all have the following installed and bachelor.

  • PowerShell 7 installed.
  • Az PowerShell Module installed into your PowerShell seven surround.
  • AzCopy Executable downloaded and attainable.
  • Azure Subscription – There are multiple means to use specific Azure Resources for picayune to no price.

Edifice an Azure Environs

Before using hulk storage to store your files, you'll offset need to import PowerShell Core modules, connect with your Azure Subscription, and build an Azure surroundings.

1. Launch PowerShell 7 and run the following command to import modules you'll exist using to store files in hulk storage.

Importing the necessary modules
Importing the necessary modules
            # Az.Accounts - Provides credential management cmdlets # Az.Resources - Provides cmdlets to work with the superlative-level Azure resource providers,                 # like subscriptions # Az.Storage - Provides the cmdlets that volition help you work with               # different storage resource, similar blobs Import-Module Az.Accounts, Az.Resources, Az.Storage          

2. Next, log in to your Azure Active Directory (AD) tenant, then run the control below to complete an interactive authentication in your web browser, as shown below.

Although beyond the scope of this tutorial, at that place are other hallmark methods, such as a Service Principal or using an access token.

Azure Portal interactive login.
Azure Portal interactive login.

Always brand certain the tenant and subscription shown after logging in are that which you intend to use. If needed, you may alter your context.

3. Now run the beneath command to create a new resource group called demo, and appended with five random numbers (Get-Random -Maximum 99999). Resource groups are hierarchically below subscriptions and contain resources that allow for more granular direction.

Notice the -Location of the resource group is set to Primal US for this case. When the command completes, it stores the result in the $resourceGroup variable.

            $resourceGroup = New-AzResourceGroup "demo$(Get-Random -Maximum 99999)" -Location 'Central Usa'          
Creating a new resource group
Creating a new resource group

four. Run the control below to perform the post-obit tasks and create a new Azure storage account. For this example, the storage account is named storage, and appended with five random numbers (Get-Random -Maximum 99999). The $storageAccount variable will hold the returned object afterwards the command completes.

            # Laissez passer the -ResourceGroupName with the ResourceGroupName property  # of the $resourceGroup variable you created in step three. # Append random numbers to the storage account -Name  # similar to the resource grouping. # Ready the same -Location as the $resourceGroup variable's Location property. # Placing resources in the same region as the parent resources grouping is a practiced do.  # Specify the storage business relationship name -SkuName with either # locally redundant storage (LRS) or a globally unique name inside Azure.  $storageAccount = New-AzStorageAccount `  -ResourceGroupName $resourceGroup.ResourceGroupName `  -Name storage$(Get-Random -Maximum 99999) `  -Location $resourceGroup.Location `  -SkuName Standard_LRS          
Creating a new storage account
Creating a new storage account

v. Execute the below control to run a couple of tasks for the Azure AD Role assignment:

  • The -SignInName value uses the account y'all're currently logged in via the UserID property returned by the Get-AzAccessToken cmdlet.
  • The value of -RoleDefinitionName is the Storage Hulk Information Contributor congenital-in office you are assigning.
  • The -Scope value sets the scope of the function assignment for the storage business relationship you created (storage10029 shown below) via the $storageAccount variable's Id property.

You can always provide more granular office assignments to private containers as necessary.

            New-AzRoleAssignment `  -SignInName (Get-AzAccessToken).UserId`  -RoleDefinitionName "Storage Blob Data Correspondent"`  -Scope $storageAccount.Id          
Creating and verifying a new file
Creating and verifying a new file

6. Finally, run the series of commands below to create a file called temp.dat on your local system. Y'all'll be uploading and downloading this file from the storage account in the following sections to demonstrate how blob storage works.

            # Load the FileStream .Internet Form $file = New-Object System.IO.FileStream .\temp.dat,Create,ReadWrite # Set the size of the file $file.SetLength(10MB) # Close the handle $file.Close() # Lookup the file to confirm the size (Get-ChildItem $file.Name).Length          
Creating and verifying a new file
Creating and verifying a new file

Uploading Files via PowerShell

Now that you have built an Azure environment and created a sample file allow'southward showtime uploading the file to blob storage. Blob storage works differently than standard filesystems. Each file in the blob storage is an object and kept within containers.

The core functionality of blobs is similar to other filesystems, but in that location are use cases where either could be a better solution. Blobs can even back virtual filesystems (e.g., BlobFuse).

Microsoft offers multiple methods to upload files to your storage accounts via PowerShell, AzCopy, and Azure Portal. Only let's upload the sample file (temp.dat) to blob storage via PowerShell for a start. PowerShell allows you a consistent experience to piece of work with your Azure Storage Accounts.

The required actions to perform this demo will incur costs. Monitor your consumption and delete resources when you no longer intend to use them.

Run the commands below to create a new container and upload the temp.dat file ($file) as an object. The container is named demo for this case, but yous tin name information technology differently every bit you prefer.

            # Creates a container inside $storageAccount via Context property of the storage account # The returned object is and then passed to the $container variable $container = New-AzStorageContainer -Name demo -Context $storageAccount.Context # Uploads the temp.dat file ($file) to the demo container ($container) # The blob proper name (-Blob) will apply the same name of the file you're uploading (Get-ChildItem $file.Name) Set up-AzStorageBlobContent -File $file.Proper noun -Container $container.Proper name -Hulk (Get-ChildItem $file.Proper noun).Proper noun -Context $storageAccount.Context          
Uploading a file to Azure Storage Account
Uploading a file to Azure Storage Account

Uploading Files via AzCopy

Perhaps you accept more complex use cases, such as synchronizing content or copying content between different accounts at scale. If and then, the AzCopy control-line tool is what you need.

Run the commands below to login to your Azure tenant and copy your local file ($file) to the URL endpoint of your container. Yous're logging in to Azure tenant since AzCopy is non aware of the credentials you are using with PowerShell.

            # Login to the Azure tenant & .\azcopy.exe login # Copy the local $file to the full URI of the destination $container & .\azcopy.exe re-create $file.Name $container.CloudBlobContainer.Uri.AbsoluteUri          
Uploading to Azure Storage Account using AzCopy
Uploading to Azure Storage Account using AzCopy

Instead of uploading, perhaps you want to download files via AzCopy. If so, run the command below to copy the specified files (temp.dat) from your container to the electric current local directory: & .\azopy.exe re-create "$($container.CloudBlobContainer.Uri.AbsoluteUri)/temp.dat" .\temp.dat

Uploading Files via Azure Portal

If y'all prefer a GUI method of uploading your files, then Azure Storage Explorer is your friend. Azure Storage Explorer is one of the best graphical methods to manage your blob storage. You tin access the storage explorer from your storage account resource in the Azure Portal.

ane. Open your favorite web browser, and navigate to your Storage Explorer in Azure Portal.

ii. Click on the demo container nether Blob CONTAINERS, as shown below, and so click on Upload to access the Upload blob blade (right panel).

3. At present click on the folder icon at the Upload hulk panel to select which files to upload (temp.dat).

4. Finally, click Upload (bluish push button) to upload your file.

Using the Upload blob blade in Azure Storage Explorer
Using the Upload blob blade in Azure Storage Explorer

Once the upload completes, you tin shut the Upload blob blade and see your uploaded blob, similar the image below.

Viewing contents of a blob container in Azure Storage Explorer
Viewing contents of a blob container in Azure Storage Explorer

Downloading Files via Azure Portal

Similar to uploading content to blob storage, Azure supports downloading content in many means. But since you lot just uploaded a file (temp.dat) via Azure Portal, let's download the same file using Azure Storage Explorer in Azure Portal.

Select the file (temp.dat) to download and click on the Download button in the Azure Storage Explorer, as shown below. Doing so opens a new dialog box to confirm the download you'll see in the next footstep.

Selecting Files to Download
Selecting Files to Download

Now click on the Click here to brainstorm download push to download the files you selected.

Downloading Selected Files from the Blob Storage
Downloading Selected Files from the Blob Storage

Downloading Files via PowerShell

Like uploading files, you also get an selection to download files from the hulk storage by running commands in PowerShell. With PowerShell, you can list the objects within a container, then download them.

Run the beneath commands to list all objects in your container and download temp.dat to your local directory.

            # List all the objects within the $container to verify the empyt container was created Get-AzStorageBlob -Container $container.Name -Context $storageAccount.Context # Download the temp.dat object from the $container Get-AzStorageBlobContent -Hulk temp.dat -Container $container.Proper noun -Context $storageAccount.Context          
Downloading files from Azure Storage Account
Downloading files from Azure Storage Account

If you prefer to use short-lived unique links to download files, you can use Shared Admission Signature (SAS) tokens to create a preauthorized download link. These tokens are unique and individual authentication tokens you can use to verify your access.

Run the commands below to create a new download link for the file (temp.dat) y'all want to download. The generated download link expires after x seconds and will download (Invoke-WebRequest $uri) the content using that link to the $temp variable.

            # Generate a new download link valid for 10 minutes $uri = New-AzStorageBlobSASToken -Context $storageAccount.Context -Container $container.Proper name -Blob temp.dat -Permission r -ExpiryTime (Become-Date).AddSeconds(10) -FullUri # Utilize the link to download the file to the $temp variable $temp = Invoke-WebRequest $uri  # Alternatively write the file to the current directory Invoke-WebRequest $uri -OutFile .\temp.dat          
Download from Azure Storage Account using a SAS token
Download from Azure Storage Account using a SAS token

Hosting a Web Page on Public Internet from Blob Storage

Up to this point, you've seen use cases of downloading files past authenticated users. But did you know that hulk storage can provide an excellent option for public content as well? One example is using a blob to host your web folio content, which you'll accomplish in this demo.

Even if your web folio contents are encrypted both in transit and at residuum, anyone can access those contents if public admission is ready.

Since you are setting up a different utilise example, yous'll utilise 1 of the major benefits of the public cloud in scale and elasticity. Y'all can provision a new storage account for a specific use example and limit the take chances of using public containers.

one. Run the control beneath to create a new storage account as you did in step four of the "Building an Azure Environment" section. But this fourth dimension, you'll pass the returned object to the $publicStorageAccount variable.

            $publicStorageAccount = New-AzStorageAccount -ResourceGroupName $resourceGroup.ResourceGroupName -Name storage$(Get-Random -Maximum 99999) -Location $resourceGroup.Location -SkuName Standard_LRS          
Creating a storage account
Creating a storage account

You now have a dedicated storage account for your public content, and you lot can configure information technology to host static spider web content with the post-obit command.

2. Next, run the Enable-AzStorageStaticWebsite cmdlet to configure the storage account ($publicStorageAccount) for your new use case. The -IndexDocument sets the default web folio you desire to present to users. The -Context volition be the new storage account you but created.

            # Create the $spider web container and configure the storage account Enable-AzStorageStaticWebsite -IndexDocument index.html -Context $publicStorageAccount.Context          
Enable storage account for website hosting
Enable storage business relationship for website hosting

iii. Run the commands below to create a new HTML certificate in your current directory, and upload that certificate to the container specifically for hosting spider web content. The content type is ready to HTML (ContentType="text/html"), so web browsers can properly interpret the document.

Accessing the document on a web browser prints the Hello from <storage business relationship name> message.

            # Create a simple HTML file "<torso><h1>Hello from $($publicStorageAccount.StorageAccountName)!</h1></body>"|Out-File .\index.html # Upload the HTML file to the static web hosting container and prepare the ContentType to text/html Ready-AzStorageBlobContent -File .\index.html -Container "`$web" -Properties @{ContentType="text/html"} -Context $publicStorageAccount.Context          
Create and upload an HTML document
Create and upload an HTML document

4. Now run the following command to get the URL where users can admission your content.

            $publicStorageAccount.PrimaryEndpoints.Web          
Get the URI of the endpoint
Become the URI of the endpoint

5. Finally, open the URL in your browser, y'all'll see something like to the post-obit screenshot.

Accessing HTML Document from Blob Storage
Accessing HTML Document from Blob Storage

Cleaning up Resources

Now that you've gone through testing these new concepts in using hulk storage, you lot volition want to make clean up your resources. Why? Doing then helps you lot proceed your subscription clean. More importantly, you stop incurring additional charges.

Since all resources you used in this tutorial are in a single resources grouping, yous tin clean up all resources by deleting the resource grouping.

Resource won't e'er exist independent within a single resource group, which illustrates why liberal use of logical segmentation tin can be beneficial, especially when testing or iterating oftentimes.

Run the Remove-AzResourceGroup cmdlet beneath, specifying the ResourceGroupName holding of the $resourceGroup variable to delete the resource group and all resources within.

            Remove-AzResourceGroup -Name $resourceGroup.ResourceGroupName          
Delete resource group and contents
Delete resource group and contents

Conclusion

In this tutorial, you lot've touched on uploading and downloading files to and from blobs in cloud storage on different platforms. You've too learned it's possible to host a web folio from blob storage that users can publicly access.

You tin do much more with blob storage and other storage types, so how would you build on these concepts? Perhaps work with file storage accounts, provide serverless file systems, or utilise page blobs for virtual hard disks with Azure virtual machines?

matosprinag.blogspot.com

Source: https://adamtheautomator.com/blob-storage/

0 Response to "Using Sas Keys to Upload Files Powershell"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel