Azure Blob CLI: Create, Manage, and Secure Blobs

Author

Reads 789

Blue Body of Water
Credit: pexels.com, Blue Body of Water

The Azure Blob CLI is a powerful tool for managing and securing your Azure Blob Storage resources. You can create a new blob container using the `az storage container create` command.

To create a new blob, you'll need to specify the name of the container, the resource group, and the storage account. The `az storage container create` command takes these parameters as arguments.

With the Azure Blob CLI, you can easily manage your blob containers and their contents. This includes deleting containers and blobs, as well as listing the contents of a container.

You can also use the `az storage blob upload` command to upload a local file to a blob container. This command takes the name of the container, the name of the blob, and the path to the local file as arguments.

See what others are reading: Docker Azure Cli Container

Authorization

Authorization is a crucial step in working with Azure Blob CLI. You can authorize access to Blob storage using Microsoft Entra credentials or a storage account access key, with Microsoft Entra credentials being the recommended option.

Credit: youtube.com, Stored Access Policy Vs Shared Access Signature (SAS) - Azure Blob Storage Access Permissions

To authorize with Microsoft Entra credentials, you need to sign in to your Azure account with the az login command. Azure role assignments may take a few minutes to propagate. The --auth-mode parameter in Azure CLI commands for data operations against Blob storage allows you to specify how to authorize a given operation, and setting it to login enables authorization with Microsoft Entra credentials.

Only Blob storage data operations support the --auth-mode parameter, while management operations automatically use Microsoft Entra credentials for authorization.

Authorize Access

You can authorize access to Blob storage from the Azure CLI either with Microsoft Entra credentials or by using a storage account access key. Using Microsoft Entra credentials is recommended.

Azure CLI commands for data operations against Blob storage support the --auth-mode parameter, which enables you to specify how to authorize a given operation.

Set the --auth-mode parameter to login to authorize with Microsoft Entra credentials. Only Blob storage data operations support the --auth-mode parameter.

Credit: youtube.com, Authorization Code Grant Flow Overview

Management operations, such as creating a resource group or storage account, automatically use Microsoft Entra credentials for authorization.

To authorize access, sign in to your Azure account with the az login command. Azure role assignments may take a few minutes to propagate.

Run the login command to open a browser and connect to your Azure subscription.

Create an Account

To create an account, you'll need to start with a storage account. Create a general-purpose storage account with the az storage account create command.

This type of account can be used for all four services: blobs, files, tables, and queues.

Blob Management

Blob Management is a crucial aspect of Azure Blob CLI. You can manage blob properties and metadata using the CLI.

System properties, such as read-only properties, are automatically set for each Blob Storage resource. These properties map to certain standard HTTP headers.

User-defined metadata is a useful feature that allows you to store additional values with the resource. You can specify one or more name-value pairs for a Blob Storage resource.

Metadata values are for your own purposes and don't affect how the resource behaves. For more information on metadata, see the az storage blob metadata reference.

Blob Operations

Credit: youtube.com, Storage - Create Blobs via Az CLI

You can copy blobs between storage containers using the az storage blob copy start command, which supports various source and destination options, including URIs, shares, and SAS URLs.

The az storage blob copy start-batch command allows you to recursively copy multiple blobs between storage containers within the same storage account. You can use Unix filename pattern matching with the --pattern parameter to specify a range of files to copy.

AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account, offering high-performance and scriptable data transfer for Azure Storage.

Create a

Create a storage account by using the az storage account create command. This command is used to create a general-purpose storage account that can be used for all four services: blobs, files, tables, and queues.

Create a container for storing blobs with the az storage container create command. This command requires authorization, which can be achieved by assigning the Storage Blob Data Contributor role to yourself or using the storage account key.

Credit: youtube.com, How to connect and create a blob in Azure Blob storage

To create a container with Azure CLI, call the az storage container create command. This command can be used to create a single container or to automate container creation using Bash scripting operations.

Create a file to upload to a block blob using a command in Azure Cloud Shell. This file will be uploaded to the container you created earlier using the az storage blob upload command.

Upload a blob to the container you created earlier using the az storage blob upload command. This command can be used to create a new blob or to overwrite an existing blob if it already exists.

Upload multiple files at the same time using the az storage blob upload-batch command. This command can be used to upload files recursively and supports Unix filename pattern matching.

Create a new blob from a file path or update the content of an existing blob with automatic chunking and progress notifications using the az storage blob upload command. This command can be used to upload a file to a storage blob or to update the content of an existing blob.

Explore further: Azure Update

Credit: youtube.com, Create Azure Blob Storage Account DEMO - Explained in 5 minutes

Upload a file to a storage blob using the az storage blob upload command. This command can be used to create a new blob or to overwrite an existing blob if it already exists.

Upload blobs recursively to a storage blob container using the az storage blob sync command. This command can be used to sync blobs recursively and supports various options for customization.

Upload files from a local directory to a blob container using the az storage blob upload-batch command. This command can be used to upload files recursively and supports Unix filename pattern matching.

Create a new blob or update the content of an existing blob using the az storage blob upload command. This command can be used to upload a file to a storage blob or to update the content of an existing blob.

Recommended read: Update Azure Cli

List

You can list blobs in a container using the az storage blob list command. This command returns a list of all blobs stored within the container.

For your interest: Azure Cli Commands List

Credit: youtube.com, Microsoft Azure Training - [39] Azure Storage - Part 2 - BLOB Storage & Security(Exam 70-533)

To refine the scope of your search, you can use the --prefix parameter to select either a single known file or a range of files whose names begin with a defined string. You can specify a virtual directory as part of the --prefix parameter.

The az storage blob list command also allows you to include additional types of objects in the response, such as soft-deleted blobs, snapshots, and versions, by specifying a value for the --include parameter. You can combine multiple values to return multiple object types.

You can limit the number of blobs returned from a container using the --num-results parameter, which has a service limit of 5,000. If the number of blobs returned exceeds either the --num-results value or the service limit, a continuation token is returned, allowing you to use multiple requests to retrieve any number of blobs.

To list containers, you can use the az storage container list command, which returns a list of all containers in the storage account. You can use the --prefix parameter to select containers whose names begin with a given character string.

The --num-results parameter can be used to limit the number of containers returned by the request, with a service limit of 5,000. If the number of containers returned exceeds either the --num-results value or the service limit, a continuation token is returned, allowing you to use multiple requests to retrieve any number of containers.

Delete-Batch

Credit: youtube.com, What is Soft Delete in Blob Storage - Azure Storage Account (16/100)

Delete-Batch is a powerful command that allows you to delete multiple blobs at once. It's a great time-saver, especially when dealing with large numbers of files.

You can use the az storage blob delete-batch command to delete blobs from a container recursively. This means you can delete all files in a directory or all files with a specific name.

One example of using delete-batch is to delete all blobs ending with ".py" in a container that have not been modified for 10 days. This can be a useful way to clean up old files that are no longer needed.

Delete-batch also allows you to delete blobs with a specific format, such as 'cli-2018-xx-xx.txt' or 'cli-2019-xx-xx.txt' in a container. You can also use it to delete all blobs with a specific prefix, like 'cli-201x-xx-xx.txt' except for 'cli-2018-xx-xx.txt' and 'cli-2019-xx-xx.txt'.

It's worth noting that delete-batch requires the --include parameter to be used in conjunction with the --prefix parameter to return deleted containers or blobs within the retention period. This is useful for retrieving a list of deleted blobs or containers.

You can also use delete-batch to delete blobs with associated snapshots, and it's required to specify the --include parameter to delete the blob along with all snapshots.

Undelete

Credit: youtube.com, Azure Storage Undelete Options

Undelete is a crucial feature in Azure Storage that allows you to recover deleted blobs and containers. You can restore soft-deleted blobs or snapshots with the az storage blob undelete command, but this will only be successful if used within the specified number of days set in the delete retention policy.

To undelete a blob, you'll need to enable soft delete on your storage account and configure it to retain deleted blobs for a certain period. This will give you a window of time to recover deleted blobs before they are permanently deleted.

If you're using versioning, modifying, overwriting, deleting, or restoring a blob will automatically create a new version, making it easier to recover previous versions of your blobs. The az storage blob undelete command can restore the latest version of a blob if versioning is enabled.

You can also use the az storage blob undelete command to restore soft-deleted containers within the associated retention period. To do this, you'll need to supply values for the --name and --version parameters to ensure that the correct version of the container is restored.

Restoring deleted blobs can be a lifesaver, especially if you've accidentally deleted important data. By enabling soft delete and configuring it on your storage account, you can recover deleted blobs and containers with ease.

Blob Properties

Credit: youtube.com, Azure Blob Storage: Managing Blobs in Azure Storage Accounts

Blob properties can be read and updated using the Azure CLI. You can use the az storage blob show command to retrieve a blob's properties and metadata, but not its content.

System properties exist on each blob storage resource, and some are read-only while others can be read or set. User-defined metadata consists of one or more name-value pairs that you specify for a blob storage resource.

You can use the az storage blob metadata show command to return all user-defined metadata for a specified blob or snapshot. This command is required if the blob has an active lease.

On a similar theme: Azure Resource

Properties

A container exposes both system properties and user-defined metadata, which can be read or set.

System properties are read-only, while user-defined metadata is for your own purposes only and doesn't affect how the resource behaves.

To display the properties of a container, you can call the az storage container show command in Azure CLI.

Credit: youtube.com, AZ-204 | Setting and Retrieving Properties and Metadata | Blob Storage | Learn Smart Coding

This command allows you to view the properties of a single named container, as well as all containers with a specific prefix.

Some system properties map to certain standard HTTP headers under the covers.

User-defined metadata consists of one or more name-value pairs that you specify for a blob storage resource.

You can use metadata to store additional values with the resource, but metadata values are for your own purposes only.

To read blob properties or metadata, you must first retrieve the blob from the service.

Use the az storage blob show command to retrieve a blob's properties and metadata, but not its content.

Blob metadata is an optional set of name/value pairs associated with a blob, which can be added when necessary.

To read and write blob metadata, use the az storage blob metadata show and az storage blob metadata update commands, respectively.

The az storage blob metadata update command sets user-defined metadata for the blob as one or more name-value pairs.

This command overwrites any existing metadata, so be careful when updating blob metadata.

Set Tier

Credit: youtube.com, AZ104 -- Azure Blob Storage Configuration: Complete Guide to Blobs, Tiers, and Cost Optimization

You can change a blob's tier to move it and its data to the target tier, and you can do this with the az storage blob set-tier command.

You may also use the Copy Blob operation to copy a blob from one tier to another, creating a new blob in the desired tier while leaving the source blob in the original tier.

Changing tiers from cool or hot to archive takes place almost immediately, and after a blob is moved to the archive tier, it's considered offline and can't be read or modified.

You'll need to rehydrate an archived blob to an online tier before you can read or modify its data, and you can read more about Blob rehydration from the archive tier.

The az storage blob set-tier command allows you to set the tier to hot for a single, named blob within a container.

Blob Security

Azure Blob CLI provides a robust set of features to ensure the security of your blob data. You can set a legal hold on a blob using the `az storage blob set-legal-hold` command, which specifies if a legal hold should be set on the blob.

Credit: youtube.com, #1 Azure Blob Container to Initial Access | Pwned Labs #microsoft #azure #security

To generate a shared access signature (SAS) for a blob, you can use the `az storage blob generate-sas` command. This command can be used to generate a SAS token with various permissions, including read-only, and can be restricted to specific IP addresses.

You can also specify the permissions the SAS grants, such as add, create, delete, execute, and read, which can be combined for more flexibility. Additionally, you can specify the UTC datetime at which the SAS becomes invalid or valid, which is useful for controlling access to your blob data.

Setting a legal hold on a blob is a crucial step in ensuring data integrity and compliance.

You can configure a legal hold on a blob using the command "az storage blob set-legal-hold".

This feature allows you to specify if a legal hold should be set on the blob.

Blob legal hold can be set, configured, or cleared using the same command, depending on your specific needs.

To set a legal hold, you don't need to specify any additional options beyond the command itself.

Get SAS

Credit: youtube.com, Use SAS URIs from Azure Blob Storage Like a Pro!

To get a shared access signature (SAS) for a blob, you can use the az storage blob generate-sas command. This command can be used to generate a SAS token for a blob with read-only permissions.

A SAS token can be generated with a specified IP address or range of IP addresses from which to accept requests, which is useful for restricting access to a specific network. For example, you can specify ip=168.1.5.65 or ip=168.1.5.60-168.1.5.70 on the SAS token.

You can also specify the permissions the SAS grants, such as read, write, or delete permissions. The allowed values for permissions include (r)ead, (w)rite, (d)elete, and (x)delete_previous_version. You can combine these values to grant multiple permissions.

It's essential to protect a SAS from malicious or unintended use, so use discretion in distributing a SAS and have a plan in place for revoking a compromised SAS. A SAS is commonly used to provide temporary and secure access to a client who wouldn't normally have permissions.

Frequently Asked Questions

How to access storage account using Azure CLI?

To access a storage account using Azure CLI, create a new storage account and set it as the default account in your environment variables. This will enable you to use Azure CLI commands to manage your storage account and its contents.

Francis McKenzie

Writer

Francis McKenzie is a skilled writer with a passion for crafting informative and engaging content. With a focus on technology and software development, Francis has established herself as a knowledgeable and authoritative voice in the field of Next.js development.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.