Mastering the AWS CLI for Amazon S3: Essential commands for Cloud Storage
Amazon Simple Storage Service, commonly known as S3, is a versatile and scalable cloud storage service offered by Amazon Web Services (AWS). Think of it like Dropbox, Google Drive or Microsoft OneDrive. It provides a secure, durable, and highly available storage solution that allows you to store and retrieve data, making it a fundamental building block for many cloud-based applications and services.
While S3 can be managed through the AWS Management Console, it also offers a powerful command-line interface (CLI) for those who prefer to manage S3 resources programmatically. The AWS CLI provides a set of commands that enable users to interact with S3 buckets and objects, making it a valuable tool for developers, system administrators, and DevOps professionals.
In this guide, I will dive into using AWS CLI commands to perform common tasks in S3 that will help to build your confidence so that you don’t have to open up the AWS console every time. Whether you’re new to S3 or looking to enhance your cloud storage skills, mastering AWS CLI commands can significantly streamline your S3 workflows and help you leverage the full potential of the S3 cloud storage service. Let’s get started!
System Setup
Before you can start using any CLI commands with S3, you will need to make sure you have:
I won't be covering how to do these things here however you can reference the below links regarding installing the AWS CLI and creating your user access keys:
Once you have completed the above steps, you will be ready to start using the AWS CLI with S3 on your laptop/computer.
AWS CLI S3 Commands
With S3, it is possible to use either:
The high level aws s3 commands are typically used to execute common S3 tasks with simplified commands. The low level aws s3api commands are used for more advanced or custom interactions with S3 through API calls. Depending on the task you are trying to run, you may need to use the low level commands where a high level aws s3 command is not sufficient. With this knowledge, we can now start to execute some commands.
Creating a new bucket
A bucket can be created using :
aws s3 mb s3://your_bucket_name
aws s3 mb s3://your_bucket_name --region your_aws_region
Or:
aws s3api create-bucket --bucket your_bucket_name
aws s3api create-bucket --bucket your_bucket_name --region your_aws_region
In S3, a bucket name has to be globally unique across all of AWS and cannot be used by more than one user. If you try to create a new bucket with a name that is already in use, the CLI will alert you with an error. If you do not specify a region, your bucket will be created in the default region that you set when you executed the aws configure command. Whilst S3 is a global AWS service, your buckets are still located in the region that you specify at the moment of creation.
To check if your bucket has been created, run the command:
aws s3 ls
This will list all of the buckets in your AWS account.
Creating an object inside of a bucket
You can add a new object directly to a bucket by running:
aws s3api put-object --bucket your_bucket_name --key your_object_name
In the above example, an object called topsecretfile.txt is added inside of the bucket called music2023.
To check the contents of your bucket, run:
aws s3 ls s3://your_bucket_name
This command will return any objects inside the music2023 bucket.
If your bucket is empty, nothing will be displayed and you will be returned back to the command prompt.
Creating a new folder inside of a bucket
In S3, a folder is known as the prefix which is a way to group objects together. Whilst a prefix might look like a folder when you view your buckets using the AWS console, it is not a folder as such. It is essentially a way to create a hierarchy or what you would consider to be a folder structure within an S3 bucket. For the rest of this article, I will refer to a prefix as a folder to make it easier to compare with a regular file system.
To create a folder, run:
aws s3api put-object --bucket your_bucket_name --key your_folder_name/
In the above examples, folders called music, videos and images have been created inside of the music2023 bucket. A folder called charlieparker has been created inside of the folder called videos.
Creating an object inside of a folder (prefix)
To add an object inside of a folder, run:
Recommended by LinkedIn
aws s3api put-object --bucket your_bucket_name --key your_folder_name/your_object_name
In the first command shown above, the object concert1.txt is created inside of the music folder. In the second command, two objects have been created - concert1.mp4 inside the videos folder and concert1.jpg inside of the images folder.
Deleting objects:
A single object can be deleted using:
aws s3 rm s3://your_bucket_name/name_of_object_to_delete
or:
aws s3api delete-object --bucket your_bucket_name --key name_of_object_to_delete
In the above shown example, the object concert1.jpg inside of the images folder has been deleted. You will not see any output/confirmation to your terminal when an object has been successfully deleted using the "aws s3api delete-object" command.
Deleting all objects and folders (prefixes) inside of a bucket:
To delete more than one object at once, run the command:
aws s3 rm s3://name_of_bucket --recursive
You must include the --recursive flag to delete everything.
Deleting a bucket
To delete a bucket, run:
aws s3api delete-bucket --bucket your_bucket_name
As with the "aws s3api delete-object" command, there is no output when the command has been successful. You will just be returned to the command prompt. You will need to ensure that the bucket is empty before you try to delete it otherwise you will see an error message:
Copying local files to your S3 Bucket
To copy files from your local computer to your S3 bucket, run:
aws s3 cp your_local_file_path s3://your_bucket_name/your_folder_name/
In this example, the local file called jazz_funk_concerts.txt is copied into the concerts folder inside the music2024 bucket. The concerts folder does not currently exist inside the music2024 bucket and so it is created as well.
If you want to rename your file whilst it is being uploaded, simply specify the new file in your command:
aws s3 your_local_file_path s3://your_bucket_name/new_file_name/
In this example, the file called original_named_file.txt is uploaded into the concerts folder inside the music2024 bucket and renamed to renamed_file.txt.
If you need to upload multiple files which are inside the same folder, you can speed up the process by using the --recursive flag. It is important to include in the same command --exclude ".*". This will prevent uploading any hidden files which may be present in the folder:
aws s3 your_local_folder_path s3://your_bucket_name/ --recursive --exclude ".*"
Here the contents of the Downloads folder has been uploaded into a folder called concerts inside a bucket called music2024. The contents of the concerts folder has then been listed to verify all of the files have been uploaded.
You could also achieve the same result with the use of a script. You would need to save the below code as a script file (e.g. multifileupload.sh) adding in the folder path on your computer as well as your S3 bucket name. You will also need to change the execute permission on the script file itself so that it can be executed (e.g. running the command chmod 744 multifileupload.sh).
#!/bin/bash
# ADD YOUR S3 BUCKET NAME AND YOUR LOCAL FOLDER PATH
S3_BUCKET_NAME="YOUR_BUCKET_NAME"
LOCAL_FOLDER_PATH="/path/to/local/folder"
# Loop through each file in the folder on your computer and upload it to your S3 bucket
for file in "$LOCAL_FOLDER_PATH"/*; do
if [ -f "$file" ]; then
# Get the file name without the local folder path
file_name=$(basename "$file")
# Upload the file to the S3 bucket
aws s3 cp "$file" "s3://$S3_BUCKET_NAME/$file_name"
echo "Uploaded $file_name to S3 bucket $S3_BUCKET_NAME"
fi
done
echo "Upload complete!"
In conclusion, utilizing the AWS CLI for managing AWS S3 offers several distinct advantages over using the AWS console. While the console provides a user-friendly interface for interacting with S3, the CLI empowers users with automation, scripting, and batch processing capabilities. Moreover, for DevOps teams and developers working in continuous integration and deployment environments, the CLI becomes an indispensable tool for automating tasks and ensuring efficiency. By mastering AWS CLI commands, you gain greater control, scalability, and flexibility in managing your S3 resources, making it the preferred choice for those seeking to harness the full potential of AWS S3 for their cloud storage needs.
Hopefully the commands I have covered in this article will give you the confidence to use the AWS CLI and to explore using it with other AWS services.
If you have read this article until the end, thanks for reading.
Additional CLI commands for S3 can be found in the following AWS documentation:
MITCHEL ANTONIO NEYRA ESTEBAN