Sign In
Sign In

boto3 is the official Amazon SDK for working with S3 in Python. It allows you to manage objects in storage: upload and download files, list objects, work with metadata, perform multipart uploads, and generate pre-signed URLs. 

Installation

To install boto3, use pip:

pip install boto3

After installation, the library is ready to use in your Python code.

Configuring Access

When using boto3, access credentials can be configured in several ways.

Using ~/.aws/config and ~/.aws/credentials

When you use the AWS CLI, a .aws directory containing configuration files is automatically created in the user’s home directory. If necessary, you can create this directory and the files manually without installing the AWS CLI.

The ~/.aws/credentials file should contain the following:

[default]
aws_access_key_id = <ACCESS_KEY>
aws_secret_access_key = <SECRET_KEY>

And ~/.aws/config:

[default]
output = json
endpoint_url = https://s3.hmstorage.net
region = us-2

If you use the default profile, no additional configuration is required in the code; boto3 will automatically load these settings.

Using Environment Variables

You can also provide access credentials via environment variables:

export AWS_ACCESS_KEY_ID=<ACCESS_KEY>
export AWS_SECRET_ACCESS_KEY=<SECRET_KEY>

Passing Credentials in Code

Credentials can be passed directly when creating the S3 client:

import boto3

s3 = boto3.client(
    "s3",
    aws_access_key_id="<ACCESS_KEY>",
    aws_secret_access_key="<SECRET_KEY>",
    endpoint_url="https://s3.hmstorage.net",
    region_name="us-2"
)

This approach is not recommended for production, as access keys are stored directly in the source code.

Using Named Profiles

If multiple profiles are configured in ~/.aws/config and ~/.aws/credentials, you can explicitly specify the required profile when creating a session:

import boto3

session = boto3.Session(profile_name="myprofile")
s3 = session.client("s3")

This is useful when the same codebase is used to work with multiple buckets or environments.

Example

Below is an example demonstrating basic object operations using the SDK. The script performs the following actions:

  • uploads a file to a bucket;
  • lists all objects in the bucket;
  • downloads the uploaded file under a different name;
  • deletes the object from storage.

Before running the script, create a file named example.txt in the same directory as the script.

Example code:

import boto3
import os

bucket_name = "bucket_name"
endpoint_url = "https://s3.hmstorage.net"
local_upload_path = "example.txt"
s3_key = "example.txt"
local_download_path = "downloaded_example.txt"

# Create S3 client
s3 = boto3.client("s3", endpoint_url=endpoint_url)

# 1. Upload file to the bucket
if os.path.exists(local_upload_path):
    print(f"Uploading {local_upload_path} to the bucket...")
    s3.upload_file(local_upload_path, bucket_name, s3_key)
    print("Upload completed.")
else:
    print(f"File {local_upload_path} not found. Skipping upload.")

# 2. List objects in the bucket
print("\nBucket contents:")
paginator = s3.get_paginator("list_objects_v2")
for page in paginator.paginate(Bucket=bucket_name):
    for obj in page.get("Contents", []):
        print(f"- {obj['Key']}")

# 3. Download file from the bucket
print(f"\nDownloading {s3_key} to {local_download_path}...")
s3.download_file(bucket_name, s3_key, local_download_path)
print("Download completed.")

# 4. Delete file from the bucket
print(f"\nDeleting {s3_key} from the bucket...")
s3.delete_object(Bucket=bucket_name, Key=s3_key)
print("Deletion completed.")
Was this page helpful?
Updated on 14 January 2026

Do you have questions,
comments, or concerns?

Our professionals are available to assist you at any moment,
whether you need help or are just unsure of where to start.
Email us
Hostman's Support