Skip to content

boto3

You must generate an Access Key before getting started. All examples will utilize access_key_id and access_key_secret variables which represent the Access Key ID and Secret Access Key values you generated.


You must configure boto3 to use a preconstructed endpoint_url value. This can be done through any boto3 usage that accepts connection arguments; for example:

import boto3
s3 = boto3.resource('s3',
endpoint_url = 'https://<accountid>.r2.cloudflarestorage.com',
aws_access_key_id = '<access_key_id>',
aws_secret_access_key = '<access_key_secret>'
)

You may, however, omit the aws_access_key_id and aws_secret_access_key arguments and allow boto3 to rely on the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables instead.

An example script may look like the following:

import boto3
s3 = boto3.client(
service_name ="s3",
endpoint_url = 'https://<accountid>.r2.cloudflarestorage.com',
aws_access_key_id = '<access_key_id>',
aws_secret_access_key = '<access_key_secret>',
region_name="<location>", # Must be one of: wnam, enam, weur, eeur, apac, auto
)
# Get object information
object_information = s3.head_object(Bucket=<R2_BUCKET_NAME>, Key=<FILE_KEY_NAME>)
# Upload/Update single file
s3.upload_fileobj(io.BytesIO(file_content), <R2_BUCKET_NAME>, <FILE_KEY_NAME>)
# Delete object
s3.delete_object(Bucket=<R2_BUCKET_NAME>, Key=<FILE_KEY_NAME>)
Terminal window
python main.py
Buckets:
- user-uploads
- my-bucket-name
Objects:
- cat.png
- todos.txt

Use SHA-1/SHA-256 checksum algorithms

You can also use SHA-1 and SHA-256 algorithms for checksum.

import boto3
import hashlib
import base64
def calculate_sha1_base64(data):
"""Calculate SHA-1 hash and return base64 encoded string."""
sha1_hash = hashlib.sha1(data).digest()
return base64.b64encode(sha1_hash).decode('utf-8')
def calculate_sha256_base64(data):
"""Calculate SHA-256 hash and return base64 encoded string."""
sha256_hash = hashlib.sha256(data).digest()
return base64.b64encode(sha256_hash).decode('utf-8')
s3 = boto3.client(
service_name ="s3",
endpoint_url = 'https://<accountid>.r2.cloudflarestorage.com',
aws_access_key_id = '<access_key_id>',
aws_secret_access_key = '<access_key_secret>',
region_name="<location>", # Must be one of: wnam, enam, weur, eeur, apac, auto
)
# Calculate SHA1
sha1 = calculate_sha1_base64(file_data)
# Uplodad file to R2 with SHA1 checksum
response = s3.put_object(
Body=file_data,
Bucket=bucket_name,
Key='sha1.txt',
ChecksumSHA1=sha1)
# Calculate SHA256
sha256 = calculate_sha256_base64(file_data)
# Uplodad file to R2 with SHA256 checksum
response = s3.put_object(
Body=file_data,
Bucket=bucket_name,
Key='sha256.txt',
ChecksumSHA256=sha256)