[python] How to write a file or data to an S3 object using boto3

In boto 2, you can write to an S3 object using these methods:

Is there a boto 3 equivalent? What is the boto3 method for saving data to an object stored on S3?

This question is related to python amazon-web-services amazon-s3 boto boto3

The answer is


You may use the below code to write, for example an image to S3 in 2019. To be able to connect to S3 you will have to install AWS CLI using command pip install awscli, then enter few credentials using command aws configure:

import urllib3
import uuid
from pathlib import Path
from io import BytesIO
from errors import custom_exceptions as cex

BUCKET_NAME = "xxx.yyy.zzz"
POSTERS_BASE_PATH = "assets/wallcontent"
CLOUDFRONT_BASE_URL = "https://xxx.cloudfront.net/"


class S3(object):
    def __init__(self):
        self.client = boto3.client('s3')
        self.bucket_name = BUCKET_NAME
        self.posters_base_path = POSTERS_BASE_PATH

    def __download_image(self, url):
        manager = urllib3.PoolManager()
        try:
            res = manager.request('GET', url)
        except Exception:
            print("Could not download the image from URL: ", url)
            raise cex.ImageDownloadFailed
        return BytesIO(res.data)  # any file-like object that implements read()

    def upload_image(self, url):
        try:
            image_file = self.__download_image(url)
        except cex.ImageDownloadFailed:
            raise cex.ImageUploadFailed

        extension = Path(url).suffix
        id = uuid.uuid1().hex + extension
        final_path = self.posters_base_path + "/" + id
        try:
            self.client.upload_fileobj(image_file,
                                       self.bucket_name,
                                       final_path
                                       )
        except Exception:
            print("Image Upload Error for URL: ", url)
            raise cex.ImageUploadFailed

        return CLOUDFRONT_BASE_URL + id

You no longer have to convert the contents to binary before writing to the file in S3. The following example creates a new text file (called newfile.txt) in an S3 bucket with string contents:

import boto3

s3 = boto3.resource(
    's3',
    region_name='us-east-1',
    aws_access_key_id=KEY_ID,
    aws_secret_access_key=ACCESS_KEY
)
content="String content to write to a new S3 file"
s3.Object('my-bucket-name', 'newfile.txt').put(Body=content)

boto3 also has a method for uploading a file directly:

s3 = boto3.resource('s3')    
s3.Bucket('bucketname').upload_file('/local/file/here.txt','folder/sub/path/to/s3key')

http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Bucket.upload_file


A cleaner and concise version which I use to upload files on the fly to a given S3 bucket and sub-folder-

import boto3

BUCKET_NAME = 'sample_bucket_name'
PREFIX = 'sub-folder/'

s3 = boto3.resource('s3')

# Creating an empty file called "_DONE" and putting it in the S3 bucket
s3.Object(BUCKET_NAME, PREFIX + '_DONE').put(Body="")

Note: You should ALWAYS put your AWS credentials (aws_access_key_id and aws_secret_access_key) in a separate file, for example- ~/.aws/credentials


it is worth mentioning smart-open that uses boto3 as a back-end.

smart-open is a drop-in replacement for python's open that can open files from s3, as well as ftp, http and many other protocols.

for example

from smart_open import open
import json
with open("s3://your_bucket/your_key.json", 'r') as f:
    data = json.load(f)

The aws credentials are loaded via boto3 credentials, usually a file in the ~/.aws/ dir or an environment variable.


Here's a nice trick to read JSON from s3:

import json, boto3
s3 = boto3.resource("s3").Bucket("bucket")
json.load_s3 = lambda f: json.load(s3.Object(key=f).get()["Body"])
json.dump_s3 = lambda obj, f: s3.Object(key=f).put(Body=json.dumps(obj))

Now you can use json.load_s3 and json.dump_s3 with the same API as load and dump

data = {"test":0}
json.dump_s3(data, "key") # saves json to s3://bucket/key
data = json.load_s3("key") # read json from s3://bucket/key

Examples related to python

programming a servo thru a barometer Is there a way to view two blocks of code from the same file simultaneously in Sublime Text? python variable NameError Why my regexp for hyphenated words doesn't work? Comparing a variable with a string python not working when redirecting from bash script is it possible to add colors to python output? Get Public URL for File - Google Cloud Storage - App Engine (Python) Real time face detection OpenCV, Python xlrd.biffh.XLRDError: Excel xlsx file; not supported Could not load dynamic library 'cudart64_101.dll' on tensorflow CPU-only installation

Examples related to amazon-web-services

How to specify credentials when connecting to boto3 S3? Is there a way to list all resources in AWS Access denied; you need (at least one of) the SUPER privilege(s) for this operation Job for mysqld.service failed See "systemctl status mysqld.service" What is difference between Lightsail and EC2? AWS S3 CLI - Could not connect to the endpoint URL boto3 client NoRegionError: You must specify a region error only sometimes How to write a file or data to an S3 object using boto3 Missing Authentication Token while accessing API Gateway? The AWS Access Key Id does not exist in our records

Examples related to amazon-s3

How to specify credentials when connecting to boto3 S3? AWS S3 CLI - Could not connect to the endpoint URL How to write a file or data to an S3 object using boto3 The AWS Access Key Id does not exist in our records AccessDenied for ListObjects for S3 bucket when permissions are s3:* Save Dataframe to csv directly to s3 Python Listing files in a specific "folder" of a AWS S3 bucket How to get response from S3 getObject in Node.js? Getting Access Denied when calling the PutObject operation with bucket-level permission Read file content from S3 bucket with boto3

Examples related to boto

How to write a file or data to an S3 object using boto3 Boto3 Error: botocore.exceptions.NoCredentialsError: Unable to locate credentials How to handle errors with boto3? Open S3 object as a string with Boto3 Listing contents of a bucket with boto3 How to save S3 object to a file using boto3 How to upload a file to directory in S3 bucket using boto How do I install boto? Amazon S3 boto - how to create a folder?

Examples related to boto3

How to specify credentials when connecting to boto3 S3? Difference in boto3 between resource, client, and session? boto3 client NoRegionError: You must specify a region error only sometimes How to write a file or data to an S3 object using boto3 Save Dataframe to csv directly to s3 Python Read file content from S3 bucket with boto3 Retrieving subfolders names in S3 bucket from boto3 check if a key exists in a bucket in s3 using boto3 Unable to install boto3 How to choose an AWS profile when using boto3 to connect to CloudFront