[python] Listing contents of a bucket with boto3

How can I see what's inside a bucket in S3 with boto3? (i.e. do an "ls")?

Doing the following:

import boto3
s3 = boto3.resource('s3')
my_bucket = s3.Bucket('some/path/')

returns:

s3.Bucket(name='some/path/')

How do I see its contents?

This question is related to python amazon-web-services amazon-s3 boto3 boto

The answer is


One way to see the contents would be:

for my_bucket_object in my_bucket.objects.all():
    print(my_bucket_object)

Here is the solution

import boto3

s3=boto3.resource('s3')
BUCKET_NAME = 'Your S3 Bucket Name'
allFiles = s3.Bucket(BUCKET_NAME).objects.all()
for file in allFiles:
    print(file.key)

If you want to pass the ACCESS and SECRET keys (which you should not do, because it is not secure):

from boto3.session import Session

ACCESS_KEY='your_access_key'
SECRET_KEY='your_secret_key'

session = Session(aws_access_key_id=ACCESS_KEY,
                  aws_secret_access_key=SECRET_KEY)
s3 = session.resource('s3')
your_bucket = s3.Bucket('your_bucket')

for s3_file in your_bucket.objects.all():
    print(s3_file.key)

I'm assuming you have configured authentication separately.

import boto3
s3 = boto3.resource('s3')

my_bucket = s3.Bucket('bucket_name')

for file in my_bucket.objects.all():
    print(file.key)

A more parsimonious way, rather than iterating through via a for loop you could also just print the original object containing all files inside your S3 bucket:

session = Session(aws_access_key_id=aws_access_key_id,aws_secret_access_key=aws_secret_access_key)
s3 = session.resource('s3')
bucket = s3.Bucket('bucket_name')

files_in_s3 = bucket.objects.all() 
#you can print this iterable with print(list(files_in_s3))

#To print all filenames in a bucket
import boto3

s3 = boto3.client('s3')

def get_s3_keys(bucket):

    """Get a list of keys in an S3 bucket."""
    resp = s3.list_objects_v2(Bucket=bucket)
    for obj in resp['Contents']:
      files = obj['Key']
    return files


filename = get_s3_keys('your_bucket_name')

print(filename)

#To print all filenames in a certain directory in a bucket
import boto3

s3 = boto3.client('s3')

def get_s3_keys(bucket, prefix):

    """Get a list of keys in an S3 bucket."""
    resp = s3.list_objects_v2(Bucket=bucket, Prefix=prefix)
    for obj in resp['Contents']:
      files = obj['Key']
      print(files)
    return files


filename = get_s3_keys('your_bucket_name', 'folder_name/sub_folder_name/')

print(filename)

One way that I used to do this:

import boto3
s3 = boto3.resource('s3')
bucket=s3.Bucket("bucket_name")
contents = [_.key for _ in bucket.objects.all() if "subfolders/ifany/" in _.key]

I just did it like this, including the authentication method:

s3_client = boto3.client(
                's3',
                aws_access_key_id='access_key',
                aws_secret_access_key='access_key_secret',
                config=boto3.session.Config(signature_version='s3v4'),
                region_name='region'
            )

response = s3_client.list_objects(Bucket='bucket_name', Prefix=key)
if ('Contents' in response):
    # Object / key exists!
    return True
else:
    # Object / key DOES NOT exist!
    return False

So you're asking for the equivalent of aws s3 ls in boto3. This would be listing all the top level folders and files. This is the closest I could get; it only lists all the top level folders. Surprising how difficult such a simple operation is.

import boto3

def s3_ls():
  s3 = boto3.resource('s3')
  bucket = s3.Bucket('example-bucket')
  result = bucket.meta.client.list_objects(Bucket=bucket.name,
                                           Delimiter='/')
  for o in result.get('CommonPrefixes'):
    print(o.get('Prefix'))

In order to handle large key listings (i.e. when the directory list is greater than 1000 items), I used the following code to accumulate key values (i.e. filenames) with multiple listings (thanks to Amelio above for the first lines). Code is for python3:

    from boto3  import client
    bucket_name = "my_bucket"
    prefix      = "my_key/sub_key/lots_o_files"

    s3_conn   = client('s3')  # type: BaseClient  ## again assumes boto.cfg setup, assume AWS S3
    s3_result =  s3_conn.list_objects_v2(Bucket=bucket_name, Prefix=prefix, Delimiter = "/")

    if 'Contents' not in s3_result:
        #print(s3_result)
        return []

    file_list = []
    for key in s3_result['Contents']:
        file_list.append(key['Key'])
    print(f"List count = {len(file_list)}")

    while s3_result['IsTruncated']:
        continuation_key = s3_result['NextContinuationToken']
        s3_result = s3_conn.list_objects_v2(Bucket=bucket_name, Prefix=prefix, Delimiter="/", ContinuationToken=continuation_key)
        for key in s3_result['Contents']:
            file_list.append(key['Key'])
        print(f"List count = {len(file_list)}")
    return file_list

Here is a simple function that returns you the filenames of all files or files with certain types such as 'json', 'jpg'.

def get_file_list_s3(bucket, prefix="", file_extension=None):
            """Return the list of all file paths (prefix + file name) with certain type or all
            Parameters
            ----------
            bucket: str
                The name of the bucket. For example, if your bucket is "s3://my_bucket" then it should be "my_bucket"
            prefix: str
                The full path to the the 'folder' of the files (objects). For example, if your files are in 
                s3://my_bucket/recipes/deserts then it should be "recipes/deserts". Default : ""
            file_extension: str
                The type of the files. If you want all, just leave it None. If you only want "json" files then it
                should be "json". Default: None       
            Return
            ------
            file_names: list
                The list of file names including the prefix
            """
            import boto3
            s3 = boto3.resource('s3')
            my_bucket = s3.Bucket(bucket)
            file_objs =  my_bucket.objects.filter(Prefix=prefix).all()
            file_names = [file_obj.key for file_obj in file_objs if file_extension is not None and file_obj.key.split(".")[-1] == file_extension]
            return file_names

ObjectSummary:

There are two identifiers that are attached to the ObjectSummary:

  • bucket_name
  • key

boto3 S3: ObjectSummary

More on Object Keys from AWS S3 Documentation:

Object Keys:

When you create an object, you specify the key name, which uniquely identifies the object in the bucket. For example, in the Amazon S3 console (see AWS Management Console), when you highlight a bucket, a list of objects in your bucket appears. These names are the object keys. The name for a key is a sequence of Unicode characters whose UTF-8 encoding is at most 1024 bytes long.

The Amazon S3 data model is a flat structure: you create a bucket, and the bucket stores objects. There is no hierarchy of subbuckets or subfolders; however, you can infer logical hierarchy using key name prefixes and delimiters as the Amazon S3 console does. The Amazon S3 console supports a concept of folders. Suppose that your bucket (admin-created) has four objects with the following object keys:

Development/Projects1.xls

Finance/statement1.pdf

Private/taxdocument.pdf

s3-dg.pdf

Reference:

AWS S3: Object Keys

Here is some example code that demonstrates how to get the bucket name and the object key.

Example:

import boto3
from pprint import pprint

def main():

    def enumerate_s3():
        s3 = boto3.resource('s3')
        for bucket in s3.buckets.all():
             print("Name: {}".format(bucket.name))
             print("Creation Date: {}".format(bucket.creation_date))
             for object in bucket.objects.all():
                 print("Object: {}".format(object))
                 print("Object bucket_name: {}".format(object.bucket_name))
                 print("Object key: {}".format(object.key))

    enumerate_s3()


if __name__ == '__main__':
    main()

With little modification to @Hephaeastus 's code in one of the above comments, wrote the below method to list down folders and objects (files) in a given path. Works similar to s3 ls command.

from boto3 import session

def s3_ls(profile=None, bucket_name=None, folder_path=None):
    folders=[]
    files=[]
    result=dict()
    bucket_name = bucket_name
    prefix= folder_path
    session = boto3.Session(profile_name=profile)
    s3_conn   = session.client('s3')
    s3_result =  s3_conn.list_objects_v2(Bucket=bucket_name, Delimiter = "/", Prefix=prefix)
    if 'Contents' not in s3_result and 'CommonPrefixes' not in s3_result:
        return []

    if s3_result.get('CommonPrefixes'):
        for folder in s3_result['CommonPrefixes']:
            folders.append(folder.get('Prefix'))

    if s3_result.get('Contents'):
        for key in s3_result['Contents']:
            files.append(key['Key'])

    while s3_result['IsTruncated']:
        continuation_key = s3_result['NextContinuationToken']
        s3_result = s3_conn.list_objects_v2(Bucket=bucket_name, Delimiter="/", ContinuationToken=continuation_key, Prefix=prefix)
        if s3_result.get('CommonPrefixes'):
            for folder in s3_result['CommonPrefixes']:
                folders.append(folder.get('Prefix'))
        if s3_result.get('Contents'):
            for key in s3_result['Contents']:
                files.append(key['Key'])

    if folders:
        result['folders']=sorted(folders)
    if files:
        result['files']=sorted(files)
    return result

This lists down all objects / folders in a given path. Folder_path can be left as None by default and method will list the immediate contents of the root of the bucket.


This is similar to an 'ls' but it does not take into account the prefix folder convention and will list the objects in the bucket. It's left up to the reader to filter out prefixes which are part of the Key name.

In Python 2:

from boto.s3.connection import S3Connection

conn = S3Connection() # assumes boto.cfg setup
bucket = conn.get_bucket('bucket_name')
for obj in bucket.get_all_keys():
    print(obj.key)

In Python 3:

from boto3 import client

conn = client('s3')  # again assumes boto.cfg setup, assume AWS S3
for key in conn.list_objects(Bucket='bucket_name')['Contents']:
    print(key['Key'])

My s3 keys utility function is essentially an optimized version of @Hephaestus's answer:

import boto3


s3_paginator = boto3.client('s3').get_paginator('list_objects_v2')


def keys(bucket_name, prefix='/', delimiter='/', start_after=''):
    prefix = prefix[1:] if prefix.startswith(delimiter) else prefix
    start_after = (start_after or prefix) if prefix.endswith(delimiter) else start_after
    for page in s3_paginator.paginate(Bucket=bucket_name, Prefix=prefix, StartAfter=start_after):
        for content in page.get('Contents', ()):
            yield content['Key']

In my tests (boto3 1.9.84), it's significantly faster than the equivalent (but simpler) code:

import boto3


def keys(bucket_name, prefix='/', delimiter='/'):
    prefix = prefix[1:] if prefix.startswith(delimiter) else prefix
    bucket = boto3.resource('s3').Bucket(bucket_name)
    return (_.key for _ in bucket.objects.filter(Prefix=prefix))

As S3 guarantees UTF-8 binary sorted results, a start_after optimization has been added to the first function.


It can also be done as follows:

csv_files = s3.list_objects_v2(s3_bucket_path)
    for obj in csv_files['Contents']:
        key = obj['Key']

Examples related to python

programming a servo thru a barometer Is there a way to view two blocks of code from the same file simultaneously in Sublime Text? python variable NameError Why my regexp for hyphenated words doesn't work? Comparing a variable with a string python not working when redirecting from bash script is it possible to add colors to python output? Get Public URL for File - Google Cloud Storage - App Engine (Python) Real time face detection OpenCV, Python xlrd.biffh.XLRDError: Excel xlsx file; not supported Could not load dynamic library 'cudart64_101.dll' on tensorflow CPU-only installation

Examples related to amazon-web-services

How to specify credentials when connecting to boto3 S3? Is there a way to list all resources in AWS Access denied; you need (at least one of) the SUPER privilege(s) for this operation Job for mysqld.service failed See "systemctl status mysqld.service" What is difference between Lightsail and EC2? AWS S3 CLI - Could not connect to the endpoint URL boto3 client NoRegionError: You must specify a region error only sometimes How to write a file or data to an S3 object using boto3 Missing Authentication Token while accessing API Gateway? The AWS Access Key Id does not exist in our records

Examples related to amazon-s3

How to specify credentials when connecting to boto3 S3? AWS S3 CLI - Could not connect to the endpoint URL How to write a file or data to an S3 object using boto3 The AWS Access Key Id does not exist in our records AccessDenied for ListObjects for S3 bucket when permissions are s3:* Save Dataframe to csv directly to s3 Python Listing files in a specific "folder" of a AWS S3 bucket How to get response from S3 getObject in Node.js? Getting Access Denied when calling the PutObject operation with bucket-level permission Read file content from S3 bucket with boto3

Examples related to boto3

How to specify credentials when connecting to boto3 S3? Difference in boto3 between resource, client, and session? boto3 client NoRegionError: You must specify a region error only sometimes How to write a file or data to an S3 object using boto3 Save Dataframe to csv directly to s3 Python Read file content from S3 bucket with boto3 Retrieving subfolders names in S3 bucket from boto3 check if a key exists in a bucket in s3 using boto3 Unable to install boto3 How to choose an AWS profile when using boto3 to connect to CloudFront

Examples related to boto

How to write a file or data to an S3 object using boto3 Boto3 Error: botocore.exceptions.NoCredentialsError: Unable to locate credentials How to handle errors with boto3? Open S3 object as a string with Boto3 Listing contents of a bucket with boto3 How to save S3 object to a file using boto3 How to upload a file to directory in S3 bucket using boto How do I install boto? Amazon S3 boto - how to create a folder?