AWS S3 Bucket’s Necessary Functions.

Utshab Kumar Ghosh
3 min readJan 21, 2021

Often we guys of Software Engineering, Machine Learning, or Development team in a tech company need to use Amazon’s Web Service(AWS) S3 Bucket to store necessary data. You can follow this link to get started with S3 Bucket.

However, today’s content is not about how to create buckets or something like these very basics. The beginners working with the bucket face a lot of issues. As time is very valuable and we should try to cooperate with each other to lessen their time getting the maximum amount of data, I have created some functions to access the bucket, upload-download, and some more operations related to the bucket. So let’s dive in!

At first, we should import some necessary libraries:

import logging
import boto3
from botocore.exceptions import ClientError

Functions:

  1. Create client objects.
# Creating the low level object oriented interface
def create_client(client_object, aws_access_key_id, aws_secret_access_key, region_name):
client = boto3.client(
client_object,
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
region_name=region_name
)
return client
# client_object = 's3' for S3 Bucket.

2. Create resource objects.

# Creating the high level object oriented interface
def create_resource(resource_object, aws_access_key_id, aws_secret_access_key, region_name):
resource = boto3.resource(
resource_object,
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
region_name=region_name
)
return resource

3. Search in the bucket.

def bucket_search(client_name, bucket, folder_name):    contents = client_name.list_objects(Bucket=bucket).get('Contents')
for p in contents:
if p['Key'] == folder_name:
return True
return False

4. Create a Folder in the bucket.

def bucket_create_folder(client_name, bucket, folder_name):    if not bucket_search(client_name, bucket, folder_name + '/'):
client_name.put_object(Bucket=bucket, Key=(folder_name + '/'))
return True
return False

5. Upload Files to the bucket.

def bucket_upload_file(client_name, filepath, bucket, object_name=None):    """Upload a file to an S3 bucket    :param file_name: File to upload    :param bucket: Bucket to upload to
:param object_name: S3 object name. If not specified then file_name is used.
:return: True if file was uploaded, else False
"""
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
try:
response = client_name.upload_file(filepath, bucket, object_name)
print(type(response))
except ClientError as e:
logging.error(e)
return False
return True

6. Download files from the bucket.

def bucket_download_file(client_name, bucket, filepath, download_name):    if bucket_search(client_name, bucket, filepath):
client_name.download_file(bucket, filepath, download_name)
return True
else:
return False

7. Generate the URL of any file or folder of the bucket.

def bucket_url_generator(client_name, bucket, filepath):    if bucket_search(client_name, bucket, filepath):
url = client_name.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': bucket,
'Key': filepath
}
)
return url
else:
return False

8. Delete files.

def bucket_delete(resource_name, bucket, filepath):    bucket = resource_name.Bucket(bucket)
for obj in bucket.objects.filter(Prefix=filepath):
resource_name.Object(bucket.name, obj.key).delete()

9. Watch the bucket names of AWS and their contents.

def bucket_names_and_contents(client_name, bucket):
clientResponse = client_name.list_buckets()
# Print the bucket names one by one
print('Printing bucket names...')
for bucket in clientResponse['Buckets']:
print(f'Bucket Name: {bucket["Name"]}')
# contents
print(client_name.list_objects(Bucket=bucket).get('Contents'))

So these are some chunks I know. Please share with me, if you know something more interesting or necessary. Please let me know if anything is not working or discuss my mistakes without any hesitations. Please be careful with the indentations. Tada!

References:

  1. https://boto3.amazonaws.com/v1/documentation/api/latest/index.html
  2. https://pypi.org/project/boto3/
  3. https://www.sqlshack.com/getting-started-with-amazon-s3-and-python/
  4. Image is taken from: https://aws.amazon.com/blogs/storage/amazon-s3-consistently-raises-the-bar-in-data-security/

--

--

Utshab Kumar Ghosh

Passionate about Python, Image Processing, Computer Vision, Data Science, Machine Learning, and Deep Learning.