Store compressed file in AWS S3 bucket using python.

Charith Prasanna
1 min readNov 2, 2021

AWS S3 stand for Amazon Simple Storage Service. Programmers can use Amazon S3 to store and retrieve any amount of data at any time. As a new member Amazon allows you to 5GB free space , 2,000 PUT requests, 20,000 GET requests and 15 GB of data transfer.

Here we discussed how to compress a object using python and put it into S3.Why we need to compress objects and store them into the s3 bucket? If we use cloudfront like services,when we want to access objects in s3 bucket using cloudfront , it cannot have more than 10Mb payload.Because of that better to compress the object and put it into the S3.

How we do this in python?

import boto3 -> need this library to access the AWS resources.

s3client = boto3.client(‘s3’) -> build the S3 client request.

Here line 5 ContentType need to be define corrently according to the what you going to store.

line 6 ContentEncoding type is the important thing, it must be defined as ‘gzip’.

I think this article might help if anyone interested with compress objects and store in s3.

--

--

Charith Prasanna

Software Engineer | University Of Moratuwa | Intervest Software Technologies | Full Stack Developer