Compressing Python dictionary objects before storing in json S3 files.

Here’s a quick little script I wrote since I need to test uploading files into s3. In this case the file generated will be 78 bytes. When unzipped 170 Bytes. The reason I wrote this is because I have to upload large amounts of data in json form into S3. Saving space in S3 results in pretty great savings. Here is the code:

This code takes a python dictionary and loads it compressed into S3 into s3://<your datalake bucket>/test/test.gz.