In my last article, I talked about how I moved from using Cron on an EC2 instance to download stat files for my hockey pool website to using scheduled Lambda functions.
In this article, I’ll talk about how I removed another Cron job by enabling S3 event notifications an, to add jobs to the SQS queues I use to parse all the stats.
Enabling S3 event notifications is really simple. Select the S3 bucket that you want to enable notifications on, select “properties” and then expand the “Events” menu option.
Give the event a name, and specify the Lambda function you want to invoke and an event occurs. It’s important to note here that the type of event is important. In this case, I used “ObjectCreated” so that each time a new stat file was uploaded the function would be fired.
The Lambda function called by the S3 event is dependent on which S3 bucket is being used to store the stats. In total, I created two Lambda functions, one for the daily stats file, the other for the total stats file. In the screenshot above you can see that it’s calling the Lambda function for the total stats file.
Just like I did for the scheduled download I copied the existing Python code I had into the new Lambda functions and updated them to use Boto 3. The Lambda functions add jobs to one (or more) SQS queues based on which S3 bucket was used to store the stat file.
That’s all there was to it. Using these simple steps I removed another Cron job and replaced it with Lambda functions called by a S3 event.
Next up, dealing with the jobs in the SQS queues.
Like what you read? Why not subscribe to the weekly Orbit newsletter and get content before everyone else?