A Fitbit Low Battery SMS Notification using AWS – Part 1

765a3f73cd8c5b0c0bc96bc3cd094740[1]

A few weeks ago, my Fitbit (one of my new favorite devices) died.

Not died, as in bricked; my battery died.

That was a bit surprising, as I am signed up for low battery notifications. Turns out, I did get a notice – in an email. Problem was, I wasn’t checking my email that day.1

fitbit_notificationThe next day, I went back to the Fitbit website to sign-up for SMS notifications; unfortunately, Fitbit doesn’t provide a low battery notification via SMS.

So, I built my own, using a handful of AWS services (Lambda, SNS, API Gateway, and DynamoDB) and exposed Fitbit APIs.

So far, it works well. After subscribing to the service, I got an introductory note telling my my battery status. A few hours later, I received a second notice – letting me know that my battery was low. Continue reading “A Fitbit Low Battery SMS Notification using AWS – Part 1”

Optimizing Costs: AWS Object Storage Tiers and Lifecycle rules

AmazonS3[1]I’m currently using the AWS free tier, so I’m allowed 5 GB of standard storage for the next 12 months. I’m not close to my limit, but at my rate of usage, I’ll reach that limit in the later part of 2016. To reduce my costs, I’ll move my hosted mp3 files to a lower tier of storage (for those that haven’t been keeping track, I’m in the process of porting my old blog to AWS).

Currently AWS offers four (really three, since S3 – Standard and S3 – Reduced Redundancy Storage are kind of the same) different tiers of object storage:

  • S3 – Standard
  • S3 – Reduced Redundancy Storage1
  • S3 – Standard Infrequent Access
  • Glacier2

For my purposes – hosting a low traffic, mostly static website – S3- IA is a perfectly suitable choice. The latency and throughput is the same as Standard, but comes with a lower per GB storage price and per GB retrieval fee. The trade-off: lower reliability (99.9% – still more that good enough for a personal blog).

I’m keeping my files on Standard for now, but will implement S3 Object Lifecycle rules in the next year (before I have to start paying for my storage). It’s pretty simple to implement – you just use the console to create a simple rule (designate a folder and a duration before migrating to S3 – IA).

s3-bucket-properties-versioning[1]

I don’t like that you can’t specific files by suffix (mp3) in the Lifecycle rules GUI – but I’m sure a simple script can be created and run to only move mp3 files. And, yes I could put the files on S3-IA  when I first create them, but I think 30-60 days on Standard storage makes sense before moving the objects to a lower storage tier.

Final note – S3 – RRS is a valid option, too. It costs less than Standard (2.4 cents/GB versus 3 cents/GB [US East pricing]) and comes with 99.99% availability. My only problem is that it is less durable than Standard and IA… so for my purposes, I’m ok sacrificing availability (versus durability) for a lower cost.

Using Amazon Elastic Transcoder

Elastic TranscoderI’m a big fan of music and I manage my 50001 songs using iTunes. iTunes has it’s share of problems2, but overall it’s a pretty good media management application – especially if you’ve bought into the Apple ecosystem (which I have).

My problem is that most of my files are in the m4a format. That’s fine for my personal use, but it’s a problem when I want to share a file with someone else (who may not be part of the Apple ecosystem). iTunes allows you to convert a music file into mp3 format, but you have to remember to remove the duplicate file3. Again, this really isn’t a problem… but there’s got to be a better way to do batch conversions of files.

This became a bigger issue when I started re-publishing old posts to my blog. Many of those post have links to since-expired media files – which means I need to replace the links to valid files. I’m using the “WP Offload S3” plugin – which uploads my media to an S3 bucket. I’m also using the “zbPlayer” plugin to make my files available to play within the post. Problem is:zbPlayer only works with mp3 files4.

Continue reading “Using Amazon Elastic Transcoder”