Wednesday, January 29, 2020

What Should a Runner Eat? A Rock Climber's Answer

What should a runner eat? The Internet, you won't be surprised to learn, has every conceivable answer. From, whatever you want to nothing (aka, intermittent fasting).

My latest take on diet is encapsulated by one of Dave MacLeod's vlogs*: 20 years of depression resolved.

On the surface, the very premise of this video seems like a click-bait mess. Dave's telling the world that he cured himself of depression by making one change: going keto! So there you go, if you want to cure your depression you should stop eating carbs the rest will take care of itself.

Thankfully, that message is the furthest thing from Dave's point. Yes, it's true he changed to a ketogenic diet and it appears to have positively impacted his mental health. But Dave goes to great lengths to make it clear that he lacks evidence to definitively link the two. Also, Dave's choice of ketogenic diet is optimized for his sport: rock climbing. His goal of starting a low carb diet was to cut body fat percentage, which on its surface makes sense. Having less fat to haul up a rock face, and more muscle to help you do so, is an obvious advantage.

I'm not wrestling with depression, nor am I interested in optimizing my body to be in climbing shape. Yet, I still find quite a bit of wisdom in Dave's video. My main takeaway: the answer to what should a runner eat? is to turn to science.

I mean 'science' in at least two senses. First, Dave promotes the idea of running experiments on yourself. This of course, is at the heart of science. Make a change, observe the outcome, document the results, repeat.

For Dave, the experiment was following a ketogenic diet. For me, it's been doing things like focusing on consuming more complex carbs, varying meal times before a run and starting the day with a protein shake. Some of these experiments have had promising results (carbs = good!), others not so much (I'm looking at you, morning protein shake). I strive not to think of these experiments in terms of success or failure; just as more data.

Of course, you need not rely solely on self experimentation. That's the other way science saves the day: with a bit of digging, you can find papers and other resources that have tackled topics you're interested. As a runner, what role do carbs play? A huge one. Which is better for runners, a low or high GI diet? From a performance perspective, it doesn't matter. Should I be avoiding gluten? No. Of course, no one study should be taken as gospel. But they are a far better option than taking some YouTube'ers word for it. In short: Look for evidence, make informed decisions.

I've been collecting up notable articles and papers here:

It's been empowering to look at diet as a journey, rather than a set of must-do-rules.

So what should a runner eat? I'm still not sure, but I'm having fun figuring it out.

Check out Dave's video for inspiration:


*Side note: Dave's vlogs are terrific. He frequently takes questions which you'd expect a flippant response to, and provides a meaningful and insightful answer. I'm not a rock climber, but I love all the wisdom he drops.

Tuesday, January 28, 2020

Google Photos API Giveth, and Then Promptly Taketh Away

Last week I was psyched: the Google Photos API had added the functionality I needed to allow for command line album management. No longer would I clumsily create and add photos to albums in a web interface. No, I'd only have to kick off a single shell script that would do all the work for me.

Using the API, I'd built out the support for querying album and images. And this week I added support for creating albums. It was all going so very well:

# Make a new album
$ gphoto_tools -a album-new -n "gphoto_tool test" 
AHnaBgvE-lqCduDAiPiMmDEIKyeEwcRALqCoK8bn-oOdZFMju06qrT8VbopDIGs-9Wh0XAmZs8ei

# Add heaps of photos to it
$ gphoto_tools -a media-list | sed 's/|.*//' | \
  gphoto_tools -a album-add -i 'AHnaBgvE-lqCduDAiPiMmDEIKyeEwcRALqCoK8bn-oOdZFMju06qrT8VbopDIGs-9Wh0XAmZs8ei'

Except, the above code doesn't work. It gives me a 400 error:

{
  "error": {
    "code": 400,
    "message": "Request contains an invalid media item id.",
    "status": "INVALID_ARGUMENT"
  }
}

I checked and re-checked the album and media IDs. Everything seemed right. I then tried using the API Explorer in the docs and got the same error. Uh, oh.

A bit of searching explained the problem. The Manage Albums Developer Guide includes this fine print (emphasis added):

Note that you can only add media items that have been uploaded by your application to albums that your application has created. Media items must also be in the user's library. For albums that are shared, they must either be owned by the user or the user must be a collaborator who has already joined the album.

In other words: adding previously uploaded photos to an album created by the API is a no-no. Why, Google?! Why?!

I suppose this is some sort of security protection, but I don't get. I'm all for sandboxing apps to protect users, but this seems to go too far. Effectively you can write an app that uploads and organizes photos, but you can't write an app who's primary job is just to organize. This seems short sighted, especially because album creation and organization is totally separate from the act of uploading photos.

And really Google, you can't include this critical limitation in the API reference pages?

Below is the latest version of the gphoto_tools script I've been working on. It includes the operations album-new and album-add, though obviously this doesn't work.

Ugh. You're killing me Google Photos API, you're kill me.

#!/bin/bash

##
## command line tool for working with the google photos API.
## https://developers.google.com/photos/library/guides/overview
##

API_BASE=https://photoslibrary.googleapis.com/v1
AUTH_TOKEN=`gphoto_auth token`
ID_COL=.id

function usage {
  echo -n "Usage: `basename $0` "
  echo -n "-a {album-list|album-get|media-list|media-get|album-new|album-add|album-rm} [-q json-query] [-i id] [-u] [-n name]"
  echo ""
  exit
}

while getopts ":a:q:i:n:vu" opt; do
  case $opt in
    a) ACTION=$OPTARG ;;
    q) QUERY=$OPTARG  ;;
    i) ID=$OPTARG     ;;
    n) NAME=$OPTARG   ;;
    v) VERBOSE=yes    ;;
    u) ID_COL=.productUrl ;;
    \?) usage         ;;
  esac
done

function invoke {
  buffer=/tmp/gphoto.buffer.$$
  curl -s -H "Authorization: Bearer $AUTH_TOKEN" "$@" > $buffer
  cat $buffer
}

function filter {
  if [ -z "$VERBOSE" ] ; then
    jq "$@"
  else
    cat
  fi
}

case $ACTION in
  album-list)
    invoke -G $API_BASE/albums | filter -r ".albums[] | $ID_COL + \"|\" + .title"
    ;;
  album-get)
    if [ -z "$ID" ] ; then
      echo "Missing -i album-id"
      usage
    else
      invoke -G $API_BASE/albums/$ID | filter -r '.productUrl'
    fi
    ;;
  media-list)
    if [ -z "$QUERY" ] ; then
      invoke $API_BASE/mediaItems | filter -r ".mediaItems[] | $ID_COL + \"|\" + .filename"
    else
      invoke -X POST $API_BASE/mediaItems:search -H "Content-Type: application/json" -d "$QUERY" | filter -r ".mediaItems[] | $ID_COL + \"|\" + .filename"
    fi
    ;;
  album-new)
    if [ -z "$NAME" ] ; then
      echo "Missing -n album-name"
      usage
    else
      invoke -X POST $API_BASE/albums -H "Content-Type: application/json" -d "{\"album\" : { \"title\" : \"$NAME\" } }" | filter -r "$ID_COL"
    fi
    ;;
  album-add)
    if [ -z "$ID" ] ; then
      echo "Missing -i album-id"
      usage
    else
      echo '{ "mediaItemIds" : [' > /tmp/body.$$
      sep=''
      while read line ; do
        echo $line | sed -e "s/^/${sep}\"/" -e 's/$/"/' >> /tmp/body.$$
        sep=','
      done
      echo '] }' >> /tmp/body.$$
      invoke -X POST $API_BASE/albums/$ID:batchAddMediaItems -H "Content-Type: application/json" -d @/tmp/body.$$
      rm /tmp/body.$$
    fi
    ;;
  media-get)
    if [ -z "$ID" ] ; then
      echo "Missing -i <id> value"
      usage
    else
      invoke $API_BASE/mediaItems/$ID | filter -r '.productUrl'
    fi
    ;;
  *)
    usage
    ;;
esac

Monday, January 27, 2020

Man-Bag Dump, January 2020 Edition

I think it's interesting to see how my gear choices have evolved (or not). So here it is, a bag dump as we head in 2020. Much of this gear hasn't changed since I last posted on the topic, back January of 2018. Many details about the items below can be found in that snapshot.

I'm currently rocking a Finnish Gas Mask Bag to hold my stuff. It's large enough to fit a Chromebook and other extras when needed, but small enough to use as an every day carry bag. The goofy looking waist strap does a surprisingly good job of making the bag comfortable to wear for long periods. It's also been indestructible, and the snap-like closure convenient to use.

Two new items I've been carrying are a cell phone telephoto lens and a backup phone. I've experimented with carrying a number of photography accessories, but the only keeper has been the lens. It lets me capture photos that would normally be way out of reach from my cell phone. The backup phone has also served me well, especially during oh crap, I thought I grabbed my cell phone but I don't have it moments.

When you look at the list of items below you'll note that one of them is crossed out and handful of them are in italics. That's because I made this list from memory and then went back adjusted it to match the actual contents. As you can see, I could mentally recall most of the contents, but forgot a few obvious items like my car keys and headphones. Also, in my mind's eye, I carry Dayquil and yet I don't.

This little game of concentration stresses a key bit of gear philosophy: it's massively useful to know what you're actually carrying. You may have the most tricked out collection of gear, but if you don't actually know what's in your bag (not to mention, have experience using it) then your kit is far less useful.

This perspective partially explains why I typically bring the same bag, with the same stuff everywhere: I get to know what I'm carrying. I don't have to wonder if I've got a hair tie, or USB C cable or a backup credit card; because if I have my bag, I've got those items. I also know that I'm not carrying a knife or other contraband which will get me stopped by TSA. Sure, it means that at times I'm carrying gear I almost certainly won't use: an SOL Heatsheet while grocery shopping or a Bluetooth keyboard while hiking in the woods. For now, the extra weight and bulk is offset by the confidence of knowing what's in my bag.

Whether it's a photography kit, knitting kit, or any other collection of gear you depend on, I'd encourage you try this memory exercise. If you don't think you are carrying an item, then you effectively aren't. And if you think you're carrying an item and you're not, that's even a bigger recipe of disaster.

Everyday

  • Carabiner
  • Sunglasses
  • Flip & Tumble Bag
  • Buff
  • TIP Flashlight
  • Kleenex
  • Sharpie
  • Everclear Spritzer
  • Hair tie
  • Binder clip
  • Cash
  • Credit Card
  • Mirror
  • Cell phone telephoto lens
  • Lens cloth
  • Clif Builder Bar
  • Fruit Strips
  • Electrolyte Tablets
  • Car keys and Res-q-Me
  • A-SPAN Street Guide
  • Extra 1 quart Ziploc bag to protect my phone from rain

Medical

  • Gloves
  • CPR Mask
  • KT Tape
  • Duct Tape
  • Gorilla Tape
  • Leukotape
  • Ibuprofen
  • Aspirin
  • Claritin D
  • Dayquil
  • Nightquil
  • Benadryl
  • Anti-diarrhea
  • Anti-motion sickness
  • Migraine Medications
  • Ear plugs
  • Dental floss
  • Bandaids
  • SWAT Tourniquet

Electronics

  • Bluetooth Keyboard
  • Battery Pack
  • USB C Cable
  • Micro USB Cable
  • Garmin Watch Cable
  • Micro USB to Audio Jack Cable
  • Micro SD Card and adapter
  • USB C Micro SD Card Reader
  • USB C Host on the Go adapter
  • Backup Phone
  • Earbud Headphones
  • USB Wall Charger

Outdoor

  • Heatsheet
  • Bic Lighter
  • Cordage
  • Tea
  • Water purification tabs
  • Fresnel Lens
  • Aluminum Foil
  • True Liberty Plastic Bag
  • 1x1 meter orange signal panel
  • Large sewing needle

Wednesday, January 22, 2020

Review: The Garmin Vivoactive 3

Shira's been a fan of the Garmin Vivoactive 3 watch for years, and with her recent upgrade to the Vivoactive 4, it meant that I could take her 3 for a spin. I've been using it for little over a month and here's my take: I like it. A lot.

I've dabbled with a number of watches, from the full featured ASUS Zenwatch to the quirky and minimalist Martian Notifier. It's been interesting to experiment in the world of wearables, but ultimately none of the devices stuck. The Vivoactive 3 shows real promise for bucking this trend and looks to be sticking around. Here's why.

First, the Vivoactive nails being a wristwatch. It's comfortable to wear and sleep in. I find that if I plug the watch in while I'm showering, its battery remains topped up enough that I can forget about its power needs. The Direct Watchface let's me configure a clean and informative main display that shows time, date, daily steps, daily distance and next sunrise/sunset. The watch does alarms, a stopwatch and a count down timer as one would expect.

Second, the watch is a joy to run with. Garmin has designed the flow of recording an activity well, including smart use of both on screen controls and the physical button. I like that I can setup the real-time information screens to be as verbose or terse as I want. Currently, I've got two screens worth of four fields each. On screen one, I see elapsed time, time of day, distance and pace, while on screen two I see heart rate, ascent, battery percent and calories burned. And when I inevitably decide that this is too much noise on my wrist, I can easily dial this information back to just time and distance.

The Vivoactive 3 has a pretty complete set of 3rd party addons such that any feature I've wanted that wasn't built in, I've been able to add. For example, Garmin neglected to offer battery percentage as an activity field (something that's critical for a full day of hiking or running), yet this was easily addressed by installing the battery data field.

The Garmin Connect software seamlessly integrates with 3rd party services. So while I've stopped using the Runkeeper app on my phone to track my runs, Garmin still keeps Runkeeper.com up to date with my activities.

Finally, the watch shows real promise for use while hiking and backpacking. Admittedly, I've yet to fully test this behavior. However, the dynamic.watch app and website allow you to push a GPX map to your wrist so you can visually track your progress. This combined with the myABC widget, which gives you quick access to the watch's altimeter, barometer and compass, mean that the watch is useful for both tracking and navigating in the backcountry. While the Vivoactive 3's battery is great for day-to-day use it doesn't pack enough juice to power the GPS for extended recording (say, 12+ hours). The watch does, however, continue to function when charging. So one should be able to plug the watch into a USB powerpack while in the field to continue to use it as both nav aid and tracker.

I'm still wrapping my head around the fact that the watch functions so independently from my phone. I can now hit the trail without my bulky Galaxy S9+ and yet record stats of my journey and leverage a GPS to find my way. More realistically, I could leave my Android back at the car and take my credit card sized M5 with me in a pocket. This gives me a way to place calls and send text messages without the heft of my main phone. The big catch: I like taking pics while I'm out. So while I could run with either a slimmed down phone or no phone at all, I don't see myself doing this any time soon.

Two features which are nice, but haven't prove their worth yet are the integration with Tasker and the health stats functionality.

I was psyched to see that the watch does at least basic Tasker integration. Yet, I haven't found a scenario where I need this capability. Though it does make the programmer in me happy to know it's there.

Knowing that I took 16,135 steps yesterday, burned 3,307 calories and had a resting heart of 48/bpm is certainly interesting, but I'm not sure what practical value any of this is. Yes, the watch tracks my sleep. But it's no surprise that on nights I stay up late programming I feel zapped the next day, and when the weekend rolls around and I can log extra zzz's I feel better. I'm already motivated to run so the gamification of my health data feels more like a sleazy trick than useful life hack. Still, the watch collects an impressive amount of the data and perhaps I'll think of a clever way to put it to use.

In short, the Vivoactive 3 has served me well over the last month and I can see why Shira's a fan of the device. Now I just need Garmin to come up with a Vivoactive 5, so I can get Shira's 4 as a hand me down.

Tuesday, January 21, 2020

A Google Photos API Command Line Toolkit

My usual post-trip photo workflow is this: (1) For every day of the trip, create an "All Photos" album, and add the appropriate day's photos to that album. (2) For every day of the trip, create a "Blog Photos" album. (3) Review each day's All Photos, copying the best pics into the Blog Photos. (4) For each day's blog post, include the contents of the "Blog Photos" album. Creating the 'All' and 'Blog' albums isn't hard, but it is tedious. The programmer in me doesn't do tedious.

A while back, I realized that if Google had a Photos API, I could script much of the above process. I could effortlessly create and copy each day's photos into the appropriate 'All' album and setup empty 'Blog' albums ready to fill. Alas, when I reviewed the available API I couldn't see any way to create a new album. My brilliant hack would have to wait.

I recently revisited this challenge, and to my surprise it appears Google has filled out its Photos API nicely. Most importantly, you can now create albums. I was psyched!

Using my youtube_tool as inspiration, I was able to throw together gphoto_auth and gphoto_tools. The former authenticates a command line session against your Google Photos account. The latter is the beginnings of a tool for working with the Google Photos API. Currently, gphoto_tools is limited: you can get a list of albums, as well as search for photos. To build out the above workflow, I'm going to need to add the ability to create albums and properly handle the nextPageToken. Still, even in its simple form, gphoto_tools is showing promise.

Here's a sample session:


# Setup a command line auth context
$ gphoto_auth init
Visit:
https://accounts.google.com/o/oauth2/auth?client_id=278279054954-h2du6fu5qtk9jhh2euqn0o7kdesg819u.apps.googleusercontent.com&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=https://www.googleapis.com/auth/photoslibrary.sharing https://www.googleapis.com/auth/photoslibrary&response_type=code
Code? #########################

# List out albums
$ gphoto_tools -a albums
AHnaBgvFACY2NHSg9kiIJ-tDfTGab2vOCa8aWeKmJovY0F-JhHbEL_0nuvII2w_wAJ7IT0Yk7IzP|Snap Circuits
AHnaBguW1U9UKnD5bzLS9ktWlxfgtOtyfCkSq0k4O2C0jurETHIJF4gXC2FsR3NkuiOCFN6fAhha|New Year's Day Hike 2020
...

# 'Search' for the images within a given album (by album ID)
$ gphoto_tools -a search -q "{'albumId' : 'AHnaBguW1U9UKnD5bzLS9ktWlxfgtOtyfCkSq0k4O2C0jurETHIJF4gXC2FsR3NkuiOCFN6fAhha' }"
AHnaBgtpDZNj7odMrv7gquGGcVV02k6ORDnzu5yBSEmWDdCYbvjjp49DfBFOa5TQzLJOrGOXO1xFPe22LChRUiaU3Bk_QnqOCw|20200101_130629.jpg
AHnaBguVuD5WJYJ8pO67lC2PL7QtS-JZ1nEJsti6S_-fD6S9Tn7TIj_t_NKLVAKaU-3awI3S8Oj4aiJxMBROXe2ESHLF47VEzw|20200101_132036.jpg
...

# -v spits out raw json, you can use .jq to work with
$ gphoto_tools -v -a search -q "{'albumId' : 'AHnaBguW1U9UKnD5bzLS9ktWlxfgtOtyfCkSq0k4O2C0jurETHIJF4gXC2FsR3NkuiOCFN6fAhha' }"  | \
 jq '.mediaItems[0]'
{
  "id": "AHnaBgtpDZNj7odMrv7gquGGcVV02k6ORDnzu5yBSEmWDdCYbvjjp49DfBFOa5TQzLJOrGOXO1xFPe22LChRUiaU3Bk_QnqOCw",
  "productUrl": "https://photos.google.com/lr/album/AHnaBguW1U9UKnD5bzLS9ktWlxfgtOtyfCkSq0k4O2C0jurETHIJF4gXC2FsR3NkuiOCFN6fAhha/photo/AHnaBgtpDZNj7odMrv7gquGGcVV02k6ORDnzu5yBSEmWDdCYbvjjp49DfBFOa5TQzLJOrGOXO1xFPe22LChRUiaU3Bk_QnqOCw",
  "baseUrl": "https://lh3.googleusercontent.com/lr/AGWb-e7_6Jmsl-REKgGEscC16xW5soNNF9TqcZbXcbpr9KQityWvGpk0H3gcgU-Sak5-qIqAmeDPNvLyZqVP0Y-y4mNGzQZHPmjNhpFQcEtSqHlas-9pJPNiuUWkKWp-4wovV-tfi8j3elDDJ7YmJLNrnTWtYkCEE-2rUrQlIS6oQdusTrQU9ZqlKilKK8Yxl3iavxU1uOoH1DBK7_a3LEV1K5ZzpqQh4cNClZ5e4ONf2j3NwUBmZU1-TM5gVPu-ymiCyWBImxa0E46c7FbeKduvueVm93j2Yg5ISigVeDxEyFX8Gz3GLMhgieA1CeqEYsrAoBJhGFZHuWN1haMWFIHuwIgsNiE-mDJ67lwICUf-_ldX84LDx_6q_RlkQkw3jKM6bTb4zmOtJ4iWZFTuuj2iGKch7P7vSF_f4b_gSYDjmVkRgDMBfANUEG5hiVp0bd086h3ZCFhzu79kY72eKIe_DEHUwe-Ykg9TvNBHD8xa-0TdbxplhB7tj5DA2skUVl4TdMiBeQp0nGyni7sVKow1J_JXVlHfQOAzoam0RsRADgd1Ki2MihrF1w5hL2JaEPJXY8aeUYHsZPYnC5WWVqO0KK-WVDwGztwQ7Qnh7BF0K8tvgAnGCUOGOAo2pk790eendwfcYeKFlwousUP-pJLFI-R7xiNzteC3U4e-U3HABLcrdAFmPyNYBkreP82qG7kpR3TsbCB0e3jDVE-_k50gyakxhqliy7agPTzVoNieI9acwTcEQoRzteuOBwUZqVkLMq8V5lKlku4ZgSgVkwlVzV-UzGE3C13WMzOooaxtKRZ_iu9TZBGC6ik",
  "mimeType": "image/jpeg",
  "mediaMetadata": {
    "creationTime": "2020-01-01T18:06:25Z",
    "width": "3264",
    "height": "2448",
    "photo": {
      "cameraMake": "samsung",
      "cameraModel": "SM-G965U",
      "focalLength": 2.92,
      "apertureFNumber": 1.7,
      "isoEquivalent": 40
    }
  },
  "filename": "20200101_130629.jpg"
}

# Process the raw json to get at image metadata
$ gphoto_tools -v -a search -q "{'albumId' : 'AHnaBguW1U9UKnD5bzLS9ktWlxfgtOtyfCkSq0k4O2C0jurETHIJF4gXC2FsR3NkuiOCFN6fAhha' }" | \
  jq -r '.mediaItems[] | .filename + ":" + .mediaMetadata.photo.cameraModel'
20200101_130629.jpg:SM-G965U
20200101_132036.jpg:SM-G965U

# Search your images by date range. Clunky, but powerful.
$ gphoto_tools -a search -q "{'filters': {'dateFilter' : { 'ranges' : [ {'startDate': { 'year':2018, 'month':4, 'day':1}, 'endDate': {'year': 2018, 'month':4, 'day':7} } ] } } }"
AHnaBgs8QzksmxLn9bGvkG0IrQaTJwm4uS1ViuljkbB35OeRGE8R9zrOp4AEn7a57QJ9xEUXZTC2hCUZ7ptzYs6GcUgd6af_6w|20180405_175606.jpg
AHnaBgshr_njKJygznaBLZSbAoaD-d_WYfifvzdHHninNWArHfrRqqnm0LGZ3_n0pt-Bl1OnWxoHHOkcRmKHIf5uEJekMbrJjw|20180405_102015.jpg
...

# The -u flag causes URLs to be returned instead of IDs
$   gphoto_tools -u -a search -q "{'filters': {'dateFilter' : { 'ranges' : [ {'startDate': { 'year':2018, 'month':4, 'day':1}, 'endDate': {'year': 2018, 'month':4, 'day':7} } ] } } }"
https://photos.google.com/lr/photo/AHnaBgs8QzksmxLn9bGvkG0IrQaTJwm4uS1ViuljkbB35OeRGE8R9zrOp4AEn7a57QJ9xEUXZTC2hCUZ7ptzYs6GcUgd6af_6w|20180405_175606.jpg
https://photos.google.com/lr/photo/AHnaBgshr_njKJygznaBLZSbAoaD-d_WYfifvzdHHninNWArHfrRqqnm0LGZ3_n0pt-Bl1OnWxoHHOkcRmKHIf5uEJekMbrJjw|20180405_102015.jpg
...

I'm the fence as to whether I should make a simplified querying interface, say '-r' for date range. Or, whether it's best to leave the filter as plain JSON that corresponds to the API docs. The former makes the tool easier to use, while the latter maximizes flexibility. Time will tell which is the best route to go.

Here's both scripts:

gphoto_auth

#!/bin/bash

##
## Authenticate with Google Photos API
##
USAGE="`basename $0` {auth|refresh|token} ctx"
CTX_DIR=$HOME/.gphotos_auth
CLIENT_ID=XXXXXXXXXXXXXXXXXXXX
CLIENT_SECRET=XXXXXXXXXXXXXXXXXXXX
SCOPE='https://www.googleapis.com/auth/photoslibrary.sharing https://www.googleapis.com/auth/photoslibrary'
ctx=default

function usage {
  echo "Usage: `basename $0` [-h] [-c context] {init|token}"
  exit
}

function age {
  if [ `uname` = 'Darwin' ] ; then
    modified=`stat -f "%a" $1`
  else
    modified=`stat -c %X $1`
  fi
  now=`date +%s`
  expr $now - $modified
}

function refresh {
  refresh_token=`cat $CTX_DIR/$ctx.refresh_token`
  curl -si \
       -d client_id=$CLIENT_ID \
       -d client_secret=$CLIENT_SECRET \
       -d refresh_token=$refresh_token \
       -d grant_type=refresh_token \
       https://accounts.google.com/o/oauth2/token > $CTX_DIR/$ctx.refresh
  grep access_token $CTX_DIR/$ctx.refresh | sed -e 's/.*: "//' -e 's/",//' > $CTX_DIR/$ctx.access_token
}

while getopts :hc: opt ; do
  case $opt in
    c) ctx=$OPTARG ;;
    h) usage ;;
  esac
done
shift $(($OPTIND - 1))

cmd=$1 ; shift

mkdir -p $CTX_DIR
case $cmd in
  init)
    echo "Visit:"
    echo "https://accounts.google.com/o/oauth2/auth?client_id=$CLIENT_ID&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=$SCOPE&response_type=code"
    echo -n "Code? "
    read code
    curl -s \
         -d client_id=$CLIENT_ID \
         -d client_secret=$CLIENT_SECRET \
         -d code=$code \
         -d grant_type=authorization_code \
         -d redirect_uri=urn:ietf:wg:oauth:2.0:oob \
         https://www.googleapis.com/oauth2/v4/token > $CTX_DIR/$ctx.init
    grep access_token $CTX_DIR/$ctx.init | sed -e 's/.*: "//' -e 's/",//' > $CTX_DIR/$ctx.access_token
    grep refresh_token $CTX_DIR/$ctx.init | sed -e 's/.*: "//' -e 's/"//' > $CTX_DIR/$ctx.refresh_token
    echo "Done"
    ;;
  token)
    if [ ! -f $CTX_DIR/$ctx.access_token ] ; then
      echo "Unknown context: $ctx. Try initing first."
      exit
    fi
    age=`age $CTX_DIR/$ctx.access_token`
    if [ $age -gt 3600 ] ; then
      refresh
    fi
    cat $CTX_DIR/$ctx.access_token
    ;;
  *)
    usage
esac

gphoto_tools

#!/bin/bash

##
## command line tool for working with the google photos API.
## https://developers.google.com/photos/library/guides/overview
##

API_BASE=https://photoslibrary.googleapis.com/v1
AUTH_TOKEN=`gphoto_auth token`
ID_COL=.id

function usage {
  echo -n "Usage: `basename $0` "
  echo -n "-a {albums|search|get} [-q json-query] [-i id] [-u]"
  echo ""
  exit
}

while getopts ":a:q:i:vu" opt; do
  case $opt in
    a) ACTION=$OPTARG ;;
    q) QUERY=$OPTARG  ;;
    i) ID=$OPTARG     ;;
    v) VERBOSE=yes    ;;
    u) ID_COL=.productUrl ;;
    \?) usage         ;;
  esac
done

function invoke {
  buffer=/tmp/gphoto.buffer.$$
  curl -s -H "Authorization: Bearer $AUTH_TOKEN" "$@" > $buffer
  cat $buffer
}

function filter {
  if [ -z "$VERBOSE" ] ; then
    jq "$@"
  else
    cat
  fi
}

case $ACTION in
  albums)
    invoke -G $API_BASE/albums | filter -r ".albums[] | $ID_COL + \"|\" + .title"
    ;;
  search)
    if [ -z "$QUERY" ] ; then
      invoke $API_BASE/mediaItems | filter -r ".mediaItems[] | $ID_COL + \"|\" + .filename"
    else
      invoke -X POST $API_BASE/mediaItems:search -H "Content-Type: application/json" -d "$QUERY" | filter -r ".mediaItems[] | $ID_COL + \"|\" + .filename"
    fi
    ;;
  get)
    if [ -z "$ID" ] ; then
      echo "Missing -i  value"
      usage
    else
      invoke $API_BASE/mediaItem/$ID | filter -r '.productUrl'
    fi
    ;;
  *)
    usage
    ;;
esac

Friday, January 17, 2020

Happy Feet: Profession Based Footwear Optimization

You can't overstate how important the right footwear is. My sturdy Salomon hiking boots have provided me with grippy traction and dry feet. My Brooks Ghost 11 running shoes helped me recover from plantar fasciitis and stay injury free. Even something as simple as the right pair of water shoes can help turn a painful trip to a rocky or scalding hot beach into a pleasant one.

So in hindsight, it's a little odd that I've never considered optimizing my work footwear.

For some professions, this means finding the right pair of steel toed work boots, or tactical combat boots. For other careers, it might be simple as a pair of shoes you can stand all day in. But what's the ideal footwear for the work-from-home programmer?

For Chanukah, my Mother-in-Law gifted me what appears to be the answer to this question: a pair of Snoozie Slippers.

These bad boys are super soft, have anti-slip bottoms and feel terrific. They are like wrapping my feet in little warm hugs. Plus they're quite fashionable:

Oh, heck yeah my productivity has gone up! Thanks Mom!

Wednesday, January 15, 2020

Marathon Man

Lately I've been on an ultra-running podcast kick, listening to Science of Ultra, Ultra Runner Podcast and Training For Ultra. I've used insights from the interviews to tweak my running and eating, and I've been logging lessons learned.

My parents are in town for my Brother and Sister-in-Law's new addition, so I thought I'd use the opportunity to pick my Dad's brain about his running habits from back in the day. I knew my Dad ran marathons previously, but didn't know the details.

Turns out, he was a beast.

My Dad's first run was a 10k in Hornell, NY. From there, he moved up to marathons. The longest distance he covered at once was 30 miles on the Erie Canal towpath with a fellow running buddy.

What I found most impressive was my Dad's training. He run commuted to work, varying his route from a direct 4 miles to an indirect and hilly 7. He'd close out the day with a 4 mile run home. On the weekends he'd log a long run, usually 15 miles or so. That's about 60 miles a week. That's the volume I see mentioned on /r/ultrarunning to tackle ultra-marathons, something that wasn't in vogue at the time.

He'd start the day by checking his resting heart rate, using it as an indication of how recovered he was from the previous day's effort. If it was higher than usual, he'd cut down on the miles. This idea of listening to your body and being disciplined enough to do less is a bit of wisdom often noted in the podcasts above.

This past week, I clocked in runs at about 9:30/mile for 6 miles or so. When I'm running with consistency, I can typically get that number down to about 8:30/mile. My Dad's marathon PR was in the mid-to-low 3 hours. That means he was logging a 7 something per mile over 26 miles! Dam!

My Dad is all very humble about this. While run-commuting was (and still is! a sensible training strategy, my Dad had an additional reason for embracing it: at the time we were a one car family.

Looking back at my Dad's running, I realized there was another benefit to it all: he passed the habit on to me. He did so essentially through osmosis. He didn't sit me down and say: Son, I'd like to give you a gift that will benefit your physical and mental health, build your confidence, allow you to solve wicked problems and even help you meet your future wife (true story!). Nope, he just went for a run. I have visions of him stepping out the back door in running clothes in all possible weather conditions, day after day. It was only natural that at some point I'd join him and we'd run together. And to this day, I'm still hooked. Thanks Dad!

These days my Dad is no longer a runner; he quit to save his knees. But old habits die hard. He can still be found at the gym cranking out intervals on a spinning bike. He was telling me just the other day about a hacking-your-health online class he took that suggested he could dramatically reduce his time spent exercising and still get nearly the same benefit if he followed the course's suggestions. But why on Earth would want to do that? You'd give up the joy of pushing your body for an extended period of time. Spoken like a true marathoner!

Update: Pics, or it didn't happen, right? Here you go!

Thursday, January 09, 2020

Meeting Our Newest Newphew

Today we met our newest nephew! Such a handsome dude. Oh, the adventures we're going to have together. I can't wait!

Little Man: that Cubs outfit was bought by Shira at a Cubs game, the night she heard that you were on your way. And your Daddy and I are drinking a L'chaim in Grandpa Irv's shot glasses. He'd surely lecture us for not giving you a little nip. Welcome to the world, we're so glad you're here!

Wednesday, January 08, 2020

Auto Ring Tone Generation - Making Some Noise

A while back I heard not one, but two different iPhones use a morse-code sequence as a ringtone. I'm not sure if the tapped out message was static or changed with the caller, but I knew it was a concept I wanted to play with. If I could setup the morse code to match the in-coming caller ID, I'd have a custom ringtone for every user in my address book with little effort. Plus, I'd earn serious geek points!

While I had a vague idea of how I could use Tasker to make this idea go, there were still countless unanswered questions. Do I pre-generate the sounds files or create them on the fly? If Tasker can't associate a ringtone with a user, what app can? Is Tasker reliable and efficient enough to run a sound generation task as a call comes in and then kill that task when the call is answered? And on, and on.

However, before I started sorting all of this out I figured I'd start simpler: generate morse code tones from text. This would confirm that morse-code based ringtones are even worth pursuing and would let me get some important momentum with the idea.

When it comes to generating sound, it's hard to beat JavaScript's AudioContext for getting the job done. Because this code runs in any modern-day web browser, it's easy to write and test anywhere.

I've experimented with browser based audio before, yet that was a while ago. I found this tutorial helped get me back up to speed.

My approach to morse code generation was first to support playing a tone of 'units' length:

  // Inspired by:
  // https://modernweb.com/audio-synthesis-in-javascript/
  function tone(units) {
    var ctx = new (window.AudioContext || window.webkitAudioContext)();
    var osc = ctx.createOscillator();  
    var duration = units * tick;
    var attack = 1;
    var gain = ctx.createGain();
    
    gain.connect(ctx.destination);
    gain.gain.setValueAtTime(0, ctx.currentTime);
    gain.gain.linearRampToValueAtTime(1, ctx.currentTime + (attack / 1000));
    gain.gain.linearRampToValueAtTime(0, ctx.currentTime + (duration / 1000));

    osc.type = "sine";
    osc.frequency.value = units == 1 ? 440.000 : 880.0000;
    osc.connect(gain);

    osc.start();
    setTimeout(() => {
      osc.stop();
      osc.disconnect(gain);
      gain.disconnect(ctx.destination);
    }, duration);
  }

I then convert the text to a series of dots and dashes, and loop through each letter playing either a 1 or 3 unit tone (1 for dot, 3 for dash). This happens more or less in real-time, with each letter being played followed by a callback being registered to play the rest of the text 'snooze' seconds later.

  function play(code) {
    if(code.length > 0) {
      var c = code.shift();
      if(c == ' ') {
        var snooze = tick;
      } else {
        var units = (c == '.' ? 1 : 3)
        var snooze = tick * units;
        tone(units);
      }
      snooze += tick;
      setTimeout(() => {
        play(code);
      }, snooze);
    }
  }

You can experiment with result here: Auto-Ringtone. You can see complete code here.

While I'm sure I could get the morse code to sound better, I'm happy with the approach I've taken so far. Dots and dashes are not only different durations of sound, but also different frequencies. Next up: figuring out a ring tone strategy on my phone.

Monday, January 06, 2020

Snap Circuits - Adult Level Insights from a Kid's Take on Electronics

I was over at our friend's house playing with their little ones when the kids had busted out a Snap Circuits kit. I quickly became enthralled. My enjoyment was so obvious that the parents turned around and got me my very own Snap Circuits kit for Chanukah. Whoo! (Thanks L&N!)

I've always had an interest in circuit building, but I've never mastered the nuances. I want to make the leap from assembling a circuit to appreciating what each component is doing and why it is selected. Snap Circuits seem to be the Duplo-Blocks of electronics: extra big and chunky, yet perfect for the beginner. Just as you can learn basic building concepts from Duplo, I was curious if I could up my electronics understanding through Snap Circuits.

I've built 36 of the 125 experiments described in the user guide that came with the kit. So the jury is still out as to how big an impact Snap's going to have on my overall understanding of electronics. With that said, I've noted two impressive nuances of the kit that suggest it may be the game changer I'm hoping for.

First off, consider this circuit: (#17, Hi-Low Fan):

There's nothing especially complex about this circuit. Current flows from the battery to a motor in two different ways. If S2 is pressed, the current flows directly to the motor. If S2 is not-pressed the current flows through the lamp and then into the motor. In both cases the motor spins. One key take-away: pressing S2 causes the motor to spin faster. That's because the lamp introduces resistance, and this resistance slows the motor. Pressing S2 bypasses this resistance.

While this is a clever experiment, what struck me as especially interesting is how the kit is using the lamp as a resistor. When I started diving into my Skill Builder Kit one of my first thoughts was: this is fun, but I'm going to need to buy a bigger kit to get more components. It turns out, however, that having fewer components is a Feature not a Bug. Just like in the experiment above, the kit uses all of the components in novel ways.

So far, I've used a lamp as a resistor, a motor, photo resistor and transistor as a switch, a speaker to listen to current, an LED to see sound a paper clip and USB cable as a conductor. I see this as far more than a frugal use of parts. I think this is key to stop seeing electronic components as specific puzzle pieces and start seeing them as general purpose tools.

The other nuance of the kit that I'm thoroughly enjoying is just how easy it is to experiment with minor circuit variations. Take experiments #31 (Transistor Control) through #34 (Murky Motor). They are all variations on this circuit:

That is a transistor, a motor and a lamp. By varying the position of the lamp and motor you can get entirely different results. From my understanding of transistors I would have assumed that in the above circuit the motor would be spinning and the lamp would be lit. In fact, the lamp is lit but the motor doesn't spin. That's because the majority of the current goes through the 3 snap wire and then the lamp. There's enough of a current in the motor's branch to keep the transistor open, but not enough current to get the motor to spin.

These kind of subtle circuit differences are easy to gloss over, yet key for a deeper understanding of electronics.

Because it's trivial to re-snap the circuit in different configurations, you can experiment quite easily. While the user guide often takes you through specific variations of a circuit, it's easy to come up with your own variations.

Of course you can do this with a few dollars worth of electronics and a breadboard. However, Snap Circuits are even easier to work with than a breadboard.

As you can imagine, I'm excited to continue chugging away through all 125 experiments. But I've already had enough Ah-ha! moments to know that Snap Circuits are a winner. If you're willing to supply the curiosity, this kit will do wonders.

Thursday, January 02, 2020

Starting 2020 Off Right

Yesterday I had planned an epic 11 mile slog hike along the AT. Alas, the weather for the area didn't cooperate so we had to bail. Fortunately, we were still able log some miles outdoors. We did a delightful 4 mile hike around Burke Lake with my Brother and Sister-in-Law.

What the trail lacked in distance it made up for with fun geocaching and good conversation. I can't think of a better way to start 2020!