Friday, June 23, 2017

A Habit and a Hack: Adding a Heads Up Display to my Offline TODO List

The Habit

I've added another piece of my offline, index card based TODO list tracking system. On Sunday evening, I bust out a fresh card, a small ruler and make myself a grid for the week:

(The last 'W' stands for weekend)

I then pour over my stack of TODO cards and come up with a high level schedule for the week, picking one or two large items to work on every day:

At the start of a given day, I usually fill in more items that I want accomplish that day.

This little exercise, including the arts and crafts component of drawing out the grid, has been surprisingly effective. It's one of those habits that's so obvious that you can't imagine how you lived without it for so long.

The Hack

The large binder clips danging off my standing desk are doing a good job of organizing some key essentials. On the other end of the spectrum, I've found a new use for the tiniest binder clips I own. This was strictly an accidental discovery. I picked up a a few rare earth magnets the other day only to find out that they were *too* powerful for the task I wanted to use them for. At one point, I had the magnets on my desk and I was fiddling with them. Turns out, the tiny binder clips I use to keep my TODO stack together are ferromagnetic (that is, they stick to magnets). And the same was true for the outside border of my monitor.

It took some experimenting, but I eventually arrived at this arrangement:

Two small neodymium magnets stacked on top of each other firmly hold my entire stack of TODO list items, while one small magnet easily holds the weekly schedule card.

The result is a heads up display that shows me both the task I'm currently working on, as well as my weekly overview. The system is surprisingly robust, with the cards staying firmly in place yet easily accessible. And besides, there's something magical about magnets that make them a tactile joy to use.

My suggestion: pick up some neodymium magnets and see what sticks.

Thursday, June 22, 2017

Android Wear's WatchMaker: A Hacker's Delight

A Little Kvetching, A Little Inspiration

My ZenWatch 3 came with an impressive number of watch faces pre-installed. And many of them look quite cool. But alas, that's also the problem: they just look cool. On my list of needs from a smart watch, aesthetics is near the bottom while functionality and hackability are at the top.

While pondering this dearth of appealing watch faces it hit me, why don't I build my own? I make use of Tasker and AutoWear to expand my watch's capabilities, perhaps I can do the same thing with a custom built watch face?

I hit Google and found various articles that talked about getting into the custom watch face game. This article recommended WatchMaker, which seemed like a good fit. While WatchMaker has a massive gallery of watches, you can also build your own. Without having a clue, I created a blank watch face and started clicking around. As I browsed hrough the list of elements I could add to my creation I noticed the option of adding an 'Expression':

Clicking on this option took me to list of various types of data I could add, such as the time or weather information. As I scrolled down further I saw there was Tasker support (whoo!) and ultimately, the list of expressions finished with a statement explaining that "WatchMaker uses Lua, a scripting language used in..."

They had me at scripting language. Building a watch face would be nice. Programming one, well, that would be awesome.

Learning Lua in 24 Hours

Off to the WatchMaker Wiki I went, where I learned that WatchMaker is full powered by Lua. My entire knowledge of Lua is this: Lua is a programming language. But still, this was hugely promising. If I could run arbitrary scripts in WatchMaker, then I could effectively treat WatchMaker as an interpreter just waiting for me to writing interesting code. Before I could get too excited, I needed to get the low down on Lua.

Getting oriented with Lua was surprisingly easy. First off, I installed QLua on my Android device. This gave me access to a Lua interpreter and editor. Then I visited the online version of Learn Lua and browsed through the first few sections. I was pleased to find that QLua executed all the sample code without complaint. My iClever keyboard, LG G6 and QLua made for a hassle free learning environment.

I still don't know much about Lua, but I know that I like it. It has a Scheme feel to it, with it's small set of language constructs, reliance on a single data type for many data structures and first class functions. The embedded nature of it reminds me of TinyScheme and it's relationship with Gimp.

After about 24 hours, I had enough Lua to jump back into WatchMaker to see if I could hack something together.

Let's Get Building!

For my first WatchFace I wanted to demonstrate the ability to invoke a function that I hand coded. Turns out, that's not a tall order. At the watch properties level, there's a chance to provide a 'script.' So I wrote up a trivial version of the Factorial function and stuffed it into that field. QLua was handy in helping me verify my code outside of the WatchMaker environment.

function fact(x)
  if(x == 0)
    return 1
  else
    return x * fact(x-1)
  end
end

I then added a new layer to the watch and wrote this expression:

  {dmo} .. "!=" .. fact({dmo}) -- {dmo}

{dmo} is a magic variable that represents the one's value in the current number of minutes. In other words, if the time is 3:06, {dmo} is 6. Why did I choose {dmo}? Because it's a value that changes relatively often. I know, all this talk of functionality, and my first watch face does nothing but calculate a factorial of a useless value.

The -- {dmo} is a work around as prescribed here. I should note that the WatchMaker UI seems extremely polished. This is clearly an app with some serious development behind it.

I figured I should make my watch face at least a little usable, so I also added another layer that showed the current time.

It took a few iterations to get all this working, but when I was done I was blown away: I had programmed my first watch face. It showed the time and the result of my factorial function. To some, this may seem like a silly exercise, but to me, this shows that I can use WatchMaker as a hacking environment for the ZenWatch. The possibilities are endless!

While I was on a roll, I thought I'd broaden my horizons just a bit. Browsing the WatchMaker wiki I can across a recipe for changing color dynamically. I decided I'd incorporate this functionality into my watch face. The idea: at the start of the hour (say, 3:02) the watch would be green, as the hour progressed it would fade to purple (at say, 3:55). A quick glance at the color of the watch would show me how far through the hour I am.

Programming this was surprisingly simple. First, using QLua I coded up a function that maps a value from 0 to 1 to a color. 0 would be green, 1 would be purple:

function percent_to_color(percent)
  return string.format('%.2x%.2x%.2x',
                       percent*255,
                       (1-percent)*255,
                       percent*255)
end

After adding the above to the scripts section, I changed the background color of my watch face to this expression:

  percent_to_color({dm}/60) -- {dm}

{dm}, is as you guessed, the number of minutes in the current time. It's quite impressive that fields like background can take arbitrary expressions. To me it shows that while WatchMaker has a slick UI, it's also got serious geek cred to back it up.

Finally, here's some shots of the watch face in action:

And here's some screenshots of WatchMaker:

It isn't often that you find an app that's a game changer, but I believe that's what WatchMaker is. For anyone who wants to experiment with the world of wearable computing, or quickly develop interesting watch apps, or is just looking for a fun way to learn to program, WatchMaker belongs in your toolbox. Simply put: it's a Lua interpreter on your wrist; what more could you ask for?

Wednesday, June 21, 2017

Eating Hardtack / Ships Biscuit - An Experiment A Year In The Making

Nearly a year ago I cooked up a batch of Ships Biscuit or, as landlubbers and Civil War soldiers would have called it: hardtack. In many respects, ships biscuit / hardtack was the original super food. Not because it contained impressive amounts of nutrition, or because it was only found in tiny, expensive grocery stores. No, it earns a special place in the food universe out of pure utility: it's super easy to prepare and can go months, perhaps years without, refrigeration. This made it the ideal food for ship crews and soldiers alike.

Sure, you can read about how soldiers and sailors subsisted on this food. Or better yet, you can bust out some flour and water and mix up a batch. That's what I did.

I marked the batch with the date and expected to come back to in a few months to do a taste test. Instead, I totally forgot about it. That was, until a couple of weeks ago. And just like that, a year had passed! My cooking test was going to be more authentic than I'd imagined.

In preparation for chowing down my little experiment I did some research. I found excellent accounts of how Civil War soldiers dealt with hardtack here, here and here.

As I read through tales of hardtack being tooth-cracking hard, and being infested with weevils, I grew more and more concerned. This was beginning to feel like I was on an episode of Survivor.

This morning I brewed up a cup of black tea and busted out my ships biscuits:

I was relieved to see no sign of bug infestation. But man, were those biscuits hard. We're talking little hockey pucks. I managed to crack off a piece and dip it in my tea. I took a bite. Not bad. Not great. But not bad. I dropped another piece of the 'bread' into my tea and left it. I then continued to nosh. Hardtack is nothing but flour and water, so it basically tastes like eating four. But even after a year, it wasn't rancid flour, so that's a good thing.

Finally, I extracted the piece of 'bread' that I dropped into my tea and crumbled it up as best I could. I then dropped it into some scrambled eggs. I generously doused the eggs and hardtack with sriracha and ate up. Again, not bad. The hardtack is really pretty neutral, with the tea giving it a little flavor.

I can see why soldiers would have dunked their hardtack in coffee. Hot chocolate would have been even yummier. I can also imagine that dropping soaked hardtack into a frying pan with meat, or cooking it up in a soup would also work well. Basically, it's just calories.

I'm counting this culinary experiment as a success. I got to see hardtack / ships biscuits super powers on display, just about a year in storage without refrigeration and I got to connect with my ancestors who no doubt cursed the stuff. Will this staple find itself included in my next backpacking trip? Probably not. But who knows, maybe I'll go super-old school in which case I dare not step out into the woods without being properly equipped.

A special thanks to Jas. Townsend and Son for inspiring me to eat my way into understanding history.

Tuesday, June 20, 2017

Gotcha of the Day: Deploying a git non-master branch to WP Engine

I have a number of clients who host sites on WP Engine, and I've become a fan of the service. What you give up in flexibility that a vanilla hosting account provides you more than gain back in performance and robust WordPress tools. The fact that I've been able to talk with a knowledgeable rep on live chat with almost no wait time has been a nice bonus.

WP Engine has git based deployment support. While I still prefer subversion over git, this solution does elegantly solve the problem of deploying content to server with a minimum of confusion.

My standard practice is as follows: after a release I fork off a release branch that is used to host urgent bug fixes. The fixes are eventually merged back into master. At the next deployment, I fork off a new release branch, discarding the old one. And repeat. This approach has served me well for years now.

A week ago, I ran into a significant gotcha: I needed to deploy a fix that had been made on the release branch. However, WP Engine assumes that all deployments are associated with master. In other words, I could make changes to my branch release-2, and I could push that branch to WP Engine, but it wasn't going to actually deploy those files.

After much Googling I arrived at the following recipe:

  git push -n  production release-2:master

That is, push release-2 to master on the remote named production.

This worked. Every once in a while, git surprises me with actually simplifying matters, and it did so in this case. It's still too early to tell how this work-around is going to play for the next release. But at least I got my bug fix deployed in an orderly way. The alternative was to drop back to using sftp and that's no fun at all.

Monday, June 19, 2017

Weekly Discoveries

Norah Jones's version of Black Hole Sun is definitive proof that her voice can turn any piece of music into something sultry and poetic. I haven't listened to Norah in some time, and now I'm reminded why she was so popular back in the day.

It turns out the band OK Go, a band I'd never heard of until last week, is the definitive champion of music videos. For me, it all started with The One Moment Video, which is an impossibly timed creation. I then started poking around their account and learned that this amazing video is merely par for the course. A few other examples: Upside Down & Inside Out and I Won't Let You Down. Seriously, how do they do all that sans green screens and CGI? My only criticism of OK Go: their videos aren't ideal to work to. To appreciate them, you've got to give them your full attention. But man, is it worth it.

Bleacher's Don't Take The Money and Big Data's Dangerous, are both way over the top videos that are hilarious and remarkable creations in their own right. Dangerous is definitely leaning towards earning itself an R rating, but it's just too perfectly executed to not include it on the list.

Aaron Lee Tasjan's Hard Life contains the perfect line: They give you loose gravel and call it rocksteady. I'm not exactly sure what genre Tasjan falls into: country? folk? I don't suppose it matters, all the does is that he's a fun listen.

Nikki Lane, for me, had two notable songs: Jackpot and Forever Lasts Forever. The first is a fun tune about winning at the gamble of love. The second tune, despite what the title suggests, is a sad song about divorce and the end of a relationship. While the latter may be a better song, I can't help but put the former on my discoveries this week. What, with my wedding anniversary coming up, it seems to apply quite well.

Listen to all the discoveries here.

Friday, June 16, 2017

Weekly Discoveries

Snow Patrol's Called Out In The Dark is a great example of a song that works far better for me because of the video. Had I just heard it on the radio, I wouldn't have given it a second thought. But the video is so endearing, I can't help but appreciate the tune itself.

Last week we were out to dinner at our favorite Etheopian place and I was taken by the music in the background. I asked the waiter about it, and apparently they had a Michael Belayneh CD running. Of course, Michael is on YouTube. He's got quite a few tunes up, and they seem to span genres. This one has an EDM vibe to it, while this one feels almost country. My favorite of the bunch so far is Ashenefe. My Amharic is a bit rusty, so I've got no idea what he's actually singing about. Let's hope it's good things.

Kudos to Michael Franti & Spearhead for their tune I'll Be Waiting. He managed to work the lyric the best things in life aren't things into his song. Truer words have never been spoken. When you need a little pick me up, I'll Be Waiting is exactly what you want to listen to.

On the non-music side, check out Peter Hochhauser's This is not a beautiful hiking video. If this doesn't want to make you grab a backpack and head out on a multi-month adventure, then you have no soul. He does a great job of compressing the highs and lows of walking 2,650 miles(!) into 10 minutes of video.

And for all you foodies out there, check out Jas. Townsend and Son's Parched Corn Videos. Watching a video on parching corn should as exciting as watching paint dry, yet the folks at 17th Century Cooking found a way to make this terrifically interesting. Who knew there could be so much history behind one simple ingredient?

Watch all the videos here:

Thursday, June 15, 2017

Putting the Murse on a Diet - Applying Ultralight Principles to an Every Day Carry Bag

Despite how positively goofy it no doubt makes me look, I've definitely been a fan of regularly carrying a bag full of extra goodies. (As a side note, said bag needs a better name. Murse? Man Bag? EDC Bag? Who knows.) The problem with any gear setup is that carrying too much is almost as bad as carrying too little.

There's both a physical and mental penalty for overdoing it, whether we're talking about a man bag, your fishing tackle box or your Yoga kit. On the physical side, carrying extra stuff literally weighs you down and can limit mobility. On the mental side, carrying too much means it's hard to keep track of what you actually have. The result is that you can be thinking you're carrying X, when you aren't. Or worse, think you don't have X, but you do.

So while I like having some essentials on hand to solve problems and deal with the curve balls life throws my way, I also wanted to skinny it down as much as possible.

In my latest attempt to do this, I took a page from /r/Ultralight, the Reddit community obsessed with trimming weight from backpacking gear. While I'm not as hard core as many of the users over there, I do appreciate their methods and philosophy (to a point) and was glad to take a few pages out of their playback to help me lighten my load.

In the world of ultralight backpacking there are two ways you can trim weight: carry less stuff and replace your stuff with a lighter version. I applied both of these ideas to my daily setup.

On the 'carry less stuff' front, I found it helpful to be disciplined about carrying items that met two criteria. Either the item had to be used frequently (like my little Flip & Tumble shopping bag I use all the time) or have relatively high value (you may not use Imodium often, but when you need it, you absolutely positively need it). The less frequently I use something, the higher value it should have. So it only makes sense to carry a Res-q-Me that I plan never to use, because it's of such high value. Carrying stuff 'just in case,' where the case is likely never to happen, is a trap I wanted to avoid.

Like any gear related pursuit, the selection of what gets carried is always in flux. Still, I did some pairing down and was happy with the results. I was pretty confident that I was carrying stuff I'd use regularly, or would be worth its weight in gold should it be needed.

That brought me to step two of the process: replace heavy items with lighter versions. For this, I followed closely in the footsteps of /r/Ultralight, and busted out the food scale. I then made a spreadsheet of what each piece of gear weighs:

I then went through and noted what the heaviest items were. In my case, the portable keyboard, backup battery and bag itself were the chief weight offenders. I then cloned this spreadsheet and went to work searching Amazon for possible replacement items. By using weights mentioned on Amazon or elsewhere, I was able to get a sense of what my weight savings would be if I went ahead and made these purchases.

Here's what I arrived at:

This simulation step turned out to be surprisingly valuable. Considering replacing items on an individual basis, wasn't especially compelling. But, simulating the changes showed just how big an impact I could have. At the end of the day, I realized I could drop nearly 1.3 pounds. That may not seem like much, but that's dropping more than a quarter of what I carry.

Here's a quick tour of the substitutions I made.

Our first stop, my beloved Perixx 8051 keyboard. For most folks, carrying a keyboard around everywhere is probably a mistake. It's exactly the kind of "just in case item" that weighs you down and doesn't have a lot of value. But in my case, because I'm a computer programmer who owns his own business, my keyboard is essential. Using it and my phone, I can fix customers issues from nearly anywhere. With a browser and ssh, I can program my way out of quite a few problems. Ultimately, the keyboard may be heavy, but its far lighter than carrying around a laptop, which would be the alternative.

Because of both the frequency and value of carrying a full size keyboard, I knew I wanted to keep it in my rotation. But, I supposed it made sense to look for a lighter version. I was pleased to discover this iClever ultralight version. Check out how it stacks up against the Perixx:

It's less than half the weight, nearly half the thickness and maintains the full size keys. It has a improved keyboard layout (finally, the arrow keys are in a sane location!). Time will tell if I like or hate the ergonomic angle on the keys; though may every-day keyboard has this style. The keyboard does everything I could ask for, including automatically turn on and off when you open and close it. The only issue: like my Perixx, it doesn't lock open. This means that using it to type on my lap is a pain. I need to find a solution to that.

Next on the chopping block, my Anker 10,000mAh portable battery. This thing is an absolute game changer. This item alone nearly justifies carrying a man-bag. Having it means being able to keep my phone alive. And with my phone, I practically have super-powers, from important tasks like using Google Maps or making an emergency phone call, to more mundane things like turning an hour long wait into a virtual trip to the movie theater.

While I knew the Anker was heavy, I appreciated the plush safety net it gave my phone. The 10,000mAh version of the battery charges my phone nearly four times and has two ports. With a little research I learned that I could drop down to a 5,000mAh battery and save nearly half the weight and nearly half the thickness:

Time will tell if cutting down to a 5,000mAh battery is the way to go. It still provides quite a bit of cushion for my phone, as I can charge it just about twice. But still, we'll see if it's the right trade off to make.

And finally, the biggest weight offender was the bag itself. I really like the iBagBar small messenger bag. It's big enough to hold essentials, yet small enough that you're not carrying a full size messenger bag *everywhere*. I also very much like the fabric flip lid - it's a simple design and totally quiet. I like the canvas styling because it's neutral and in my imagination gave off a bit of an Indiana Jones vibe.

But man, is it heavy. And also, by downsizing, it was getting too large for the items I was carrying.

Amazon has a billion bags to choose from, many of which have free return shipping for prime members. In fact, I found that some bags had colors that were free return shipping while other colors weren't. For example, the lime green version of this bag has free returns, while the other (more appealing?) colors don't.

I went ahead and picked out a few bags on Amazon that claimed to be lighter weight and that had free-returns and pulled the trigger on them.

Out of the bunch, the winner was the Wsdear Sport Small Crossbody Bag. It checked nearly all the boxes. It's lightweight, coming in at 158 grams versus the 588 gram iBagBar and has good organization. I can use the main compartment to hold most of my items, there's a slot in the main compartment which nicely fits my keyboard and battery, and there's a front pocket which holds essentials like hand sanitizer and tissues. Best of all, it has a big slot in back that I can drop my phone into, rather than having to shove it in already full pockets. There's even a tiny pocket in front that fits my wallet, which is handy when I'm wearing gym shorts without any pockets.

The main shortcomings of the bag are quality and style. The zippers just seem so fragile. Part of that is probably a good thing and contributes to its weight, but I won't be shocked if the bag doesn't hold up. At $16.00, I'm fine just calling this an experiment to try.

And then there's the style. I can't say it's ugly, but it does feel very, shall we say functional? Gone is my Indy vibe, and now I'm a lot closer to the 80's fanny pack style. Scary, I know. Like I said, it's an experiment. Ultimately, a guy carrying a bag of any kind around here looks wrong, so I just need to accept that and move on.

Whew. So it's done. I've got a lighter bag. And the thing is, I'm happy with the result. The setup is noticeably lighter and didn't give up much in the way of functionality. Of course, this is always a work in progress, so I assume I'll look back at myself in 6 months and laugh at my naivete. Such is the joy of being a gear head.

Update: And for completeness, here's a snapshot of what's in the bag:

Wednesday, June 14, 2017

WP All Import and Google Sheets - Best Buddies, and Worth Mastering

One of, if not my favorite, plugin for WordPress is WP All Import. Specifically, I love that it's able to slurp a Google Spreadsheet into a series of well formatted posts. WP All Import doesn't care if the data was hand entered into a Google Sheet, or if it was entered via a Google Form. The latter allows you to create a crude data entry interface and then publish that data in a WordPress site with minimal effort.

Here, let me show you what I mean.

WP All Import in Action

Suppose I wanted to publish the data collected in this Cool Tools survey. Getting the data into a Google Spreadsheet is trivial. I did that here. And publishing that data so that the world can see it is easy, too:

Jumping over to the WordPress side of things, you can import this data into posts with relative ease:

The above sequence shows entering the download URL, then selecting entry as the element that WP All Import should use to demarcate posts. And finally, you construct the post by dragging and dropping the available fields in to the editor. There are tons of options during the process, so you can attach an image, add categories and tags and choose whether posts should be drafted or published automatically. It takes some time to learn, but with a little practice you can do really clever things without having to write a line of code.

And Now, The Catch

If you were paying attention to the above process, you'll notice that there was one key step I glossed over. That is, where the heck does the download URL come from?

This URL is a special one, and not at all obvious. You can use command line tools to figure it out, but that put this solution out of reach for many users. In short, the entire process of converting a spreadsheet to a series of posts is code-free, *except* for figuring out the download URL. Until now.

The Solution

This question of what a sheet's download URL is, came up often enough that I developed a web page to answer this question. Check it out here (and the source code is here, enjoy!). And here it is, in action:

This web page requires that you enter the public URL to the spreadsheet, in this case https://docs.google.com/spreadsheets/d/1eyDNkv5BfsgzTEQZA5Sprgwj5mjyBhviXX2IN590ux0/edit#gid=563082240, and it does the reset.

I can see from this, that the data URL is: https://spreadsheets.google.com/feeds/list/1eyDNkv5BfsgzTEQZA5Sprgwj5mjyBhviXX2IN590ux0/o9b8tei/public/full.

If your spreadsheet has multiple tabs on it, then each tab will get its own download URL, which the tool will report.

Really, This is Cool

If you haven't considered populating posts from Google Sheets, let me give you one more example. A while back I collected up various testimonials from clients. As they came in, I dropped them into a Google Spreadsheet as that was the natural place to track them. We're working on a website re-design, and I wanted to incorporate those testimonials into a custom post type. Could I have hand entered each testimonial into WordPress? Of course, but it was far faster to just slurp in the sheet full of testimonials. By establishing a unique identifier during the import process, it's even possible to re-import updated data later on. This makes sense for my testimonials, as I can edit these small bits of text far faster in a Google sheet than going one by one in WordPress.

The bottom line: WP All Import rocks, you just need to keep your eyes open for where managing data in a spreadsheet is faster than managing it directly in WordPress.

Tuesday, June 13, 2017

Lost in Google Space; Untangling the Mystery that is Google Drive and Google Photos Integration.

The Google Photos / Google Drive integration is both brilliant and tremendously frustrating. I've got Google Drive setup to expose Google Photos, and Google Photos setup to pull from Drive:

Yet the simple act of finding a photo in the Google Photos that's stored in Google Drive is a hit and miss operation, with miss being the usual outcome.

Consider the latest conundrum I got myself into: I uploaded a whole heap of photos in Google Drive. They were nowhere to be found in my photo stream. Of course, my Google Photos stream is a mess, containing 300 photos from the last *weekend* alone. So I'm sure they were there, but they weren't obviously visible.

Ahhh, but this is a Google product, so surely the answer is to search. And search I did. I searched for the filename. Nothing. I searched for various dates associated with the photo (the create date, the upload date, etc.). Nothing.

Perhaps the photos didn't get imported? I tried to upload them again. The thing is, Google Photos is smart enough to not recreate photos that have already been uploaded. While I could upload photos, and Google would report success, I was still left with no idea where the photos were.

My first break came when I searched for 'pumpkin' - that is, an object in one of the photos. *That* Google could find. Sheesh. The filename wasn't supported, but searching by object was. I then looked at the date on that photo and found a few more. But where were the rest?

On a whim, I tried searching for the folder name. Oddly enough, Google Photos seemed to know about the folder name, but when I search for it, nothing came up.

See, when I searched for my test album: Adventures in Foo I see that Google Photos seems to know about it:

Yet running that search, brought up photos of adventures, not the specific photos I was looking for.

And then I realized that there was folder icon next to the auto-complete option. I clicked it:

And finally, success, I found myself at an album.

And that appears to be the magic that ties Google Photos and Google Drive together. All photos in drive are implicitly put into albums named by their folder. You can search for and select the album from with the Google Search bar. You can't search for the folder name, or find these albums on the list of Google Photo albums you manually created. Nope, you need to search for and click on the album in the search bar.

That's a remarkably subtle UI choice, as the auto-complete option is typically there as a convenience. In this case, it's the seemingly only route to get to the album. Surely Google knows this confusing?

At least I finally get it.

Not only did I search for the album name of the big block of photos that I uploaded. But, I found that I could re-organize the files in drive and have them show up in a new virtual album seamlessly. Ultimately, this is confusing but cool.

Monday, June 12, 2017

Football and chasing puppies, the perfect recipe for fun and feeling old

We had an awesome time hanging out with one of our college buddies yesterday. I can honestly say, Jared and Beth haven't aged a day since our glory days, nearly 20 years ago. The same, however, can't be said for their kids.

Liam trounced me in Bubble Hockey and wowed me with his mastery of Clash Royale.

And Maya is adorable, with an endless supply of energy and an infectious smile.

Such good times!

Friday, June 09, 2017

A Better Blurl: list posts by label, patch posts and other command line blogging goodness

I'm trying to avoid shaving the yak here, but having limited success.

The labels (or tags, as every other blogging platform calls them) on my blog are a mess and I'd like to clean this up. I can do this by hand, but the Blogger procedure for doing this is painfully clumsy to use. What I really want is a command line utility that lets me update labels in bulk. I can't seem to find such a utility, so I'll have to build my own. This isn't as hard as it sounds, as I've previously develoiped blurl, a lightweight Blogger command line tool.

blurl, however, lacked some key features to allow me to develop my label fixing utility. I've now addressed those shortcomings. The latest version of blur, which you can grab here has the following features added to it:

  • The tool now returns all results, not just the first page's worth. In the past, if you asked for a listing of posts, you got the last 30. Now, you'll get every single post on your blog.
  • The list function can optionally accept a label, which will narrow the results to just that label.
  • The tool can now run a search and return back all matching posts.
  • The tool now supports PATCH'ing, which allows you to update a post without having to re-specify every aspect of the post. For example, you can just change the title of a post and leave the content and labels unspecified and unmodified.

With these new changes in place, I should have no problem building my mass-label-changing utility. Stay tuned.

Here's a few examples of the new blurl functionality at work:

# Grant blurl access to my blog
$ blogger_auth -c blogbyben init
https://accounts.google.com/ServiceLogin?XXXXXXXXXXXXXXXXXXXXXXX
Code? XXX
Done

$ export BLURL_AUTH_CTX=blogbyben

# Find out the unique ID of my blog and store it
# in the environment for later commands to use.
$ blurl -a info  -u https://www.blogbyben.com/
12753102:Ben's Journal
$ export BLURL_BLOG_ID=12753102 

# Last 3 posts in 'reviews' (that's plural)
$ blurl -a list -l reviews | head -3 
2534412822419131921:Review: ASUS ZenWatch 3 - Top quality reviews, but a wait-and-see experience
8359106005748061573:Review: Personal: A Jack Reacher Novel
7209997068661197075:Review: The Boys in the Boat: Nine Americans and Their Epic Quest for Gold at the 1936 Berlin Olympics

# Last 3 posts in 'review' (that's singular)
$ blurl -a list -l review | head -3
5631052962590454401:Review: The Time Traveler's Wife
6292599222302015512:Review: A Partial History of Lost Causes
5153762744021255090:Review: Containment

# Move one post from 'review' to 'reviews'
$ blurl -a patch -i 5631052962590454401 -l 'reviews, recommendations'
http://www.blogbyben.com/2017/01/review-time-travelers-wife.html

# Confirm it worked
$ blurl -a list -l reviews | grep -i time
5631052962590454401:Review: The Time Traveler's Wife

$ blurl -a list -l review | grep -i Time
1390162071521669374:Review:  A Tale for the Time Being: A Novel
7280970208702975504:It's Summer Shoe Time -- Giving A New Minimalist Shoe A Chance
1389130645758623727:Review: Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time
3086195861781100501:Review: 30 Things Every Woman Should Have and Should Know By The Time She's 30
7655086660726473861:Review: The Life and Times of the Thunderbolt Kid: A Memoir
1579868997134332395:Killing Time In Recovery

Wednesday, June 07, 2017

Review: ASUS ZenWatch 3 - Top quality reviews, but a wait-and-see experience

My roller coaster experience with the LG Style Smart Watch left me thinking I'd successfully acclimated to the world of modern smart watches. To recap: I started off hating the watch, as it failed to outperform my old stand by, the Pebble Classic. However, after a week of use I found that I was growing attached and could see the potential of the device. Ultimately, I decided that the battery life really was too poor to be acceptable, and so back to Best Buy the device went.

I purchased the LG Style because it was relatively low cost, and while plenty of reviews I'd read said to pass on it, I decided not to listen.

For my second time around, I though I'd give more credence to the reviews and ultimately opted to get an ASUS ZenWatch 3. It received plenty of positive reviews, and I figured with my LG Style experience behind me, it would be smooth sailing. Sure that leap from a dumb phone to a smart phone is painful, but usually going from one Android device to a new, higher praised mode, is a breeze. Oh, how I wish it were so.

I picked up the ZenWatch 3 and very quickly found myself both unimpressed and confused. Unimpressed because it was significantly heavier and chunkier than the LG Style, and it replaced the functional and elegant 'crown' with three ugly buttons. Yes, the watch contained a speaker, which meant that I could finally take calls on my wrist. But the quality of the phone experience from the test calls I made just gave me more reasons to be unimpressed.

And then there was the confusion. While there were similarities between Style and the ZenWatch, I found myself befuddled by the ASUS's user experience. I thought I had figured out how to navigate an Android Wear device, but now I wasn't so sure. Just the simple act of managing apps was totally different between the two devices. So much so, that I had turn to Google for help on this basic operation.

Ultimately, I owe my friend Nick a big thank you for clearing up my confusion. The ZenWatch 3 comes with Android Wear 1.5, while LG Style had Android Wear 2.0 on it. I may have upgraded hardware, but I took a step backward in the world of software. I assumed that with the rave reviews on the ZenWatch 3 that it must have had the latest watch OS. Apparently, not true. From poking around on the web, I'm told that the ZenWatch 3 will get an upgrade to Android Wear 2.0 any day now.

Part of me wanted to ditch the ZenWatch 3 and go back to svelte Style. However, the ZenWatch 3's battery does appear to significantly outperform the Style. And while the buttons are ugly, they are convenient. At some point, the software upgrade should come through, and I'll be all set in that department. Ultimately, I've decided to stick with the ZenWatch. I'm hoping it's upgraded hardware will pay off in the long run.

I am, however, refusing to learn Android Wear 1.5 in a degree of detail. Once I get AW 2.0, I'll get back to focusing on figuring out clever ways to use the watch. For now, I'm just digitally treading water; using what works and leaving well enough alone.

Should you run out and buy a ZenWatch 3? Uh, give me a few more months on that question. For now, your guess is as good as mine.

Tuesday, June 06, 2017

Weekly Discoveries

Last week I was out on Wednesday and Thursday, which meant that I came back to quite the inbox on Friday. The solution: bagpipe music, obviously. That eventually led me to this song from the movie Black Hawk Down. Great movie, and great tune.

Morgxn's Home is a well done video that captures the both the pain of not fitting in and the joy of finding like minds. It's a solid tune, too.

You know the old story of the two space suit wearing strangers on the street? Well you will after you watch Andrew McMahon in the Wilderness's So Close. I suppose it's sort of related to Morgxn's Home (both in theme and embracing dance), but it's quite bit less intense. Dance off, anyone?

And truly lighten things up, check out Emily Vaughn's MOOD. Next time life's trying to drag me down, I'm so blasting this tune.

Listen to all the discoveries here:

Monday, June 05, 2017

Location Based Notifications; Or Letting the wife know you're going to be late in the coolest possible way

I still get lots of mileage out of my Report Location Tasker Action I setup nearly 4 years ago. I use it to let Shira know where I'm at when I'm running late on a jog. Sometimes, I put the action in a loop and send her my location every few minutes. At other times, I'll be more selective and use my smartwatch to notify her of my location when I'm at a light or the like.

It occurred to me that I could automate some of this based on my location. That is, when I was getting close to my final destination I could kick off a report so Shira knew I was on my way. Thanks to AutoLocation, which allows you to setup a geofence and then react to it, setting this up in Tasker was a breeze.

I setup the follow three actions: Tripwire Center, Tripwire Arm and Tripwire Trigger:

These three actions form a sort of simple state machine. At the start of my run I kick off Tripwire Center which grabs the current latitude and longitude and updates the specially named geofence Tripwire to be centered around it. It sets this geofence to have a 400 meter radius, or about 1/4 of a mile. Time will tell if this is too large or too small. Finally, the action turns on the Tripwire Arm profile.

Assuming my run is longer than 1/4 of mile, at some point I'll step outside of the geofence. At this point, the Tripwire Arm profile will be triggered and the corresponding action run. This action turns on the Tripwire Trigger profile which awaits my return and also shuts down the Tripwire Arm profile so it's not triggered again.

Finally, as I'm making my way back to my start point I'll eventually cross back into the geofence and the Tripwire Trigger profile will be invoked. This is where the magic happens, as I invoke the Report Location Task here.

I'm going to have log a few runs with this action before I can tell if it's useful. But I'm excited to be putting AutoLocation and geofences to use.

LinkWithin

Related Posts with Thumbnails