Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.

2017/06/29

Three Little Words: The Perl Conference 2017 Day 2

I lost the bubble.

I wrote a post at the end of the first day of the Perl Conference, intending to do the same for the second and third, but I went to visit family and play with their puppies on Tuesday night, and puppies are better than blogging.

Puppies!
The beginning of the third day, my laptop hit a weird issue where it couldn't enable the wireless NIC. I fixed it at home, once I had access to precision screwdrivers again, but this meant that I couldn't blog the last day, either.

But waiting affords me the opportunity to add videos of the talks I attended. So, yay.

I started Day 2 with Sam Batschelet's Dancing In The Cloud, which was not what I was expecting, being more a deep dive in another domain. I was hoping for an introduction to using the Dancer framework on cloud providers. It was interesting, and I now know what kubernetes does and that etcd exists, but this is far enough away from where I live that I don't know if it'll give me much more than buzzword recognition when I next attend a technical conference.


Again, I would like to reiterate that it was deep for me. Good talk, good speaker, but I was the wrong audience. Read the talk descriptions, not just the titles!

This was followed by Chad Granum on Test2, which is not yet on YouTube. I came to TPC wanting to up my testing game, and I've been looking through Andy Lester's modules and brian d foy's Stack Overflow posts trying to get into the mindset. This talk is a long-form walk through the diffs between the old and new test frameworks, and by showing how to test using Test2, it showed a lot about how to test in general. Once the talk is up, I am sure I will go through it again and again, trying to get my head around it.


But just because we don't have this talk on YouTube, that doesn't mean we don't have Chad talking about testing.

My next talk was Adventures in Failure: Error handling culture across languages by Andrew Grangaard, which is another point of growth for me. All too often, I just let errors fall, so knowing how they're commonly done is a good thing, well presented.


Damian Conway was the keynote speaker, and his talk was called Three Little Words. Those are, of course, I ❤ Perl.

Also, class has method. Conway proceeded to implement Perl6-style OOP with Dios, or Declarative Inside-Out Syntax, which needed Keyword::Declare to enable the creation of new keywords in Perl 5. To get this going, he needed to more quickly parse Perl, so along comes PPR, the Pattern-based Perl Recognizer. It was in this talk that the previous Lightning Talk from Chad, including his module Test2::Plugin::Source, comes from.

The Lightning Talks were good, but the one that comes up as a good one to put here is Len Budney's Manners for Perl Scripts. This makes me want to look into CLI::Startup.


I hope to get to Day 3 and maybe another post just on Lightning Talks soon. An hour's dive into topics won't get you too deep, but an hours' lightning talks will will open up a number of possibilities for you to follow.

2017/06/27

Reverse Engineering Google Maps at Highway Speed



I was on I-70 in Maryland on Sunday, going to Alexandria, Virginia, along with a lot of others. I was using Google Maps for navigation. When I could look down, the route was looking red, indicating congestion and delay. Eventually, Google said, "We have an alternate route that will save you a half hour. Want to take it?"

Of course I said yes. So I took the next exit, went down some rural highways, through a small town and almost down someone's driveway, it seemed, and back onto I-70. I called it a win.

The Friday after, on my way home, it rained all the way through Maryland, West Virginia, Pennsylvania and Ohio, and well into Indiana. It occasionally rained hard enough that I started seeing other drivers turning on their hazard lights for others to see them, and I followed suit.

A truck had crashed in western Ohio, closing westbound lanes, and Google told me it would add an hour delay, but when it routed me through five miles of county roads to the next on-ramp, it was closed. I knew it -- I saw the chain across the road -- but Google didn't, so I alone, without a line of other vehicles, bird-dogged a route to the next on-ramp with Google urging me to make a u-turn every few hundred feet.

This made me think about what's actually going on inside the navigation feature of Google Maps. It starts in graph theory, establishing the US road system as a directed graph with weighted edges, showing Interstates as preferable to highways to county roads and side streets, and using Dijykstra's shortest path algorithm to find the best route.

In graph theory, everything is either a node or an edge. In this case, a node could be the intersection of 6th and Main, or the fork where I-70 splits off to I-270.  The edges would be the road between 5th and 6th on Main St., or the long path between exits on the Interstate. In other uses of graph theory, the nodes are most important -- Facebook Users being nodes and their friendships being edges in Facebook's Social Graph -- but here, the information about the edges is the important part.

This is similar to how we used to do it, looking at a paper map and preferring Interstate for long-haul routes, judging this road or that road as preferable due to scenery or a hundred other criteria, dropping to surface streets only to get through the final few miles. But, Google knows this road gets gridlocked at rush hour and that one is under construction, which you can't tell from your ten-year-old road atlas, and has a huge body of historical data that allows it to present alternate routes and the estimated time difference between.

The increasing amount of data helps it properly weigh edges and make connections. Early in my time with Maps, it suggested I go from Lafayette to Ft Wayne through Indianapolis, which would actually add an hour to the trip, because it had weighted the Interstate so highly. Now, it properly suggests two east-west routes and avoids the southern detour entirely. Similarly, an intersection near work is right-turn only, and it took some time for Maps to not suggest a left turn.

Maps re-calculates the route often, which is why, after a time off its path, it still says "fastest way is behind you; turn around and go back". When you're just stopping for gas, it's a little annoying, but when you see that the highway department has blocked the suggested route, you wish it would just shut up and get with the program. I'd have to take more trips where I'm not the driver to really tell, but I would guess it re-runs shortest-path several times a minute. My guess is there are waypoints, places along the route between here and there, so Maps tries to find the shortest path between them, rather than rethinking every turn in your 600-mile journey, which makes this faster and more predictable.

Maps has a lot of data for each section of road, showing how many drivers use it and their average speed, as well as the posted speed limit. If the drivers are going slower than average and slower than the speed limit, that indicates there is a problem. In the Ohio case, the Department of Transportation must have reported that the reason was an accident, but to paraphrase Scream, causes are incidental. What's important is knowing where things get back to normal, and if cars aren't there, then Maps has no way of knowing where that is and what on-ramp is open. This is guesswork of the highest order, but in a week where my every movement was guided by Maps' algorithms, this was the only point of failure.

When I took that detour in Maryland, I was not alone; I counted at least eight vehicles taking that route along with me. Google offered me the choice, but this can't have been an option for every driver on a backed-up highway. I think there must have been a process of A-B testing, where some were rerouted and some were kept on the main road, and it used this information to decide where to send drivers later.

I don't often take these long drives, so it may be a year or two before I'm so fully in the hands of Google Maps and on such a dynamic journey, but I expect the experience to be even better.

2017/06/19

Feeling "Tipsy": The Perl Conference 2017 Day 1

A few lessons learned before I dive into the talks I attended and what I gained from them:

  • I had figured that aspects of my Every Day Carry would not be appreciated in a government building, so I left my multitool off my belt, but, even after I had put everything I could think of into the tray, I had still made the machines go beep, causing the guards to pull out the wand. Tomorrow, pack lighter.
  • I take notes on a paper notebook, but I used my phone and tablet to connect to others via IRC and Twitter, as well as keeping track of my calendar, and I ran through my batteries before the end of the talks. In part, it was switching between the local WiFi and the cellular network, doing well with neither, but I'm not convinced it'd wouldn't be drained regardless. At lunch, I need to find a place to charge. I often come with a power strip for just this purpose, too.
  • I didn't break out the laptop once. If I don't use it more, I should just leave it and have less to carry.
Hopefully, I will remember this and come prepared for the next one.

Now, to the talks:

I started the day with MetaCPAN, the Grand Tour. I had been convinced earlier that MetaCPAN is better than CPAN for looking into modules, but there's more, including modules I am going to have start using to search CPAN.  

Graham Knop's Continuous Integration for CPAN followed. I had been aware of Travis-CI, and had become aware of AppVeyor recently, but the tooling available in Perl to work with these was less familiar to me. I was unaware that you can specify Linux or OSX in Travis, This was something I was thinking and asking questions of other developers about. I have issues on FreeBSD, which I'm told is something that Gitlab-CI can help me with, but somehow, I doubt I can connect Github to Gitlab, but I could be wrong.

Steven Lembark had much more with Dockerizing CPAN Testers: Running an Isolated Test Site than I could fit into my head, and I think I'll have to go back to the tape once it's available, but I think it's a useful addition to the world.

After lunch, I went to Joel Berger's Variables, Scoping and Namespaces, which he set as a talk for beginners. He went so far as to suggest more established developers go elsewhere. Since I never thought I learned all of Perl when I was learning, it was very much a lot of things I did already know, but a little "So that's why we do that", some more "Oooh. I forgot about that", and one weird trick that explains how to mock functions for tests.

(That, fundamentally, is my big item to work on as a developer. Testing.)

After this, I attended Matt S. Trout's ES6: Almost an acceptable Perl 5? and it gave me a sense of treating Javascript like Perl, but since I don't code Perl like Matt does, I probably won't code ES6 like Matt does. My notes peter out about halfway through, but they do include several gems, such as lodash, that might improve the code I do write.

Following this is Lightning Talks, which has a bunch of interesting points, going from "Write for OpenSource.com" to "Try learning Dart" to "Creating Test JSON from JSON Schemas" to "Using Instapaper, IFTTT and Buffer to tweet what you read" to the Alien modules, which I almost understand now. Or maybe not. Certainly, though, I'd be installing and trying Dave Rolsky's RTx-toGitHub right now if I wasn't so tired.

Finally, Sawyer X talked about Perl 5.26 and the changes that came and the changes that are coming. The thing that comes to mind first is that things that have been deprecated since 5.0 are finally being pulled. I understand the critics who think that removing . from @INC is insignificant, but I am still for it. I also like that Perl recognizes unhandled merges and dies now. 

Tomorrow, I will be learning about Dancer, Test2 and more with @INC, and visiting with family after

2017/06/02

Tracking your Old Tabs in Chrome over Time

Starting "Has this ever happened to you?" is a very informercial way to start, but it's where my brain has left me: Working on your computer and suddenly, something happens. Blue screen. Kernel panic. Browser crash. Whatever. Restart things, open Chrome and click "Restore Tabs" and it ... does nothing.

Or does something, but not enough.

A user's list of open browser tabs and windows is a map of that user's interests. For me, there's usually 6-20 open windows with 3-12 tabs covering a number of topics that, while interesting to you, are not directly applicable right now.

And Chrome is of no help these days. As Neil Young said "It's all the same song", Google believes, between phones, tablets, laptops, desktops — heck, the talking-donkey called Google Home, for all I know — that it's all the same browser. Pages you saw recently, no matter what device, are at the top of chrome://history, and the tabs that you kept around for when you get back to that topic are way dropped out, and ironically, the tabs you looked at, rejected and closed are right at the top.

Last night, I killed lots of tabs to enhance browser stability. Today, looking for an image, I screwed up my machine, and so I rebooted. It's Linux and that's bad Linux admin work, but it wasn't a hill I wanted to die on, and when I got back on, the six windows and maybe 20 tabs total were gone. There were certain givens (Tweetdeck, Chrome) and certain trashables (work tabs for tasks completed the day before) but the "I was gonna get back to that" windows are gone, and the pages they held are deep deep deep in the communal history behind today's web comics and headline links.


I cannot get those pages and the plans I built on them back. But, I can start to get my house in order to keep it from happening again. And that starts with storing them.

This page shows the default location for Chrome user data. Lifehacker has a how-to on Restoring tabs, detailing what files hold what you need and how to force Chrome into Recovery Mode. the key things are in Current Session, Current Tabs, Last Session and Last Tabs. These files are SNSS format, which could be better (I've found a Github repo with an SNSS parser in Python, but haven't started working with it), but they respond to strings, so you have something to work on without the parser.

#!/usr/bin/env perl

use feature qw{ say state } ;
use strict ;
use warnings ;
use utf8 ;

use DateTime ;
use File::Copy ;
use File::Path qw{ make_path } ;
use IO::Interactive qw{ interactive } ;

# program to back up chrome tabs for easy restore should things go bad
# add to your crontab with something like
#       @hourly ~/bin/chrome_tab_backup.pl

# T
#

my $now = DateTime->now()
                  ->set_time_zone('America/Indiana/Indianapolis')
                  ->set( second => 0 ) ;

my $date = $now->ymd('/') ;
my $time = $now->hms('-') ;

my $source = join '/', $ENV{HOME}, qw{ .config chromium Default } ;
my $target = join '/', $ENV{HOME}, '.chrome_backup', $date, $time ;
say $source ;
say $target ;

if ( !-d $target ) {
    my @dirs = make_path($target) ;
    }

chdir $source ;
for my $f ( grep {m{(Tabs|Session)$}n} glob '*' ) {
    say $f ;
    copy( $f, $target ) or die qq{Copy Failed: $! } ;
    }

There's a bit of generalization in this. I would prefer to make it discern what TZ you're in without hard-coding it, and I could just use UTC and everyone would know to look for the most recent one.

The Lifehacker page points to a Windows-specific (or maybe Windows and Mac, I dunno) Local State file as where you set exited_cleanly to false in order to force recovery, while in Linux it's Preferences.

Where this leaves us is a bunch of files in the directory. I could figure out a way to parse the SNSS. I could create an HTML page showing that time's open tabs, which obviates the need to force Chrome to open the correct tabs. I could begin to start grouping them by window and time, saying "You've been wanting to get back to MQTT for some time; fish or cut bait".

Plus, this example is very Linux-centric, meant to service Chromium and run in crontab. Making it run on Win7 or Win10 and Strawberry Perl and be run in Task Manager is important, as really, my Windows machine is for browsing and testing.

If all I get is "Hey, give me my tabs back!", I'll be happy.