Cookie Notice

As far as I know, and as far as I remember, nothing in this page does anything with Cookies.

2012/02/23

I Suppose I Have Been Trying QS Already

I suppose I should've said that the weight tracking is my first step toward Quantified Self that I've programmed, because I've used some apps written by others.

I have MyTracks and RunKeeper on my phone (not that I really ever use them), both of which can be used to keep track of what you do when you exercise. I have used MyTracks for a few walks, but the problem is more me and my lack of scheduled walks than anything with those tools. I can credit MyTracks with telling me that the walk from the bus stop to my building is 0.31 miles.

The coolest thing I've used so far is called Smart Alarm Clock. Key to this is the Android phone's capability to sense motion. If you tell it to wake up at 6am, it starts trying to sense you moving around, which you do more in REM sleep, which is not when you want to wake up. When it senses you being more still, that is when you are sleeping the lightest, which is the best time to wake you up. It really is like magic.

The weight tracker is somewhat inspired by the Hacker Diet, which I have not fully read. 

Dipping My Toe into Quantified Self

Quantified Self. "Self knowledge through self-tracking". And weight is what I want to track.

It was a moment where it just came together. "I can do that!"

I haven't hacked my scale to wirelessly tell my computer. That's a bit more than I want to do. Yet. (I've been looking at XBee stuff on adafruit, just thinking about that.) Right now, I want to collect data. What do I need for that? I need a way to save this data that I can handle in the AM before I really shower and wake up. Sound like a simple web page back-ended in a database to you? It did to me.

I roughed it out last night, and finished it up this morning. I did some gussying it up with CSS this evening after work, but that's incomplete. It looks much better, though.

I have only one true entry into the database so far, so graphing that information isn't a thing yet. I figure ten days of data is a good point to start taking a running average. I know how to automate that, but I don't have that data on that machine....

2012/02/18

Maker vs Developer

I've noticed I have a tendency. I solve things once.

Let's take a specific example. Google Plus does not have RSS feeds. It does have an API, for which you can get a key, where you get a JSON output, and you can parse it down and push it into RSS. I didn't, at first, want to do my own stuff. There are a few people who tried to get into the game, but they fall down due to the limits put on G+ by Google.

So, I felt pushed to make it myself. In Perl, that's a little YAML to kick the API key out of the code so you can put it on Github, LWP to get the feed, JSON to parse it and XML::RSS to spit it into the desired format. I have it on a server somewhere. I want it specifically so I can put it into Google Reader, then plus the things I really care about, which ifttt then pushes out to my Twitter feed.

So, I'm the one who uses it.

Only me.

This could go on to Google+, Y U No RSS? but that isn't really the point here. The point here is that this, and similar tools I write, are designed to be things I want and end at being things I want. I think this is a very Maker thought. It's Adam and Jamie in their workshop, saying "I need a rig to drop a ball off a table, so how do I do that?" I have this problem. I solved this problem. I no longer have the problem. If you can do anything with it, fine. That's fine, but — and it pains me to say this — it isn't really scalable. Which is to say, it isn't a way to make scratching your own itch a business.

I do want to think more like an entrepreneur, trying to turn what can I do? more toward what can I do for others? which is how to make money off these things. This is why I liked the Greater Lafayette Startup Weekend. That's something that could be better for me.

2012/02/15

How to manage several MySQL accounts

Here is something that I wish I had known a while ago.

So, I had my .my.cnf filled with the information for the login that was mine, which I used for test. Eventually, I started to work with a production database, and was given one login for general CRUD (generally for the web interface, but useful for day-to-day work as well) and one for creating and modifying tables.

I needed to be able to be in multiple ones at a time, which meant I couldn't just copy a config file with all the information about one setup to .my.cnf, then copy another when I needed that.

My initial take on the solution was to make a script and then an alias with all the correct login information (including the password), then call that when I wanted to get into the admin account. This puts the keys to the database into the process table, which is stupid and wrong and dangerous, but I didn't know a better way.

This morning, I thought to ask the DB StackExchange. Of course, I should've searched more thoroughly first. Here is what a sample .my.cnf would look like:

[clienttest]
host        = server.university.edu
user        = test
database    = test
password    = abc123

[clientprod]
host        = server.university.edu
user        = production
database    = production
password    = abc123

[clientadmin]
host        = localhost
user        = admin
database    = production
password    = abc123

[mysql]
prompt='mysql [\u][\d][\h]>'

You then start mysql by typing mysql --defaults-group-suffix=whatever, where the config says [clientwhatever].

That can, of course, be a [client] setting, so just typing mysql gets you into a default space. Adding aliases for easier typing is also useful.

alias test_db="mysql --defaults-group-suffix=test "
alias prod_db="mysql --defaults-group-suffix=prod "
alias admin_db="mysql --defaults-group-suffix=admin "

I think that prompt is too verbose, so mine is cut down to mysql [\u]>, but knowing which account you're in keeps you from accidentally killing something important.

And, of course, in the tradition of other pessimization removers I've blogged, it is good to have an alias to get into your .my.cnf.

alias mycnf="vi ~/.my.cnf"

2012/02/14

"Bug" affecting Safari browsers

Strictly speaking, Safari doesn't have the bug.

Strictly speaking.

Consider this URL: http:/~foo/

It is ugly, isn't it? This showed up in our workflow, not from code I wrote, and when I objected to it, my co-worker said the browser renders it (meaning, if the page with this janky URL on it is on http://server.org/, the browsers we used (Chrome, to be specific) would tell we meant http://server.org/~foo/ and act accordingly.

There are generally two ways of presenting URLs. Absolute: http://server.org/~foo/ , and relative: /~foo . Let me step back from that, because, beyond the server and protocol aspect, /~foo is far more absolute than ../foo. So I don't know what to call /~foo/. And I couldn't dig through the RFCs enough to say "RFC #### says no!" so I had to drop back to "It looks janky".

Because Chrome accepts it.

And Firefox accepts it.

And IE accepts it.

And Opera accepts it.

But one of our user labs is mostly Mac, browsing with Safari, and they had a problem with the page with the janky URLs. They don't know what to do with that sort of URL. I can argue that they shouldn't know what to do with that sort of URL, because nobody should use that janky sort of URL. In fact, I do. The bug was with us being janky with what we send, defying Postel's law. That URL is the reason XML hates the world. But Safari drops the ball in a way other browsers can handle, which bothered our users, who had nothing to do with the issue. This was reason enough for us to code URLs better.

And, it turns out, the default browser on pre-ICS Android phones also hate it.

I am now a registered Apple developer, only so I could get to the bug tracker and report this.

Simple Telephony Tool with Twilo

As long-time readers of this blog might know, I'm a command-line guy, and for any given technology I start to play with, I like to have a command-line tool. And once Scott W. pointed me to Twilio, I had to go that way.

The good thing about that methodology is that it takes the part that you want to find out and combines it with stuff you pretty much know well. Because there's a lot of PHP in the Twilio documentation, I went with PHP for the project this weekend, but now that I had a moment to take a step back, I stepped to my preferred methods for this, and came up with call_me.pl.

#!/usr/bin/perl

# A simple command-line tool for making a telephone call

use 5.010 ;
use strict ;
use Carp ;
use IO::Interactive qw{ interactive } ;
use URI::Escape ;
use WWW::Twilio::API ;
use YAML qw{ LoadFile } ;

my $twilio_conf = get_twilio_conf() ;

my $status = join ' ', @ARGV ;
if ( length $status < 1 ) {
    while ( <stdin> ) {
        $status .= $_ ;
        }
    chomp $status ;
    }

if ( length $status < 1 ) {
    say { interactive } 'No content' ;
    say { interactive } length $status ;
    exit ;
    }

my $twilio = WWW::Twilio::API->new(
    AccountSid => $twilio_conf->account_sid ,
    AuthToken  => $twilio_conf->auth_token ,
    ) ;

my $url = $twilio_conf->url . uri_escape( $status ) ;

my $from     = $twilio_conf->from ;
my $to       = $twilio_conf->to ;
my $response = $twilio->POST(
    'Calls',
    From => $from,
    To   => $to,
    Url  => $url,
    ) ;

say { interactive } $status ;

exit ;

#========= ========= ========= ========= ========= ========= =========

sub get_twilio_conf{
    my  $twilio_conf = $ENV{HOME} . '/.twilio.conf' ;
    my  $config ;
    if ( defined $twilio_conf && -f $twilio_conf ) {
        $config = LoadFile( $twilio_conf ) ;
        }
    else {
        croak 'No configuration file' ;
        }
    return $config ;
    }

Now, for a few notes of explanation.

First, the modules. I use Carp for error handling, IO::Interactive to allow output when used interactively, but hide it when put into a batch or crontab. URI::Escape is used to turn the message into something query-string-ready, because the way Twilio calls work is that this call interface points Twilio to a web page that handles the actual interaction of the call. And, I'm trying YAML so I don't have to parse the configuration file myself.

To make a call, you need an auth token and account SID from Twilio, plus you need a number. You can use the sandbox number, but then you need to put in a code to make the call work, so you need to get a number from Twilio, too. I put the message URL and the number to call into the YAML, too, because I wanted to have the same connect-it-all-via-pipes method I use for twitter.pl and facebook.pl to be at work here. If your use case is calling all the server guys if there's a power outage to the server room, you will want to pull the number from somewhere else.

As I mentioned, you need a web-available TwiML application to define the interaction. In my case, it looks like this:

<?php
    header( "content-type: text/xml" ) ;
    echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n" ;
    ?>
<Response>
    <Pause length="3" />
    <Say language="en" voice="woman">
       <?php echo $_GET['message'] ; ?>
    </Say>
</Response>

As mentioned, I did the work over the weekend overwhelmingly in PHP, and this is very much based on that work.

Twilio has an XML format that determines how phone conversations are handled, with enough hooks that I can easily see how you could implement a voicemail system, an extensive phone tree, or a song-of-the-day setup. I don't have that here. Right now, what I'm seeing is a tool to find my phone when I don't have another one to call it, or a way to have my phone tell me when a long-running process ends, or even a for-me wake-up call setup. Twilio also can do SMS, which is what most of my coding this weekend was around. SMS works over phone, not IP, so it should still be able to tell me things when I have my 3G shut down to save battery life. So, as a user and not a business, I guess I'm starting to see some practical uses for  this stuff.

I will have this up on Github before too long. If you have ideas about what you could do with Twilio, I would be glad to hear it.

2012/02/12

Just got back from Startup Weekend and boy, is my brain tired!

This is not a rant. I entirely enjoyed myself and would do it again. But /var/log/rant is where I go to write about tech stuff, so here it is.

It was the Greater Lafayette Startup Weekend, held at LafayetteTechHUB. Concept works like this:

  • People pitch ideas on Friday. Lots of ideas.
  • Everyone votes about the ones they think are most doable.
  • They then group up and work on the Minimum Viable Product of the idea until about 6pm Sunday.
  • Like "Lion's Den", mentors judge who has the best idea, presentation and product
This was me the whole weekend.



I don't want to say much, because the idea isn't mine, but I can say it involves SMS in a way that involving SMS makes sense. I used Twilio to do the SMS, plus a few phone calls. There exists Perl libraries for Twilio, which I have tried to poke at before, but the best documentation was for PHP, so I learned PHP with the better documentation rather than dig through an API I don't understand with code I did.

But, I met some people, people I had previously talked to at GLITS or seen present at BoilerWeb or followed on Twiter, or just had never seen before. I learned about PhoneGap and a few other shim libraries you can use to develop mobile apps with Javascript.

But that isn't the most valuable thing I got.

The absolutely most valuable thing I learned from the experience is how much you can get done with a little vision, a little knowledge and talent, a little money (not much, even) and a lot of dedication. It really was a wonder to see.

Even if I had to look up from the laptop and take my headphones off to notice.

2012/02/09

Thinking Aloud on an SQL Problem

This is an extension of this question from StackOverflow. I have a pile of Xs, each with an id. Some are allowed to be together, and I want a table to hold them. But...
  • I want to be able to group pairs and triads
  • I want a simple interface
StackOverflow users pointed me in the right direction, which is to an idea I had seen before. The table for X could look like this:

 id | string 
----+--------
 01 | aaabbb
 02 | aabbaa
 03 | bbaabb
 04 | bbbaaa
 05 | aaaaaa
 06 | bbbbbb
 07 | bababa
 08 | abbaab

While a grouping table could look like this:

 group_id | x_id 
----------+--------
 01       | 01
 01       | 02
 02       | 01
 02       | 04
 02       | 06

That looks fine enough, except there is no indication as to how many x_ids are allowed in any given group_id. I was thinking/hoping/praying-to-the-computing-gods that only two tables would be necessary, but I'm thinking I will have to make a third:

 
 id  | size 
-----+--------
 01  | 2
 02  | 3

So, that's the decision. I need to have a third table.

2012/02/08

Talking Through A Problem

Current task is as follows: We have a series of barcodes. Barcodes are genetic markers that are put at the beginnings of samples to mark them as being of that sample, so you can do the same thing with several different samples and know what's what. My analogy is the different colored shirts schools give kids when going to the zoo or something, so the children from Turing Elementary don't get mixed up with those from Watson Elementary.

Specifically, we're looking at words of six letters, each being A, C, G or T. A and C are m while G and T are k. We're looking for groups of two, three and four where any of the six spaces are not exclusively m or k. From a corpus of 19 barcodes of a possible 48, using the sequencers available, we have developed a set of 247,528 triplets of triplets where:

  • the barcodes within each triplet do not conflict on the m and k thing
  • a barcode used within triplet 1 is not used in triplet 2 or triplet 3, and so on 
Now, we want to cut that list down to where, if, for example, samples A and B from triplet 1 and sample C from triplet 3 fail, can those samples with those barcodes be used together? My coworker believes that it should be possible to find groups where all 9 can be used interchangably, but his code so far fails to select them.

I look at that and think that subdividing them into groups that work together and giving up the ones with the longest list is probably the way to go. 

Which is what I'm now about to implement.

2012/02/03

A little bit on OAuth and Net::Twitter on the Command Line

I have a Twitter application up on Github. I quote:

#!/usr/bin/perl

# largely taken verbatim from
# http://search.cpan.org/dist/Net-Twitter/lib/Net/Twitter/Role/OAuth.pm

# Next step is to get the keys and secrets to a config.

# you need to get your own access token and secret (which identifies you
# as a developer or application) and consumer key and secret (which
# identifies you as a Twitter user). You cannot use mine.

use 5.010 ;
use strict ;
use IO::Interactive qw{ interactive } ;
use Net::Twitter ;
use Carp ;

my $status = join ' ', @ARGV ;
if ( length $status < 1 ) {
    while ( <stdin> ) {
        $status .= $_ ;
        }
    chomp $status ;
    }

if ( length $status > 140 ) {
    say { interactive } 'Too long' ;
    say { interactive } length $status ;
    exit ;
    }
if ( length $status < 1 ) {
    say { interactive } 'No content' ;
    say { interactive } length $status ;
    exit ;
    }

say $status ;

# GET key and secret from http://twitter.com/apps
my $twit = Net::Twitter->new(
        traits          => [ 'API::REST', 'OAuth' ],
        consumer_key    => 'consumer_key' ,   #GET YOUR OWN
        consumer_secret => 'consumer_secret', #GET YOUR OWN
        ) ;

# You'll save the token and secret in cookie, config file or session database
my ( $access_token, $access_token_secret ) ;
( $access_token, $access_token_secret ) = restore_tokens() ;

if ( $access_token && $access_token_secret ) {
    $twit->access_token( $access_token ) ;
    $twit->access_token_secret( $access_token_secret ) ;
    }

unless ( $twit->authorized ) {

    # You have no auth token
    # go to the auth website.
    # they'll ask you if you wanna do this, then give you a PIN
    # input it here and it'll register you.
    # then save your token vals.

    say "Authorize this app at ", $twit->get_authorization_url, ' and enter the PIN#' ;
    my $pin = <stdin> ;    # wait for input
    chomp $pin ;
    my ( $access_token, $access_token_secret, $user_id, $screen_name ) =
      $twit->request_access_token( verifier => $pin ) ;
    save_tokens( $access_token, $access_token_secret ) ;    # if necessary
    }

if ( $twit->update( $status ) ) {
    say { interactive } 'OK' ;
    }
else {
    say { interactive } 'FAIL' ;
    }

#========= ========= ========= ========= ========= ========= =========

# Docs-suggested
sub restore_tokens {
    my $access_token = 'token' ;            #GET YOUR OWN
    my $access_token_secret = 'secret' ;    #GET YOUR OWN
    return $access_token, $access_token_secret ;
    }

sub save_tokens {
    my ( $access_token, $access_token_secret ) = @_ ;
    say 'access_token: ' . $access_token ;
    say 'access_token_secret: ' . $access_token_secret ;
    return 1 ;
    }

I mention this, in part, because I'm going back to the code so I can automate tweets for @PurduePM, the Twitter feed of the Purdue Perl Mongers, which I am taking over.

I remember that, when I wrote this and first tried to use it, it frustrated and confused me. When I tried to dust it off this afternoon, I found it to be downright easy. I wiped the access tokens and ran the code. It hit the if statement, gave you the authentication URL and waited for a PIN. Given that, it "saved" the tokens (by which I mean "wrote to STDOUT". I then put the access token and speed. The means I have for saving and restoring tokens, which is less than minimal, could easily be improved. Thing is, I have no means within the code to distinguish between users. The usage is twitter.pl This is a tweet or echo This is another tweet | twitter.pl, with the entirety of @ARGV being concatenated into the message, so it's entirely within the code whether we're determining which account is being used.

But, it strikes me I could make aliases. Alias twitter to "twitter.pl jacobydave " and shift the username from @ARGV. (BTW, I'm on Twitter as @JacobyDave. Follow me.) Then, alias twit_ppm or something to "twitter.pl purduepm " and so on.

I really got to the "it works, so can I please get to the next thing?" stage and stopped playing with it, before I got to the really interesting point. I'm certainly beginning to appreciate how useful and cool OAuth is. I'll attempt to come up with the next better thing, as well as put together the means to store the keys in an external file. Remember, you'll have to get to the Twitter Developer page and create an app to get the crucial consumer key if you want to use this code. If I let you have mine, and someone starts using it to spam a lot, Twitter might revoke the key to protect itself. 

2012/02/02

Notes on the proper use of UPSes

We had a power outage yesterday.

It only lasted a little bit. I'd say two-to-five minutes. My desktop boxen powered down, but that's OK, I was wanting to reboot them anyway. The important thing isn't my desktop machines, but rather the instrument machines.

I work in a laboratory, and things have progressed that just about anything that could be done without the use of instruments and computers and analysis has been done sometime in the last few hundred years. Computers have enabled new ground to be broken, and there are lots of instruments that are connected to computers via USB or CAT5 or eHDMI cables. 

One of these instruments has a UPS, or Uninterruptable Power Source, connected to it. Consider it a surge protector with 20 lbs of lead oxide battery connected to it. Thing is, the instrument uses the twist-to-lock power connector, and the UPS for that instrument only had the twist-to-lock UPS connectors. The computer connected to it did not have a twist-to-lock power cable, so it was disconnected from power. 

Lesson: If you have an instrument on a UPS and the PC that controls it is not on a UPS, you do not have an instrument on a UPS.

We actually had an APC SmartUPS 1500 floating around the office, so I was able to close that barn door after the livestock escaped. Outages occur only once every four years or so here, so this is more threat than menace.  But now I have a sense of what needs to be done.