Ships Through the Night

Last week I participated in a Ship-It event at work.

The concept: the engineering staff form teams around concepts and ideas. Innovations that have been on peoples’ minds. Things that they think might be useful. Wow an audience.

Then at 2pm on Thursday, coding starts, and at 2pm on Friday, pencils down.

The aim is to put together a presentation or demo to an audience of everyone in the company. After all the presentations we vote. After the vote, one team wins the esteem of having won, their names on a trophy to be put somewhere in the office, and a custom-designed-one-run-only t-shirt.

Most of us are in it for the t-shirt… we have some kick-ass designers in our company.

The team I was part of started the event with a great idea for an improvement to some internal tooling. It was never going to win because it doesn’t cater to a broad enough audience, but it will make our own lives easier.

First lesson: 24 hours is not a lot of time.

We took some time designing a solution prior to the event. We had a fairly good idea of all the moving parts. Some databases, a message queue, some background processes and a Web UI. As I type that list up, I’m already wondering what kind of crack we were smoking to think it’d fit. But it’s good to be ambitious. I love hard problems.

By 6pm though it was time for dinner. Some of the chefs had stayed behind to make us something they called “chips”, but which so far surpassed the food item I have come to associate with that label we might as well pretend it was called “caviar” instead.

It didn’t feel like we had made anywhere near enough progress, and the day was getting late.

By 9pm, the team was starting to itch to go home and get some sleep. I foolishly stayed till about 10:30pm thinking I might make some more progress through the night, but my environment was exhibiting issues I couldn’t solve with a tired brain by myself.

Second lesson: stay or stay-not.

I probably should have headed home earlier than I did. A good nights’ sleep is a great way to restore productivity and get a fresh perspective on all the problems from the previous night. I guess this is why working weeks have nights at home scattered throughout them as well.

At 10:30pm I briefly considered sleeping in the office somewhere. I had prepared for staying and showering at work the next day. But I also had a kink in my neck and as comfy as the lounges are, I wouldn’t have been able to sit-up straight in the morning. Maybe I could have won a sympathy vote or two with a Hunchback routine, but it didn’t seem worth it.

I got home at about midnight, slept 5 hours, and headed back into the office early to get a head-start on the day. I was back at my desk by 7am feeling much fresher, albeit a little worn out.

If all this sounds exhausting. It is.
But it is also so much fun that it just doesn’t matter.

Third lesson: redefine the problem.

We came to the conclusion early on Friday that although only about half the available time had been spent yet we weren’t going to have a complete working system by 2pm.

So we did the smart thing.
We pivoted.

Rather than trying to produce a full working solution, we started to work on proving the feasibility of as many of the parts of the full solution as possible. We proved a migration path. We proved throughput. We proved the result we were aiming for.

And then we presented our findings with a demo of a Web UI over some batch-processed data rather than the live feed we would have liked to have had.

Fourth lesson: 24 hours is more time than you can imagine.

And then I presented for my team. Did an accidental mike-drop at the end (that’s my story and I’m sticking to it).

And we didn’t win of course. 8 out of 11.

But, what struck me is the amazing things that the teams came up within a mere 24 hours. Some teams stayed far later than I did; perhaps because they didn’t have to travel far to find their own beds. Next time I hope to find the strength to go around the clock myself.

There is something magical about dropping all process and standards, and just racing for the best you can do in a small time with a limited amount of time. It is refreshing. And at the same time it gives a good appreciation why some level of process is necessary.

I shudder to think of the code I wrote over 16 hours actually being the final version. It is atrocious by production standards. But it also worked brilliantly to address a tangible problem.

And then we rest

From what I understand we’ll be doing these every 6 months with fresh ideas to pursue.

It is a brilliant way to blow off steam. To try stupid things in stupid ways. And to work with staff outside of my own direct team. Mix-and-match.

I just need to prepare a palatable place to sleep in the office.
Just a few hours will do.
And I already have an idea.

2015-04-24 - Fireworks

Ship-It… Ship-It-Good

The day that I shall call Thriday is going to be an interesting one.

Tomorrow at noon, pretty much all the Engineering staff at Campaign Monitor will be challenging each other to build the best product they can from-scratch in a 24 hour time period.


Because, Why Not!
Also, it is presumably fun. And it sounds like there is a prize for the winner. Who knows.

There is a minimal amount of prep that gets done in the lead-up to the day; trying to woo other staff to join your project, brainstorming, some designing and planning. But all the heavy lifting gets done in just one day.

It’s an interesting exercise in discipline. It’s very easy to design something that is just too complicated to fit within the allotted time with a small team. Everything has to be agile. Everything has to be shippable in minimal increments, because there are no extensions to the deadline. And then it has to be as impressive to all our colleagues as it can possibly be, because we are also all judges.

I think the team I’m on has got an interesting problem ahead of us.

There’s a bit of intense data processing involved, a bit of web UI, a legacy system we need to suck data from, and then on Friday a presentation to do.

It really is going to be a little microcosm of IT; planning, managing, designing, developing, testing, selling.

I’m looking forward to finding out who will still be in the office by midnight.

Regular Like Clock-work

That is to say; with a whole bunch of wobbly and spinny bits that nobody quite understands the need of, but without which the mechanism just suddenly fails in spectacularly unpredictable ways.

You guessed it… Regular Expressions.

The greatest, most terrible tool, ever invented to do quick-and-dirty validation in web front-ends around the Interspaces.

I’ve been working to improve first-pass URL validation logic in the web front-end. I started by trying to read the existing regex, but it looked like a cat had just mashed some random symbol keys to a length of about 200 characters. And I knew it wasn’t allowing all the URLs we’d like to accept.

I decided to go back to first principles; RFC 3986 – URI Generic Syntax. The first shock was learning that the following is a perfectly legal URL:


And I haven’t even used Unicode characters anywhere in that example yet.

First, the temptation is to go to the back of the RFC, and just translate the BNF notation into a Regex and be done with it. Alas, I didn’t think I could accurately transcribe that many permutations without slipping up… and regexes are hard enough when you have a clear idea of what exactly you are parsing.

Second, the important realisation that it doesn’t have to disallow everything that isn’t a valid URL. This is about helping the users by catching the most important mistakes they might make. If anyone decides to actually use an IPv6 literal as a host identifier, then it really isn’t important to check whether the exact right number of hex words were used.

So, when squinting just-right at the RFC, it is easy enough to come to the following right-to-almost-right rules:

  • The group [\w\$-\.!;=@~] is a great approximation for the permissible alphabet for most of the textual parts of a URL; in some places that might allow slightly too much, but it restricts all the characters that really do not belong.
  • “#” is only permitted exactly once, after which all further text is considered the fragment identifier.
  • “?” is not permitted until the query portion at the end of the URL, but can occur as many times after that as you want.
  • Allowing excess square brackets makes capturing the part between the “//” and the first following “/” easier. Making the expression more specific helps break down the results into more logical parts.

What I have landed on for now is the following (finessed so that the capturing groups try to catch the logical parts of a URL):

  (?:(https?|ftp):)? # URL Scheme Identifier: http, https, ftp
    \/\/ # Literal //
    ([\w\$-\.!:;=~]*@)? # Followed by optional username:password@
    ([\w\$-\.!;=~]* # Followed by hostname
    |\[[a-fA-F0-9\:\.]*\]) # Or IPv6 address
    (\:\d*)? # Followed by optional :port

    |\/[\w\$-\.!;=@~]+ # Or literal / and a path segment

    |[\w\$-\.!;=@~]+ # Or no slashes and a path segment

    | # Or... nothing at all!

  ((?:\/[\w\$-\.!:;=@~]*)*) # Rest of the URL path

  (\?[^#]*)? # Optional query: ?...

  (#.*)? # Optional fragment: #...

I’m a little sad that named groups are not available in Javascript; remove all comments, white space and line-breaks from the above, and you can expect the capturing groups to contain the following:

  1. The scheme: http, https or ftp
  2. Either “//” followed by a host (authority), or otherwise the first part of the path
  3. The username:password@ of the authority, or nothing if absent
  4. The hostname from the authority
  5. The :port of the authority
  6. All of the URL path if there was a host (authority), or otherwise the remainder of the path after the first level
  7. The ?query portion of the URL
  8. The #fragment portion of the URL

Clearly some more post-processing needed to extract the actual values if you want to. Although I strongly recommend using a proper Uri class if you really want to process the content, rather than just getting a quick yes/no whether a URL seems plausibly valid.

Next stop… email addresses – RFC 5322.

As agonising as all this sounds, even to me, I am actually having a great deal of fun right now.