My Startup Lesson Learned

A few years ago, I quit my job to build a bootstrapped startup called SandwichBoard–a content management system for restaurants. Patrick Joyce and I put $5,000 into the business and made a leap of faith to live off savings and occasional consulting gigs.

After we got the sales routine down, we were growing at a steady pace, but not early or fast enough for our personal lives. We both really wanted to marry our girlfriends. We needed money, and ultimately had to ditch the bootstrapping salaries. I was elated to be marrying the woman I had been waiting for my whole life, but throwing-in the startup towel felt like a huge loss. It was something I had been working towards for a long time, but giving it up to marry my wife was the best decision I ever made.

Now that I’ve had time to reflect on what went well and what didn’t, I have my own list of lessons learned–things I would do again or make sure never to go near when building a business or product. But you’ve already seen that sort of list before. Instead of dumping a bunch of do’s and don’ts on people, I would rather leave them with an encouragement–one that will give someone else the courage to take that leap of faith.

My lesson learned: even though the startup plane may come to a fiery end, the pilots have ejection seats.

Our efforts were not wasted. While building SandwichBoard, we became proficient in Ruby on Rails, resulting in a great product and more than two significant open source contributions. This helped land Patrick a position at a company called Hungry Machine and me a job at Razoo. Taking that risk propelled us from working typical IT jobs into the startup world. I’ve had fun just about every day since I took that leap of faith, become a better engineer at an accelerated rate, been able to work with incredible people, and turned that savings loss into a short-term investment.

Patrick’s employer renamed itself LivingSocial, and his first major responsibility was to develop a small site called LivingSocial Daily Deals. (Maybe you’ve heard of it.) I rejoined Patrick this July and am leading the technical team responsible for building LivingSocial’s latest local-commerce product: Takeout & Delivery–restaurant takeout and delivery ordering (currently only in Washington, DC). It’s a fitting end to the SandwichBoard story.

Results may vary, but I still think it’s worth a try. Give it a go; jump off that cliff!

strip_attributes, Rails 3, shoulda 2.11 hack

I have never understood why Rails doesn’t strip attributes by default. I know at least one person who tried committing it to core, only to have it rejected. I always end up installing the strip_attributes plugin.

I’m ramping-up a new Rails 3 project, with Shoulda 2.11. I installed strip_attributes. It works, but the strip_attributes Shoulda macros don’t work anymore. I could take the route of upgrading the plugin, but then I would “have to” also refactor the Shoulda macros since macros have been ditched for matchers. Then I’d have to re-write the tests. At that point, I might as well make it a gem and add that feature I’ve always wanted. But, I’m just not ready for that sort of commitment right now (#gtd_maybe_someday).

So, here’s a short line to add to the end of strip_attributes/init.rb to get the old strip_attributes Shoulda macros working again:

require File.expand_path(File.join(File.dirname(__FILE__), 'shoulda_macros', 'macros.rb')) if Rails.env.test?

JavaScript Style Sheets

One of the privileges I have experienced in my professional life is having worked with Jeremy Keith of Clearleft. I learned a lot from the author and web developer from our interactions and by studying his code.

Jeremy is a fan of graceful degradation and ensures that a site will function just fine without JavaScript browser support. However, instead of mingling JS and non-JS styles together, he employs a particularly clever technique. He puts the standard styles in the standard CSS file. But just as his sites provide print and mobile stylesheets for printers and mobile devices, he also provides one for JS-enabled browsers. This has two nice side-effects: (1) better CSS organization and (2) CSS that is applied immediately, not after JS has had an effect on the DOM.

One example is a thumbnails carousel he designed. If the browser does not have JS support, the thumbnails are arranged in a grid. If the client does support JS, he wraps-up the thumbnails in a single row inside a horizontally-scrolling carousel, so they can be browsed left to right. The carousel presentation of the thumbnails requires a different set of CSS rules, which override the vanilla styles.

Here’s how he pulls off the wiring-up of a JavaScript CSS style sheet:

  <script type="text/javascript">document.write('<link rel="stylesheet" href="/stylesheets/javascript.css" media="screen,projection">');</script>

Any site I’m involved in will be following suit.

Thanks, Jeremy!

Specificity, Eames. Specificity?!

Way-back-when, I suggested that CSS authors use classes as much as possible. It was a foggy idea at the time, but, thankfully, someone important stumbled-upon the same concept, and codified it. Around the same time, it started to become popular for developers to consolidate CSS and JavaScript into two files for their sites. This technique saves HTTP requests, speeding-up users’ experiences of your site.

Now that I have had a few years to use these techniques, I’m happy to report they work beautifully; that is, until you have large CSS and JavaScript files that take a long time to download. Long downloads risk new visitors’ first impressions. You want it them to be wowed. A slow introduction doesn’t wow.

Large CSS and JavaScript files are often symptomatic of defining a heterogeneous set of styles or behaviors for very particular elements on very particular pages. In other words, styles and behaviors that aren’t re-used; ones that don’t scale. Here are some straightforward techniques to lighten those two global files:

  1. Page-specific styles. I see CSS as a way to define general styles that affect large portions of a site (using classes). If you want to do something really, really special on a particular page, just put it inside the handful of elements that want to be different as inline styles. (ID selectors are a dying breed.) There’s no need to define rules in a global style sheet if they’re not really global. If the page has a lot of inline styles, and they are making the markup unmanageable, you can always extract the styles into a page-specific stylesheet. One extra HTTP request on one page (or a handful of pages) won’t kill you or your users.
  2. Page-specific JavaScript. Users may never even get to some pages; why slow down their first request to your site to grab JavaScript for them? Don’t be ashamed to slap-in a JavaScript script block at the bottom of your page’s markup.
There is a continuum between two consolidated files for all your CSS and JavaScript and obtrusive CSS and JavaScript. As with many other things in life, you’ve gotta’ find the healthy balance.

Document Domain

Every few years I run into an issue with JavaScript-based rich text editors and spellcheckers when they spawn pop-ups. The pop-ups open but don’t function.

If I open my Firebug console in the pop-up, I see something like:

Permission denied for <http://assets2.mysitedomain.com> (document.domain has not been set) to get property Window.tinymce from <http://www.mysitedomain.com&gt; (document.domain has not been set).

Chrome’s console, shows a similar error:

Unsafe JavaScript attempt to access frame with URL http://www.mysitedomain.com/mypage from frame with URL http://assets2.mysitedomain.com/javascripts/lib/tiny_mce/themes/advanced/source_editor.htm. Domains, protocols and ports must match.

In this case, TinyMCE‘s HTML plugin is running-up against JavaScript’s same origin policy because I’m serving assets (and therefore TinyMCE pop-ups) from a different fully-qualified domain name than the page TinyMCE is being embedded in. When not explicitly set, my site’s pages will default to something like http://www.mysitedomain.com and TinyMCE’s document domain will default to assets2.mysitedomain.com.

The simple fix is to bump up the document domain on my site’s pages to just mysitedomain.com. I do this in my global JavaScript file. I do the same thing to TinyMCE’s tiny_mce_popup.js file.

    document.domain = 'mysitedomain.com';

(You might also know that cookies need a similar bump when trying to read and write to them across subdomains.)

Although this works, there is a problem for those developing locally: there’s a good chance they’re not developing at mysitedomain.com but something like localhost. A page at localhost certainly isn’t allowed to claim its document domain is mysitedomain.com.

To handle both cases, we can instead set the document domain smartly, by putting this in both our global JavaScript file and in tiny_mce_popup.js:

    document.domain = /(\w+)(.\w+)?$/.exec(location.hostname)[0];

Gift Cards Getting Declined?

Several months back, I was testing an application’s ability to charge credit cards before going live with it. The application allowed people to make one-time purchases, purchases using a previously saved credit card, and subscription purchases wherein people’s credit card information was saved for future use. Like many sites, we used a third party to charge and store the credit card information securely. (In our case we used Authorize.Net Customer Information Manager–also known as CIM.)

As a part of my testing, I employed the use of VISA, MasterCard, and American Express gift cards to make sure they worked too. We didn’t want to miss-out on any potential purchases. However, we ran into a curious situation when using gift cards as saved cards: the first transaction would be approved but any subsequent attempts would always be declined.

I spent the majority of a day talking with our merchant gateway (Authorize.Net), our payment processor, and the gift card issuing banks to try to understand what in the world was going on. Just like any perfectly-executed Department of Defense project, not one person knew or could explain the big picture. Each person could only tell me what they knew about my declined test transactions and point fingers at someone else. After speaking with everyone, though, I was able to extract what was going on.

To make a transaction with a credit card online or in the real world, you must provide a piece of verifying information. There are two options: (1) the 3-digit security code on the back of the card or (2) your physical address.

The catch about using your physical address for a gift card purchase is that you must first register the gift card with the issuing bank so that they have your address on file to verify it. The problem with address verification is that virtually no one registers their gift cards. (I didn’t even know it was possible.)

The problem with security codes is that no one other than the issuing bank is allowed to store it. The merchant gateway might capture and send along the code when making the initial purchase and even storing the card. However, it will never be available for subsequent transactions. This is why our gift cards were being declined when used as saved credit cards. We were sending the customer’s address to the payment gateway, but the issuing bank didn’t have an address on file to verify. And we couldn’t send the security code because Authorize.Net hadn’t stored it.

So, what could be done? Well, we couldn’t just reject gift cards on our site because there is no way to determine if the card is a credit card or gift card. (The only people who know this information is the card holder and the issuing bank. Credit card processors and banks are agnostic.) We could require the customer to provide a security code on every saved credit card transaction or subscription transaction, but that inconvenience to all of our customers would defeat the purpose of storing credit cards for automatic processing. The only viable option was to tell customers to register their gift cards with their issuing bank if they wish to use a gift card as a saved card or for subscriptions.

Hindsight is 20/20, so this makes perfect sense. However, it was quite confusing before the epiphany. Hopefully this will save some others a day of investigation.