April 30, 2014

Looking at Data: 2007 NSF research expenditures and research outcomes

In this week’s installment of Looking at Data, we’re digging into data from the National Science Foundation and NASA. The data can be found at Research.gov, and covers research grants from the NSF and NASA from 2007, and how these grants fared in terms of publications and conference proceedings. They have data all the way up until FY 2013, but I chose 2007 because research takes time, and well, I wanted to have as much data as possible on research outcomes (in this case publications and conference proceedings).

The question I wanted to look at this week is whether “scrappiness is the Mother of invention.” This came up in a conversation I had with my friend @DaveyDeMille over lunch last week. In other words, is being well-funded correlated with a higher research quality, as measured by the proportion of NSF grants that lead to a publication.

First off, just some descriptive stats:

Now moving onto our question:

So although this is pretty cursory, it appears that there’s not really a relationship between getting lots of NSF funding and lots of publication-yielding projects. I’d say, without digging more into this, that scrappiness is, if not the Mother, of invention, at least the aunt.

EDIT: I’ve been reading up on UX lately, following 52 Week of UX (awesome resource, by the way), and came to Week 4 today. They have an excellent post on how constraints force creativity: go read it here: Constraints Fuel Creativity.

April 22, 2014

Takeaways from Optimizely’s 2014 Opticon

Last week I had the honor of flying out to San Francisco for Optimizely’s first Opticon conference to represent CLEARLINK.


I wanted to soak up all the knowledge and wisdom from the source itself. As a direct response marketing company, we’re continually trying to understand our customers and what makes them tick, so testing and optimization, continuing to deliver an improved and more compelling marketing message on behalf of our brand partners.


I was floored by the quality of the event, the speakers, and the panels Optimizely had put together!

Additionally, Rachel Johnson and I were honored to be the recipients of the inaugural Optie Award in the category for “Best Advanced Feature Use”!


Below are the three biggest takeaways from my experience.


Use A/B testing for “argument prevention”
Dave Nuffer (Liftopia) and Alissa Polucha (Microsoft)

Testing helps us remove “I think” and “I like,” and move towards “I know”

2. Focus on the problem, not the solution

Hiten Shah (KISSMetrics)

  1. Find out why people don’t convert
  2. Use the exact words customers do
    • What do or do you not like?
    • What is the primary benefit you received from X?
    • How would you describe our product to a friend?
  3. Be very specific with calls-to-action
  4. Start with a hypothesis, then prioritize
  5. Create experiments that are small
  6. Continuously experiment

3. Experimentation never ends: iterate!

Take the learnings from your last experiment and always try to improve on them. For example, Alissa Polucha from Microsoft Store shared the example of how the MS Office page on microsoft.com has gone through six iterations of testing so far, just to get where they are today