If you enjoyed this post, share it.

If this is your first time reading, I recommend you start with my 6-month challenge and table of contents of weekly posts for the past 8 weeks.

tl;dr This week I spent a lot of time on an email experiment MVP and not as much time as I would have liked to make progress on the actual MVP v2 (actually, now MVP v3!). However, I learned a lot about email marketing, copywriting and consumer/behavioral psychology these past seven days and don’t consider my time wasted at all.

I think you guys are in for a treat this week; I am incredibly excited to write this post and share it with my readers.

Why did I do this email experiment MVP?
It’s been almost two months since I’ve set up my Launchrock page and <1 month since I’ve communicated with my beta testers sourced from Meetup (exact length of time varies per user and communication) and I wanted to give them some updates. Like I said in a previous blog post last week, I did not want to NOT talk to my users and leave them hanging in limbo in the dark for months at a time.

So I decided to send my first email to them and simultaneously conduct a stealthily designed MVP to extract potentially valuable and more importantly, actionable data from them.

I’ve always been under the impression that users don’t always know what they want, but moreover, users may say they want something but actually end up contradicting themselves afterwards through their actions. Actions speak louder than words, and what better than to give my users something and observe their actions (or lack thereof)?

I’m going to show you the exact step-by-step play-by-play of my thorough email experiment that served as a mini-MVP, if you will. Perhaps you can take these lessons to design your own experiments (and if you do, please email me — I’d love to hear about it!).

Ready to get started?

You’ll want to get comfortable, since this is going to be a very long and detailed post.

I. Executive summary
II. Experiment design
III. Hypothesis and goals
IV. Timeline
V. Follow-up emails and copywriting
VI. Actual results
VII. Unhappy emails
VIII. Post-mortem thoughts
IX. Next week and lessons learned

One of the most common types of advice we give at Y Combinator is to do things that don’t scale. A lot of would-be founders believe that startups either take off or don’t. You build something, make it available, and if you’ve made a better mousetrap, people beat a path to your door as promised. Or they don’t, in which case the market must not exist.

Actually startups take off because the founders make them take off. There may be a handful that just grew by themselves, but usually it takes some sort of push to get them going. A good metaphor would be the cranks that car engines had before they got electric starters. Once the engine was going, it would keep going, but there was a separate and laborious process to get it going.

– Paul Graham, Do Things That Don’t Scale


My experiment in a nutshell:

This email experiment MVP served four purposes:

  1. To give a product update (via MVP screenshots) and
  2. To offer them four potential gluten-free service options that would exclusively be only for beta testers (more details on this)
  3. To allow users to vote on which of the options appealed to them the most and thus tell me which “feature” was the most valuable
  4. To observe if users would be willing to pay and preorder $1 for a service option as part of “voting”

The goal of this email was to update them and also collect actionable data and click-throughs on four major “features” I think would set Cusoy apart from a typical gluten-free app, but of which I wanted to know what was the single-most valuable of them all. Which one would they vote for the most? Moreover, would they want to preorder it?


Pay careful attention to how I designed my experiment. It was critical for me to separate the interest factor from the intent to buy factor. First of all, which options interested them the most? The number of click-throughs indicated a user’s particular interest, free of any bias. Then, now that they selected a service option that interests them, would they want to preorder it for $1? Introducing money as a factor now makes things very interesting and actually somewhat confrontational.

This is the workflow of the experiment:

  1. Value proposition email with four options
  2. Preorder website requires $1 preorder to vote

1. Value proposition email with four options

User receives email with product update and four potential gluten-free service offerings, with a call-to-action to “vote” for one or multiple service offerings.

Here’s a sample of two of the four services I offered:



Note that the call-to-action is “I’m Interested” and that there is absolutely no mention of payment required to vote. At this point, the user just thinks by clicking “I’m Interested” that that constitutes as a vote.

Let’s take an example that the user is very interested in the 3 curated dish recommendations because she is tired of eating at the same restaurants and eating the same gluten-free dishes each time. She is intrigued and clicks “I’m Interested.”

Now what happens?

2. Preorder website requires $1 preorder to vote

After she reads the email and clicks “I’m interested,” she is then taken to a preorders website which restated the gluten-free service offering. Moreover, she is then told that in order to vote, she must preorder $1 for the service.



Take a second to read the sales copy and reread it again.

There are six major marketing elements in play here that are very important to recognize.

Can you pick them out?

What catches your eye the most? Would you be persuaded to prepay $1 to vote?

Now let’s deconstruct this page for a little bit so I can point out everything that I tried to make this as frictionless as possible and “easiest” for the user to want to say “Yes, I will preorder $1 to vote.”

I originally had written this out point-by-point, but I thought it might be easier to just do a quick annotation 😉


Remember, services are chosen based on cumulative number of votes — the service which gets the most votes gets “chosen.”

Note: The PayPal button actually was not there for the first preorders website iteration. It came after a user requested a PayPal payment option, rather than Gumroad. The PayPal button here looks extremely tacky and turns me off — looks very sales-aggressive and pushy. I have absolutely no control over how the PayPal button looks, unfortunately. I do think the button’s appearance had a significant effect on how this experiment may have played out later on.

Now that you have an excellent idea of the experiment purpose, design and workflow — let’s talk about my hypothesis and goals next.


tl;dr I hypothesized one or two users would actually preorder the $1 and that the service offering of curated restaurants and user ratings/reviews would receive the most indirect “votes.” My goals were to actually see user actions to translate into actionable data, as I mentioned earlier.

Friction to pay
I thought I would have one or two users who would voluntarily pay — I knew going in it would’ve been insurmountably difficult to convince any user to pay, since this was my first ever email to the beta testers, and while I’ve personally and individually communicated with most of them at least once via email (and some over the phone and in-person) — Cusoy was/is still an unknown and unproven product.

It’s a pretty tremendous leap of faith for anyone to preorder something that they haven’t seen or experienced, even if it’s for $1, less than the price of a cup of coffee. At least for coffee, you know what you’re getting and the value meets (or perhaps exceeds) the price of the cup of coffee.

With these services, you have no idea what they’re like or if you’ll actually find them valuable.

Moreover, the next thing you look at is the seller’s or company’s background — and I’m still relatively new, as is Cusoy. So I was pretty unproven to them as well.

I was curious to see if the $1 would make any difference — but at the same time, even for $1, $5, $25, $100, $1000, etc. a user would still have to pull out her wallet and physically take out her credit card to enter in the information. (Also why one user asked for PayPal integration since she didn’t have her CC on her and how ubiquitous PayPal accounts are).

Service option
I personally think the service offering option #3 of curated restaurants and user ratings/reviews will get the most clicks, or indirect “votes” due to feedback that a lot of users wanted to see user ratings/reviews of restaurants from other people like them who have gluten-free dietary profiles due to necessity.

However, I had offered other service options that weren’t even on the radar of what users had told me — based on some recurring feedback I had received from friends (non-Cusoy targeted users) and also based on the fact that users don’t always know what they want until you put it in front of them.

So things got a little interesting.

Let’s now take a bird’s eye of the timeline of this experiment and the major milestones.


Here’s a quick and dirty calendar view of the email experiment MVP timeline with five important events:


1. Tuesday, September 3 – Email idea, feverish design

I got coffee with Jason Shen in DC that afternoon and my chat with him sparked an idea of doing a quick and dirty email experiment MVP to see what users really want (and satisfy my burning curiosity of if they would pay for something sight unseen). I wanted to see if I could make that first dollar, that first sale — and rationalized that people would pay for something if it truly solved their problem.

I was also very intrigued by the following:

In a previous post a couple weeks ago, I mentioned making a landing page MVP and totally scrapping it — anyways, I realized revamping my landing page was not the best medium to conduct this MVP experiment and more importantly, I already had a group of eager and waiting beta testers on my email mailing list with whom I could work with!

I then spent entire night in Barnes and Noble and once the bookstore closed, I retreated back to my room at home and did the following:

  • Designing the experiment, including MVP screenshots and service offerings
  • Going through multiple drafts of copywriting and copyediting in Mailchimp
  • Designing and tweaking a template layout to suit my needs
  • Setting up a WordPress website to take preorders, tried out Gumroad, Ribbon and PayPal. Stripe is too developer-y for me. I wanted quick and easy and stuck with Gumroad.

2. Wednesday, September 4 – Sent out email to beta testers, vote deadline Sept 12

Not going to lie, I was very nervous in hitting the Send Campaign button in Mailchimp.

3. Monday, September 9 – Reminder email with promise of free guides on Thursday

This was a reminder email to vote for people who were still on vacation — remember, my original email on September 4 was just after Labor Day weekend — it was kind of crazy all the different places users had opened my email, from Mailchimp demographic stats.

Also, to “soften” the blow even more of my somewhat-guilt of asking users to preorder $1 (after the email jump), I also promised five free 10-restaurant guides in five different regions: San Francisco, East Bay, Peninsula, South Bay and Napa/Sonoma (actually didn’t end up doing Napa/Sonoma due to no beta users being there).

Oh yes — this was the other thing I spent a TON of time doing this past week. Creating restaurant guides. I originally wanted to do 25 restaurants per region — then decided on 10, since that’s a nice number for a sample guide. Very glad I chose to downsize from 25 to 10… otherwise I probably would not have finished the guides in time!

4. Thursday, September 12 – Last call to vote and released free guides

I released four free guides (which I had spent at least 30+ man hours constructing over 3 days) to the users. This was 100% free, no gimmicks and no stealthy prepayment orders 😉

This was also a final reminder to vote (voting closed at Thursday midnight) and that I’d announce the “winner” service on Tuesday, September 17.

5. Tuesday, September 17 – Announce winner service

After voting was finished and everything counted, I would announce the winner service that would be offered to beta users free for a limited period of time.


There’s a ton of incredibly interesting information out there on best practices of writing sales/marketing copy — something that really interests me too, in the realm of behavioral psychology. It really can make all the difference in persuading users to open your emails and do what you want them to do, either click through, make a purchase, etc.

Here are three key iterations in my follow-up email on Monday, September 9 that I wanted to point out:

1) Email subject line – create a clear action and command for user:

“Action Required: Vote for most valuable gluten free service offering, Upcoming Gluten Free Restaurant Guides”

I had noticed several emails that had an “Action Required…” subject line and that piqued my interest because I felt that it was directly calling out to me that I needed to do something. It made those emails stand out from other marketing emails in my inbox and I decided to do the same for Cusoy.

2) Limited time offer – create a sense of urgency:


Again, there’s marketing studies and research out there that users can and will act when there is a sense of urgency. It is a tried and true marketing tactic. But users won’t know about this “limited time offer” unless you tell them that an offer is going to end soon. Again, like Parkinson’s law, if you give a user a week to do something (as I gave my users a week to “vote” due to Labor Day weekend), he won’t do it unless it’s the very last second (if he remembers at all) unless you remind him that your offer is going to expire.

I had also written copy that it would just take 60 seconds to “vote” to take advantage of the limited time offer. Again, you want to mitigate as much as possible the cognitive overload on the user so he can make really quick decisions and act quickly, and not sit there thinking about if something will take too long to do and thus is not worth his time.

3) Create a clear call-to-action:



Here, it is abundantly clear what “I vote for this service” means, whereas not very clear what “I’m interested” means.

This also creates an illusion to the user that by clicking “I vote for this service” means that that click itself is a vote — when actually they are then redirected to the preorders website and then become horrified to learn they actually have to do a $1 preorder in order to vote 😉 But they didn’t know that! It’s part of the thorough design of this experiment and again, separating interest vs. intent to buy.

These marketing changes got noticed by one of my (more vocal) beta users:


OK, now let’s talk about the actual results. You’ve been dying and slogging through reading all this text – what actually happened?


Let’s look at the Mailchimp and Gumroad/PayPal results.

1. Wednesday, September 4 – Sent out email to beta testers, vote deadline Sept 12

Email open rate: 74%
Email click rate: 47%

Out of those who opened this email, 64% clicked on a link in the email. Pretty good.

Remember, this email also contained MVP screenshots – thumbnails that the users could click on to enlarge to full size (my original draft of this email had full size screenshots inside the email — yeah, bad idea! I decided thumbnails are better and moreover I can clearly see which ones users are interested the most by tracking their clicks).

So, the majority of the clicks in this email were on the MVP screenshots, unfortunately.

But, I did get some good introductory data on the four potential offerings.

We’ll start the running tally of votes now (for these purposes, a “vote” is a unique “click” — Gumroad payments covered later in this section):

Service #1 (3 curated dish recommendations): 4
Service #2 (3 curated restaurants and user ratings/reviews): 2
Service #3 (featured gluten free restaurant review): 3
Service #4 (all-inclusive gluten-free concierge): 3

Interesting! Remember my hypothesis that #2 actually would come out first?

Also, very interesting — my email was opened in: California (obviously), Oregon, Colorado, Hawaii, New York, Massachusetts, New Jersey, Italy and Thailand. Clearly people are STILL on vacation… and convinced me that a follow-up reminder email was necessary.

2. Monday, September 9 – Reminder email with promise of free guides on Thursday

Email open rate: 55%
Email click rate: 21.2%

Out of those who opened this email, 39% clicked on a link in the email. OK, so not 64% like last time, but the links in this email were solely dedicated to voting for the four service options, no more MVP screenshot thumbnails to click. So this is still pretty good!

Again, this is also the email with the 3 key marketing email iterations that I mentioned above.

Let’s look at the “votes” or unique “clicks” for this email.

Note: I’m not discounting any multiple clicks made by the same user (ex. if user John clicked service #1 in the September 4 email and then clicked service #1 again in September 9). My sample size is a bit too small, unfortunately. Also, this was not a problem — different people were clicking each time.

Service #1 (3 curated dish recommendations): 3
Service #2 (3 curated restaurants and user ratings/reviews): 1
Service #3 (featured gluten free restaurant review): 4
Service #4 (all-inclusive gluten-free concierge): 2

Interesting change of dynamics.

Let’s do a cumulative tally now for both the September 4 and 9 emails:

Service #1 (3 curated dish recommendations): 7
Service #2 (3 curated restaurants and user ratings/reviews): 3
Service #3 (featured gluten free restaurant review): 7
Service #4 (all-inclusive gluten-free concierge): 5

VERY interesting! My hypothesis of #2 is actually ending up as the LEAST voted upon service. Goes to show you that users don’t really know what they want… since my hypothesis was drawn straight from the users’ mouths what they wanted. Seriously.

I was very surprised that the gluten-free concierge would come out at third place too. That was actually one of Jason’s suggestions to me — he said, how about you suggest something service-oriented like Postmates, Uber, or TaskRabbit? Seems interesting that demand is there for such a service.

3. Thursday, September 12 – Last call to vote and released free guides

Email open rate: 61%
Email click rate: 39%

Out of the people who opened this email, 65% clicked on a link in the email. This is the highest clicks per unique opens rate so far, probably due to my email subject line “Download your free gluten-free restaurant guides” — which I of course primed my users about in the previous Monday, September 9 email.

This was the email where I released the four free gluten-free restaurant guides. Of course, users are going to jump on that, and the click rate reflects that.

This was also a final call for votes, so let’s look at the “votes” or unique “clicks” for this email and see if this might’ve made any difference.

Service #1 (3 curated dish recommendations): 3
Service #2 (3 curated restaurants and user ratings/reviews): 0
Service #3 (featured gluten free restaurant review): 1
Service #4 (all-inclusive gluten-free concierge): 0

Wow, #2 and #4 not feeling the love here. Service #1 seems to be a recurring crowd favorite, huh? Again, very, very interesting stuff.

Let’s do a cumulative tally now for the September 4, 9 and 12 emails:

Service #1 (3 curated dish recommendations): 10
Service #2 (3 curated restaurants and user ratings/reviews): 3
Service #3 (featured gluten free restaurant review): 8
Service #4 (all-inclusive gluten-free concierge): 5

Drumroll, please! The resounding winner is Service #1: 3 curated dish recommendations.

Mailchimp open/click rates
Let me just say these are absolutely stellar open rates and click rates. I’m very happy with them. I actually naively was disappointed that they weren’t higher, at least in the 90%+ range, but that’s near-impossible, unless your email list is five people with everyone on the list as your family members. I kid. As long as it’s above 50-55%, I think I’m happy.

To give you an idea of standard benchmarks, Mailchimp says that the industry average open rate is 17.5% and industry average click rate is 2.9% — granted, my email list is probably a drop in the ocean compared to a lot of bigger businesses out there, but I’ll take it 🙂

So, now that you know the Mailchimp tallies — do you think the Gumroad clicks will reflect that?

We’ll see.

So Gumroad is the micropayments solution I used on the preorders website to make it as frictionless for users to do a $1 preorder. Gumroad actually also tracks clicks on the order buttons, so I can still see what users click on even if they don’t follow through to complete a transaction.

Let’s take a look at the clicks — this is at the end of the email experiment, as of today September 17:

Service #1 (3 curated dish recommendations): 2
Service #2 (3 curated restaurants and user ratings/reviews): 0
Service #3 (featured gluten free restaurant review): 1
Service #4 (all-inclusive gluten-free concierge): 0

Note: there were zero transactions completed. Very disappointing and disproves my hypothesis that one or two people would actually follow through to pay $1.

Interesting that the Gumroad stats also follow — Service #1 was 1st place on Mailchimp (and also 1st on Gumroad) while Service #3 was 2nd place on Mailchimp (and also 2nd on Gumroad).

No one paid me through PayPal. Haha. No surprise here, actually…

So, there was this one user who requested I put PayPal on the preorders website because she was sick in bed and unable to retrieve her CC from her purse. So I did and also emailed her telling her I did so — but unfortunately 4-5 days after her email… and she didn’t respond and more importantly, didn’t actually end up paying.

I also think another main factor why people were turned off on the preorders website was the ugly PayPal “BUY NOW” button. Oh well.


I’m laughing as I write this, but understandably, a LOT of users were very surprised after the email jump when they got to the preorders website and learned that they actually have to preorder $1 to vote.

Some emailed me to let me know exactly how they felt. And I’m happy they did.


angry-email-1 angry-email-3

That’s one of my lessons here… be careful if, when and how you ask for money 😉 More on this later.


More users please
If I could do this over again… I wish I had more than 35 users on my mailing list to test these things. I actually had much more signups on Launchrock, but those were from friends, random people from HN, etc. who aren’t my target user and would mess up the design of the experiment (sorry guys!).

And to be quite honest, I’m not sure exactly why, but none of my Launchrock users (except a couple) have opened my emails so far. Strange… if I take off those “inactive” users, my mailing list actually would dwindle to about 25 or so.

That’s OK though, since my targeted users are a very niche group, and these were all freely acquired through Meetup (no need for me to throw expensive money at Google Adwords, Facebook ads, etc).

I’ve since acquired 5+ more beta testers since then, but have stuck with this email group for the entire duration of the email experiment — wouldn’t make sense to introduce “new” people into the email list just yet.

Personal connection
One of the things I think helped me, though, in the open rates and click rates is that I personally established a relationship with each and every one of these beta testers (primarily through Meetup). What I mean by that is I’ve personally corresponded with each of them via email at least once (for some of them, several times or even many times), or via phone, or even in person. This probably reminds you of Paul Graham’s essay of don’t do things that scale — this obviously would not work so well if I had 500 or even 1,000 users.

I think the fact that they “knew” the person behind Cusoy, me Melissa, and had established an initial connection greatly helped in their “investment” in my email experiment MVP–that is, in their open rates, click rates, and voluntary emails to me regarding their feedback (either good or bad!). At the same time, they’re personally invested in Cusoy’s success because it would help solve their problem (or so I think, maybe they are just trying to be nice and helpful to me too? Haha) and agreed to be beta testers. That doesn’t hurt.

So I think that makes up for the fact that my mailing list of beta testers so far is less than 35, but their involvement is much greater than it would have been had my mailing list been 1000+ or so.


So next week, still building MVP v3! Chugging along. Not much new there. I want to launch within 1.5-2 weeks.

Also — am I really going to launch a service? I got actionable data but no $1 preorders, as I said.

Hm, I’m not completely sure. But I think I might just launch Service #1 (the one that got the most votes) for three weeks for the users for free. This is definitely following Paul Graham’s do things that don’t scale, for sure. Haha.

OK, lessons learned from this email experiment? Oh, where to begin…

  1. Carefully and thoroughly design your email experiment. I made sure to carefully separate the interest factor from the intent to buy as I outlined in this blog post. If users knew they had to do a $1 preorder to vote when reading the initial email, I would not have gotten as many clickthroughs as I did.
  2. People don’t like to pay for something unproven. The conversation immediately becomes uncomfortable once you ask for money, even when it’s just $1 and a basically money-back guarantee 🙂 I tried to make every single possible thing as frictionless for the user, with a free PDF and refunds, etc. I had one interested user, a couple clicks, but unfortunately at the end of the day, no money in the bank. Not even a dollar.
  3. Email marketing and sales copy is very important. You have to put on your marketer’s hat and user’s hat at the same time. I’m talking about the magic arts of crafting subject lines, sense of urgency/limited offers, call to action buttons, etc — they’re all very important to persuade users to do what you want them to do. Think to yourself — what are some email subject lines you’ve noticed that persuaded you to open those emails over others? What about reading emails, what persuaded you to click on something? Think about it. It’s pretty interesting.
  4. People can surprise you sometimes. My hypothesis that Service #2 would come out with the most votes was unfortunately shot down and actually came out with the least votes — even if it was exactly what the users told me what they wanted! Again, that’s why you go by what users actually do by their actions, not just by their words.

P.S. I’d love to meet you on Twitter here.

And if you enjoyed this post, please consider sharing it on FB or Twitter.

Leave a Reply

Your email address will not be published. Required fields are marked *