In 2014 I took a job as Director of Product Management at America Achieves, working primarily on their Raise The Bar program. America Achieves is a nonprofit startup accelerator, and Raise The Bar was its program dedicated to aiding parents in supporting their children’s education. It was exciting to get back into advocacy (1993-1999 I handled a lot of IT and ran the website and email programs at Environmental Defense Fund), especially in support of public education.
As America Achieves’ only technical hire, I had a job a bit like the tech co-founder of a startup–I was Director of Product Management, but I was also effectively the CIO. What made me not exactly like a co-founder was that I came in about a year later. By that point, the founding grants had come in and a lot of money had been spent. I see this in nonprofits in a lot of different ways, but there’s a tendency to spend on tech as if you only have to pay for it once … they get a million dollars for tech in a grant, and they spend the million. But it’s going to cost them 20% of that each year to maintain, and where does the $200K come from? Typically, it doesn’t come from anywhere. We need that million dollars, so we take it when we can get it. America Achieves had done this, but fortunately they had done a very bad job* of it, spending a lot more money than they had to in order to get what they got. What that meant was I could fix it–with the help of Most Media we re-coded some complicated tech to and moved it to modern web hosting, saving about $14K/month for an outlay of about $8K (plus my salary. of course, but it only took a month). That’s how bad their early decisions were (had they made reasonable decisions there’s no way we could have saved so much for so little).
[*Note: making bad decisions about tech is not the worst thing a startup can do. Anything you can do to get the business moving forward and growing is probably a good decision. If that means over-spending on tech to get set up fast, so you can then focus on the business instead of the technology, that’s probably overall a good decision, even if your next tech person is going to roll their eyes a little bit. I wasn’t around to know how it all happened in that first year, but they’re still going strong so they’re doing something right.]
Anyway, this is not really the story I’m trying to tell here, but it’s important, because getting the custom software onto much cheaper infrastructure was really important to what happened next: America Achieves span off Raise The Bar, essentially selling it to Learning Heroes, a separate org with very similar goals but much different approaches. Raise The Bar’s approach was to create quizzes that would help parents know how their kids were doing relative to grade level. If you ask parents how well their kids are doing in school, about 80% will say their kids are doing better than average. Of course, that’s impossible; and in fact more than half are actually behind grade level. So they’re not getting good information from their schools about how their kids are doing, unfortunately. The quizzes were designed to match grade-level standards, so a kid that did well on them was probably at grade level or better. A kid that did poorly could probably use additional help.
The problem here was high barriers to entry. Parents who used the quizzes found them useful and helpful, but they took almost an hour each (math and reading), so very few parents could sit down with their kids and get it done completely. Especially the working-class parents who probably needed the most help. So Raise The Bar didn’t always reach parents directly; we mainly went through teachers, who would then advise parents to use the tests, or even have kids do them during class time. Because of America Achieves’s other programs, we had an email list of a few thousand teachers, and communicated with them by email, while running social media pages for parents who’d gotten interested. It was good community support–we had great engagement–but not really a fast or great way to grow that community. The truth is, the idea was poorly-conceived even if the execution was excellent. Anyone with significant web experience could have told them they wouldn’t get a large audience via math and reading quizzes that take an hour to complete. Maybe anyone without significant web experience could have told them that, too.
Learning Heroes, on the other hand, aimed at much smaller goals but a wider audience. Their goal was to reach parents at a very basic level, meeting them where they were, so to speak, and leveraging existing tools online like Khan Academy and other great providers (see Learning Tools for more on this). The basic idea was this:
- 1. Contract with user research firms to get an understanding of what kinds of issues, and what kind of language around those issues, resonate with parents. What subjects are an entry point to this conversation, and how do we talk about these subjects in ways parents care about and understand?
- 2. Get that content out.
- 3. Encourage parents to get more involved in their kids’ schools and schoolwork in ways that are helpful.
Number 2, of course, is the trick. Everywhere else I’d been we had really strong membership programs, including some basic level of free or in-kind membership, like a subscription to an “insider’s” newsletter, for example, or free programs for young professionals. Or participation in the online advocacy program. Only at Environmental Defense Fund (EDF) did we call it “free membership” … Everywhere else we just called it email marketing. There were all the members, and they were probably signed up for email, and then all the other people who weren’t members who were also signed up. At EDF, we launched it all with the very first growth of the Internet–when we started an email list and a website (with email signup forms), our constituency thought it was fantastic, because of the potential for saving paper. We had 75K people on the email list in a year, in 1996. And we sent them a weekly email brief with links to about seven stories, almost all of them from 25 years of newsletters, scientific reports, and economic and political analyses that we’d digitized in order to launch the website with lots of content. But we also kept the newsletter going, writing about a dozen new stories each month, and we sent that out by email as well.
At The Metropolitan Museum of Art, we had an email list of about 350K, more or less equal to the number of members, but with only about 40% overlap, if I recall. A lot of the Met’s members had been members for a long time, so we didn’t have their email addresses, or maybe they didn’t really use email. Also, on average they cared less about saving paper–in 2006-2012 most of them still wanted those glossy catalogs the Museum put out–so there wasn’t that big push to switch over . But we also had lots and lots of visitors, millions every year, many of whom wanted to stay in touch and on top of what was happening at The Met, which put on 30-40 special exhibitions every year. And then, starting in around 2008, there was social media. In a matter of a few years, my team grew the Met’s following on Facebook and Twitter to over 2 million, largely by posting lots and lots of beautiful pictures of artworks–following our feeds was fun and educational even if you never visited the Museum.
At both these places, as well as the New York Public Library and Brooklyn Museum (the other places I’d mainly worked before going to Learning Heroes), there was a large digital audience, grown mainly because those places had been around for a long time and had existing audiences that were converting over to email-, web-, and social media-based communications. What would we do at a startup that had no audience?
Well, I’ve spent a long time getting to the punchline, and it may be a little obvious, but not everything about how well it worked is obvious. We advertised. Primarily on Google, also on Facebook. The reasons it worked so well were more or less these:
- * Our funders–places like Bloomberg and Gates Foundation, and others that cared about supporting public education–wanted to know how many people we reached, and how many minds we changed or actions we inspired. (This was the era of parent skepticism around Common Core, so trust in public education was low, even among traditional supporters). Basically, they wanted to know that their spend was paying off.
- * Our audience was basically every parent of a child roughly 4-18 years of age. So, a lot of people. And basically none of them knew about us. We could hope to build an email and social media audience, but it would be a long time before it was any significant fraction of the number of people we should reach.
- * Google and Facebook and lots of other ways to advertise online offer really good targeting. For example, on Google we could buy search terms that parents were likely to use when trying to get educational help for their kids, and directly target people most likely to benefit from our help. On Facebook we could target based on geographic region, whether they were parents and pretty well on the age of their kids, and based on interests. So if they were in any education-related groups or followed a school, we could expect them to be a better target.
What this all meant is that we could reliably predict a cost-per-click of 7 cents. In other words, for every dollar spent, we’d get about 14 people clicking through to our content. Do you see how important that is? If a funder said they wanted to reach 100K people, we could consistently expect to succeed with an advertising investment of about $7K. We didn’t necessarily tell them that number precisely; the grants would typically cover a range of activities, including the user research (see item #1 in the short list above), which was typically much more of an investment. So we could request $30K for “marketing,” promising to reach those 100K people, then use most of it to develop landing pages and other web features, and spend the remaining $7K to get the 100K pairs of eyeballs.
If you’ve been reading carefully you might notice I kind of left out the “minds we changed” part of the first bullet point, above. A click isn’t a mind changed or an action inspired–far from it, probably. But that kind of goal is very hard to measure, and funders rarely require it. Rather, we would emphasize the value of the user research, and the skill of the firms we employed, and the funders would at least know that we were changing about as many minds as we could reasonably expect.
I worked as a consultant for Learning Heroes, so after rebuilding the website with simple and solid tech, that staff could manage on their own, and developing these marketing approaches, my job was done and I moved on. I don’t know how they’re doing now, except to say that they’re still going–a small, 6-person startup–and that’s probably not true of many of 2015’s nonprofit advocacy startups. And I bet they’ve grown their email and social media followings now, too.