Posts Tagged ‘data’

The Validation Board: a “Lean Startup” strategy worth testing

A few months ago, a colleague recommended I pick up a copy of the popular book The Lean Startup by Eric Ries. This book challenges “startup founders to build and run their companies in a new way, maximizing customer value while minimizing wasted effort.” Having run my own business for the past decade, I immediately connected with the wisdom shared by the author regarding how organizations from any sector can achieve higher results while simultaneously investing less time, effort and money. I thought back over my career and noticed for the first time that many of the mistakes I had made over and over would have been totally preventable had I been using these Lean principles.

As I hopped on Google and began investigating this philosophy in more depth, I stumbled upon a website called “Lean Impact – Lean Startup for Social Good.” Since I run a for-profit business that serves the nonprofit sector, I was intrigued by how the site describes how this work-smart approach could benefit the social sector in the following ways:

  • Figuring out what creates real impact and discarding what doesn’t
  • Operating with fewer wasted resources
  • Leveraging forward-thinking technologies to achieve our goals
  • Gathering continual feedback from our community
  • Creating a culture that sees failure as learning that brings us closer to a solution
  • Eliminating mission creep influenced by funding.

What an amazing list of benefits a Lean strategy could have! When I discovered the Lean Startup Conference was coming up in San Francisco, I excitedly booked my ticket and headed there ready to learn more. Having just returned, I’d like to share with you my biggest take-away from the event as it helped me view effective social change in a whole new way. I’d like to introduce you to the Validation Board.

The Lean philosophy is all about failing fast by testing your ideas early before you invest a lot of effort into them. We’ve all been there…We have a “great” idea but after working on developing it for months (and sometimes years), it bombs. What happened? Most likely we had our target market wrong or we were trying to solve a problem that really didn’t matter to our target market in the first place. Our idea may have very well been amazing but unless your target client wants it, it’s really not that viable an idea after all.

Just how does the Validation Board work? Here is a quick seven-step overview. For a more detailed explanation, please watch the video below the seven steps.

1. Identify your “Customer Hypothesis”: Who is your organization’s specific target client? Although we like the idea of helping anyone who needs help, the truth is your organization will be most effective by delivering specific, customized solutions to a specific market.

  • Nonprofit example: women aged 25-45 from the Washington, DC, area who have been victims of domestic abuse.

2. Identify your “Problem Hypothesis”: What problem do you think they have for which they are searching for a solution? I know for me in my work as a coach, it is so easy to see a glaring problem that I can help my client solve; however, if my target market doesn’t personally identify with that same challenge, my solution to their problem won’t “sell” no matter what.

  • Nonprofit example: In an effort to serve these women, you hypothesize these abused women need a shelter to escape the abuse.

3. The first time working through the Validation board, skip the “Solution Hypothesis” because we need some data about the problems our target market is currently facing around a specific issue before we come up with any potential solutions. Once we have some feedback from our market, we can then begin to test the viability of potential solutions.

4. Identify your “Riskiest Assumption”: What are you assuming about what your target clients want? In addition, what data do you need to support your assumption?

  • Nonprofit example: You assume this target market would be willing to come to a shelter to escape when you might not know if women will utilize the shelter until you have some data to prove it. For argument’s sake,  these women may want to escape but would not feel safe in a shelter. Before you invest in building an expensive shelter for them, it would be important to find out what kind of escape resources they would want instead of just assuming you know what they need.

5. Method: How are you going to test your idea? How are you going to find out if your target market identifies with the problem you’re looking to help them solve? Are you going to ask them in person, do a survey, etc.?

6. Minimum Success Criterion: How will I know when I have enough data to validate my idea?

  • Nonprofit example: Before deciding on creating any particular solution to help these women escape, you may decide  you need 18-20 women from your target market to say they would use a shelter. If you don’t get the numbers you need, then you go back and create a new hypothetical problem. For example, maybe the women would feel more comfortable coming to a kid-friendly, established church building.

7.  Pivot as necessary: As you get more feedback from your target market, adjust your approach as necessary. Perhaps the problem you’re setting out to solve isn’t what they really connect with but there is another problem they’d like you to help them solve. Keep tweaking your concept until you have enough information to validate that your idea is viable for both you and for them. Use as many columns on the Validation Board as necessary.

Here is the video resource that explains the Validation Board in more detail: http://www.youtube.com/watch?v=HhoducyStMw

I admit that after spending two days talking about the Lean Impact conference and then spending hours reading through the different resources on this topic, it is very easy to slip into a mindset where we feel like we are handling our target market’s problems in a way that involves way too much head and not enough heart. Data, numbers, hypothesis…not focused on the heart of the mission; however, I want to offer the questions below to you.

What if being Lean allows you to impact more lives in a more meaningful way, giving your donors a higher ROI on their dollars, thus encouraging them to increase the frequency and amount of their gifts to your organization? What if being Lean makes your nonprofit leadership job easier and more satisfying? Wouldn’t that be worth at least looking into?

More and more, the organizations that are able to change the world will be those who become experts at listening to their target clients’ real needs and then developing customized, innovative solutions to solve them. How they will arrive at these solutions will be through a process of thorough testing, attention to the real data from those they exist to serve and then pivoting accordingly.

I’d love to hear from you…What do you think of the Lean philosophy as it applies to the nonprofit sector? Please leave a comment.

See also:

Little Bets: How Breakthrough Ideas Emerge from Small Discoveries

Leap of Reason: Managing to Outcomes in an Era of Scarcity

The End of Fundraising: Raise More Money by Selling Your Impact

Image credit: leanstartupmachine.com, lean.st, dreamstime.com, coolinsights.blogspot.com

Leave a reply

Big data is not required for big insights

You’ve probably heard a lot about Big Data. Big Data is going to change the world. Big Data is going to change how organizations are run. Big Data is going to clean our garage and walk our dog.

Big Data vs. Small/Medium Data

And maybe Big Data will do that–for big organizations. If you’re Coke or the Fermilab or the National Security Agency, your products or services or spying naturally produce a lot of data. Tapping into and harvesting massive streams of continuously created data, which is the hallmark of Big Data, is a natural thing to do.

But for many of us who work at small and medium organizations, Big Data is an abstraction at best. We simply don’t have massive, ongoing data streams that we can dive into to learn about our markets, our products or services, our clients, or our organization. We’re not big enough to have Big Data. But that doesn’t mean we can’t learn from the principles behind this phenomenon and use them to our advantage.

The hype around Big Data is the data itself: massive, previously unattainable and unimaginable rivers of data pouring through your world. But the philosophy behind Big Data is actually more important.  t’s about looking around to identify where those data flows are in your own environment and then tapping into them to gain insight. You don’t need Big Data to do that. It works just as well with Medium Data or Small Data, especially if you’re a medium or small organization. We too can tap into and harvest data; it just flows in smaller quantities at our scale.

Three sources from which to harvest data

So how can we start this harvesting? What can we collect? There are three main sources to consider, though we’ll concentrate mostly on the third one.

First, you can harvest data that already exists outside your organization and is updated regularly. For example, there are lots of federal surveys and data collection efforts out there, and they’re very cost-effective to retrieve if you know about them. The right ones can help you understand your environment.

Second, you can create data via ongoing special efforts, such as conducting a regular survey or instituting a special data collection effort that is not part of your daily operations. This is a bit of a different concept from harvesting data, but still falls within the realm of a streaming source of data you can use for analysis.

But the third concept is the core of where Small Data can help you. It’s the implementation of a system to collect and harvest on an ongoing basis the data that we produce in our daily operations. Or more precisely, it’s data that we do or could produce easily in our daily operations.

Focus on the third

Thinking about that third concept, we all have opportunities to gather data on a daily basis. Most likely, we already do to some extent, even if it’s as simple as our client names or time sheets. So we’re already in the habit of creating data. But how are we using that data? As examples, I’m always surprised by the number of organizations that record their clients’ ZIP codes but then never use that data to examine their clients’ demographic and geographic makeup I’m also surprised by the number of nonprofits that don’t do research on their donor databases to identify their demographic sweet spots. These data are often collected but not often analyzed and leveraged to their full extent.

Beyond harvesting data that already exists, is there other data that we can efficiently build into our routines that can add value, either in understanding our clients, serving our clients, or improving our internal operations and efficiency? My company, for example, began tracking the origins of our consulting engagements a few years ago, and it has been very effective both in identifying inefficient means of marketing and effective ones. Our minor investment in that effort has paid itself back many times over.

There is value in data. We all know that. The key, of course, is to manage the process so you’re gathering valuable data in an efficient manner and then actually using it to your benefit. If you think about evaluating a program, a general rule of thumb is that 5 to 15 percent of the budget should be invested in evaluation, depending on the size of program. If you would make that investment in a program, why not follow the same rule for your organization as a whole? It may pay off handsomely.

See also:

Level Best: How Small and Grassroots Nonprofits Can Tackle Evaluation and Talk Results

Leap of Reason: Managing to Outcomes in an Era of Scarcity

Leave a reply

Tags: ,

Hire from within for your best evaluation team

One of your funders wants to see hard data, and another set of stakeholders needs touching stories that aren’t just cherry-picked anecdotes. You know you should evaluate your program. In fact, you genuinely want to know what’s working in your program and what needs to be tweaked. But you have a slightly queasy feeling in your stomach because despite your best intentions, you simply don’t have the time or expertise to conduct extensive program evaluation. Here’s the good news: you may already have the resources to do program evaluation right in your office.

Take a minute to list all your key staff, volunteers and hands-on-type board members.

Here’s the next step: See if any of them match any of these descriptions:

Clear thinker: someone who really “gets” the program.

Go getter: someone who doesn’t mind face-to-face interaction with perfect strangers and who knows how to just be aggressive enough to engage people without turning them off.

Accurate and tolerates tedium: a person who doesn’t mind doing slightly monotonous work and does it accurately.

Sees number patterns: someone who can look at a set of numbers and see patterns.  (You know these people when you see them in action.)

Sees comment patterns: this person can look at a set of comments and see what they have in common.

Good writer: a person with writing skills.

Champion: a person who believes in your program and has the ear of your stakeholders.

Here are five other requirements for people who help you with evaluation:

  1. They have to be able to commit some amount of time to their work and maintain that commitment.
  2. They have to be willing to put aside their own opinions about the program.
  3. They have to be teachable.
  4. They have to be able to maintain confidentiality.
  5. They have to understand the limits of their evaluation job so they don’t run amuck and make outlandish suggestions to your board just because they’re involved in evaluation.

Now, here’s a simple outline of a soup-to-nuts evaluation process:

Phase One: Planning for Evaluation (a two-step process)

Step 1: Develop a Logic Model (a 1-page description of your program’s activities, intended outcomes and how you think they relate to each other)
Step 2: Develop an Evaluation Plan (a list of the evaluation tasks you’ll do, such as surveys, focus groups, interviews, etc., that will help you understand your progress toward your intended outcomes)

Phase Two: Doing Evaluation (a six-step process)

For each evaluation task you do,

Step 1: Design the survey (for example)
Step 2: Distribute and collect it
Step 3: Enter the data
Step 4: Analyze and synthesize the data
Step 5: Write up a report
Step 6: Use the findings

Here are six hints about who can help with each task:

  1. Clear thinker can help you with your logic model.
  2. Go getter can help you distribute and collect surveys. With training, he/she can also conduct interviews (if he/she is removed enough from the program to be impartial).
  3. Accurate-And-Tedium-Tolerant member is just the person you need to do data-entry.
  4. People who can see number and comment patterns will LOVE helping you with data analysis.
  5. The writer can help you write your report.
  6. And when it’s time to use your data, you’ll need your champion.

When you recruit people with these characteristics, keep in mind that people (staff and volunteers) love doing what they’re good at in service of things they believe in. And most people, especially teachable ones, like learning new things. So, while they’ll be doing you a favor, you’re also giving them a great opportunity.

Here are three caveats.

  1. There’s an additional skill-set which is very specialized: facilitation. If you have a facilitator on your team, lucky you. Put him or her to work running a focus group (if he or she is removed enough from the program to be impartial).
  2. If you have stakeholders who should be involved but don’t have any of the characteristics listed above, it may be a good idea to find an appropriate role for them so they don’t feel left out.
  3. Even with all these great resources, it’s handy to bring in an evaluation expert who can help you anticipate costs and timing and help with designing your evaluation plan and instruments.

Yes, there’s a lot to think about. But you are already far ahead of where you were 10 minutes ago! Even if you’re not ready to jump into evaluation yet, you can be confident you won’t be diving in alone when it’s time to take the plunge.

See also:

Leap of Reason: Managing Outcomes in an Era of Scarcity

Level Best: How Small and Grassroots Organizations Can Tackle Evaluation and Talk Results

 

 

Leave a reply

Welcome! Please provide your log-in information below.
Forget your password?
Enter your email or user name and your log-in information will be sent to the email on file.