Understanding the metrics incubators or accelerators need to track

Standard

Like startups, incubators and accelerators are hard work but they aren’t startups — they are educational programs (and sometimes facilities) that use something resembling an apprenticeship model to move a (normally software) business through their life cycle at an accelerated pace. Incubators and/or accelerators are likely to offer some resources like a place to work and/or money as well but the value is in the educational program which should drive the metrics that are important to these organizations.

Success isn’t easy to agree upon with these programs and if you look at the math it isn’t a sure bet they are going to be making money. Measuring success in an education model isn’t easy either. Having what is essentially a marriage of a venture capital model with education does not make it easier. If you are running one the dozens of newly minted programs or are one of the thousands of hopefuls trying to get into the program, here are some things to keep in mind.

Baseline metrics

In my world there are only two styles of incubator/accelerators and that is TechStars or YCombinator which TechCrunch as tried to come up with numbers to compare:

  • Total companies
  • Total raised by companies
  • Total rounds raised by companies
  • Money raised after seed funding (money raised after the incubator process)

What TechStars tracks (and shares openly) beyond the above list:

  • Number of employees
  • Status: active/failed/acquired
With regards to education or apprenticeship success the key metrics are simply around completion of the program. An easy to find data set from the United Kingdom demonstrates what is important from a government:
  • For education: “Success rates are based on the individual aims that were expected to end in the academic year.  They are calculated as the number of learning aims achieved divided by the number started, excluding the aims of any learners that transferred onto another qualification within the same institution.”
  • For apprenticeships: “success rates are based on the number of learners who meet all of the requirements of their apprenticeship framework, divided by the number of learners who have left training or successfully completed their training in the academic year.”

Why do the numbers matter?

Paul Graham is open about YC’s numbers but he struggles with what success means and acknowledges it takes a long time to see the real success of YC.

Incubators and accelerators can and should make money, Betaworks demonstrated that by paying back its investors. Using the 316 number of companies that have been to YC and assuming that ‘rarely over $20 000 in investment‘ translates to:

  • An average of around $18 000 or $5.7 Million has been invested by YC (apparently they might have invested more as they did follow on rounds when funding was scarce)
  • The avg value of 210 of them is estimated to be $22.4 million each
  • $627 million has been raised (according to techcrunch)
  • $4.7 billion in total value of companies

In contrast, TechStars has 114 companies at:

  • $18 000 each (plus now a $100k note option)
  • $132.3 million has been raised by TechStars alumni.

Assuming that raised money accounts for at least 20-25% of shares in the companies (average), you can guess that the value of companies is not on par with YC companies but still impressive for only 114 companies.

I will throw a third into this mix, Mass challenge, who is clearly about education through apprenticeship. They focus so heavily on the programming side of things when they talk about:

  • the number of applications, over 1000
  • 111 startups, $103 million in funding raised within 12 months of completing the program
  • 805 employees now at those 111 companies
  • They don’t take equity

To me these numbers for Mass challenge demonstrate that their success is driving as many companies as they can through their program. The quality of the program is measured by the funding they get afterwards as well as the jobs created. TechStars measures that too.

A lot of these numbers being thrown around are used to inform entrepreneurs about the nature of the program, enticing them to be a part of it. Other metrics commonly shared:

  • Number of mentors available
  • Value of extra services like legal, accounting, software, etc

These are likely more vanity metrics than anything else.

Developing metrics for your incubator or accelerator to drive programming

When developing the programming you should design it around some basic questions:

  • What is a success for a company or founder that enters the program?
  • What is the time frame you expect them to realize that success?
  • What are the activities you can measure now (eg. event attendance, company funding, jobs created, IP created, etc)?
  • What should you measure later?
  • Does volume of applicants, participants, funded companies, etc matter?

Then you need to build some instruments to measure, monitor, and adjust. Everything you do from speakers to mentors to other events should move the numbers one way or another for every cohort that is run through the program. There are no bad metrics to start, just start.

Building a funnel to help interpret the data

The higher the number of people at the top of the process, the more likely you will have great things coming out the other end. The more success you have at the bottom of the funnel will fill a growing pool of resources to feed back into the process. Where this starts for an incubator/accelerator is at the application level.

The basic funnel is illustrated above (yes it looks like Skok’s customer acquisition funnel). In each step there are a lot of actions that influence the outcome down the chain but to start:

  1. Applications – this is your top of funnel where you collect information on people and make some decisions.
  2. Interviews – at this point the teams get to meet you, the assessment should be working both ways.
  3. Funded – companies that are accepted into the program.
  4. Successful – those that meet your definition of success, at a baseline that should be all those that complete the program.
  5. Alumni – this is the growing base on the other end of the funnel that you should keep engaged.
Parts of this funnel are interconnected and if you were to draw it out for a specific program it would get more complicated. Below speaks to how you use your alumni base and have different definitions of success.

The additional definitions of success are more realistic. You will have companies that complete the program but fail to raise money or find revenue. However, you would still put that group into the alumni pool as they can help drive bigger success throughout the funnel.

The basic model is a science but building success is an art

If you look at the numbers of TechStars companies by location you see the results on funding post-TechStars vary a lot by cohort and location yet the model is ‘open’ and known. If running these programs was a science I would expect more consistant results.

The success metric for a VC focused incubator is profit but they need a secondary metric that is equally important — 500 Startups is to build a tight community of 500 startups, TechStars has its Alumni network that doesn’t reject stagnent or failed companies, YC has a very tight knit alumni group as well. All the other metrics  indicate the health of funnel is what I believe speaks to type of experience the entrepreneur is going to have being a part of it. 

Entrepreneurs should be asking the incubators why they are doing what they doing and be just as selective as the people on the other side of the table.

One thought on “Understanding the metrics incubators or accelerators need to track

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>