Entrepreneurship programs face an ongoing challenge of selecting the best candidates as beneficiaries, whilst balancing that with a need to produce results that prove a return on the investments they have made in the particular individuals they support.
For the past 13 years, the Allan Gray Orbis Foundation has been running fellowship and scholarship programs for high potential entrepreneurial youth in high school and university- with the aim of supporting their long term emergence into high-impact entrepreneurs.
Over time, there has been a need to constantly reflect on and refine selection processes for these programs- to ensure that the candidates that make it into the Allan Gray Orbis Fellowship are the best fit for the program, and that the Foundation can continue to add value to the lifelong development path for the same.
The Foundation defines high-growth, responsible entrepreneurs as “conductors”. These individuals not only start a business, but also help it grow and ultimately take charge of its trajectory. This definition guides the foundation in understanding that the qualities they were really looking for in candidates were those similar to those possessed by successful CEOs.
On 22 November 2018, the Foundation hosted the second Ecosystem Forum, in which they shared findings and lessons from having gone through a process of refining the selection process for the fellowship. Some key lessons are highlighted below:
- Have clear criteria and a strong theoretical and scientific rationale for your success predictor variables
Allan Gray Orbis Foundation Pillars (source: Allan Gray Orbis Foundation)
The Foundation has five pillars based on which they select potential high-impact entrepreneurs, as pictured above. In addition to the demonstration of each of the qualities highlighted above through past experiences of candidates, there is monitoring of ongoing demonstration of the same once selected. However, one of the lessons that were learnt through the evaluation process there is a need to find assessment models that do not bias some candidates over others. For instance, Candidates who had lesser exposure due to their context- i.e. background, schooling experiences etc, could easily be at a disadvantage. The Foundation introduced a Behavioural Anchor Rating Scale in 2007, to counter this- so that regardless of other variables in the South African context, assessment tools were able to assess for potential rather than current competence or performance.
2.Always ask the right questions
In kicking-off the validation study, the Foundation was confronted with a range of questions which they had to be clear on. These centered mostly around tools, processes and predictors of success used as part of the selection and retention process of scholars and fellows.
Carl Herman, Assessment and Development Manager, emphasized this point at the forum. Was the foundation measuring what it intended to measure? In addition, he also emphasized that organizations must exercise caution and diligence when selecting assessment tools for their programs especially from vendors. You must critically assess both the theory behind assessment tools and request technical manuals from the vendors before falling for the sales pitch.It is crucially important that before adopting an assessment tool, organizations must vet it properly for proven success and fit for the context in which it will be used, so that it does not unfairly disadvantage candidates from across different cultural groups.
3. Data Matters- always
Recording, storing and collecting data is one of the biggest Achilles heels in development work. Sometimes the data is not collected consistently and stored centrally, which makes it difficult to mine and analyse it systematically to make data-informed decisions. One of the biggest benefits of having conducted a validation study is that not only can the foundation use the data for internal purposes to improve decision making, but can also begin to share, as at the Forum, with external players in the space to drive collective improvement in pursuing and assessing impact.