Insights from a Higher Education Market Researcher
So Kirk, what an intriguing title — more than clickbait right?
Well, oftentimes you’d be surprised at what data reveals! Case in point: There was a Canadian university that wanted to look at its international tuition fees.
For many years, the fees it could charge were highly regulated, until one day, the government decided to deregulate some international fees and let the schools keep any additional revenues they generated. A sort of “eat what you kill” philosophy tied to lower government grants.
While the school was certain that it could raise fees, it had no idea of what a more appropriate level of fees might be. They certainly weren’t going to try the usual “look left, look right, and put yourself in the middle” strategy for setting fees.
Through our research, we determined that international students perceived our client to be woefully under-priced. We were able to show the client that they could raise fees by more than 230% and suffer no loss of applicants.
They were (understandably) shocked; they knew their programs were underpriced but had no idea to what extent.
What was actually going on? Well, many students saw the programs’ regulated, low costs as an indicator that the university’s programs were low quality, especially compared to other international schools that charged more. At the former level of regulated fees, students and their families assumed that the school couldn’t be generating enough revenue to deliver a quality program. So they looked elsewhere.
The new fees reflected the quality that more students were looking for. Surprisingly, the new higher fees attracted more applicants — in some programs as much as tripling the number of applications.
Do you have other examples of data-driven decisions?
On the flip side, another well-established university suffered from the public perception that it was not in any way a leader in higher education. A competitive analysis study done with both existing prospects and potential prospects revealed that public perception, at least at the level of prospective students, had changed massively over time. The school’s reputation was so much stronger than what their parents’ generation assumed. The competitive studies were repeated since, and recently revealed that in some key program areas, the university that was once thought to be a lower-tier, third-choice safety school had pulled up equal to some of the top universities in the country.
As one of the executives said to us after the presentation of the results, “I’ve been working towards this over my 20-year career at this university. Today I see that we have achieved what we set out to do 20 years ago.” That’s pretty powerful feedback.
From the data-driven perspective, what are some of the big missed opportunities for colleges and universities today?
Well, there are a few big ones. The first one is clarity.
There’s so much going on in a typical college or university that it’s sometimes difficult to know which way is forward.
While the private sector has a hierarchical and often direct decision-making structure, academia uses a more horizontal structure where multiple actors have input and several different perspectives vie to be heard. Institutions can also exist in a comfortable rut of, “This is how we do things, this is what we know to be true, no need to look into that, we know it works fine.”
The result can be a kind of fog of indecision that either drags on or fades away into the shadows as a new issue or urgent matter takes the forefront.
We’ve been lucky — our clients want to achieve results. But even with that good will, an important part of what we do is helping clients focus on the question they are asking that they want the research to answer. What is their end goal? How will they and their colleagues or team use the results? How does this research fit into the institution’s big picture of goals and tactics? How will the results support evidence-based decision-making?
And the second missed opportunity?
Well, this may sound self-serving, but we find that the even larger missed opportunity is the strategic use of the data itself.
Universities and colleges are exceptionally good at generating data. At any given time, data being generated, be it about enrolment, grades, fees, grants and scholarships, plus any of the projects a school’s Institutional Research team might be running.
The number of colleges or universities that actually use the data that they’re generating is very low. A hoard of data by itself can only tell you so much. It takes a deeper analysis that can tease out the hidden patterns that reveal opportunities for straightforward action.
But the really big opportunity is to practically use that data in more intensive and creative ways to further goals. Every institution has pressures they’re trying to navigate around, especially as we head into increasingly untrodden ground. Every institution has goals, whether it’s student success metrics, recruitment, revenue, etc.
What is keeping some universities from adopting a data-driven approach?
It’s probably a few things. One might be the reasonable (and very human) fear of the changes that an honest reading of the results will recommend.
It can feel like a giant leap of faith to follow recommendations based on results from a study to make significant changes, even if they’re in baby steps leading to an ultimately wider change. And for the most part, post-secondary institutions are not known for taking what can look like a big risk. Not everyone likes change and data-driven approaches to identifying and solving problems generally result in change.
Other schools have historical systems for making decisions and the shift to data-driven decisions challenges that culture. Occasionally, that culture focuses on a single decision-maker who has been in a post for many years and who does not brook disagreements with the policies and operational approaches that they put forward.
Another reason is the fear of the cost. It’s always a little disappointing when you’re proposing a study where the results could generate 10 to 50 times the study’s cost in new revenues but then you have someone focusing solely on the cost of study rather than balancing the study’s cost with the potential added revenues. This is, again, that part of the culture that shies away from risk-taking. The focus of the university is academic and turns around the needs of faculty. Risk-taking puts the university in more of a corporate state of mind — and while that’s not a place everyone in the university is comfortable going, it’s becoming increasingly necessary as things change.
What is the secret to successfully implementing data-driven research?
There are two.
The first is having a clear plan in mind before you set out to collect the data. That means you need to be clear on the results that the client needs and how those results can be enriched through a variety of analytical means.
The second secret is simply making it easy for people to understand what the results actually mean.
From day one, we swore we would never produce 100-page bricks of tables and, god forbid, pie charts, that are completely lacking in any kind of context, prioritization or analysis of the findings. In our early days, we spoke with an executive responsible for research at a very large Canadian bank about his level of interest in our evolving into a research company. He pointed to his credenza where there were multiple cerlox-bound legal-sized and 11 x 17 books with a hundred or more pages. He said, “There’s my research. I don’t need any more research. But I’ll pay you to go through it and tell me if there’s anything in there that I need to know.”
Research shouldn’t be a burden to anyone but rather a pathway to identifying what the priorities are, what resources are needed and what the reasonable next steps are. Our clients should come away with a very clear idea of what matters, what needs fixing, and what their most important two or three priorities ought to be.