We all know there’s a skills gap. It’s real, it’s a problem, and it’s time that we move beyond talking about what it is, how large it is and why it matters to a more important topic – what do we do about it?
The answers, if indeed there are any, are not easy to discern let alone implement. As appealing as skills-based learning is to nearly everyone involved in both industry and education, implementing change comes with some significant obstacles. Many micro-credentialing programs operate outside of the accreditation process, so find difficulty gaining legitimacy and the traction that comes with it.
Let’s look at why that is, and what we might – or might not – be able to do about it.
Determining the Skills That Lead to Employability
The very first thing that needs to happen is the definition of 21st-century skills, the ones that lead to employability. In this series, we’ve already tackled the problem of defining both those skills and the very term employable. Even once we have established a common definition of terms so we are all speaking the same language, there’s another challenge.
New conversations need to be started between industries and educators: “We need more connections and discussions between industries and universities, where on-the-job-learning is part of the curriculum,” Noam Mordechay, Vice President of Enterprise Innovation at Gloat told the BBC. Conversations that have started, need to continue.
As Emeritus Professor Beverly Oliver indicated in the whitepaper Rethinking Employability Beyond 2020, placing an employability lens on the curriculum is as crucial as is building the bridge for learners to the world of employment beyond the campus walls.
There are of course broader, wider issues at play. Rapid technological change is impacting nearly all job classifications globally – and has only been exacerbated by the COVID-19 pandemic. If education is the currency for the knowledge economy, are all forms of education equal?
In their recent article Good Jobs in Bad Times, Jeffrey Selingo & Matthew Sigelman indicate their steps for how institutions can reshape their approach to ensure learner success past graduation and indicate Colleges have to be “more agile, adaptive, and imaginative.”
However, as a starting point, we can’t simply throw out the current education system and broadly replace it with a new one, regardless of the conversations we’re having. We have to start with the system we have now.
Evaluating Current Curricula and Courses Broadly
This is where technology can help. There are a variety of AI initiatives that help educators perform what appear to be simple tasks until we look closely at what’s involved with each step.
EMSI, based in the US, has introduced their Skillibai program. Using technology, it allows educators to begin this process.
- Skillification – current curriculum and the skills taught are matched with skills employers are looking for in an apples-to-apples comparison.
- Market-aligned skills are identified and emphasized.
- In-demand skills are identified, skills that educators may want to add to their curriculum.
- The analysis shows where needed skills are taught in other programs and could be cross-applied through curriculum modifications.
The problem comes from trying to take what is essentially a basket containing a variety of fruit and making those apples-to-apples comparisons. In some cases, some of the ‘fruit’ (to use our analogy) has gone bad and needs to be thrown out. In others, we may have the wrong type of fruit in a particular basket – but it might work very well in another.
For instance, listening is a skill that is important in nearly every career. However, it may not be taught meaningfully in a software development curriculum, but is taught well in a communications program. By simply ‘stealing’ or ‘borrowing’ an already taught skill in another area of the university, a valuable, transferable skill can be added to an existing program without creating something new.
This means it isn’t enough to evaluate the current program in a university, but instead the entire curriculum needs to be evaluated, categorised, and modified based, not on opinion or legacy frameworks, but on intelligently gathered and analysed data.
What do we do with that data? And how do we share it? Enter the Internet of Education (IoE) – a term coined in January 2020 at the World Economic Forum meeting. This has quickly become a global movement defined as ‘a global ecosystem of trust that enables networks of personalised and effective learning.’
The Learning Foundation has become the steward for this movement, and as a result, has set up several ‘Labs’ in what they call Education 3.0. “Together we can create a world where degrees, credentials, online learnings, workplace development and career progress are saved on learners’ phones and verified on the spot. A world where learners can effectively map education toward their goals, scholarships, holistic living and abundant employment.”
The key is that these labs offer a “fully aligned, yet decentralised, ecosystem for collaborative innovation, investigation and experimentation for the future of education and employment.” Does this sound challenging? It is – and the ideas are still in their infancy. This article is a great starting point, a must-read overview. As great as all of this sounds however, there are additional obstacles to a decentralised system. And it starts with frameworks.
Current vs. Emerging Frameworks
We’ve talked in this series about frameworks, including the one implemented by IBM and other private companies to provide their employees and others with a micro-credential system that also helps them define a path to career advancement.
These frameworks exist for the most part outside the academic accreditation space. Why? The simple answer is trust – and it is perhaps one of the most difficult issues to overcome. Frameworks are in place for a reason. A degree from Harvard means something because they have high standards and a rigorous curriculum that ensures a graduate has learned specific things at a high level.
While each educational institution has its own framework, there are also various government-level frameworks, for example, the Australian Qualifications Framework (AQF). These also exist for a reason: a four-year degree from any institution means the learner has at least attended a certain number of classes, and their major declares what they’ve had the opportunity to learn, and mastered well enough to receive a passing grade. GPA / WAM is also somewhat standardised within a framework.
Any disruption of that framework must be approved at the highest levels of the organisation that established the frameworks, from various accreditation bodies, such as the regional accreditation bodies in the United States. Otherwise, the institution risks losing accreditation.
In addition, the learner must have some method of verifying to a future employer that their education has value, and that value is often found in the accreditation of schools by established organisations. While ideas are developing around self-sovereign identity systems (SSI) such as Blockchain technology – to ensure that micro-credentials have verifiable meaning – for the most part, they don’t carry the weight of a university degree.
Here Come the Regulators
Regulation, usually by a government entity or other organisation, is a huge part of accredited education. From the Department of Education in the US to the Tertiary Education Quality and Standards Agency (TEQSA), and the Australian Skills Quality Authority (ASQA) in Australia.
These bodies in large part govern and approve the curriculum taught by schools within their jurisdiction. Variation from the established frameworks and curriculum meet roadblocks that are not easily removed. New ideas must be tested to ‘prove’ that learners still meet established standards at certain milestones and upon completion of their education.
History has proven the need for these regulations. A tragic example is the for-profit private universities that thrived for a short time in the United States. Graduates found their degrees meant less than a similar degree attained elsewhere and many found themselves underemployed, if they were employed at all.
This regulation is largely centralised and if not directly controlled by governments, is recognised by them. And whoever doesn’t play by the rules is ‘penalised’ by perhaps the most compelling guideline of all: funding.
Show Me the Money
This brings us to the final point of this article, one that deserves an entire article all it’s own: funding. The biggest obstacle to change is often funding. Investing in emerging AI in education is extremely valuable, but does not have the ‘curb appeal’ of electric vehicles, nor do we have an Elon Musk-like character tweeting about it and creating investor buzz.
Yet it is likely these forms of private investment will lead to the greatest innovations. Private companies must be the ‘first’ to establish frameworks that are proven to produce outcomes that satisfy industries, educators, and employees. Once they are proven to impact the skills gap and offer a solution, broader public approval will surely follow.
In turn, this will mean that a skills-based curriculum may transform what a degree looks like, whether that is a stacked deck full of micro-badges or a more traditional degree enhanced by them. What will that look like, if, or when, it does happen?
Like much of this post and the posts before it in this series, more questions are asked than are answered. So it’s your turn to talk – and we encourage you to talk, not just to us, but among yourselves, to answer:
- How do industry and education best communicate and agree upon a set of skills that leads to well-rounded and employable learners?
- How do we better evaluate current curricula and modify them into new, skills-based platforms using the technology and other tools at our disposal?
- How do we establish frameworks that help learners prove what they have learned and at the same time allow potential employers to verify that learning?
- What role do regulators play, especially in an emerging, decentralised, and collaborative environment? How do we ensure those frameworks align meaningfully?
- Where does the money come from for research, testing, and implementation?
We look forward to your answers and the further conversations that will be necessary to better develop skills-based curriculum’s that can truly make a difference.
Connect with us
Credentialate assesses, monitors, promotes and validates learners’ attainment of evidence-backed skills, supporting the transition from learner to earner. It is a secure, configurable platform that assesses and tracks attainment of competencies and issues micro-credentials in a digital badge to students.
Untangling the Modern Credential Marketplace
In this blog series, we seek to untangle the modern credential marketplace by examining it from multiple perspectives, including: