Digital Badges - Credentialate Guide to Digital Badges in Education
Digital badges can be powerful symbols of achievement, skill and employability. What are digital...
What’s the difference between employment and employability data? What can employability data tell us about a learner's job or career prospects? In this information-rich Credentialate Guide we look at the currently available data and what data we could be capturing to give us a better understanding of learner employability, with the ultimate aim of driving graduate employability outcomes.
This guide is about a 19 minute read - enter your email below to have a PDF copy sent direct to your inbox:
Big data impacts every single business and industry around the world. However, education is in need of, well, an education. We collect a lot of data from educational institution information at national levels. What data should we be paying attention to? What should we be capturing? And what are we missing?
At an institutional level, graduate employment data can tell educators a lot about what is happening right now, and how effective their curriculum may - or may not be - at producing desired outcomes. But for most learners, the primary desired outcome of higher education is getting a job or moving up to a better one. This is often why the focus is on whether or not the graduate is employed in their chosen field of study.
But if the student is not employed, why not? Did they graduate as an employable individual, for example, but choose either to remain unemployed for a variety of reasons, accept a position where they are underemployed, or do they actually lack the skills needed for an employer to deem them employable? Alternatively, they may have skills that make them employable, but lack the documentation (or certifications if you will) to prove those skills aside from the “degree” or “diploma” they may have received.
First, let’s start with a couple of definitions, to be sure we are all on the same page. Then we will look at a small portion of the data we have now and what it shows, and what data we would like to have that might be more useful.
Employment data essentially tells us what, career-wise, a graduate does a few months or, in some cases, years after graduation. Some graduate and learner data goes beyond just average GPA and graduation rates. It is important to look at several different factors when evaluating outcomes - particularly employability and employment data:
This data can be very revealing, and is often a primary driver behind funding, but it is also vital to both learners and employers as well.
Why do we call this employment data? Because there are many types of data, and various names for that data, out there. This data is not where our focus is, but here are a few definitions for you:
What most of these boil down to, with the exception of learner data, is employment. Because what does a degree mean if it doesn't result in employment? What skills does the graduate really have that relate to a job?
But perhaps just as important: why does any of this employment data matter?
Why do we need all this data? The most commonly stated reason is to measure the practical effectiveness of learning. For example, let’s say a business school had an 80% graduation rate with an average 3.5 GPA. What if none of those students graduates with the skills they need to get a job? How effective was their education?
This matters to the learner, who wants to understand the value of the time spent learning and how that translates to real world employment. It also matters to employers who want to know what skills the learner brings to the job, and want tangible, verifiable proof of those skills.
The other primary reason is funding. As an example, in the United States, learner outcomes are often tied to Federal funding: if an institution fails to meet a certain outcome standard, they may be ineligible for certain Federal programs, can lose accreditation certifications, and worse will endure damage to their reputation from which they may never recover (see an excellent article by The Times Higher Education back in 2019 on this subject).
One US program specifically is the funding awarded to service Veterans. While there are two types of funding, what matters in this case is the requirements for any institution, whether for-profit, non-profit, or Title IV accredited universities. Graduation rates, after-graduation employment rates, especially for the veteran population, and even how long learners stay in their career field are audited on an annual basis at a minimum. If an institution cannot pass this audit, their veteran funding will be denied until they can pass an audit and reapply for the program.
Performance-based funding is also being rolled out across the globe, for instance Australia, the United Kingdom and Canada to name a few. The principle is the same: schools that perform well get more money, and those that don’t risk losing funding entirely.
Clear examples come from the failure of many for-profit “universities” with questionable business practices in the US. There are also numerous articles that talk about the failures of performance-based funding and the burden of coercive reporting and the inequality for students.
In short, graduate outcomes illustrated by institutional data are the benchmark for success or failure. Right now, that data focuses primarily on employment rather than employability. But can we do better?
The answer is, yes, we can.
What does “better” look like? The answer is not simply more data, but richer data. If we are talking about employability, then richer data related to employability skills, not just industry specific skills.
For example, a bachelor's degree in business from one university might not carry as much weight as the same degree from another university. What is the difference? Is the student from one school actually receiving learning in soft skills different from those received at another, similar school? What are those skills, and how are they measured and documented? This is not easy, and at the moment assessment and evaluation of employability skills is subjective for the most part. Without a clear framework and well-defined standards, how can these skills be measured universally?
This is the problem at its core: how do we measure and document leadership and listening skills, for example, as part of the above business degree? How do educators measure an individual learner’s aptitudes and document them as part of a transcript?
That is the challenge, but to address the needs of learners and employers, it is one that must be met. Learners need to know and be able to articulate what skills they are getting from a course, and what those skills mean. To do this, we need to align the curriculum to the skills, track the student’s mastering of them, and then find a way to document them. Those skills need to be available for students, potential employers, and others to access.
If you ask learners what they got at the end of their course, they will typically tell you what grade they received. What they should be able to tell us, is what skills they’ve learnt and why these skills will help them in their career, something Credentialate delivers through its personalised evidence layer.
And there is other good work being done in this area. EMSI in the United States has introduced Skillabi, an AI-based analysis tool that helps institutions to analyse curricula and determine where soft skills are already being taught in the current educational framework. This allows institutions to add courses or parts of courses to a current area of study, creating a holistic, instead of departmental, approach to curriculum development and revision.
The entire premise of the organisation is to connect students with education and work in a meaningful way, and a big part of that involves a credentialing system that goes beyond the current “grading” and evaluation systems that are an integral part of traditional frameworks but also the foundation of “employment data".
Where does this data come from and how is it gathered? First of all, within traditional frameworks, a learning institution is obligated to report graduate outcomes to either an accrediting agency, a government entity, or both.
The format of the reporting depends on the agency requesting the data, but there are typically several points addressed, similar to the questions asked above. So for example, if we were to look at the most recent UK learner outcome data, we would see:
You can see all of the current data for yourself at the HESA website. Each country or region has their own graduate outcome reporting or data set, most gathered and reported annually. Most often this data is taken from higher education institutions that offer bachelor’s, master’s and other advanced degrees.
However, as widespread as this data is, it’s incomplete. Participation in certain parts of a graduate questionnaire is voluntary: not everyone gives any answer at all or even participates. Not all accrediting agencies share their information in a single digestible format.
And there is another issue. What about those smaller, shorter courses that teach equally valuable skills? In a recent press conference, the Lt. Governor of Idaho, Janice McGeachin joined a limited number of other states in including skills-based training graduate outcomes in their reported statistics.
In this case, that reporting included several members of the Northwest Career Colleges, including electrical lineworker programs, beauty school and franchise training, and others. “We realised Idaho was missing out on a huge potential in recognising these career paths,” McGeaachin said.
However, in the process they discovered that these schools apply to more than just high school graduates seeking a career. “Many of our students are non-traditional students," Janier Smith, Director of the Restorative Therapy Program at Stephen Senager College told us. “They choose our program over a traditional one because they want to get done faster and reenter the workforce in a productive position.”
For many jurisdictions, including the non-traditional, data on skills-based education programs is now included in employment outcomes. Even when this data has been gathered before, it’s not nearly as comprehensive as that obtained from more traditional frameworks. However, skills-based learning, especially when specifically targeted to a career path, often results in higher employment rates in skilled positions that pay more.
Without including outcomes from skills-based learning and other “micro-programs” that provide learners with certificates and credentials that create employability, we’re missing a lot of data about additional educational options and the skills learners take away from those courses.
Much of even the most robust of this data is called “experimental statistics”. In other words, we are still really in a testing stage, and models are not fully developed. Where is this data documented, and what does that documentation look like?
The HESA website mentioned above includes some of the most robust graduate outcome data available. Although the National Association of Colleges and Employers (NACE) in the US also provides a lot of data as well. This data is typically documented in a variety of ways, including digital charts and graphs or downloadable spreadsheets containing raw data so a user can do their own analysis.
One factor impacting this data is the student response rate. For example, the NACE data includes data from, 358 schools of which:
This works out to likely the largest collection of data on graduate outcomes in the United States. Even so, that means we have data on the following:
Even the best of data sources is less than complete. When it comes to the HESA data, out of 478,805 surveys, 214,280 individuals did not respond at all.
Multiple organisations do their best to gather and aggregate this data to better inform both educators and employees. While a popular concern among learners, most depend on simple article summaries like The Best Colleges for Your Money list.
This data, as useful as it is in aggregate, is much more useful at an institutional level. Where does a curriculum need to be more robust? Where are skills missing? What do students need to be employable that they are not getting? Graduate outcomes and performance data provides decision makers with a clearer picture of what is happening now, and how that aligns with future goals.
Are educators really equipping learners for the current job market? If not, why not, and how can those outcomes be improved? Current employment data essentially provides a starting point, a foundation for making data driven decisions about change.
What it all comes down to this is: higher education needs a lot more data to be competitive going forward. The high time and sometimes monetary cost of a four-year degree must be clearly demonstrable for both traditional and non-traditional students to embrace them.
They need a way for a teacher to learn relevant information about the learner and what skills they actually possess when they complete a course of study. A more robust assessment that doesn’t overly burden educators must become the norm. This is the goal of QACommons and other initiatives, which seek to certify skills and ensure students are employable for their first job and the ever-changing workplace they will face.
Then the data must be documented somewhere and in such a way that an AI data search can both find it, and create a credential for it. In this way the student has a meaningful, verifiable, and secure way to illustrate the skills that make them employable.
What does that future look like? It could mean a combination of courses and micro-credentials “stacked” to form a degree or certificate, essentially a “proof of skills.” Ongoing credentialing and education could become the norm. These “digital badges” could hang on a decentralised yet verifiable framework.
For this to work, we need more data. We know we are starting with what we have now, something educators, learners, and employers all know is not enough. What we don’t know, what many stakeholders are working on, is what data we do need, and how we gather, measure and evaluate it.
It is through this transformation, this change from a focus on employment data to employability data, what that really means and how we use it, that we will change the language we use, the way we verify and validate learning, and the way we look at a candidate seeking employment.
Credentialate is a secure, configurable platform that assesses and tracks attainment of competencies and issues micro-credentials to students backed by personalised evidence at scale. By automatically extracting data from existing platforms and using an organization’s own assessment rubrics, we can objectively measure awarding criteria and validate its evidence.
By this same method we can automate the assessment, monitoring, promotion and validation of evidence-backed skills. For an institution, we provide the data and insights required to track skills and competencies across courses and entire programs.
Finally, we have decades of collective experience in educational technology and long-standing ties with global educational powerhouses. These solidify our ability to produce credible digital badges.
Credentialate assesses, monitors, promotes and validates learners’ attainment of evidence-backed skills, supporting the transition from learner to earner. It is a secure, configurable platform that assesses and tracks attainment of competencies and issues micro-credentials to students. If you’d like to learn more About Us and how we can work together, contact us or Schedule a Demo and let’s discuss!
Enter your email below to have a PDF copy sent direct to your inbox:
The world's first Credential Evidence Platform
Launched In 2019, Credentialate is the world's first Credential Evidence Platform that helps discover and share evidence of workplace skills and analyse competency achievement across an institution like never before. It automatically extracts data from your LMS and issues a digital badge with an embedded personalised evidence page. This unique page details learner achievements using qualitative and quantitative data and can be issued together or separate to the digital badge. Available for the first time globally through Credentialate, is a suite of Skills-First Evidence Alignment tools - such as a Skills and Competency Library, Evidence Matrix builder and Framework Alignment - which recognise and support the burgeoning skills economy, and enable educators to apply backwards-design to micro-credential development. Credentialate was developed by Edalex - an edtech company on a mission to surface learning outcomes, digital assets and the power of individual achievement. Founded in 2016, Edalex brings together the team behind the CODiE award-winning openEQUELLA open source platform that centrally houses teaching and learning, research, media and library content.
Find out more at: edalex.com/credentialate
Digital badges can be powerful symbols of achievement, skill and employability. What are digital...
All degrees are not created equal, and employers know that. They also know that just because a...