Video Interview - How Education, Industry, Society and Tech are Converging to Embrace and Empower Alternative Credentials
In this conversation style interview, Dan McFadyen and Martin Bean, Founder and CEO of the Bean...
In this conversation style interview, Margo Griffith and Doris Zahner, PhD from the Council for Aid to Education discuss authentic assessment of essential skills for career success, such as critical thinking, problem-solving and written communication. Zahner discusses how authentic assessment accurately measures critical skills, the evidence of authentic assessments that is provided to educators and learners and how different institutions use attainment of critical skills data.
1:12 Assessing in-demand critical skills and connecting learners to hiring managers
2:48 How authentic assessment effectively and accurately measures critical skills
5:34 Standardised authentic assessments vs customised assessments
6:40 What evidence of authentic assessments is provided to educators and learners?
8:24 How do institutions use attainment of critical skills data?
10:12 How does personalised evidence support CAE's authentic assessments?
13:52 How a broad range of institutions use authentic assessments
Click on the videos below to view or watch on our Channel Edalex YouTube channel - Subscribe to receive updates on new videos:
Margo Griffith (MG) - Hi everyone, I'm honored to be joined today by Dr. Doris Zahner. She's the Chief Academic Officer for
the Council for Aid to Education based in the U.S.
CAE is a leading provider of performance-based authentic assessments for measuring essential college and career readiness skills. Doris, lovely to talk with you again.
Doris Zahner (DZ) - Thank you so much for having me today Margo, It's great to be here.
MG - Absolute pleasure, and Doris in addition to your role at CAE you are also an Adjunct Associate Professor at both NYU and Columbia University and a multi-published author in the area of assessment. So your credentials really speak for themselves and I'm
very excited to get your thoughts today on how we can best tackle this challenge of authentically assessing career readiness or employability skills.
So I would love for you to tell me more about your work and also the work that you do at CAE.
DZ - Great sure, so CAE has assessments, collegiate learning assessment and success skills assessment referred to as CLA+ and SSA+ and these assessments authentically measure essential college skills. The career skills as we said - such as critical thinking, problem solving and written communication.
So these are the skills that hiring managers value above and beyond content knowledge and yet they're not necessarily explicitly taught in the college classroom and they're also not highlighted or reflected in University transcripts for the most part. I'm speaking predominantly for domestic U.S but I believe this to be true internationally as well, so CAE has a solution to help students and
educators and institutions and hiring managers all these people to make this connection between those who have these skills and those people who are looking to hire students with these skills into the workforce.
It's also a way for students and educators to identify where students need to improve these skills so that they become you know, it's better for their careers to become contributing members to the workforce.
And so since 2005 we have assessed over 800,000 students. They've participated both in middle school, high school as well as higher education domestically within the U.S and internationally, 1300 institutions worldwide.
MG - Wow that's an enormous amount of experience in this area. Do you think that the importance of being able to authentically showcase these really critical skills has, I guess, has the importance increased with COVID in your opinion?
DZ - I don't know necessarily if the importance is increased due to COVID, but if we just think about this - let's do a little thought experiment. If we accept the idea that critical thinking and written communication skills are important for positive career outcomes, then we should be able to help students and educators identify these students, those who are strong in these skills as well as those who need improvement in these skills. And in order to do so, you can use assessments.
Many assessments are traditional multiple-choice tests, so students are going in and filling out and selecting an answer - or you can do what we do, which is to use a more authentic approach, which is a performance task and we ask students to actually do something that is similar to what they would encounter in the workplace.
So, for example: CAE's assessments present students with an issue that they would need to make some kind of decision. And these scenarios that the students are put in, they try to imitate what they would have in their first job. So whether it's an internship or they're putting together a report or some kind of response to a client or their manager - these are the types of situations that they're in. And we ask them to make a decision and then they're asked to review the evidence that's presented in a document library. So, we give them all the information that they need in order to come to a conclusion and they're presented with multiple or both sides of the issue.
So it's not just like there's one correct answer, which is what makes CAE's assessments very unique. Just like in the real world, there isn't always just one correct answer. You can have multiple approaches. What we measure students on, is how well they can articulate and support their decision, with evidence - and that's the critical thinking piece. So, the recommendation that they would put together takes the form of some kind of written response, like an email or memo as I said, to a manager or client and they would back their decision with evidence from these documents. So, you can pick decision A, as long as you're able to cite the evidence from the documents that support decision A and refute decision B. That's how you would do very well and so there's no prior knowledge.
And what's also very unique about these assessments is that there's no prior knowledge that's needed. So you could be a STEM major and do very well on a performance task. You could be a Humanities Major as well. So fields of study do not have any kind of interaction with the topics of these performance tasks.
MG - How interesting. Are these assessments, are they standard I guess, out of the box assessments or do you also customise or do
bespoke assessments for particular institutions or organisations?
DZ - At the higher education level, if they're looking to measure critical thinking, we have different topics for the performance tasks but they are pretty much standardised - and it's important that they are, because we have normal criterion reference data points for the students. So they can tell how well they're doing based upon either a standard setting study which was conducted by CAE - myself and my colleagues or compared to an enormous reference sample - how other students that are comparable, whether it's their year in school or their field of study or their gender or ethnicity et cetera.
There are different ways in which students can compare their data to each other and we think that this method of assessing, including giving this benchmarking data, is more authentic than going and selecting a single multiple-choice question answer type of assessment.
MG - And so, for institutions and learners what do they get, you know, in terms of being able to - or as a tool, or as a document or something - what do they get to be able to showcase their skills, say to future employers?
DZ - Right, so a great question. Students who take the assessments will receive individual student reports with their data and if they do well enough - meaning that they've scored as part of the proficient accomplished or advanced mastery levels - then they receive a micro-credential through Edalex and these micro-credentials are verified.
So we make sure that there's backing behind what these, the students are getting. And so this verifies their skills. Saying that you are proficient, accomplished or advanced and not just the name, but these are the exact knowledge, skills and abilities students who are at these mastery levels have and that's what's outlined on these micro-credentials.
They also receive reports of their student performance and then they can look at how they've performed against other students within their own institutions sometimes or against international or, you know, domestic or international database and it really depends on what they're interested in looking for.
At the institutional level, institutions of higher education would receive a report that aggregates all of their students who have taken the assessment. And then that data, they would be given distributions of how well their students are doing, as well as the number of students who are proficient or are beyond, within the assessments and for example how many of them have earned the credential and the micro-credentialing badge.
MG - So in your experience Doris, when the institutions receive this data, what do they tend to do with that?
DZ - That's also a really good question. It depends on what they're trying to answer, the research question that they want to answer. So some institutions that participate with us are only interested in value-added, or student learning gains and in that case, they take a sample of students that are entering and they compare it to the students who are exiting and they look at the difference in the performance between these two groups. And that's sort of at the institutional level and they oftentimes use it for different purposes. Such as credentialing for them, for the institution themselves so, for example, middle states they want to get, they want to go and get their credentials for, for their institution.
Other institutions want to actually use it in a more formative setting and so they want to know how their individual students are doing and then to actually have faculty members or a writing center or, you know, another office within the institution helps students improve their skills. And so it's flexible, the instrument is flexible in that, in that respect.
And then a third use, which is not as often used but it's a way in which a University program might want to measure the efficacy of their programs. So let's say they implemented a new curriculum and they want us, they want to know how effective that curriculum is. So we have a group that decided that they were going to change their Freshman year seminar format and use performance-based tasks in that seminar to help students improve their critical thinking skills. And so the assessment was then used to measure whether there was improvement for comparing those students who were part of the old curriculum versus part of the new curriculum.
MG - Okay, that makes sense. So CAE and Edalex, we have a partnership and we're super excited about that partnership. For us being able to work with an organisation that's had such rich and robust experience in the area of performance-based assessment, 21st-century skills career readiness skills - you know, it seems that we are very closely aligned.
With Edelex, what we do is provide the personalised evidence that sits behind the micro-credential and this comes directly from your assessments. Would you mind just talking a little bit about from a theory kind of things, what that part of the puzzle brings to some of your clients, but also to CAE as an organisation.
DZ - Sure, so we're also super excited about this partnership and very specifically CAE used to have, we've always actually had these badges, electronic badges but they were just badges. They just said accomplished, you're an accomplished critical thinker. And there was a little, like, blurb about a paragraph about what that meant. But the feedback that we received from institutions and students are like - well, what am I supposed to do with this and because, that's it?
It was like a PDF that they could print or they could put it on, the idea was that they were supposed to be putting it on their LinkedIn pages and again - this is in the domestic U.S context for the most part - so, they didn't really know what else to do with it and it wasn't a verified badge and it didn't have all of the backing of, you know, explaining these skills. And so when CAE and Edalex started talking, we were very excited about this idea of having verified information and having detailed information and we're really impressed with how you took our rubric and were able to get sample badges to us, micro-credentials to us, that really illustrated how this could be used for students.
And so we did have - we have a client, it's a major client internationally - and they tested a bunch of students this past year, academic year and and we gave them their, their electronic badges. This is prior to our partnership with you and they're like - well this is it... what are we supposed to do with it? I guess we can issue these to the students but then the students are like - what do they do with it?But now we've mentioned these, these Edalex badges, the Credentialate ones and they're like - wow this is really, really useful and it shows, it can highlight and showcase their skills to potential employers. So we see this as a huge improvement upon a product that really didn't have much.
This idea that behind the badges was originally developed to increase student motivation, because students, to be honest, like - they don't want to take tests and no one's like psyched to take a test - so, we thought that by having, we thought by, about having, we thought by having these badges it would help increase student motivation so they would want to do better.
But now that we have something that is a verified authentic badge like this, that really is able to showcase and highlight student skills, so, that they bring this to employers and again - like I mentioned earlier - these are not skills that are on a transcript, so they can differentiate themselves from every other student out there who has a high GPA. 3.7 I think is the average GPA these days and it's a way in which they can showcase, because we know that these are the skills that hiring managers and players are looking for in the people who are recently graduated from University.
MG - Right so, one of the things with CAE that we've learned through our partnership is the breadth of clients and customers that you have. All the way from, you know, individual I guess, sections within colleges all the way to bigger systems and colleges and even governments - and can you let us know a little bit about some of the work that you do with, I guess, the breadth of of those customers?
DZ - Sure - so, at the smallest end the, you know, our smallest clients, they are individual institutions of higher education and typically so far these institutions have been using the value-added model, so their tests - they'll assess entering students in the beginning of their time, the first semester in which they enter university. And then in the spring semester, so, during the same academic year, they'll assess exiting students, so these are not the same students.
This is a cross-sectional model - and then we give them how well, we look at the difference between the entering and exiting students and we give them a score, either it's a value-added score, or it is an effect size depending on how many students they have and what type of institution etc.
So, that's at the smallest end. And then we have systems, so we have Universities that have multiple campuses and they will sometimes use it for all the campuses within their system and they measure, for example, the third year students - because they want to be able to know where they are either benchmarking, or they'll measure all of their entering students because they want to try to be able to improve their skills. We also have groups that are within the United States, the government-level ones are usually within state university systems like I said, where they have multiple campuses.
But internationally, we've been involved with quite a few different types of institutions as well, so in Latin America for example, we engage with individual Universities. And most of them are looking to measure the growth of their students within particular fields of study. So, they're not just interested at institutional level, they want to know how their STEM students are doing as an example. We also have different states that are testing, again, same model similar, where it's all the campuses.
We did have a participating group internationally, use the assessments for exiting students as a graduation requirement. So instead of, because our assessments are domain agnostic, so you don't, you can be from any field of study and take the assessment and do well you don't have to have prior knowledge in a content area. They wanted to replace content assessments, so assessments in physics and use a similar assessment across all students so that there would be a standardised assessment they could measure against each other, because if you have domain specific assessments, you can't measure across domains necessarily. So that is another use.
And then we have a partnership with the OECD in Paris and they help us with recruiting ministries, so, higher, you know, larger organisations at the ministry and so we have had ministries of education participate with us and in that case, it's less so individual
student focus and more at the institutional level. Like, how well do higher education institutions within a country do? And so it's more of a benchmarking type of thing. It's not comparative, but it's just looking at, it's a snapshot I should say, of how well students are doing.
So it's everything from small schools all the way, like you said, up to ministry level.
MG - Fascinating work. Doris I learn so much every time I speak with you - thank you so much for sharing all of your knowledge today. Lovely to see you again.
DZ - Yes, great to see you, It's great to be here.
MG - Thanks.
Credentials just got personal - Unleash the power of your skills data and personal credentials
Credentialate is the world’s first Credential Evidence Platform that helps discover and share evidence of workplace skills. Launched In 2019, it was initially developed in close collaboration with leading design partner, UNSW Sydney, in support of a multi-year, cross-faculty community of practice and micro-credential research project. Credentialate has continued to evolve at an accelerated pace, informed in partnership with educators and industry leaders from around the world. Credentialate provides a Skills Core that creates order from chaotic data, provides meaningful insight through framework alignment and equips learners with rich personal industry-aligned evidence of their skills and competencies.
Find out more at: edalex.com/credentialate
In this conversation style interview, Dan McFadyen and Martin Bean, Founder and CEO of the Bean...
In this conversation style interview, Dan McFadyen and Nicholas Robert, Co-Founder and CEO of...