Pedagogo
Pedagogo is the podcast for anyone and everyone in higher ed, brought to you by ExamSoft. Tune in for innovative ideas, thoughtful discussions, and expert perspectives to transform your thinking and practice in education and assessment. You’ll come away from each episode with tools, resources, and strategies for success, brought to you by thought leaders, subject matter experts, scholars, and professionals. Join our host and ExamSoft Director of Education and Assessment, Dr. Divya Bheda, for an exciting fourth season, where you’ll encounter diverse ideas and perspectives on building community in higher ed and taking collaborative action to foster student success. Seasons 1-3 explore topics such as change management, democratic education, and the future of assessment, as well as best practices to foster cultural attunement in the classroom and strategies to effectively assess student learning.
Pedagogo
Realizing the Value of Comprehensive Assessment
In our final episode of the season, we talk assessment. How do we determine what students really know about what they know? Join Dr. Allison Case and guest Dr. Natasha Jankowski, Executive Director of the National Institute of Learning Outcome Assessment (NILOA) and Research Associate Professor at the College of Education at the University of Illinois, Urbana-Champaign, as they take a deep dive into establishing impactful practice in assessment. Learn about the four lenses through which assessment can be viewed to add value for both faculty and students.
Show Notes and Resources:
NILOA: National Institute for Learning Outcomes Assessment
Natasha talks about the Spellings Commission and their work in 2008 for standardizing measurement
AAC&U : The mission of the Association of American Colleges and Universities is to advance the vitality and public standing of liberal education by making quality and equity the foundations for excellence in undergraduate education in service to democracy.
TILT: Transparency in Learning and Teaching project which aims to advance equitable teaching and learning practices that reduce systemic inequities in higher education through two main activities:
- Promoting students' conscious understanding of how they learn
- Enabling faculty to gather, share and promptly benefit from current data about students' learning by coordinating their efforts across disciplines, institutions and countries
Intro (00:00):
Pedagogo: the show that brings education to your ears and meta-mastery to your assessments. Today's episode discusses the rapidly changing future of education. Spoiler alert. It may never be the same again. Pedagogo brought to you by ExamSoft the assessment software that keeps security and integrity in your exams while providing you actionable data for your outcomes. For all of the toughest testing challenges. ExamSoft has you covered
Allison Case (00:29):
Welcome to the last episode of Pedagogo Season Two. Time flies when you're having fun, which helps when COVID makes it feel like time is standing still. Today, I get to talk to the Natasha Jankowski executive director of the National Institute of Learning Outcome Assessment. A philosophy major, Natasha jumped at the chance to apply her philosophical deep armchair thinking to education. The result, thankfully for us is brilliant insight, a science and art to assessment and forward thinking to move education forward, tune in to hear how assessment could look in the future and how it has the chance to change everything about what our students know about what they know. Let's get started.
Allison Case (01:12):
So, Natasha, thank you so so much for joining us today. It has been a professional goal of mine to speak with you because I admire the work that you're doing and the voice that you have to speak to the value of learning outcome assessments. To kick us off, I'd love it if you could tell me what do you do every day?
Natasha Jankowski (01:29):
Sure. Well thank you so much. And I am truly delighted to be a part of this conversation. I am always, uh, waiting for a moment to, to talk more about assessment. o any opportunity to engage in that and thinking to the future of it is always so welcome. I serve as the Executive Director for the National Institute for Learning Outcomes Assessment also known as NILOA. And I also serve as a Research Associate Professor at the University of Illinois, Urbana-Champaign in their College of Education. And what I do every day is think about what we're doing when it comes to student learning, really. And what do we need to think more deeply about where do we need to take some moments to pause and what are spaces and places and tools and resources that we can get in front of folks to help them think deeply as well about such an impactful practice as assessing student learning.
Allison Case (02:21):
I love it. I can't wait to get to the answers to all of it or at least touch on the answers to all of those. I think those are wonderful, wonderful questions. I love starting off with baseline questions because I think if we put 10 different people in a room, we get 10 different answers potentially to this, but Natasha, what is assessment generally speaking? And what is it to you and what do you think is the goal of assessment?
Natasha Jankowski (02:42):
Sure. So those are big debatable questions for sure, but I think necessary anytime we get into assessment conversations, I like to talk about assessment in the, the most often cited and sort of agreed upon is that it's a systematic process by which we're looking at is what we're doing, really getting us to the goals and the values and the learning that matter most for our students. And so that involves processes and practices, but it does have a long history of assessment being a test or a measure. And so there's the process piece, but there's also this element of what is your assessment, the, the instrument or the tool. And a lot of times there's a default setting to think about it from a standardized test perspective. And then there's also sort of the ongoing process of reporting and capturing what people are doing for the purposes of compliance. And so we have to report to some folks accreditors and external bodies that want to know, are we assuring quality? Are we doing what we said we would with our students and how do we know? And so there's definitely that accountability function for assessment, but for me, and I, I'm delighted that you mentioned my background in philosophy, it's really about the arguments that we make from the, the evidence we can compile about how we know our students know and how we know what they know. And so really what we're doing is advanced argumentation or even evidentiary reasoning to make the case about where we think our institution has added value to students in their learning and thinking about it from that more argument stance is really a way to think about how do we communicate then about the structure and the design of why we ask our students to participate in things, because we believe it leads them to be the examples of the learning that we really want to see. And so less process, less measure and more, how are we really putting these pieces together to make arguments about our value as institutions?
Allison Case (04:40):
I love that.
Natasha Jankowski (04:40):
Thanks.
Allison Case (04:41):
In some regards I was talking to the team and I said, this woman speaks in dissertation. So I just can't help, but know that there's been so much work that goes into just statements or bullet points like that. Um, because it's so important that we're identifying the real goal, the heart of what we do and why we do it.
Natasha Jankowski (05:01):
I think it's really more the hopes and dreams and student version of our assessment that, that brings us back to you. There's a reason we put these learning goals out there because it's what we value is what we desire for our students that come that are a part of our experience. It's not just sort of this, this mindless act or a research act where then issues of validity and reliability really start to take prominence. But this is about people and about our espoused values and our lived values. And thinking about how do we infuse that throughout all of your educational experiences and opportunities that you have in your time with us when you're attending our institutions.
Allison Case (05:39):
Right. So, so let me ask you this. Why do you think we don't see more humanizing language in learning outcomes if at the end of the day learning outcomes are to communicate our hopes and dreams and biggest desires for our students?
Natasha Jankowski (05:53):
Well, that's a really interesting question. Um, I would sort of argue two different ways that we've ended up where we are in the, the removal or the sanitizing, maybe of assessment to be this very objective measurement only oriented approach. And part of that is our history that we have at assessment of the removal of assessment from faculty work. So there was a time period, and I'll just do a brief history lesson with assessment, for those not familiar, um, where writing across the curriculum, which started actually as a faculty embedded practice of making co-designed rubrics and really thinking about scaffolding and curriculum design was happening in, in a major way in what we would think of what looks like assessment today prior to the 1980s. And it was aligned with grading. Um, feedback was very formative to students where you'd actually get the feedback and be able to use it in faculty conversations around students really drove the work on assessment. In the 80s, you have this time period where assessment becomes mandated as a part of accreditation practice and with that came, but we, how do we know faculty voices were seen as very subjective because grades some grades are about assessments. Most aren't, it's about class participation. It's about student behaviors. Did you show up on time? Did you, did you follow sort of the readings and, and engage in a class discussion board or something like that? Right. And so part of that would have been maybe the learning outcome that I'm interested in, but it wasn't all of it. So we needed something else. They were like, we can't trust grades an A is not the same as an A, and it doesn't mean mastery. We can't trust the faculty then became sort of the narrative around that. We need something alternative to that's very objective. And you have this entrance of very measurement, um, scientific, rational, scientific sort of post-positivist approach to engaging in assessment where we could only see learning and pre posts. We need standardized exams. Um, and we need something beyond the faculty word and outside of that course to tell us. And when you do that, what you've done is remove assessment from the purview of the faculty to become an add on of now, are you looking at me and what I'm doing and you're gonna report on it and what are you going to do with this data? And why don't you trust my teaching? I can tell you they're confused, or I know that they got it. And how do you want me to document that? So unfortunately that gave us two problems. One, how do we get qualitative data into our conversation? And what kind of evidence do we need to actually make decisions, not prove a point which are two different pieces and how do we get faculty to engage in something that we told them we can't trust them in, um, that we need other measures because they can't give us what we need to know. And so you put in a situation like that and now you need, well, we need an assessment professional to come and sort of oversee this reporting process. And at that juncture, there's no space for hopes and dreams.
Allison Case (08:49):
I'm so glad you shared it because it really contextualizes ,one our conversation, and two maybe some of the feelings associated with assessment, both from faculty and from students.
Natasha Jankowski (09:03):
And then you have students most of the time who were unaware because it was a process of checking faculty and checking programs or programming on a part of student affairs in units. So it wasn't, how did you, as a student know where you are in your learning path, that was not even a part of that consideration. So at NILOA, we do national surveys on a sort of rotating basis to get a handle on the movement of assessment work over time. And we are seeing a return to embedding assessment into the curriculum, having very authentic assessments in the form of assignments and really that faculty ownership piece in partnership and engagement with our students as co-learners is starting to come back. And so I'm very pleased with that return to the roots of assessment.
Allison Case (09:44):
Right? Wow. Your answer just made me realize is that perhaps for awhile assessment data was almost seen as only data going out from the classroom out to others. So to accreditors, to administrators, to universities, but not use this feedback with it.
Natasha Jankowski (10:03):
That's so true. And also created a two-tiered system of data collection where you had data that was being collected at an institution level for reporting out that was not being used to inform program and then program level work that was going on that wasn't collected and talked about, but that's where improvement was happening. And so then you have these tensions of compliance and improvement, which is lost when we think about assessment as a practice literature, if it's about how do I do it, then it's not what organizational structures are we creating, what approaches to where learning happens are we supporting by that structure that we create. Yes, yes. Because we get into the doing and not thinking about, but what are we signaling and how we go about doing this work?
Allison Case (10:48):
That's right. So, Natasha, I heard you say that you see this resurgence of, um, looking at liaising with students and teachers as co-learners, um, and getting back to this in-house and in classroom assessment that communicates to the participants. Um, what do you think is driving that resurgence and where have you seen that work?
Natasha Jankowski (11:10):
I think it is, as we look at this work over time where you have an influx of standardized tests, and then it goes back to very embedded classroom approach. And then you have like the Spellings Commission in 2008 that said, you know, we need some sort of standardized measure to really look at this. And institutions picked up standardized tests and then went, Oh, we don't really know what to do with this, or how to make change in our curriculum. And we, and we go back. So some of it is that ebb and flow. That's natural to do assessment practices. We think we know what we need at different points in time, but some of it has really been hard national work on the part of various organizations to bring this back to faculty. Um, AAC&U did a value project, which the valid achievement of learning and undergraduate education. They brought groups of faculty together to say, let's create rubrics that we collectively design around learning, and that we can talk about as an institution level, but that the evidence of which comes from your courses. So we can start to say, how are we aligning? What's going in, in your course to really think about how that connects to the degree. Then you also have some national conversations on degree frameworks. Like what does it even mean to have an associate's degree or a bachelor's degree? And where do we know that that's happening in the curriculum or the co-curriculum? What does that even look like? And so now you have people going back trying to map and align and say, well, I need to understand how this assignment rolls up and connects to these larger learning and the developmental trajectory we have our students on. And now you start to move into conversations that we have around, “Well, if we do all this mapping in this alignment, we can actually think about our data in a very different way, and we can categorize that data differently.” So how about if we made comprehensive learner records, where instead of saying the list of courses, students took, it's actually by the learning outcomes that we're going for, and it's a collection of all the places and spaces in which that learning happened with the evidence of it there, which you couldn't do. If we hadn't done sort of the mapping and thinking about rubrics at all of these things. So I think it's a confluence of a variety of mechanisms.
Allison Case (13:16):
Let me ask you this. What, in your opinion, what do you think distinguishes that type of administrative connecting the dots from the type of administrative connecting the dots that changed assessment for the worse?
Natasha Jankowski (13:30):
And so I think there's a couple ways to think about it, but I'll hone in on two. Um, so one is that there's an administrative level of crosswalking, which I want to make an argument it's different from mapping, um, where I can say as an administrator, we have these institution learning outcomes.
Allison Case (13:46):
The Rosetta stone.
Natasha Jankowski (13:47):
Yes, exactly the translation piece needed between them. That has no impact on what's happening in each of those spheres. What it allows for me as an institutional leader to see is what's happening across and be able to pull in compile data to say, when we think about leadership as an institution, where are we at? Where's it's happening? How are our students doing? Um, but I'm not asking anyone to change their practice.
Allison Case (14:11):
I see, I see that's the piece. Data gathering versus dictating or prescribing behavior.
Natasha Jankowski (14:18):
Yes. And on the mapping side, we try to really argue and we had done some projects looking at different ways in which institutions went about and the faculty within and staff mapping learning and what worked well and what didn't. And where did people get into checklist mentalities. And where was there that shift to the sort of communal understanding that we've been talking about? And it was very clear that if mapping was seen as an administrative burden, then a department chair would say, all right, I've got this Excel doc. We have these classes, everybody send me, you think your class does and I'll compile it and like, turn it in. At that point, we have no idea of students really feel like they got that from the class. If faculty all understand and agree, you know, that when we say critical thinking, we mean that, and I could have said my class does it, but I only talked about it. I don't have an assignment in it. Um, so the ways in which we decide to put it on a map could be completely different. And so we have a lot of noise, but where we had departments that got together and said, okay, we've had these learning outcomes. What are we talking about when we say critical thinking and how are you doing it in your class? And are you talking about it or building off of it, or are you actually assigning something? Are we struggling with the same things in our students, in our program in our critical thinking, if we have adjuncts teaching these courses, have we told them that that's what this course is doing? And do we all agree that that's the role of this course in our, um, in our map and do advisors even know to tell students like, Hey, these are some of these roles. And so it actually became a consensus building exercise for what do we believe as a program and what are we all doing? It's a sharing of our practice. And when we have something where faculty get together and share their practice, now you get into conversations of teaching pedagogy and assessment that's embedded in our day today. And I can use this and we can keep coming back to it as programs and having conversations and saying, well, they're struggling in this. That's these two classes, these two assignments, let's look at that and narrow it down, but also roll back up.
Allison Case (16:22):
Well, and there's buy in and ownership on behalf of the faculty for what those things mean and what it means to lose them. Wow. I shouldn't be surprised to hear you say conversations are an enormous piece of this, but conversations are where a lot of the hard work in academia, um, happen.
Natasha Jankowski (16:40):
Yes.
Allison Case (16:41):
It's just hard to create space for those conversations and to continue to have them and not get weary. I think these are deep thoughts and in big conversations to have. In our previous conversation, you'd mentioned the four lenses of assessment. It was so eye opening to me. I would love if you could backtrack just a little bit and talk about the four lenses of assessment before we talk about the future of assessment.
Natasha Jankowski (17:05):
Sure. I'm very happy to do that. And we've touched on a couple of them tangentially. We like to talk about this as four main lenses of how people view assessment in terms of what it is, what it isn't and what assessments should be doing in higher education. And so, um, think about this as what makes the most sense to you? What sounds like, Oh, that's how I think about it. And also think what sounds like my colleagues, and this is very helpful. So that as you engage in some of these conversations and dialogues, you don't talk past each other. But the one that we hear most often is that measurement piece, that assessment really is about measuring student learning that our conversation should be about finding the most appropriate, valid, reliable measure. Um, if you hear conversations about, well, that N size is too small, so we can't do anything with that data. You're probably talking to a measurement person, and so thinking about that, if the role of assessment is like, we have to have a pre post to prove that that learning has happened, that we need to have, uh, information on how we did the sample and all of the questions that are around measurement. If that drives entirely the whole focus, that's a place where some people live and you can only see measurements at the expense of everything else. The second one that we hear most often is that compliance mentality. And this is really that the only reason that we engage in assessment is because somebody told us to. Accreditation, um, state governing body, whatever it might be. Somebody wants us to submit a report. So we're going to submit the report and never think about it again. This is where you'll hear things like ‘this is divorced from my teaching and learning’. This is an add on if you have a system where you submit a bunch of data, but you don't know where it goes, that might be a sign of that compliance system. And most of our assessments is there. So don't feel bad. You're thinking like that “that’s me”. Most of our assessment systems have been a compliance one, and they're doing exactly what we intended in both of those. Nothing is about students. There's none of those hopes, dreams, and values. There's no faculty in that other than, oh, I'm submitting a report, um, which is not what we want for assessment. So the last two are really where each of those live. And the third one that we hear is really the teaching and learning lens. And that is the one that's driven by faculty questions they have about their practice is what I'm doing, really getting students to where I want them to be. If I change it in this direction, does that help or hinder? Um, how am I thinking about, um, communicating meaningfully the intent and design and how are we getting together to have those curriculum mapping conversations? So we can agree as a program, what we're going for. So that improvement process is actually an embedded part of our, our scholarly and, um, regular conversations and dialogue. And so that one is really where that faculty live and thinking of, and also student affairs, staffs, and units that say like, if we're offering these programs and offerings, is it doing what we intended and how do we know, and what would we look for and how do we want to align what we're doing to really make sure that we get to where we want our students to be. Now you're starting to get into some hopes and dreams. So that's the third one. The last one is student centered, and that's really where the students now play a role and come in. And I would say, this is the most emergent of the four that we we've been hearing over the lowest time. And I am delighted that it's on the table as a conversation. So you have this history of assessment where assessment was done “to” students, not “with” students. And they're like, what do I need to do? Just tell me what I need to do to get an Aand pass. And our students become our faculty in a compliance mentality. They're like, what do you want to see? I'll just give that to you. And then I can go about my day. This is an add on, and I don't understand it. So student-centered really brings the students in to say, Hey, you are agents of your learning. You can co-create. And co-design, how are you thinking about it and being aware of the learning that's happening in these different spaces and how are we talking about learning meaningfully beyond just the first day on a syllabus, but really getting into thinking about when I asked you to do this it's for these reasons you could apply it over there. And how are you thinking about this as a student and where are you in that learning journey? And so this is a much more, um, embedded student partnerships, student agency, and really thinking that the focus should be about how are we empowering students in their learning? They're the ones that do the learning and demonstrating and transfer it and all of that. And so how are they an active, integral part in this work?
Allison Case (21:42):
Wow. If that's not a reason to get up and go to work everyday, I really don't know what else is. They, they are these vessels carrying all of this learning to go and do all, you know, the next generation of things. So I think you might be touching upon some of, some of these characteristics of, um, value added assessment just by naming and pointing out and bringing awareness to this skill in addition to the procedure or the mechanism, is that right? And if so, will you talk to me a little bit more about value added assessment versus traditional assessment?
Natasha Jankowski (22:23):
So in the value added assessment, what we're really doing is, is there should be value beyond this assessment for both our faculty and our students and assessment there I mean, whatever thing I'm asking you to do with that. So an assignment or a test or a do a debate or whatever. So beyond you turning that in or doing that, there should be some sort of lingering impact of some value add for, for engaging in it. And that value add is not just, oh, look, you moved up 2% in your competency towards this goal, but that I understand how that developmental sequence of learning over time and the interconnected nature of my learning is unfolding. Our students are working, they are living, they are caring for family and learning, surviving, you know, trying to thrive all at the same time. And we act as though a lot of times in our traditional assessment that we can block all of that out or create a space where we can control. And so the only thing that you're doing is education. And we look for evidence that counts to us is academia and not evidence that can be transferable to employers or to a student or to a family or to your life and the moment and the, what you're experiencing and the hardship you're going through it. Then there's this very disconnect. This sense of, if you aren't just doing education, then you're not doing enough or you're not doing it right. But most of our students have these very complex, multifaceted, layered lives. And none of that's captured unless we really think about what's the value add of integrating our students, their lived experience, their culture, their background, how they view what counts as evidence into our assessments. I don't, as a faculty member have to figure this out on my own. My students have been in education for years. They have had a long history of assessment, especially those that have coming from a K-12 system that had that experience and put forward, like, here's this learning outcome I'm going for? What do you think is evidence of that? Where have you done this in your life? Think about your day and where have you flexed this, this skill set and this knowledge and what, what did it look like? Take some pictures, go out and share who you are as a person in how you were thinking about your evidence base that you're building because the student is the carrier of it. And that sort of student is an integral part as an active participant, the integrating of their lived experience into the ways in which we think about what counts as evidence of learning and really thinking about how we communicate all of that work clearly is definitely part of that value added. And that's not super time consuming to you to have those kinds of conversations. And really, I think liberating for both students and faculty as, as it goes, the only piece that I would add a caution to it is that it doesn't have the desired impact, but beyond being well, that was a really great course. That was a really great opportunity or as a really cool faculty member, if it's a one off kind of thing, right? It needs to be a programmatic institution decision to say, what is our role in supporting students? And we've been putting it on them this whole time. Like, you're not doing enough, you're not doing this, but how are we enabling and creating and moving that and how are we going to ensure that every time you have an interaction or an opportunity, it is educational and it is a value added education, not a, not a negative takeaway.
Allison Case (25:51):
I'm so glad you clarified that because my immediate feeling is invigoration followed by fear. And I think it's because the idea of individualizing learning in this way to an already overwhelmed faculty member can seem like it's just too much. But what I hear you saying is we're not talking about individualizing necessarily our final exams or our midterms or high stakes assessment. We are individualizing the day to day and we can still have those high stakes summative check-ins that are for compliance and serve for objective measures. But where we're potentially missing opportunities is in those day to day lessons, formative assessments, um, and contextualizing what we're teaching there our students can have opportunities to individualize. And that's not to say that you can't write a midterm, it allows individualization and we're contextualizing in a way that invites relationship and deep learning and connection day to day in and out.
Natasha Jankowski (26:56):
Yes. And you can apply it with some of those very traditional assessments. So nursing is a great example of this. There are licensure exams to which you need to sit and pass. And so that's not going to change. We can't sort of change that outer system as higher education people, but I can tell you when you're in a nursing class, the reason that I put these four questions on here, and the reason I'm having this test is because this is exactly the structure you're going to run into you on your licensure exam. And so let's engage in that, helping a student understand why, what do I do with this? Why are you asking me to do this? It's so beneficial for motivation and engagement and has been proven in research time and time again, if I don't know why you're asking me to do something, am I even trusting as a program that I'm getting good data on your ability to do it?
Allison Case (27:42):
Wow. Can you give me two specific examples of how you've seen this done?
Natasha Jankowski (27:46):
Sure. So there's actually a whole project for those that are interested in this called the transparency and learning and teaching project, it's TILT, that got a National Science Foundation grant. And I think there are to, to look at, if we get clear in our prompts about that, why, what happens to our students and their learning, but what they did is they asked faculty to say, take a look at your assignment prompt. So when I give you an assignment to do, take a look and see how clear is the task. So what is it that I'm actually asking you to do then there's the why what's the purpose? So you've got the task and the purpose. So it's, why am I taking this exam? And that's a sentence that can say the purpose of this is to help prepare you for taking your NCLEX exam. And the questions are designed to feel similar. And the reason we're doing two hours is so that it mirrors the structure so that you feel like you're getting prepared to sit for your professional licensure exam. And then the criteria is, how am I going to judge if you've done a good job? So it's task, purpose, criteria. And that criteria could say in class, you have to have a certain pass rate to be able to get certified. So we're going to mirror that pass rate for this test. And that's why I've chosen. This is, is to be part of that mirror. But now, you know, what you're going for is your goal in, in that space. And that just three part chunk in an assignment prompt, the task, the purpose, and the criteria by which you're going to be judged as successful was able to close equity gaps for students, because they actually understood what they were being asked to do, why they're being asked to do it and what it would look like and what their sort of goal was that they were going for.
Allison Case (29:16):
What are some building blocks to begin transforming our assessments into ones that bring life and, and humanize our, our, um, content and engage students and make a deep lasting impact where when they leave college, they can say, I know what I learned. Yeah.
Natasha Jankowski (29:35):
So I think some of it is thinking back about individually, what was the most impactful learning you can remember and what about it was impactful? And so I believe in sneaky professional development, where you walk away from something going, Oh my gosh, I just learned all these things. I didn't know that that's what I came there for. So while I was just advocating for transparency and clarity, I also think sneaky professional development has a role to play, but I do think there is a needed first step of that internal reflection and conversation. And what am I personally frustrated by? Is it that, that I keep changing some assignment or added things to my syllabus, but I'm not seeing what I want to see from my students is that I keep giving them the same feedback every year. What, what is it that's frustrating to me? And then connecting that frustration to thinking about what was that meaningful, impactful learning experience I had and starting to do some of that self work and thinking through in there. The other thing that goes with that is thinking about who is, who's my collaborative teams that maybe I'm not thinking of, because I have been sort of doing this alone in my pocket and not going out. Um, we had done assignment charrettes, which are like fancy assignment design conversations. Yeah. It sounds way cooler to term from architecture education. Um, and you want to come to a charrette. You're like, all right, check it out. Right. But what we got faculty together to peer review each other's assignments and really on a, on a structured group of questions and give each other pedagogy and assignment feedback, and people have been able to do it remote as well. But we did one with adjunct faculty and we did one with students and with the students, we gave them, here's the learning outcome. What types of assignments do you think would meet this? And the things that they came up with that were the projects they wanted to do was just amazing. And the adjunct faculty would say, well, here I brought three assignments that I go at each of these different institutions. And I can tell you which one actually works for my students from these different assignments, but I have no ability to feed that information to someone else. And so thinking about who is my team to help me rethink what we need to do, and what's the role of adjuncts. What's the role of students? What's the role of our advisors? Because a lot of times we put this on faculty, faculty, faculty, faculty, and as we were talking, this is not something that you can solve alone.And so if we think rethink who our team is, we might get some really different, innovative ideas on how to solve it in play, and also have a much more coherent communication to our students because we've gotten people on board differently. And then for safety, professional development, it's create opportunities and spaces to have the conversations that were happening and having now that don't have the assessment in the title, or learn how to submit your report or, um, you know, how to engage with your LMS in a more meaningful way. But instead are those big questions that are like, Oh, where are they to go with this? What is that about? And there was big teaching questions to get faculty attention because our timing.
Allison Case (32:45):
Something provocative.
Natasha Jankowski (32:46):
Yes. And then invite into dialogue and engage in that without the stance of I'm going to tell you how to do it, um, are really sort of the building blocks for an institution view to, to do it from a faculty view. I believe in, you know, this is a learning and growth opportunity for the typeface as well. So you don't have to plan for everything. And that's, that's okay. It's important to remember that assessment is a discipline and a scholarship. It's not a fad. It's not an administrative burden. There are debates, there's literature. There are conferences, there are associations, there are journals that are written about assessment, and no one is expecting you to be an assessment expert, but there are assessment experts. And if you want to engage in conversation, they are dying to talk to you and they will be, yes, they're ready. Yes. There's a huge literature base from which you can pull to answer these questions and engage in it. And it's not, um, it's not as though it's, it's just started and it worked, Oh, how did we get out of our way?
Allison Case (33:52):
But eventually we knew how to do it well, and my goodness, if NILOA, hasn't shown us that that's the case, there's just so much information available there. That's wonderful. Thanks. Well, Natasha, thank you so, so much for your time today, I have a hundred more questions for you, and I think we could do a hundred more episodes, but, um, I just love what you've shared with us today because it's such a relatable message. And I think, um, you know, when you hear truth, it rings. And just so many of the feelings and frustrations that faculty are having and students are having in so many of the hopes and dreams of students and faculty alike, I think can be addressed in these simple fixes. And I think that these types of assessments and value added assessments align with who we are and why we got into education in the first place to make an impact. So I just am so excited for our listeners to start down this path or continue down this path. So thank you so, so much for your, your insight and your time. Um, it was just such a pleasure.
No, thank you so much for having me and delighted to share in our conversational journey. So thank you so much for the discussion.
Allison Case (35:09):
That's it for season two of Pedagogo, we hope you've enjoyed this season dedicated to equity and enablement and higher education time flies when you're having fun. And this season has been a blast. I've learned so much from hosting this season. We learned about the importance of understanding our in our students' culture, how to build a sustainable online classroom that offers an equitable experience to students on the ground. The role of trauma informed pedagogy and inclusive assessment and deep learning, the principles and importance of enablement minded leadership in academia. And today Natasha closed us out discussing the future of equitable and enablement minded assessment. I've loved every minute of bringing you new thoughts and ideas from subject matter experts and peers in a way that's understandable and applicable to your lives today. I've grown personally and professionally from my interactions with my brilliant guests. And I counted an honor to have shared time with them. And with each of you stay tuned for season three of Pedagogo coming soon. And while I don't want to ruin the surprise, next season, Pedagogo will have a brand new host bringing you whatever's hot and happening in education. Be sure to tune in, to hear your new host and to discuss new topics and trends, Alison case signing off, education nation.
Outro (36:19):
Pedagogo, brought to you by ExamSoft the assessment software that keeps security and integrity in your exams while providing you actionable data for your outcomes. For all of the toughest testing challenges, ExamSoft has you covered.
Speaker 2 (36:35):
This podcast was produced by Alison case and the ExamSoft team, audio engineering and editing, right that of Carsten and the eight UK productions crew, including me, you're the car step. This podcast is intended as a public service for entertainment and educational purposes only, and is not a legal interpretation nor statement of exams off policy products or services. The views and opinions expressed by the hosts or guests of this show are their own and do not necessarily reflect the views of ExamSoft or any of its officials, nor does any appearance on this program imply an endorsement of them or any entity they represent. Additionally, reference to any specific product service or entity does not constitute an endorsement or recommendation by ExamSoft. This podcast is the property of ExamSoft worldwide and is protected under U S and international copyright and trademark laws. No other use, including without limitation, reproduction, retransmission, or editing of this podcast may be made without the prior written permission of ExamSoft.