Evaluating transfer of learning and mindsets is difficult even for the best in assessment. This month, we speak with Saadeddine (Saad) Shehab of the University of Illinois about his goals to understand how workshops using human-centered design transfer into skills and experiences later in and beyond students’ academic careers. The panel discusses the challenges of this type of assessment and considerations for both near- and far- term transfer.
See our full episode notes at https://www.centerforengagedlearning.org/transferring-mindsets-from-design-based-environments/
Dhvani Toprani hosts this month’s episode with her longtime colleague, Saadeddine (Saad) Shehab from the University of Illinois’ Siebel Center for Design. As the Director of Assessment and Research for the Siebel Center, Saad wants to identify strategies for near- and far- term assessment of the transfer of learning and mindsets in participants in their center’s workshops. Our panel included Sarah Bunnell, Director of Elon University’s Center for the Advancement of Teaching and Learning, Adam Kanowitz, a senior Entrepreneurship and Innovation student at Elon University, and Dawan Stanford, President and Founder of Fluid Hive, a consultancy for using human-centered design for innovation. The panel discusses the challenges associated with measuring transfer and offer some ways to align assessment with goals.
This episode was hosted by Dhvani Toprani, edited by Matt Wittstein, and produced by Matt Wittstein in collaboration with the Elon University Center for Engaged Learning.
Limed: Teaching with a Twist
Season 3, Episode 3 – Transferring Mindsets From Design-Based Environments
Matt Wittstein (00:00:11):
You are listening to Limed: Teaching with a Twist, a podcast that plays with pedagogy. Before we get into this fabulous episode, I want you to know that we are actively seeking guests for the podcast that have unique teaching challenges or opportunities that they would like to explore. You can email me directly at mwittstein@elon.edu, that's mwittstein@elon.edu, or find more information on our website, www.centerforengagedlearning.org/podcasts. This month, Dhvani Toprani hosts one of her longtime thought partners from the design thinking space. Saad Shahab of the University of Illinois at Urbana Champaign. Saad is in charge of assessment for their Siebel Center for Design and brings questions about transferring knowledge and mindsets in both the near and far term. Our experts include Sarah Bunnell, director of Elon University's Center for the Advancement of Teaching and Learning. Adam Kanowitz entrepreneurship and project management major at Elon University and research lead for the Center of Design Thinking. And Dawan Stanford, president of Fluid Hive, a design studio that helps people think and solve like a designer. The conversation discusses some of the challenges and proposes some solutions to assessing learning from design thinking and human-centered learning programs. Enjoy the episode. Here's Dhvani Toprani.
Dhvani Toprani (00:01:59):
Welcome to the podcasts. We are so excited to have you with us.
Saad Shehab (00:02:03):
Thank you. Thank you so much, Ivani. I'm very excited to be with you too.
Dhvani Toprani (00:02:07):
So Saad, tell us a little bit about your teaching and learning context and what is your primary role as the associate director of assessment and research at the Siebel Center for Design?
Saad Shehab (00:02:17):
As mentioned, my name Saad Shehab, I am the associate Director of research and assessment at the Siebel Center for Design. So basically myself and my team are responsible for running the research studies and the research support basically that goes into our offerings at the Sima Center for Design. And by offerings I mean the courses, the workshops, the seminars, any learning experiences that we design at the center to help our students and our instructors learn more about design thinking and human-centered design and their possible applications in problem solving basically.
Dhvani Toprani (00:02:57):
That sounds very interesting so that you're trying to teach and practice using design thinking as a philosophy in your work. So what challenges are you grappling with in this process?
Saad Shehab (00:03:09):
Teaching with about and through human-centered design and design thinking is not easy at all. I mean, there's a lot of literature that realizes and points at the applications of design thinking in education and it's role in solving educational problems of practice, but it does not translate as easy as it sounds to whenever you decide to actually use design thinking in your work as an educator and as an educator or a teacher, you have options here. So you can teach with human-centered design and design thinking, meaning you yourself are going to use design thinking to do the teaching work, to design a course to solve a certain education problem of practice. You can teach about design thinking as a problem solving approach and this is where you need to be a believer of design thinking and see it as a problem solving approach that can really help the students make progress towards solving an authentic real life problem.
(00:04:13):
And you can teach through human-centered design, meaning that you see human-centered design as an engine or as a vehicle that can help the students learn the applications of certain concept or content or subject while engaging in the human-centered design process. I think even more important than all of this is the mindsets that are associated with human-centered design. So when you engage in human-centered design experience, you are going to be applying a lot of human-centered design processes such as empathy or brainstorming or prototyping. These are super famous, right? But as you engage in these processes, we have evidence from our own research and evidence from existing research that show that individuals who engage in human-centered design tend to develop important mindsets like collaboration, communication, creativity, human centeredness, experimentation, and all of these are in an alignment of a trend in education, which is the competency-based education or skill-based education.
(00:05:22):
And this has been a hot topic over the past 10 years, and we need to also teach them those skills and mindsets which are very relevant to any workspace after graduation. This is the key challenge right here, which is how can we assess that students are actually developing these mindsets as they engage in human-centered design and design thinking, learning experience. Not only that, the other question or challenge will be, we know from research that these mindsets take a long time to develop and one learning experience is not enough or two learning experiences maybe are not enough. So the question become can we track the development of these mindsets over time? Right? Can we assess or capture the transfer of these mindsets into new situations? Because if we're claiming that we need to teach our students about these mindsets during their undergraduate level or graduate level four years, let me say, of higher education, will they be able to take these and actually apply them when they take a job elsewhere after they leave data?
Dhvani Toprani (00:06:33):
So having worked with you, I know a little bit sad that you work both with students to teach things, you work with faculty members to help them teach things. When you explain this problem context, who are your primary learners and how are they different if your primary learners are different audiences?
Saad Shehab (00:06:52):
So it really differs because when you're talking to educators slash instructors slash faculty members, you are trying to help them really understand, first of all how to teach with about and through human-centered design. And there is a educators workshop series that I lead every semester, which is composed of five virtual one hour workshops that are given or taught for educators to help them first understand what design thinking and human-centered design is. And then second, how can, for example, they create a design challenge that is going to engage their students in design thinking and human-centered design. And there is one actually workshop that is specifically focused on facilitating students' engagement in those design challenges and then assessing students in those challenges. And one of the things that we focus on a lot in that workshop is the assessment layer and then how it is very tied to explicit learning objectives that you need to write around design thinking and human-centered design.
(00:07:58):
It's not enough just to say that I'm going to be using design thinking as an approach to teach in this course. You need to set down and write set of objectives that are associated with this. And these objectives can range from students learning a specific process related to human-centered design, or it can be students starting to develop one or more of the key mindsets that are associated with human-centered design. And then this is where you need to make a decision on what assessment tool I'm going to be using in order to capture students' development or students' learning during that experience. So in this case, the focus is more on how to assess and the process of it. When we prepare though, or when we teach students about human-centered design, we use some assessment tools ourselves. And given that I come from a research background and I'm a researcher by training, many of the tools that I use sometimes are more on the pre-post survey side of things.
(00:09:01):
Sometimes if I'm very interested in measuring that near transfer piece, which is did the students really learn anything from this experience that will allow them to think using these things? And this is where we give usually a performance task at the end and ask students to submit a written document or sometimes a video or any form of artifact that can showcase if they really, they're going to be using any of the learnings into answering this prompt or indepth performance path. So that's basically what I do in both teaching domains for the instructors, educators, and for students.
Dhvani Toprani (00:09:41):
What you're doing is very layered and nuanced Saad. So what I'm thinking is what kind of assessment has been most successful in capturing learning so far for you? I heard you say pre-post testing. So are there any limitations or barriers that you have encountered while using that kind of assessment and what is it that you are looking for more in this assessment?
Saad Shehab (00:10:05):
From my position as the associate director of research and assessment, one of the things that I care a lot about is the generation of an impact report because at the end of the day, anything that our academic programs team offer to the students in terms of courses or workshops or seminars or anything, we need to be at least able to capture the impact of that learning experience on some learning. And as I mentioned, my team kind of provide the frameworks and the assessment tools and the literature that support any learning experience that our academic program team usually design, develop, and implement. So with that being said, and to be able to capture the impact report, I and my team I think came up with a very nice at least template or structure for pre-post where we have of course a section that allows us to collect some information about the student's demographics, a little bit about where they're coming from, their major, all of that stuff.
(00:11:09):
And then there is a layer or a section that ties to some learning outcomes that we believe we can measure or we are interested in measuring at this moment in time, like let's say their understanding of design thinking or human-centered design. This means this section needs actually a question which is define design thinking or what do you think design thinking is? What processes do you think design thinking entail, right? And then if for example, we care about mindset development, this is where we have found in the literature some valid and reliable instruments like Likert State instruments and things like this that we can include in this section. And then the last section and the third section will be to collect some on the experience itself because this will help us and help the academic programs team iterate on the experience and make the changes that people are recommending.
(00:12:03):
So with that being said, these three sections that we include in our pre-post tests at least help us generate an impact report that starts with describing the experience and how it was developed, followed by the results that we get from running pre post. And that's where your first challenge comes in, which is you need some people to help you do this, number one. Number two, you need the students to complete this. And as you know, surveys are very hard to complete, especially when you send them to students, let's say after or before the workshop even sometimes if you built time for the students to actually do the surveys as part of the learning experience, it's sometimes hard to get it done. And then a challenge also, which is key is when you sometimes to be able to really capture the impact on the learning outcome, your open-ended questions are the best bet because when students report or self-report, sometimes they report high in the post-test because they want to make you feel good or they don't want to be rude or they want to make you feel that, oh yeah, it was the best experience in the world.
(00:13:15):
But when you ask them an open-ended question like define design thinking before and after, that's where you are able to capture, oh, are they using more design thinking language or terminology post compared to the three, et cetera. But as you know, Dhvani, and I know you are a researcher yourself, this requires coding and manual labor that sometimes take hours and sometimes there are some offerings here at SCD where more than 150 students are engaged in the learning experience. So how much time is it going to take you to take each one of those responses and analyze it and do a pre post comparison? So that's a major definitely challenge to that. So these are the challenges I would say a summary of the challenges that come in that layer of can we assess or measure first? My answer is yes, but it's challenging In light of the challenges that are reported, the bigger challenges are can we track those students if they take other experiences and can we track the development if it's happening per experience and how they are building on the learning over time.
Dhvani Toprani (00:14:24):
So I'm curious to know for how long are you able to engage with your learners? So are these students who are with you for four years and do you continue to have connections with them so that you can go back and run some assessments or is it just at four years period of time when they're in school, when you have access to them? So if you're thinking of designing long-term assessment, how long do you have access to these learners?
Saad Shehab (00:14:50):
That's a great question. So it depends on the length of the program that the students are engaged in, right? Because if they come from let's say an engineering program and they are undergraduate, we're expecting to have at least access to them. And by access I mean via email. Because usually when they take any learning experience with civil for design, even if it's a short one hour workshop, we collect their email addresses and ways to contact them via the pretest. So I'm not worried about this part, but going to your question, if the student an engineering student who is first year, then this means in the next four years, we can definitely talk to the students somehow, somewhere or reach them in order to measure any kind of or do any longitudal study with them. If they are more in the humanities social sciences programs, maybe it's three years.
(00:15:44):
So I assume that between three and five years, depending on the program, that's how much access we have with them. Now the challenging part, let me add on that, is when the students leave, because this is where it's so hard to track them that I don't know about your university, but at our university, after six months of them leaving the university, their email account at least stops. We can't use the same email, we don't know where they are. We don't know if they have allowed the university to transfer any email through that address into their new address. I don't know what students do, but this is what becomes even very hard to track them or track, get in touch with them. And that's why the far transfer as I like to refer to it piece is super hard to capture. And by far transfer, I mean what did you do with the learnings that you got from our courses or workshops beyond the university?
Dhvani Toprani (00:16:43):
What value does a long-term assessment add to the work that you're doing?
Saad Shehab (00:16:48):
I think this is a wonderful question. I think it adds a lot of value because it validates this entire claim of these mindsets and processes are useful beyond any problem you solve at the moment or content you learn about them in. So it's very important and I think we have succeeded, I would say to do a study or two where we were able to follow the students like eight months or even one year after they took a 16 weeks course with us, that integrated design thinking and human-centered design in what they were doing, the results were very interesting of course, but I can tell you easily that not a lot was recalled from what they have taken with us. And I'm talking about the 16 weeks course, which is a kind of pretty good time. But we know again that these mindsets and processes need a lot of time to develop. But it was nice to capture the data. It was nice to know at least what they recalled, what is still existing, what they still are trying to learn more about and apply versus I don't know what you're talking about.
Dhvani Toprani (00:18:02):
This is a very fascinating problem Saad. So it sounds like you are looking for an approach to design assessment for learning experiences that are invisible and aims to focus on mindset shifts for learners, isolating the impact of the design. Learning also seems to be an ongoing challenge that you're grappling with in this process. And finally, an assessment approach that is not very laborious would be an ideal one for you based on what you have said so far. I think that's a pretty fascinating problem and I'm very excited to see what our panelists have got to say about this. So thank you so much art for being with us today.
Saad Shehab (00:18:45):
Absolutely. I am excited to learn from the panelists that they think, and as I mentioned, I love the framing of all this in that trend of competency-based education and trying to step a little bit away from content-based education because as we all know here, I think the content is very accessible to everybody these days and it keeps getting more and more accessible. But what really matters at the end of the day are the mindsets and the skills that you have to solve a problem. And problems can be solved on daily basis. They can be personal, they can be work-related problems. So the more I think you are equipped to have these mindsets, the better you are going to excel at what you do. But I really don't believe that we really have a good research supported tools, let me say, or at least frameworks that can inform how do we do this assessment and how do we track over time and how do we track beyond the university experience. So I'm really, really very excited that I was able to communicate this with you and I'm really looking forward to hear from the panel.
Dhvani Toprani (00:20:00):
Welcome panelists. I'm really excited to get this conversation started. Let's go around the room to introduce ourselves to our listeners. And as we do that, think about a time as a learner, what does it look and feel like when you are applying what you have learned in one context to a completely different context? Are you usually aware that you transferred learning between context?
Adam Kanowitz (00:20:24):
Hi listeners, my name is Adam Kanowitz. I'm a fourth year student at Elon University where I study project management and entrepreneurship slash innovation. I work at Elon University Center for Design Thinking where I'm in charge of senior leadership and I also serve as workshop development where I edit our content and workshops we're giving to students. I would say as I am moving on out of college, the context in which I apply things is becoming much more common, but I would say that oftentimes it is kind of an unconscious feeling. I think that the application tends to be pretty unconscious, and I do recognize it After the fact or in the middle of the moment, I was having a conversation with one of my close friends about just workplace, how can it be better in communicating with leadership? And I realized one of the styles of communication I was talking about was actually something I picked up over the summer through one of my internships and I said, wow, this is actually a useful tool that I figured was pretty cool when I learned it, but I didn't think that it would be applicable. I find myself in those situations pretty frequently, I'd say.
Sarah Bunnell (00:21:36):
I really appreciate that, Adam. My name's Sarah Bunnell. I'm the director of the Center for the Advancement of Teaching and Learning at Elon University and an associate professor of psychology. And the question divan about what it looks like and feels like feels really important because there is this affective moment when you have what Adam notes as a realization that transfer has actually happened. I think most of the time when that happens, it's quite implicit. We're not aware of the transference occurring and when we are stretched to make those large leaps of transfer, it often leads to, at least for me, an aha moment of wow, there was relevance and connection that I could draw on this particular circumstance. I'm thinking of myself as a young learner when I had first learned about geometric principles and ratios and then we were working in the kitchen and trying to figure out the height that a particular shelf should be based off of a picture and I could use the ratio of the shelf on the image to calculate where the shelf should be located in the kitchen itself. And I just felt like the most awesome competent learner in that moment and so appreciative that I had some skills that I could apply to that situation.
Dawan Stanford (00:23:03):
Sarah, that kind of reminds me of my early days, sort of formally learning about design thinking, human-centered design. My name's Dawan Stanford. I am the president of Fluid Hive, and I also had the wonderful privilege and experience of spending a couple of years at Elon launching the Center for Design Thinking. And I think back to really when I started to, I was doing design in terms of building websites and other things, but it was when I started to get into like, oh, wait a minute, what is the literature saying? How are people teaching this and launching my business? And I started to find moments where I couldn't shut off my eyes. I would look at a door and say, oh wait, that door has been designed poorly. I would see a sticky note on the back of a monitor, not see a sticky note on a monitor. I would see either software that had been designed poorly or a workflow challenge. I found myself asking the questions of design thinking in other contexts when I was just trying to figure things out. And sometimes I would realize I was doing it, and then sometimes people, someone around me would be looking like, what are you doing? And you're like, oh yeah, I'm applying this over here. And you weren't expecting that in this context.
Dhvani Toprani (00:24:30):
Wonderful. So let's take that feeling and play with it. To brainstorm start's Problem Saad is the guest caller for this episode, and he is the associate Director of assessment and research team at the Siebel Center for Design at University of Illinois at Urbano Champaign. Their center teaches with and about human-centered design or HCD to faculty, staff, and students as a problem solving approach. Our research from SARS work and literature at large has established that HCD can help learners develop skills like communication, collaboration, creativity, and promote growth mindset, which are as important as domain knowledge for learners to succeed in life. Now, although his learners are usually a mix of faculty, staff and students, he wants to explore this assessment challenge specifically for students, but welcomes ideas of course for faculty and staff as well, given that students learn by engaging with HCD and that learning is very implicit and long term S wants to uncover ways in which he can assess and track the development of this, what he calls design mindset.
(00:25:46):
He also wants to develop an assessment approach that can track the transference of the skills that students have learned while engaging with HCD. In another context, he has so far successfully been using pre and post survey to test for near transference. What I mean by near transference is transference of learning, let's say within the period of a semester or so. And he's also used artifact based reflection to check for learning, but he wants to push the envelope to dive a little bit deeper. He would love if the assessment is not too laborious to implement and goes beyond of learning. So I guess my first question to all of you is how to design an effective assessment to measure long-term transference of learning.
Adam Kanowitz (00:26:37):
So I think one of the biggest challenges that we see with this is number one, long-term learning. It's the time students have a typically limited time that they're at an institution, and oftentimes if you want to compile more research, you have to go through the IRB process and check in with them. We at the Center for Design Thinking and Elon have run student focus groups for several of our consulting projects, and it's always a big difficulty. So I think establishing a timeline is very, very important. We see often in terms of classes that you're able to do a pre and post survey, but actually following up once they're out of the class can be difficult. I mean, you would then run into barriers of self-selection, who's going to continue to follow up, tends to be the more interested students. So I think creating an effective timeline for assessment projects is really important ing that it has to be within what's feasible for students in terms of long-term down the line assessments, that's really difficult. So maybe even providing some sort of incentive for students to join these surveys.
Sarah Bunnell (00:27:50):
I appreciate those ideas, Adam, in terms of thinking about how do you create conditions of meaningful participation around the survey, and I'm struck by there's a large body of literature on self-report of learning or self-report around expertise. That in a nutshell suggests that we are fairly good as humans in reporting on what we did. So if we ask individuals, what are the steps that you took to accomplish a particular task in this case, what are the design centered principles that you applied to a particular task? We are less good at explaining how we did it. So what were the decision-making steps that we went through and we are really bad. This is all folks not student-centered. This is human, human-centered problems with explaining why we did something. And so regardless of the timeline or the incentives and Adams pointed to some really important logistical pieces to think through, I'd also think about are we asking questions that are positioning students to provide the information that they are actually able to contribute? And then think about what are the other streams of information beyond self-report that would give you some markers of developmental skills over time that you care about that students themselves are less well positioned to provide.
Dawan Stanford (00:29:21):
Yeah, Sarah and Adam, I think I would also build on that there's the survey that gives you some kinds of information, but there's also that qualitative storytelling and pulling in that qualitative data. So one of the things that I look for if I'm trying to figure out where someone is in their skill level or what skills they're actually using is I think, Hey, tell me a story about the last time you faced and some sort of challenge or problem or context where they likely would've applied the skills. And then it was like, oh, just having them do most of the talking to get that story out. Then I can really hear in that I'm like, oh, wait a minute. Was someone really building the problem understanding and doing the research or they just jump right into making stuff? What skills? So it's still a self-report, but in some ways the information connected to evidence of transfer comes out in the story, whereas it might not come out the same way with the direct questions. In a survey,
Dhvani Toprani (00:30:38):
You're telling me a story approach was very fascinating to one. So it sounds like self-report would work for some, but not all kinds of stories or data that we want to gather. So what are other alternative assessment types or approaches that you would implement for capturing long-term learning?
Sarah Bunnell (00:31:00):
So as someone who does faculty development, I would want to be in conversation with Saad around what kind of learning is meaningful to them, and then backward design, any assessment plan, using that as the directive for how we think about what other sources of evidence as Dawan is saying, could help articulate some sense of what kind of learning we're seeing. I think this is a very common challenge that we see when we think about assessment that often the kind of learning that we're trying to develop in and with learners in partnership with our learners is not going to emerge in its full force across the course of 14 weeks. And so often when I'm working with departments or programs, they will express a sense of real disappointment or a sense of inadequacy of their current metrics of doing departmental or curricular assessment because of the limited temporal nature of data.
(00:32:04):
And as we move further and further afield from the time that we're working most closely with students, the relative impact that we have compared to the huge range of additional variables and people and experiences that they're encountering just continues to expand. So thinking about what feels both aligned with what you want to know in terms of what are the core skills, intellectual moves, values, ways of being that you're seeking to foster in students through this program and where might you look for those over time, but also keeping a humble scope to not overly claim that any one particular program is the full force of a student's intellectual journey, but we are one piece in a large constellation of learning experiences and our assessment approaches should reflect that.
Adam Kanowitz (00:33:03):
Yeah, Sarah, adding onto that in our previous conversation about what learning feels like. We often don't realize when these valuable practices, these valuable lessons we've learned through either courses or in our case, workshops pop up until later down the line, and we have had that trickle in effect where other experiences also add to it, but I think that's natural and should be actually valued. Seeing how having one short-term experience, whether it's a class or a workshop, can then impact how you view other classes. I think that looking at a holistic end goal is an important part of design thinking, especially like Dawan mentioned, looking at the doors like your daily life. I've noticed something that's very interesting to me is actually decorating my apartment now that I'm moving into a nice house for my senior year. I've noticed that the way I approach my own living space is very, very design oriented, and I don't think about that.
(00:34:03):
That is I have experienced woodworking, right? So I didn't think, okay, my taste in my decor is coming from a design experience. I'm just looking for nice wooden stuff. But I start to notice, oh, wait a minute, this furniture is actually really well designed. I like it not only because of its appearance but its function. That's a core aspect of design thinking. So I think at the end of the line, maybe in senior capstones or before students graduate, it may be effective to look at their holistic experience and see if we could see if that design thinking experience pushed them towards other experiences that they added.
Dawan Stanford (00:34:45):
One of the things I often say early in the innovation or design thinking or service design training, it's the process I can teach. That's kind of the very generic level. You can use this one or that one. They all kind of have the same stuff in them, and then you move down from that to methods and design methods are, they hold together fairly well in terms of once you understand a design method, people have similar language for talking about that. And those are the things I can teach the mindset part that connects to design. I remind people I cannot teach that. You get there by practicing. And one of the questions around assessment, long-term assessment is are you designing the course where people can practice to the degree that far transfer as possible, but also that you have a stronger chance of the things in your assessment and the influences throughout a learning path that Sarah was talking about, and there's enough there to sort of build on and tease out and play with later
Sarah Bunnell (00:36:04):
Dawan, I want to come back to this idea of you can't teach it, but you learn it through practice because it reveals or it indicates that there are some core threshold concepts of really transformative understanding that have to happen through that process. It reminds me of a conversation that I had with one of my geology colleagues. He was saying, how do I teach students to make observations in the field? The way I was taught to make observations in the field was to make observations in the field. And that feels helpful and also incredibly not helpful as a pedagogical approach because it's not a neutral experience. What should I be observing? How should I feel in this moment? When do I know that practice is sufficient to lead me to a transformative threshold concept where I am stepping through into a new way of seeing the world?
(00:37:07):
One, you said that you can no longer just look with your eyes closed, your eyes are always open. And that's what we're trying to do in all of our educational experiences that as a psychologist, I want students to see psychological principles and ideas everywhere they go. That doesn't mean that I want them all to be psychologists. In fact, it would be good if they were not. I want them to be all kinds of things, but I want the transference to be so powerful that they can't turn that off. And so I'm wondering for you all, as you're thinking about developing design thinking, ways of thinking, when is enough enough? How do you know when they've stepped through that threshold?
Dawan Stanford (00:37:57):
I can point to some signs and so often pretty much everyone who has learned with me eventually gets tired of hearing me ask one question, and that is, what problem are you trying to solve? It's a foundational thing. Knowing that in many ways, finding the problem is more important than any of the stuff people associate with design and innovation and service design, the making and the prototyping and the creating ideas. All of that has a basis in a deep understanding of the problem you are trying to solve. And so when I start to hear people pull, I start to see people pull back from trying to make stuff because they don't understand the problem or say, well, wait a minute. I'm not sure We've talked to enough people and I've done that gathering of insights. And so that's one of the things in my work, I set up everything as questions because I found people with design, it was much easier for people to start asking kinds of questions and then back into the like, oh, here's why and here's how this works and here's how it connects to the design jargon, but what problem are you trying to solve?
(00:39:29):
What's good evidence for your understanding of that problem? And creating ideas? How do you generate ideas based on that evidence and so on through a process? And so part of one aspect of doing longer term assessment might be looking for the kinds of questions people are asking themselves in different contexts when faced with a challenge when trying to understand it and connecting those to design skills and the things that someone functioning at an expert level in that domain would ask intuitively.
Adam Kanowitz (00:40:10):
So I'm thinking on how to frame this. I very much agreed to one, that conversation, the problem is more important than how you get there comes up all the time. Me and my coworkers were actually just discussing that earlier. One of them says that every workshop, I think that at its core design thinking, human-centered design, creative problem solving can best be seen as a form of effective thoughtfulness. And I feel that for me, the way I look at it, not only in people after our workshops, but especially in the catalysts or my coworkers who I'm training, I'm looking for, can they approach problems with that sort of thoughtfulness and thoroughness that we don't typically see? That is, in my opinion, what really, really sets design thinking apart from many other approaches to problem solving in life is the thoroughness in which people need to find their answers or their solutions or what they're working towards.
(00:41:17):
And I think that comes kind of very deeply from the root of human-centered design is people, when we're trying to solve for people's problems, we have to be thorough. We have to understand them because people are complicated. So seeing if somebody is able to not necessarily call it design thinking, not necessarily explicitly say, I am a designer, but dive deep to that level of thoughtfulness, and not even just specifically the type of questions, but how many questions they ask If I see them becoming a thoughtful thinker, that is how I know that design thinking or human-centered design has made a big impact on their future career or where they're going in life or their way of being.
Dhvani Toprani (00:42:00):
Okay. So this was very fascinating. It sounds like there is a lot of reflection and meaning making involved in the process of learning HCD given how layered and complex this is. So doing reflection is one thing, but assessing reflection has its own challenges because now you're talking about subjective experiences and it can mean different things to different people. So I wonder what strategy do you use as a researcher or instructor to assess such subjective experiences whose value is enhanced by its subjectivity?
Dawan Stanford (00:42:34):
So the way I would approach looking at if I were doing the, tell me a story about kind of conversation that I pull a lot of that from the work of Indie Young, who is a fantastic qualitative researcher and UX designers foundational Design studio, DNA, she was at Adaptive Path before it was acquired. Well, the way she does, she calls the listening sessions and non interviews because it's 90%, 95% listening is to start to look at the kinds of statements people are making. And she has an exceptional way of parsing that out. So we're not doing, and I'm going to say the E word empathy. We're not doing emotional empathy largely because when we're doing emotional empathy, there's so much of the researcher that gets thrown into that in the interpretation. Now in my work, I'm looking for cognitive empathy. And so regular emotional empathy, the ability to recognize and understand the emotional state of another person from their perspective. And when I'm doing cognitive empathy, I'm looking at the ability to understand what people are thinking and how they're making decisions from their perspective. And I feel like I can get closer to what someone is that I actually conveying or expressing or experiencing when I'm doing cognitive empathy. And so parsing it that way can help me connect it back. Oh, okay. Here's where the design curriculum probably influenced the way they're thinking about making this decision or approaching this moment.
Sarah Bunnell (00:44:32):
I'd love to have a different conversation around empathy and teaching for and with empathy, but it also reminds me one of the programs that I have helped support over time is being human in stem, which is now a national program that has a core curricular component where students, faculty and staff, co-facilitators do a lot of reflecting on themselves, their identities, how different readings and activities and learning from and with each other shifts how they understand themselves and understand themselves and how we understand more broadly systemic and deeply rooted inequities and higher education and how that plays out in access and outcomes for student learning. And the reason why I am thinking of that program, Devani, is because you asked about the assessment of subjective reports where subjectivity is the core, and we have had a lot of conversations in that program about grading the grade-based assessment of subjective reflection and have really applied more alternative grading approaches, looking at engagement level, looking at process that the doing of the reflection is the value more so than evaluating the accuracy of something where that's not the conversation that we're having, and that's not where the learning is rooted in that particular moment.
(00:46:07):
So I don't see subjectivity or individual positionality as inherently a problem as you're thinking about immediate assessment or long-term assessment, but your feedback approach should reflect the kinds of information that we're gathering and when we're doing evaluative work versus process focused feedback work. Can I ask one question because I'm curious with Adam in the room as a senior looking into your last year and we're talking about long-term learning outcomes, how do you think about yourself as a learner finishing up your college pathway with a recognition that so much of what you have learned, you may not even realize all of the ways in which you've learned it or its relevance until way far down the road of your own experience?
Adam Kanowitz (00:47:08):
This is a tough one. I often find myself in the role of a researcher. I often don't think about my own experience. I would say that I try to be very aware and appreciative of the information and the experiences that I get from professors. And a lot of what I find Elon is very valuable in is making those relationships. And I find that oftentimes it's easier to identify, oh, I learned this information or I learned this life lesson through a relationship. But in terms of thinking, how do I recognize things down the line because of the knowledge that I'm able to point to at this point in time, I know that I've learned plenty and I'm able to use a lot of the knowledge. I felt like I received a good education, but knowing that a lot of the knowledge that I received has led to more opportunities and will continue to lead more opportunities is exciting.
(00:48:04):
I feel that a lot of people in my generation like to think of themselves as continuous learners growing up in the information age, we get so much and it's easy to be overwhelmed. So a lot of the time we don't see that stuff down the line. So I would say that it's invigorating and it's reassuring, but again, in terms of thinking, going back to assessment, how do you track that? It's difficult because you need that time to mature. You need those additional experiences that may test you in different ways than school or your own experiences on a college campus have that cause you to dig deeper and use that knowledge that you may not think was useful or bring it to the surface.
Dhvani Toprani (00:48:49):
This has been going in a lot of interesting ways. So based on what each one of you shared, I want to almost take a U-turn at this point of time. Do you all think long-term assessments and transference of learning is the most effective way to assess learning in an HCD learning context?
Adam Kanowitz (00:49:08):
I mean, I think that human-centered design has a lot of layers to it. People develop their own takeaways from it. And again, that takes time. So it is very valuable to have these long-term assessments, but because again, it's such an abstract concept, there are many, many sublayers creative confidence for instance, builds. And as people go through their lives and their experiences, the impact of taking a class freshman year may not be seen until senior year or down the line. So I think that it's a huge logistical hurdle to track, but looking at long-term outcomes, looking at five, 10 years, where are you? Are you still using that skills? I think that's the best way to really encapsulate this stuff as an impact. But I also think that on a practical level for centers of design thinking or other on-campus centers looking at the seeds, starting to see it maybe pop up in other directions, starting to see if there's a curiosity to learn more, are people opting in to continue using that? I think that is a great indicator because that will lead to them using it in further experiences and that can be measured on the short term
Sarah Bunnell (00:50:27):
Dawan, Your question of should we be doing long-term assessment is an interesting one. And as I said before, the impact of one particular lever on an individual's learning becomes complicated in all of the best possible ways. The further out we get that doesn't mean it is not important, but it also means it's not the singular driver of who we are and how we think about engaging with complex problems and process. But we also know, and here's where I would argue that the value perhaps is greatest, is that when we reassess over time, we actually prompt reconsolidation and new meaning making about learning. We show the linkages that may not be transparent between what we've experienced before and the problems or the opportunities that we're encountering at a particular moment. And that's what we all seek to do as educators. How do I find the relevance? How do I help you activate your context specific knowledge that you're bringing in from a whole range of amazing learning opportunities that happened before me and concurrently in this particular class, and how do I help you see those links? So the assessment may not be and is not inherently a pure marker of the impact of a particular program on a particular learning outcome, but it can reaffirm the value of the learning and helps the learner see the value of the work that they're doing at that particular moment.
Dawan Stanford (00:52:02):
Yeah, Sarah, I was just thinking about the effect of assessing, and this is, let's skip the form of the assessment for a moment, but having that as just an ongoing part of the learning, perhaps through extending past graduation, if we're sticking to the university context as a way of helping learners deeply integrate that knowledge. And so it is a way of saying, oh yeah, there's that. I remember that concept. I could have used that when I was working on that project in third quarter last year at the nonprofit. Oh yeah. And the assessment in some ways becomes not only as a way of assessing learning with all of those influences, but also supporting and deepening transfer over time.
Dhvani Toprani (00:53:00):
So it sounds like our faith in transference and long-term assessment is unshaken. So as a concluding remark, what do you all think we as designers, as educators, can do more of to make the transference of learning visible in our process of teaching?
Adam Kanowitz (00:53:22):
This is almost on a meta level, but really building in the human connection of human-centered design, finding where does the lessons that they're learning connect to people's lives and giving people the opportunity to share that. For instance, I love, my favorite thing is talking after I give a workshop and hearing the one or two people who stick around to come up to me. I still remember one of my first law workshops was in a law school, the Elon Law School in Greensboro, and we had just talked about mentoring and connection, and one student came up to me and he said, Hey, I just moved down to Greensboro. I am struggling to make friends. That's what I was thinking about through half this workshop. And I realized, because you mentioned organizations and one activity in the workshop, I think I'm going to join Model un. I always wanted to do that. And sure, the workshop was about making those connections and mentoring and also design thinking. So that was the overlaying message, but that was his own personal application that I saw there. So having those conversations with people who are using this thing to find where they see the value of it, not the predefined learning objectives, I feel like that is really valuable.
Dawan Stanford (00:54:44):
It's funny, when I think about teaching and training, I realize that I can't actually teach anyone anything. All I can do is create the context for learning and make sure that I am helping people with capability and opportunity and motivation to get in there and do the work and making it easy. So I think about learning facilitation, and so one of the things for me in my teaching is thinking about practice. How can I help people see what good practice in a particular domain means?
Sarah Bunnell (00:55:26):
Those are such great answers. So building on what Adam said in terms of how do I build community and help students see their own way of connecting, and then to one, thinking about how do I create conditions of self-efficacy so that practice feels meaningful and relevant? I would add a third piece, which is to model curiosity ourselves. To always be engaging in this curious learning mindset of how does the things that I care about, the questions that I'm asking, how can those be in conversation beyond my own discipline or my own domain of expertise? So to invite and model that kind of inquiry of transfer, I think can help to create the community and the self-efficacy that my colleagues have spoken about.
Dhvani Toprani (00:56:21):
I feel like we redefined all educational terms that we took for granted in all these episodes. So this feels wonderful because that's exactly what we are here to do. Question, question things differently, rethink about things. So I love that we talked about humble scope, human connections. I think that's like the soul of HCD work. So thank you all of you. Thank you for your time. Thank you for these great ideas and I'm excited to share this back with S.
Dawan Stanford (00:56:51):
Wonderful. Thank you. Thank you.
Sarah Bunnell (00:56:53):
Thank you very much.
Dhvani Toprani (00:57:07):
Hi Saad, welcome back to the show.
Saad Shehab (00:57:09):
Hey, hey Dhvani. Thank you so much. I'm very excited to be with you again here today.
Dhvani Toprani (00:57:15):
Alright, Saad. So the panel discussion for your challenge was very reflective and full of divergent ideas. We had Sarah Bunnell, the director of Center for the Advancement of Teaching and Learning and associate professor of psychology here at Elon University. We had Adam Kanowitz, who is a student staff at the Center for Design Thinking and Entrepreneurship major at Elon University and Dawan Stanford, who is the president of Fluid Hive to begin with. They all unanimously agreed that each one of them has grappled with your problem within the design or assessment world in some shape, size, or form. So find some soce in that. And they found themselves no easy solution. They also agreed about the need for and importance of long-term assessment in the kind of work we do as design educators. Adam adding his student perspective validated that many times the connection between the learning movement and its practice are unnoticed by learners.
(00:58:18):
Sometimes these moments are days, months, or years apart. Sarah shared that self-reporting is a good approach to learn about the what in learning, but not so much about the how and why in the learning. She suggested validating self reports with other data. To that point, Dawan emphasized the need for qualitative data that extracts stories from learners by focusing on their process. What he meant by that was you are listening to their stories at a meta level almost to find patterns and not just focus on the content of the story. Clarity about the core skills and values associated with design learning would greatly support this process for you. The panel also recommended using backward design to design assessment where you begin by identifying your long-term learning outcomes and then design spaces where your learners can express those practices and you as a researcher can observe and collect data from that.
(00:59:24):
The panel also discussed that for something as complex as design learning, the learning won't emerge in its full force by the end of the course or workshop. So pre and post surveys can give you just so much from that perspective. In fact, design learning becomes more meaningful when it is infused with other authentic life experiences. So if as a researcher we are trying to isolate the impact of design learning, we are perhaps doing disservice to its very identity process-focused assessment is what will get us more accurate. Reflection of the learning that has happened in these spaces was something that the panelists emphasized on multiple times during the conversation. We then eventually reflected on the idea of assessment and long-term assessment for design learning. The panel suggested that how we define impact reports around this complex form of learning needs to be very humble in its scope. They suggested that we can teach them this method, we can teach our learners the methods and share all the knowledge with them, but the mindset comes only with practice. So with all of that information, I'm curious to hear, Saad, what are your thoughts? I know you are a deep thinker. Was there anything that resonated with you that surprised you that you've already thought about?
Saad Shehab (01:00:52):
So all what has been said definitely resonated with me. Two things that I felt are new to me were the idea of the backward design and intentionally backward design. Because I think as researchers, many of the times we think as backward designers or we use backward design, but we don't really explicitly write on paper what actually we are trying to get at. And then can we create some spaces or some opportunities where we can collect enough data or the appropriate data to check that out or to check students' learnings. So I really appreciate the idea of the backward design and the emphasis on it again. And then I love the humble in its scope. I kind of coded that because I really felt that that's absolutely right. We are sometimes many very ambitious as teachers and as researchers. We want our students to learn things and develop knowledge and skills around stuff that we ourselves are still developing.
(01:02:01):
And we forget sometimes that it took us lots and lots of learning journeys to arrive there and learning experiences to arrive there. And because we probably love our students so much, we want them to get where we are in very short time, which doesn't happen. We know that learning doesn't this way. So I really love the idea of humble in the scope, and that's something I realized myself in my role because I remember when I started doing research studies to capture the impact of design thinking, learning experiences on students, I was always wanted to capture everything and always wanted to measure every single development of every single mindset and every single learning in every single design thinking process. But then I came to realize that this is impossible. You really need to be very humble. You need probably to select one or two metrics or one or two learning outcomes and just work around these and then try to focus more on collecting multiple forms of data. Which brings me to that validation, which I also do all the time because I believe in that idea of sure thing, validated instruments, especially surveys for example, are important, but you need to also learn how to triangulate the data and collect multiple data from multiple sources so you really can capture what's going on and be sure that, yeah, this is what's really happening. So yeah, that's my first reaction to what I've heard.
Dhvani Toprani (01:03:40):
And to that point, Saad, I think the subjectivity is so important in this learning experience. When Adam made that point about your learning really does not shine until it is infused with other existing authentic experiences. It really made me think then, how far can we go with the standardized testing mechanisms in a situation like design learning plus? I was also very curious to rethink what an impact report can look like in your context. Now that we have some of these conversations that are in our minds, what can an impact report look like for you?
Saad Shehab (01:04:21):
So actually I'm in my fifth year in this position, and I have to say it wasn't until two months ago where I succeeded to write an impact report of a course that I felt what at least inspired me as a researcher slash teacher. I've been struggling with this question over the past five years, what really composes a good satisfactory impact report. And I can, on the top of my head list some headings that I wrote under. First of all, of course, you need definitely the learning objectives and the purpose of the course. What are the learning objectives? What's the purpose of the course? What are students going to get from this learning? I think then the second thing you can talk about is your development of the course. So as a teacher, it's really nice here to add your voice on how did you design some learning experiences that can help the students meet those learning objectives.
(01:05:17):
Then you write a little bit about the participants themselves. Who are you assessing or who were you assessing in this course? And then comes the evaluation part, which I think is the heart of the impact report, and that's where all what we have talked about comes in the idea of the humbleness, the ideas of what you report at, the idea of even defining your metrics. So one of the things that I like to focus on in a design thinking course, especially one that aims to introduce students design thinking, is just simply their understanding of human-centered design or their understanding of design thinking. We're talking here about an open-ended question of what is design thinking? They can provide their pre answer post answer, and then you need to come up with a reference and tell the people in that impact report, I am assessing students' answers based on this reference or based on the presence of the following elements in their definitions.
(01:06:10):
And then see if there is any change from pre to post to say that, yeah, sure, they got it. In an introduction to design thinking course for me, if we want to do the humble perspective, that's more than enough. If you met the students' expectations and if you really found out that they have learned something about design thinking specifically it's definition, that's more than enough for me so that you can claim, okay, this experience had really led to something and more importantly in my opinion, in an impact report, is that last section which you entitled Recommendations for future iteration. If you are to repeat this course again in light of this analysis or in light of this data, in light of this impact, what do you think you need to change so you make it better, more engaging, more advanced, et cetera.
Dhvani Toprani (01:07:01):
A lot of what you said, the piece about the humans being in the center is also reflected in the work that you do. This was very fun, sad. Thank you for doing this. I hope these ideas give you some new pathways to explore because I know there's always more to do in the world of design and design learning. So yeah, thank you so much for being on the show.
Saad Shehab (01:07:25):
I want to say one more thing though. This experience with you and thinking about these things again, and now what you have mentioned from the panel has been great and at least validating a model that I have been thinking about that relates to assessment of design thinking and design thinking, learning experience. I think we still are going to be faced by two important challenges, or let me call them black boxes, which is that idea of the near transfer. How do I capture if what I'm seeing as impactful or impactful is actually leaving with the students so they can use it in new contexts. And then another black box, which is does this persist over time, which is that far transfer piece. So I think whoever is listening to this and thinking about assessment in these domains, use this framework to think about things and hopefully together we can come up with the solutions and answers to these questions. But thank you and your team for this wonderful opportunity. I highly appreciate it. It was a great learning experience for me and it's always good to talk to you, Dhvani. You've been a friend since a long time and I appreciate our friendship and our work together.
Dhvani Toprani (01:08:43):
The feeling is very mutual Saad. I always get to learn something new about design when I talk to you, so that's always fun. Thank you.
Saad Shehab (01:08:51):
Of course. Thank you.
Matt Wittstein (01:09:02):
Limed: Teaching with A Twist is produced in collaboration for the Center for Engaged Learning at Elon University. For more information, including show notes and additional engaged learning resources, visit www.centerforengagedlearning.org. Lyme Teaching with a Twist is a creation of Matt Wittstein, associate professor of Exercise Science at Elon University. Episodes are developed and hosted by Matt Wittstein and Dhvani Toprani, Assistant Director of Learning Design and Support at Elon University. Olivia Taylor, a class of 2026 Music production and recording Arts major is our Summer 2024 intern and serves as a producer and editor for the show. Original music for the show was composed and recorded by Kai Mitchell an alumni of Elon University. If you enjoyed our podcast, please take a few moments to subscribe, rate, review, and share our show. We aim to bring insightful and relevant content to educators each month, and we would love to hear from you. If you're interested in being a guest on the show, do not hesitate to reach out. Our most updated contact information can be found on the Center for Engaged Learning website. Thanks for listening and stay zesty.