Pages

Friday 3 June 2016

Course Write-up: Gathering feedback

Hello! It's been a while since I've written up any courses I've been on, in part because I am very lucky that I work in a place where we share what we learned on courses at our weekly staff meetings so I haven't felt the need to try to reflect or assimilate in writing. But it's always beneficial to write these things down for my own sake as well as sharing them more broadly, so I thought I'd try to make more of a habit of it, at least when I feel it would be useful to do so for my own practice. I'm working on a some other work-based and professional development habits, which I hope to talk a bit more about in an upcoming post. Meanwhile, here's a write up from a course I went on this week.

Sally Stafford's recent session on gathering evaluation and feedback was one I've been looking forward to for a while. As someone who teaches a fair amount, I often rely on the course leaders to pass on feedback from their students about what I might improve, and we are still working on developing strategies for consistently getting evaluations from the training sessions we give. While I really enjoyed this course, it was a bit of a stretch to think of ways of incorporating it into my own practice as there was a heavy bent toward feedback on exhibitions and outreach events rather than training sessions. Even so, it certainly got me thinking about creative ways of framing questions and assessing learning outcomes.

The first point Sally made was that when people think about feedback, they're often only thinking about gathering opinions after the fact. However, effective projects look for feedback throughout, from the initial development phase. By doing this you can ensure that you are delivering content that people really want, in a way they want. Teaching sessions have this built in, in that the process is inherently iterative: ideally, your training gets better each time you do it based on the feedback you've had before, and there is no "final" product to get feedback on. However, it is useful to think about impact in the same way museums do. What have people taken away from my session? What was the impact on their practice? Impact is a fiendishly difficult thing to measure, but various sectors are under increasing pressure to demonstrate measurable impact to justify funding, staff and other resources. This is not to say that everything needs to be reduced to a number, say "72% of participants said this training was Very Useful". I'm always more inspired by individual comments, like, "I learned a lot, thank you!" or "I never knew librarians knew so much about this stuff!" But in large volumes that becomes more and more difficult to parse and if you are involved in a project that requires you to justify funding, you may be dealing with people who find the numbers much more compelling than a few glowing remarks when presented in a report.

GLOs and designing questions

Central to Sally's process were the Generic Learning Outcomes, a framework used by the MLA to assess learning in non-classroom contexts. While much of the discussion around the GLOs focused on the context of exhibitions or outreach to school groups, I found the framework to be a useful prompt to think about what questions one could ask to gather feedback about different facets of learning.

  • Knowledge and understanding: While this is fairly straightforward, asking bluntly "How much/what did you learn?" is not necessarily going to give you brilliant feedback. Any parent who has asked their kid what they learned in school today will be familiar with the non-commital shrug followed by that slippery syllable, "Stuff". The group talked about potentially asking for one thing that stuck out, one fact for example. Since I often do follow-up sessions or series of classes, I could always ask at the beginning of a class for one thing they learned in the previous session for example.
  • Skills: This is a tricky one to get verbal feedback on, but could be tested through doing an activity before and then after the session. An approach that Sally used for our session was a target, where we rated our confidence with evaluation before and after the session using sticky dots placed correspondingly on the target. I think confidence is a good operative word when asking people to self-report on their skills. 
  • Attitudes and values: Another tricky one to ask about, as Lucy was tactfully explaining, as it has to do with subjects that people are sensitive about. Sally offered the example of, "Are you a bigot?" as a potentially insensitive way of gathering feedback about this aspect. :) My content is often very value-laden. I talk pretty openly about my mistrust of metrics as a good indicator of the quality of an article, about the flaws in the peer review process and advocate passionately for Open Access publishing. Rather than gathering this in the form of feedback after the fact, I usually seek to have a discussion during the session where people are invited to share their points of view.
  • Enjoyment, inspiration and creativity: While I don't think I'll collect finger paintings from my students, it's worth thinking of creative ways people could respond, especially if they have felt inspired by something in a session. I refer to this facet later as the 'Ah-hah' moment and discuss it in more detail.
  • Activity, behaviour and progression: Once again I do often have the opportunity to find out what people learned from a previous class and I think I could take greater advantage of that to see if behaviour has changed in response to something I've taught. I usually ask if anyone has been using a particular tool or technique and ask for feedback but perhaps I could do this in a more structured way.

Creative feedback methods

I'm obviously still mulling over how to ask for feedback and how to record it, but the session was certainly not short of ideas. Many of them would suit a UX context better as they're fairly involved, e.g. focus groups, behavioural mapping, observation, interviews etc. Some suit different audiences better than others, for instance role-playing or drawing would be great for kids but I somehow doubt I could get a room full of stressed MPhils to see the value of such tasks (as much as I believe that creative endeavours are good for stress levels). There are some ideas I'm tempted to use, however. For shorter sessions on a particular tool or skill, I'd love to adapt the target method to show the change in confidence levels. For series of sessions I'd like to build feedback into subsequent sessions and help people reflect on how their practice of academic research is developing. It's definitely gotten some gears turning in my brain regarding how I could gather feedback beyond my usual post-its at the end asking for one thing they've learned and one thing they'd improve.

Ethics and accessibility

I wanted to raise a couple of issues that didn't come up during the course. First, gathering data from people I think it's essential to speak to someone in your institution that knows about research ethics. They may say that no further approval is needed, but the moment you start gathering artifacts or quotes from people, start observing their behaviour or start intruding on their time it's important to think about the ethical considerations.

  • Is participation voluntary?
  • Have you informed people that they're being watched?
  • Have you informed them of how you will use their data and do you have a plan to follow through with that?
  • Who will have access to the feedback they give you?
Unless told otherwise, people have a reasonable expectation of anonymity when taking part in studies. It doesn't have to be a signed consent form in every instance and can be very light-touch. For example, I plan on adding a quick verbal disclaimer when I'm asking for feedback that it's anonymous and any feedback they give us will be used to make our training programs better. There is a blurry line between user experience research and feedback and I would think that it would be good to err on the side of caution and consult with someone who can give guidance on what you need to tell participants and how to keep the data in a safe and anonymous way.

Similarly, it's important to think about comfort levels. One method discussed for use with teaching sessions was task-based feedback, for example acting exercises to gather feedback during a session. It was mentioned that adults were likely to be self-conscious about this, but that it would be engaging for children. I agree but I think it's important to be aware that if this is built into the curriculum of a particular course or training session it may not be accessible for people on the autistic spectrum or other social differences. By way of making course content equally accessible to all, I would be interested in finding a way for students to opt in rather than making it a requirement, or seek other ways in which you could gather similar feedback.

Accessibility sprung to mind again when we looked at feedback methods using red, green and amber coloured pieces of card to let the instructor know how confident or engaged participants felt. Again, alternatives that are accessible to colourblind participants would be useful to prepare ahead of time. These are just a few examples - there are many ways to build accessibility into your feedback process if you take the time to consider who is being excluded by the method you have chosen.

Ah-hah moment

I think my favourite concept from the course (and my own 'Ah-hah' moment) was the idea of focusing feedback on what inspired someone, one idea they'll take away from the session or one lightbulb that lit up during the session. Even if a student in one of my sessions paid me pretty much no attention but they had an 'Ah-hah' moment regarding their own work while sitting there, I feel like at least I provided them the space to get that inspiration and I'd love to know about it.

For kids visiting an exhibition, they may not grasp your thesis but they will certainly remember the taxidermy pigeons because they're surprising. Or they might remember that, like Charles Darwin, they always keep a journal too and like drawing the animals they see. Similarly, my students may not remember everything I tell them about Data Management, but I hope they remember the story about the guy who lost 6 months of work when his laptop containing his PhD and the backup disk, both in his rucksack, were stolen in a pub. Or when looking at conference posters for design tips, maybe a student will finally figure out what methodology she'll use for her own dissertation (this literally happened in my class yesterday). I don't really mind if she took on board less of what I taught as long as she left my class excited about something to do with her work.

I think beyond tips for improving the actual content or timing of the sessions this will be the focus of generating feedback. It provides students the chance to reflect on what they're excited about, which will reinforce whatever inspiration they've had, and it gives me a window into what material students are resonating with. On a personal development level, I've just started a weekly reflection/accountability thing where I write about what went well vs. not so well, which includes noting down my own 'Ah-hah' moments. It's really helpful to try to capture what's inspiring you from the courses you're taking or the books you're reading as it makes it easier to remember and therefore implement any changes that you might think of as a result (especially if, like me, you're consuming so many great articles and podcasts that often times great ideas are driven out and forgotten, no matter how inspiring they were).

So, to try to synthesise a write up in which I concluded pretty much nothing, it was a good session and it's definitely got me thinking. These concepts might go on the back-burner for a bit but I can have a look at this post later on when I'm developing courses to see if it prompts any more 'Ah-hah' moments.