How to collect, analyze, and apply different types of learner feedback
Everyone wants to be heard. I know I do, and I think you do as well. We like to praise what has gone well and offer some “words of encouragement” for things we don’t like. I’m trying to be nice here, and I’m sure you try to be nice as well. Our entire online culture is filled with opportunities to offer our thoughts – blog comments, social media posts, emails to the editor, online surveys for the products we buy, the services we use, or customer support calls we make to get our problems solved.
Sometimes it doesn’t matter if I’m right or wrong – because that can be subjective. I just want to express my needs, wants, desires, concerns, hopes, and dreams to someone and feel like they are listening and taking me seriously. They may not be able to action every suggestion, but at least I’ve had my say. Sometimes the communication is between myself and someone who is a direct report or to whom I report, sometimes, it is peer-to-peer, and sometimes it is truth to power. There is another kind of feedback that is so critically important to my success.
Receiving feedback can be painful, but is necessary for your growth
As an in-person and online course instructor, I’ve had to hear learner feedback that was immensely helpful, and sometimes the feedback was a bit hurtful if I chose to take it personally. Errors happen, links break, information changes, best practices evolve, or third-party tools stop being free or available. Learner feedback told me what needed fixing long before I could hope to find it on my own – checking every link in multiple courses. Learners told me when I used a term that I didn’t adequately define. They asked for deeper explanations or better examples. The most hurtful was the person who complained about me clearing my throat during a recorded one-hour presentation. You can only do so much, sometimes.
Realize that your training content is receiver-focused
A wise man once told me, “All communication is receiver-focused.” That is especially true when creating online training – because the quality of the training communication (courses) that you provide your learners will determine how well your learning and development program goes. I’m not just talking about factual accuracy here. That’s easy. I’m also talking about the look and feel of the course, all aspects of your online training delivery platform, how various course elements behave, how they are notified, enrolled, log in, etc. All that “non-course-content stuff” and more.
Where does the term “learner feedback” originate from?
“Feedback” is a term that originated in the audio industry, meaning the sound that created a loop from the source to the microphone to the monitor/speaker system, then was picked up again by the microphone. It was unpleasant, annoying, and thought of as unprofessional. For our purposes, let’s agree to define learner feedback as the ability of your team to “speak truth to power” about their experience taking your courses, critiques of the course content itself, ongoing Q&A for your course content, and the ability to use crowdsourced wisdom to find knowledge gaps and opportunities to improve your training materials and deliver.
As providers of online training, we need to invite, accept, assess, and action learner feedback That’s when other people’s opinions become the lifeblood of your business, nourishing your decisions and helping you improve the training experience for everyone.
In our industry, online corporate training, creating and collecting learner feedback is one of the most valued and professional things you can do to create a better learner experience for your team and improve your training outcomes.
Being open to your learner feedback
Chances are when you started building out your online training, you took your existing training materials and converted them into an online format. You may have used resources from your instructor-led training programs – such as slide decks, audio presentations, handouts, quizzes, tests, and interactive exercises, and assembled your online training from those materials.
Did you pause to ask yourself, “how long has it been since this material was updated?” It’s easy for text-based course materials to live forever, with changes made in real-time through in-person instruction – all those little details and interesting tidbits that instructors throw in as part of their interactive presentation and classroom discussion. It’s like university textbooks versus the professor’s lecture; one is static and filled with spontaneous anecdotes and examples to illustrate a point.
Make sure you understand your learner feedback – discover the “why” behind the feedback
So, you’ve made sure that your course content is up-to-date and engaging – maybe even interactive. You probably got a lot of instructor feedback or information from your subject-matter-experts to fill in the gaps of knowledge from their side of the teaching equation. Did you get any learner feedback on what they thought of the material? Either before its conversion to an online training format or after, when it was assigned to them to take?
When you receive feedback from your learners, it can be worth digging into the meaning behind the feedback. Why they made the feedback request can be equally as powerful as the request itself. If you are unsure to the reason for the feedback or don’t understand it fully, your reaction to their feedback may miss the mark.
Learner feedback is vitally important to the ongoing evolution of your course content and delivery. They will be able to offer a unique perspective that may challenge you – or your team – but it is so valuable that I would say that you might ignore it at your peril. Being open to learner feedback also means remembering that they are complementing and critiquing “the course and its delivery,” not the people who are responsible for it, and sometimes it can be hard to hear. Fortunately, more often, it can be great to hear how your training content or program touches hearts, and changes lives.
Using learner feedback to achieve your learning objectives
In the classroom, the instructor can read their audience's facial expressions and body language and adjust their delivery accordingly. Knowing when to pause, reframe, and check for understanding is easy. We lose that in the world of digital training materials. One course has to meet the needs of the entire cohort that takes it – and given that a cohort is made up of individuals, the course may not adequately meet the instructional needs of every student to the same degree. Having a consistent, measurable feedback process allows you to get the insights you would have missed if you didn’t ask the question.
Your learners need to feel comfortable providing you with feedback. Depending on your company, you may want to have both identified and anonymous feedback collection. Identified feedback ensures that you can answer a specific question that affects one learner or a cohort of learners taking a course. Anonymous feedback is a viable safety mechanism for those with concerns to share them without fear of retribution. It happens. It can be hard to tell your boss that the course they assigned to you just didn’t arm you well enough for the realities of your job.
What constructive learner feedback can reveal
You may be surprised at the types of things you will learn about your course or delivery platform once you invite learner feedback. Here’s a list of some possibilities:
Surface-level errors and inaccurate material feedback
Typos, grammar errors, and formatting problems
No matter how many times you perform quality assurance (Q&A), there is always a chance an error will creep in.
Broken or incorrectly configured links
Links to your resources can change and take the learner to an unexpected place. It happens! Especially any references to external resources, as third-party links are subject to break without warning.
Accessibility and usability-based feedback
Issues around accessibility
Especially regarding seizure triggers, adequate contrast between typeface color and background, or use of colors for those with limited visual spectrums. Do you need to use a mouse to navigate a course, or can you use a trackball or joystick? How well can your course adapt to the visually or auditorially impaired? Does your course work well for those who are neurodivergent?
Issues around diversity or inclusion
Language continues to evolve around sensitive terms that were once commonplace or highly gendered – now we talk about the largest bedroom in a house being the primary bedroom, we have a safelist on our email inbox, and we have legacy accounts for long-term agreements that were signed before more current clauses and conditions were added. Job titles and roles are becoming increasingly gender-neutral – like being the “chair” of a committee, not the chairman or chairwoman.
Course content and training delivery-based feedback
Relevant course visuals
The need for more illustrations, better graphs, schematics, or charts – ensuring that all the visual representations are improving, not detracting from the course material.
Course delivery
How well did the delivery platform work “FOR THEM” with whatever limitations of technology, internet bandwidth, and device they were using?
Course timings
Did the course move too quickly or too slowly? What can you add to give your learners more control over their learning experience – the ability to stop, start, pause, rewind, and fast-forward any audio or video?
Course pace
Is there too much filler? Too many social scenarios that detract from the meat of the course in any simulations or animations?
Content fit
Where the training is inadequate to do the job that the training is supposed to be preparing them to do. The training content may be too advanced or too basic, making assumptions about the learner’s current level of understanding.
Post-course related learner support
Could the course benefit from downloadable performance supports (formerly known as job aids or cheat sheets) that the learner could print off or save separately to allow them to access information outside of the course delivery itself? Is there a need for more or better definitions and improvements to a glossary or knowledge base?
Setting the expectation about giving feedback
You may have to train your learners on how to provide you with effective feedback. We all bear the scars of past teachers or employers who would take offense at the slightest criticism and behave punitively towards those who spoke out – even if those providing feedback had good intentions.
You can establish a tone for the feedback you receive by making “feedback” part of your onboarding program – setting the guidelines for how learners will be asked to give feedback, what the outcomes will be, affirming the non-punitive response to even harsh feedback (aka criticism) that is received. You can also give examples of how to provide constructive versus non-constructive feedback on a few topics to guide your learners toward clear, concise, and honest communication back to you when the feedback survey comes out.
There are two well-established models that help us learn how to give and accept feedback. They have their roots in pedagogy and have slowly migrated into the adult training world. Using one of these or making your own, you can help your learners frame their feedback in the most constructive way.
The FIDeLity model
The FIDeLity model is where the expectation of feedback is that it is Frequent, Immediate, Discriminating (meaning based on clear criteria and standards), and delivered Lovingly or Loyally – meaning kindly and with the intent of improvement. Let’s break that down:
Frequent
You want to invite feedback as an ongoing part of learner engagement – not have monthly or (gasp!) quarterly feedback meetings. These can devolve into complaint sessions rather than productive conversations because there is so much to cover, and people have been saving up their opinions for a while and may be frustrated or unhappy.
Immediate
It should be “quick and easy” for learners to provide you with feedback as close to the moment they are experiencing friction as possible so they can capture the details. That allows you to address small issues before they become amplified and problematic for a larger number of learners.
Discriminating and Lovingly or Loyally
I tend to tie the two together. By setting expectations for how the feedback will be delivered, you can guide your learners to be respectful in tone and understand what your standards are and what is possible within the system. You have a role in teaching them how to give the most valuable feedback possible.
The SMART model
The SMART model is more well-known, and variations of it are used in productivity and management workshops. When it comes to feedback, we are asking for Specific, Meaningful, Applicable, Reflective, and Timely inputs from our learners. This is likely the more familiar and self-explanatory of the two models, but I’ll expand on them.
Specific and Meaningful
Your learners should point out specific problems or be able to qualify their emotional response to your courses. There is a big difference between “your course sucks” and “when I don’t see any people of color in the photography used in the course, I feel unseen and marginalized.” That’s specific.
Applicable and Reflective
When it comes to feedback about content, version mechanical issues like typos, you want your learners to think through their feedback, and you may need to help them understand that what they may perceive as an issue is perhaps reflective of a company policy or industry standard of which they were not aware. Your goal is to improve your course content and delivery to positively impact training outcomes, and all qualitative feedback of value will help you move closer to better outcomes.
Timely
This is a version of ‘see something, say something” – ensuring that your learners have adequate mechanisms to provide you with feedback in an ongoing and immediate manner. It is also courteous and respectful to acknowledge feedback upon receipt, even if you can’t act on their suggestions right away or, perhaps, ever. If they take the time to provide feedback, they should at least get a thank-you.
We can use these models whenever we deliver feedback to our team, our peers, and even our employers because they help present the feedback we deliver with the intent of creating positive change. This then becomes part of the company culture, and everyone benefits from clear, well-intentioned constructive feedback.
How do I collect learner feedback?
You can collect feedback in many ways. Some may depend on your chosen learning management system or microlearning platform. Some can be automated, others require more organization and preparation to accomplish. Let’s look at various ways to collect learner feedback – this is by no means an exhaustive list. Some of these may be built into your eLearning platform, and some may need to be delivered by email, or via a third-party platform, or (gasp) in person.
Built-in eLearning platform functionality
Does your eLearning platform have any built-in feedback mechanisms? For example, our OttoLearn microlearning platform has a “provide feedback” function that allows the learner to send a message to their administrator from almost anywhere in the course that they are taking. This feedback might take the form of “email your instructor” or “contact your course admin” and may generate an email, send a text message, or post a note to a public, semi-private, or private chat forum. Talk to your business account manager to ensure you know what built-in options exist in your chosen platform and that you, your admin team, and learners can access them.
Star ratings and smile sheets
Digital versions of the old-fashioned paper-based brief survey, where participants are asked to rate their satisfaction by picking from a row of smiley faces ranging from angry to happy, hence the name. Today’s smile sheets are basically graphic versions of our next solution.
Polls
Even tools like Microsoft Teams has a built-in polling feature where you can pose a question to your cohort and ask for a 1-10 rating. The software aggregates the data for you. Great for in-the-moment insight, but these may not provide the depth of feedback you need to create meaningful change.
Pre- or post-course surveys
There are several easy-to-use and low-cost survey tools like Typeform and SurveyMonkey (non-compensated mentions of survey tools I have used successfully). Because you are building the survey yourself, it can be as long or short as you wish. Many of these platforms allow for branching scenarios – where recipients are given questions based on their response to the previous one – so it allows you to easily dig down into specifics – “chasing the mice,” as my boss puts it.
These can be built into the course as one of the final components, or a link is sent by email. Email delivery might be integrated with your eLearning platform, or you may have to send emails manually. You could send learners who complete a course one survey and send those who did not complete the course in the time allotted a different survey. You have lots of flexibility to create specific surveys to get the right questions in front of the right cohort of learners.
I would recommend taking some training on how to properly frame survey questions and analyze results. You don’t have to become a statistician, but there are some survey best practices that will help you get the most accurate results and perform the best analysis. Sometimes these courses are right within the survey platform you choose.
Opinion platforms
There are numerous feedback and survey programs like TinyPulse (non-compensated mention) that allow you to ask a question weekly and then collect and anonymize the data, so you don’t know who has provided you with what comments. At Neovation, we send out a weekly question to the team about a wide variety of topics to measure work satisfaction and engagement. About 60% of the team responds weekly, and the insights have proven invaluable.
In-person interviews and focus groups
Sometimes, you just have to sit down with your learners and let them tell you how they feel about your training. This can be coordinated by third-party consultants who specialize in drawing out constructive feedback from highly facilitated group discussions. You must create a safe environment for people to speak freely in order to get all the feedback you are looking for. Paying people for their time is customary if these focus groups or interviews are held outside of office hours.
And there are likely many more ways to collect feedback. What other ones have you used?
What do I do with my learner feedback?
There’s no point in collecting learner feedback unless you do this strategically and have a plan to take action based on what you learn. Some of the “next steps” can be easy wins, while others will involve many moving parts to resolve.
Addressing surface-level learner feedback
Typos, bad punctuation, grammar, and non-inclusive language can be easily modified.
Factual errors can be confirmed and corrected. Gaps can be filled, definitions supplied, and resources created by your course design team or your hired course creation partners. Those are the easiest things to fix based on learner feedback.
Seek an expert’s help with accessibility-related learner feedback
Accessibility issues can be addressed within the existing content by asking an instructional designer to meet the minimum standards required for accessible content. Neovation’s in-house accessibility champion, our Product Manager Ashleigh Lodge is writing articles for the Learning Hub about digital accessibility, so I urge you to read her articles.
Some learner feedback may prompt deeper research and review
It’s more challenging when you are analyzing more “emotional” or “experience-based” feedback – like the material being not engaging, or the timings make it difficult to absorb the information, or the information is overly simplistic or, conversely, overly in-depth, and not meeting the learner’s needs. These feedback items need to be analyzed, aggregated, and then carefully addressed. You don’t want to base major course change decisions on the feedback of one or two learners – you need to weigh each piece of feedback and determine how urgent and important it is to address.
You may need to do additional research to dig into the cause of the problem that your learner experiences or what made them feel the way they did to give you the rating that they gave. Sometimes the problem, unfortunately, can be learner-centered – not allowing themselves adequate time, focus, preparation, or “whatever” to give the course material/delivery a fair evaluation.
Set your learner expectations when they provide feedback
When asking for feedback, you want to set clear expectations about how that data will be collected and analyzed and your priorities in acting on the feedback you receive. Quick fixes for small errors are one thing – and your learners should be thanked for pointing out errata in the course material. Major course changes may take longer, but you will want to reassure your learners that they have been heard and that their feedback is driving longer-term improvement plans.
Above all, don’t frustrate your learners and diminish the value of their feedback by asking them to provide feedback continually if nothing has changed from the last round. It is disheartening to take the time and focus to provide feedback only to see the same misspelling show up when they take their refresher training or do their annual compliance course. You never want your learners to feel that providing feedback is a waste of their time. Acknowledge feedback, say thank you, and confirm that corrections have been made and bigger changes may be coming if warranted.
Learners want to learn while having the best experience possible
Building a corporate culture of ongoing training means keeping learners excited to learn. Their training content must be meaningful, helpful, and informative and keep them engaged with their tasks and roles and the company’s vision for their future. What better way to engage your learners and make them part of your extended eLearning team than to offer them a variety of avenues to provide feedback and then demonstrate that you value what they have to say about your training materials and delivery?
Remember – you can take a bow when the feedback is positive, and you get to roll up your sleeves and improve the training experience when the feedback is more challenging. Don’t take it personally. You asked for their feedback – and it is a gift to you and your team to improve your courses from the grassroots perspective.
Correlate your learner feedback with your learning analytics
While what your learners tell you is one component of a larger learning analytics strategy – don’t sleep on the information you receive. Involve them in the discussion as you put forth changes and solutions. Take the “soft” information you’ve gathered – directly from your learners – and compare it with your “hard” data – your learning analytics. If your organization is gathering learner feedback and collecting training performance metrics, then use both to make data-driven training decisions and watch the value of your training increase. And, when you’re ready, continue your discovery into learning analytics and training ROI with our learning hub guide.
After all, you need to change your team’s behavior to achieve your training objectives and move those precious corporate KPIs. That’s the entire purpose of training – to help your team do their jobs better, more efficiently, and more effectively, to increase productivity and profitability, so everyone in the company can benefit.
Training engagement is key – and nothing is more engaging than feeling like you are part of a conversation by providing feedback to an active listener to improve the training experience for everyone.