The value of feedback from learners

The value of feedback from learners

By Scott Granville

I was asked an interesting question recently about whether feedback from learners played a role in our decision to create a sequel drama series (Skippers Pass Homecoming), a first for Chasing Time English. The simple answer is yes, but the timeliness of the question led to some deeper thinking around the collection of feedback, and how important it has become in both our short-term and long-term planning.

Direct feedback from learners is a valuable asset for any course or programme, regardless of the scale and scope. There are all sorts of measures, both informal and formal, in-class or self-guided, that can be applied to collecting feedback, and through analysing and reflecting on this information, changes for future improvement can be considered and implemented. From the standpoint of a classroom teacher, I have always attempted to gather feedback on a regular basis, even in small informal ways such as follow-up verbal questions at the end of a lesson. If there has ever been any concern around the validity of such results (students wanting to make the teacher happy rather than providing honest feedback), anonymous written forms are simple substitutes. In any case, a happy and engaged learner is better than a frustrated or unmotivated one.

With our Chasing Time English series packages, we are conscious of creating a balance between an engaging narrative to hook the audience (learners) in while providing language input and outcomes that are meaningful and valuable to the target proficiency level. For that reason, feedback from learners has become one of the core resources for how we approach our entire production planning process.

When we released Skippers Pass on FutureLearn in 2021, it was the first time that we had delivered one of our courses directly to learners. With the success of usership (in raw numbers), we now have course data from 40,000 users across 179 countries. The feedback can be sorted into 3 categories:

1) In-course activity response

2) End of week response

3) End of course response

Rather than our normal small sample sizes (from class groups or individuals), we have the ability to look at the feedback across regular intervals, quantify the overall response to certain activity types, and understand whether the narrative has kept the audience engaged. Without drilling into the analytics in detail, here are several key takeaways from user feedback:

1) Learners enjoyed the opportunity to engage with other learners in discussion forums without the need for educator prompting (over 70% of active learners participated in discussion activities more than once).

2) Activities that are directly connected to the ‘world’ of the narrative had higher positive response rates than non-related activities regardless of type (reading, grammar, vocabulary, listening).

3) Learners enjoy prediction and speculation with high positive response rates throughout a narrative series but a majority prefer a definitive, rather than open-ended, conclusion.

The first of these points was perhaps most surprising and went against the preconceived notion from a number of our team around the role an educator should play in what is essentially a self-guided course. There is a lot to unpack in that finding and I will save it for a future post on learning design.

It was the third finding (3) that directly influenced our immediate planning and resulted in a pivot for our most recent production in order to create a sequel. With responses from learners in the thousands(!) wanting closure for the main character, Emma, we decided to put a scheduled series on hold and shift focus to the sequel for Skippers Pass. It will be interesting to gauge the success of this decision over the next 6-12 months and see if the feedback in action then results in a positive uptake of the course.

Without doubt, we are in a fortunate position of controlling the materials creation process that enables such a significant change to occur. Yet regardless of the size of change required, the important takeaway is that the learner’s voice should always be heard, and when appropriate, included as a valued factor in decision making.