The Riverside Lighthouse Assessment Journey in 2020

2020 was an opportune and vital year to introduce evidence and outcomes based AfL for our teachers and students. The Covid 19 pandemic had disrupted lives and livelihoods in Singapore and the world. Home based learning affected the nature and extent of teachers’ instruction and support to students. Many educators and parents were concerned whether students were learning adequately for their school assessment. Yet, given how powerfully assessment drives learning, our school chose to focus on using assessment to support and enhance learning.

Given the imperatives of MOE initiatives from Full Subject Based Banding to PDLP, it is critical for schools to set a clear and sustainable focus on AfL. We embarked on a bespoke AfL professional development approach that could enable teachers to examine the efficacy and evidence of their practices, and learn new ways of impacting students’ learning. At the same time, we wanted an approach that could tap on and develop the assessment leadership of our Key Personnel (KPs) and Lead/Senior Teachers (LSTs).

The ambitious nature of our AfL PD required our school leaders, KPs, LSTs and teachers to work closely together. As school leaders, having to work closely with the HODs provides the opportunity to build the capacity of the HODs, staff team and ensure consistencies and connections across the school.

This was our 2020 journey in learning about effective assessment feedback to ensure and enhance learning, from the perspectives of my colleagues.

Overview and Outcomes (Lin Shaojie, Lead Teacher, Mathematics)

“For the next Lighthouse session, you will bring along the samples of your assessment feedback that reflect…” Teachers scribbled down the requirements of their new assignment at full tilt, as they contemplated ways to adapt to the suggested actions and pondered over the significance of the changes to our current assessment feedback practices.

This was our recurring experience with AfL PD! We pushed ourselves to do the “homework” because we realised that assessment feedback should be doing more for students’ learning and that a number of our feedback practices did not seem to be producing good outcomes.

We face several challenges when our students did not act on the feedback given. Could our feedback be made more specific to help them understand? Or could the student be unwilling to do corrections in the first place? As teachers who want to see better learning outcomes from their cognitive engagement with the feedback, we need to cater to the learning needs of our students.

Our weekly PD sessions were designed to guide our teachers to look further and look beyond the current methods. We wanted to enhance our current assessment feedback practices to inspire students to self-regulate and learn on their own. To build a sustainable model, we invited an assessment consultant expert to co-design a series of PD sessions that focused on practical methods that teachers could apply instantly in their classrooms. Our mass Lighthouse sessions alternated with the department professional learning community sessions where further discussions and reflections took place in smaller groups. Before each mass session, the consultant expert would meet with key personnel of each department and discuss how assessment feedback artefacts could be prepared and used to discuss, critique and enhance feedback practices in their subjects. These pre-session consultations helped with early detection and resolution of potential issues, ascertainment of focal points for discussion, as well as making every minute count in the finite duration of the actual session.

However, teachers’ AfL PD were not confined only to the lighthouse sessions. Learning tasks were given after each session to provide teachers with opportunities to apply what they had learnt, and to examine students’ responses to their modified feedback practices. The combination of learning from PD sessions and applying feedback principles to their respective contexts between sessions created a series of continual and sustained period of learning for teachers. It was also an invaluable opportunity for KPs and LSTs to exercise and exemplify assessment leadership in their disciplines.

As we honed our feedback practices continually to respond to students’ learning needs, we learnt how to right-size the students’ learning with better alignment to process level outcomes, thereby increasing the degree of student independence. Assessment for learning helped both teachers and students realise what was important for learning. Likewise, it was important for teachers in Riverside Secondary to realise and reflect on what they had learnt about AfL.

Our enhanced AfL practices directed us to understand how and how well students were learning. Likewise, we wanted to know what the experience of learning AfL was like for our teachers, and we opted for a non evaluative approach.

A survey of teachers’ AfL learning was conducted with four questions (adapted from Stephen Brookfield’s Critical Incident Questionnaire):
  • What was most puzzling to you?
  • What was said that was affirming or helpful?
  • What was something useful you learnt?
  • What would your students like you or your colleagues to do now?
The reflections of our colleagues on their learning were heartening and provided diverse perspectives and good ideas for implementation. It took courage to accept the limitations of our current practices. The additional effort and time set aside for adjustments to a less familiar feedback system spoke volumes of the teachers’ incremental mindset and new found motivation to help our students achieve intended learning outcomes.

Integrating Assessment Feedback into Classroom Instruction (Tio Eng Chu, Head of Department, Science)

It was indeed a challenge to grapple with underlying issues of assessment for learning, and examining what would work best to ensure that students would achieve their intended learning outcomes. This required deep, difficult and systematic thinking, and four questions in particular were helpful in organising our approach:
  1. What do you (teachers) know about the students and the task?
  2. What do you (teachers) do on the students’ work?
  3. What do students do on your (teachers’) feedback?
  4. What do students get from your (teachers’) feedback?
This helped teachers to reflect on their roles in assessment, as well as to refine and clarify how assessment feedback could be integrated with classroom instruction. It dawned on us that feedback is not effective if students did not act on it. However, we could not assume that every student could act on every single piece of feedback we wanted to provide. Prioritisation is key, and hence it is useful to differentiate between mandatory and optional levels of action expected of the students (Tan, 2014). On top of that, students must be given sufficient opportunities to demonstrate their learning, as well as their understanding on the teachers’ feedback, in class during lessons and in individual assignments. The following diagram summarises what teachers can do to help students to be more independent in achieving desired learning outcomes.
We also learnt that assessment feedback practices differed according to the disciplinary needs of different subjects. Amidst the diversity of assessment feedback approaches, three key ideas of effective assessment feedback for ensuring and enhancing students’ achievement of learning outcomes were identified:


Science: Short questions are deceptively simple. The complex learning processes need to be analytically unpacked for students. Including questions to reflect specific process outcomes help students frame their thoughts in answering short questions, and indicate how students may be cognitively engaged with the complexities of science process thinking in assessment feedback.


Mathematics: Effective assessment feedback should also prepare students to become increasingly independent in their learning. Assessment feedback should guide students to make sense of their work without directly specifying their next steps, and creating ample opportunities for students to act on the feedback, so as to develop their productive dispositions and proficiency levels. This would develop and sustain their learning for the longer term.


English: Assessment feedback need not necessarily be given after students have produced work. When students’ learning gaps can be anticipated, a pre-writing task works as a form of feedback to pre-empt students’ mistakes and guide them towards improved learning. This form of feedback helps activate the schema in students and keeps them focused on any feedback given previously. This also helps to make any feedback given after a task more effective.


Here are three examples of assessment feedback practices that illustrates these key ideas in practice.

AfL in Science Process Skills (Haryani Hamidan, Teacher, Chemistry)

A constant challenge for our science students is fully understanding the complex processes and related concepts in short science questions. The following is an example of a deceptively simple science question that merely requires students to ‘fill in the blanks’.
The question above may look easy as it only requires short answers. However, looks are indeed deceiving. This seemingly simple question requires multiple levels of processing, as shown in the diagram below. When they see ‘potassium’, they will write potassium metal and when they see ‘silver’, they will write silver metal and silver oxide, without much thought. However, most will end up answering the question wrongly.
There is more than one reason why this question is challenging for many students. Firstly, they may not have used the flowchart, may not have memorised the solubility of salts and may have forgotten the acid reactions. Inability to perform any one of these steps will cause inability to answer the question correctly. I believe that when students are aware of the process required to answer the question, they will feel more guided and confident.

Hence, I decided to unpack the thinking process by modifying this deceivingly easy question.
In this modified version, questions were added to include the process of answering the final question. It provides an authentic reflection of the involved. It will also provide me with a tool to identify the specific process level outcome which is lacking for my students. When students are exposed to such questions often, it gives them an idea of how to process their thoughts when faced with something similar. It was important for me to understand what students thought about this help that was being provided.

A focus group discussion (FGD) was carried out to find out how the students felt about the modified assignments. In general, students find them positive as it helps them think of the way to approach the question, makes them think more in depth and helps them focus on how to answer the question. They too appreciate that such questions help them identify their own specific learning gaps. However, they are worried at the same time as they are afraid that they might be too overly dependent on those questions and might not be able to think on their own during exam. As such, they suggested to have a final assignment which does not include the added questions. In addition, students also suggested they be given a more detailed and specific guiding question as some students may not see the relevance of the added question to the main question.

Having been given an opportunity to share my approach with my colleagues, I gained more suggestions on how I can move forward. Based on the FGD, one student feedback that some students may not need the modified assignment as they were already able to answer the original question correctly. Hence, some teachers suggested to provide both versions of the assignment and to give the students a choice to choose. This can be a resource for differentiated instruction. Formatting of the modified assignment is also important to encourage students to answer those questions. This suggestion was raised as some of the modified assignments do not provide space for the students to write their responses. These feedback are helpful for me in crafting my future assignments.

Big Idea: Short questions are deceptively simple. The complex learning processes need to be analytically unpacked for students. Including questions to reflect specific process outcomes help students frame their thoughts in answering short questions and indicate how students may be cognitively engaged with the complexities of science process thinking in assessment feedback.

Assessment for Learning in Mathematics (Lin Shaojie, Lead Teacher, Mathematics)
The procedure in focus that week was prime factorization. My students dutifully found the lowest common denominator with what they had just learnt. Generally, they arrived at 1200 and probably did not understand the significance of that number. The immediate question that came to mind when marking was “What could this student be thinking?” and the second question was “What do I want him to be thinking?”

The fastest way to take away a student’s learning was by directly specifying what he needed to do next i.e. divide 1200 by 150. The other extreme would be indicating the specific correct and wrong part of the response, respectively with a tick and a cross or circle. The student would be left to figure out the problem on his own with near zero guidance. For Feedback A, I wanted the student to explain the meaning of her final answer. For Feedback B, I tried to convince the student that the final value was not logical, and prompted her to review her solution. My goal was to guide them to identify the misconceptions in their thinking and make sense of what they are writing through clear, quality written feedback, with supplementary teacher-student dialogic feedback and peer-to-peer conversations.
In the subsequent lesson, some students could solve the problem after reading the feedback, some attempted but still could not understand the feedback. Hence, I provided a simpler yet similar problem for them. Working with smaller numbers would help students mitigate some challenges pertaining to number sense and computational fluency, while maintaining the same constraints. I believe this would help them develop the strategic competence in solving such problems. Virtually all the students made sense of the intermediate answer in the new problem, transferred the understanding to new problems and increased their productive dispositions with their new found confidence and motivation.

How do we know that our students have learnt? I believe this happens when they are able to bridge learning gaps through completing corrections with minimal support. We could provide more opportunities for them to transfer their understanding to similar contexts. Nevertheless, in order to fully assess their levels of understanding, I would also need to increase the complexity of the questions continually, and quiz them on the same topic at different junctures across the year.

Big Idea: The key to effective feedback is to guide students to make sense of their work without directly specifying their next steps, and creating ample opportunities for students to act on the feedback, so as to develop their productive dispositions and proficiency levels.

Assessment for Learning in English (Lee Jin Wee Joel, Teacher, English Language)

Target Group: 1NA and 1NT OOS students
Lesson Focus: Tenses

Students with a weak foundation in the English language tend to get confused with past and present tenses when writing their essays, which was evident in the skill which we were focusing on for the term: Personal recounts.

In students’ first ever personal recount submission, most students had their tenses all mixed up with the short submission being a mixture of past and present tenses. I was concerned and I knew I had to address this gap before I could move on to any subsequent form of writing in their 4 years of secondary school education.

To address this gap, I spent 30 minutes focusing on past and present tenses and subsequently got my students to rewrite their letters. In addition, I got students to underline all the action verbs which they can spot, as these are the words they should pay attention to when checking if their essay was written in the right tense. However, despite this exercise and the lesson on tenses, the majority of the students were still not able to deliver a clean piece of writing without mistakes in their tenses. Most students received feedback similar to the one below, which was aimed at being encouraging yet reminding them of the feedback that they received prior to this task.
Contrary to typical feedback practices when it is usually given after submission, my case here was that students were aware that a recount should be in the past tense since students already received two rounds of feedback. I then anticipated the problem and designed a pre-writing task as a form of feedback prior to their writing instead of waiting for their next submission which will be too late.

For their next submission which was an essay, students had to fill in this worksheet prior to writing their essay. This exercise focused on reminding students that they were recounting about an event from the past so their writing should be in the past tense. It also helped them organize their essays.
There were significant improvements in their second submission in terms of organization and grammar, with the majority of the class now being able to submit a relatively clean piece of writing in terms of past and present tense. I attribute this to the constant reminder from this pre-writing task when students refer to it while writing their personal recounts.

I then made it a point to have a pre-writing task for submissions which I anticipate problems in as a form of feedback to remind and help guide students in what is expected of them.

Big Idea: Assessment feedback need not necessarily be given after students have produced work. When students’ learning gaps can be anticipated, a pre-writing task works as a form of feedback to pre-empt students’ mistakes and guide them towards improved learning. This form of feedback helps activate the schema in students and keeps them focused on any feedback given previously. This also helps to make any feedback given after a task more effective.

Conclusion

The approach to assessment leadership in RSS is intentional and thoughtful. This requires assessment leadership at all levels, and it required both KPs and LSTs to work together and lead assessment in their respective departments. Hence, one key aspect of an assessment journey is to strengthen the assessment leadership of the HODs and LSTs, and support them at the same time.

We are cognizant that having an expert in our assessment journey is not sustainable in the long run. What is critical is for assessment leaders in various subjects to have the confidence and clarity to lead their department in the assessment journey, from unpacking concepts of AfL to rethinking about their practices. By co-constructing the lighthouse sessions with the expert and working closely with staff on focusing on artefacts (staff’s feedback and students’ actions), both KPs and LSTs have delved deeper into their practices and have gotten to know their colleagues’ work better. In turn, this clarified their strategic roles as curriculum leaders and gave them concrete opportunities to develop and support their colleagues.

Changing mindsets is a common challenge in most organisations. Given the successes from past practices and beliefs about learning, it is not easy to convince teachers to change their approach in teaching. With the use of students’ artefacts and information, it was easier to convince teachers to shift their mindset about assessment.

As assessment leaders who wanted to see improvement, we needed to lead and role model the change first by setting certain standards and demonstrating a strong willingness to make appropriate adjustments. This required a common understanding and coordination among school leaders, KPs and LSTs with the help of the expert consultant. The learners’ disposition displayed by the school leaders in this journey helped to create a safe environment for the staff to engage in their learning. Together, we could offer coherent assessment leadership and help our colleagues see the value in the learning and sharing of practices. It was salient to gear towards a collective level of understanding that the disparate practices are not teacher preferences, as feedback practices need to be minimally effective. Teacher resistance was expected. Nonetheless, as we resurfaced our shared beliefs and goals, and our empathy for the struggling learners, our mutual respect and trust in each other would help us see the meaning in adapting to the change.

This video from RSS captures what our teachers have learnt in the journey of assessment. We hope it will encourage your heart, and spur you on to experience, enjoy, and evidence assessment literacy in meaningful ways.