Coherence in Science Assessment Literacy

 by Fung Eng Chu , Riverside Secondary School
     Tan Xing Yu, Riverside Secondary School
     Tan Thiam Soo, Riverside Secondary School
     Lee Min Huey, Riverside Secondary School


Assessment has been our main focus during our professional development (called light house sessions) and Science Department meetings. In 2020, the lighthouse sessions helped deepen our understanding of what assessment can do for learning. Our journey on assessment continues in 2021 in various ways, with a common department goal of making students more independent in learning through focusing on process-level feedback.

This is a pivotal change which requires a lot of effort, from reviewing the existing assessment task to revising feedback comments. Despite the hard work, the teachers see the value of such change. This is a quote given by a Science teacher:

“I see more value in trying to start working on our feedback with the purpose of enabling our learners to attain the process level. Through the sessions, I see that there is a link between a well-crafted question and how I can enhance the effectiveness of our feedback in helping our learners to attain, at least, the process level."

Due to the emerging demands for ICT skills and abilities to manage students with different learning needs, teachers have to attend courses such as e-pedagogy, differentiated instruction and blended learning. Often, teachers view these courses in isolation.  Hence, it is essential to continue the dialogue with teachers and reiterate the focus of assessment to support and serve the learning needs of students in all our teaching approaches. The following are examples of using assessment to support students towards process level outcomes in Science.

The implementation of a new Lower Secondary Science syllabus provides an opportunity for us to leverage the integrative activity created by Science CPDD to assess students differently using rubrics. Our Science teacher, Xing Yu, will elaborate more on using rubrics to encourage self and peer assessment in Secondary one Science this year.

On top of that, the emerging use of ICT tools/platforms such as Students Learning Space (SLS) allows teachers to reconsider the roles of technology in designing meaningful assessment tasks, which will be shared in detail later by our Physics teacher, Thiam Soon.

To give helpful process level feedback, teachers have to be skillful in designing learning tasks that can help to identify specific learning gaps from students’ answers. Min Huey will share her own experience in the last segment of this article.

Using rubrics in science assessment – Xing Yu

After completing several science topics, we wanted to allow students to apply their learning across different topics on an authentic task. We leveraged on the integrative activity designed by CPDD in activity book 1A. The students were required to write a proposal stating their plans, experimental procedures and evaluation of their plans in water purification.

We developed rubrics to assess students’ application of their learning on the given task. The main aim of our rubrics was to provide the students with guidance on what is expected and to encourage them to be more self-directed in striving for the best. We hope students will take the initiative to improve their plans by self-evaluating using the rubrics. Annex 1.1  shows parts of our first draft of the rubrics. Through our PD sharing and discussion, we realised that we may not achieve our objective with this set of rubrics. For example, big words like "scientifically sound" may not be understood by the students. To improve our rubrics, we refer to the example given by Tan, Salim & Manimala (2020) on how the teachers crafted their rubrics by including positive criterion specific examples. The teachers included descriptions in their rubrics to showcase what they meant by “apt and precise” vocabulary and “conveying messages in a compelling manner” in writings. Taking ideas from Tan (2020), we decided to unpack the term "scientifically sound" using an example from a scenario whereby students are required to separate a mixture containing soluble and insoluble solids. 

Annex1.2 shows how we unpacked the terms by giving examples of the steps required for the separation, classified into different levels of attainment. We also included comments on the strengths or weaknesses of the examples given. This approach not only gave the students an idea of what is “scientifically sound”, it also prompted us to think through what exactly  we are expecting for  each level of attainment. Two concerns with having a detailed rubric like this are that we need to spend more time creating the rubrics and the students may be overwhelmed by the amount of information presented. To make the work more manageable, we decided to unpack only one criterion, which is harder to comprehend. For the students, we broke down the rubric into two parts, giving only necessary information for the project at that moment. Part one consist of the criteria for planning and peer evaluation while part two consist of the criteria for conducting the investigation, concluding and evaluating their project.

We felt that the detailed rubrics had guided the students well in planning their investigation. The majority of the groups provided sound procedures although they have different foci for their investigation. The rubrics reminded students to give illustrations to complement their procedures. To score in creativity, students created diagrams to match the plans that they came out with instead of using readily available images. Annex1.3 gives an example of work produced by students.

Rubrics are useful tools for assessing tasks that allow students to produce a wide variety of work. To encourage students to be more independent in learning through rubrics, we must be mindful of providing age-appropriate content with examples when necessary. Doing so also helps us to be more precise about what we are expecting.

Technology-enabled Feedback - Thiam Soon

With technology so pervasive in our lives, we would naturally tend to explore how it could supplement the learning experience of our students. Technology can enhance learning by providing a different platform for teachers to assess the students' learning.

I focused on two main Key Applications of Technology (KAT) of e-pedagogy outlined in my science class: “Support Assessment for Learning” and “Facilitate Learning Together”. The learning topic was on the Kinetic Model of Matter in Physics.

In “Support Assessment for Learning”, the Student Learning Space (SLS) was used to get students to respond to the questions via the Interactive Thinking Tool (ITT). Students were given some time to submit their responses online, and then the teacher reviewed and commented on their responses verbally in class, thus providing feedback to the students.

In "Facilitate Learning Together", the students also used the ITT in the SLS, but this time instead of the teacher actively reviewing and providing feedback for learning, the students were tasked to evaluate their peer's works and give feedback on them. The teacher's role was to facilitate, guide and monitor the students in the peer review exercise. It was hoped that this would be a positive step forward in getting students to be more self-directed and take charge of their learning.

The diagram below shows how we focused on giving feedback to improve students’ answers instead of responding to the question using ITT in SLS.

For more details, click on the link here to access the full article.

Process-level feedback – Min Huey

In the teaching of Biology, many students struggle with answering lengthy and complicated application questions. Students' struggle was often masked under a myriad of other problems, such as the inability to unpack the information given in application questions and weak mastery of content knowledge. The use of process-level feedback will help the teacher to identify learning gaps clearly and provide targeted feedback. Hence, allowing students to reflect upon their work, improve the skills needed to attempt future application questions and attain subject mastery.

Process-level feedback gives students targeted feedback to improve their mastery of the processes involved in attempting a question to show conceptual understanding (Hattie & Timperley, 2007). This will help students attain subject mastery and transferrable skills beyond the initial task (Goh, 2021). To give effective process-level feedback, the teacher has to consider the learner's profile, and craft assessment tasks that allow targeted feedback to help students close the learning gap. The design of an assessment task can involve a variety of ways of collecting evidence of students’ learning, such as drawing and writing in prose. 

In the full article here, I will elaborate on how I have changed my correction process and redesigned the assessment task during corrections to give process-level feedback. After marking the assessment task, I noticed three learning gaps and addressed them by changing the correction process. The assessment task was redesigned and used for corrections. The new correction process allowed me to give process-level feedback that targeted the learning gaps (shown in the table below). 

In summary, complicated and lengthy Biology application questions can be unpacked to

●      make students’ thinking visible (being intentional with the questions that help to scaffold students’ thinking)

●      allow for targeted feedback (breaking down one long lengthy question into many small targeted questions)

●      enable students to express their understanding through drawings (to help students who struggle with expressing their answers in prose)