Showing posts with label Rachel Goh. Show all posts
Showing posts with label Rachel Goh. Show all posts

In Conversation with Dr Rachel Goh

 


What prompted/persuaded you to join NIE as a teaching fellow four years ago?

As a newly-minted doctoral graduate, the opportunity to work in a higher education context was a godsend. My original intent was to apply the learning from my doctoral research in facilitating teacher professional development in the context of lesson study. The opportunity to work with NIE faculty enabled me to build my curriculum and assessment expertise and participate in cutting-edge research on assessment feedback in schools, which far exceeded my expectations. 

What were the incidents or experiences that have impacted you the most?

The incidents that have impacted me the most revolved around my interactions with struggling students in the higher degree programme, many of whom were trying to cope with the academic demands as well as manage challenges at work and transitions at home. In sitting beside my students to work out the learning issues with them, I found myself developing greater empathy for learners and being more cognisant of the assessment theory-practice nexus at work. 

How has your philosophy and your pedagogy in assessment changed?

My pedagogy in assessment has become less teacher-directed and more student-driven over the course of time, shaped largely by a deepening belief in the importance of fostering student voice and developing self-regulation. This philosophy informed by my research experience fuelled changes I made in involving students in assessment by way of co-constructing rubrics, having students make feedback requests, peer- and self-assess.

What would be your proudest achievement?

My proudest achievement would be the publication of the book ‘Designing quality assessment feedback in schools’ which I had worked on with middle leaders from the MLS assessment feedback elective and teachers from schools. This book project would not have been possible if not for the encouragement from my mentors, A/P Kelvin Tan and Dr Tay Hui Yong, who also gave invaluable feedback on the manuscript.

What is one thing you would say is most vital for assessment to meet the needs of all learners?

For assessment to meet the needs of all learners, one thing that is vital is to reframe what is deemed as a measure of success. Valuing students’ personal best where we compare their current performance to their past performances places a primacy upon the learners’ individual progress and development. The shift in emphasis can counter the pressures of normative comparison and ensure that intended learning outcomes are achieved by each and every single learner under our care, including those struggling to learn.

What are your future plans for, and with, assessment?

I hope to promote teacher inquiry into assessment practices in schools to challenge ourselves to ensure that all learners benefit adequately from assessment. 

 

 


Big Ideas of Effective Feedback

by Rachel Goh 

This is an excerpt from the introductory chapter of the book “Designing Quality Assessment Feedback Practices in Schools”

“Class, I’ve marked your essays and given you comments,” announces the teacher. Restless murmuring fills the classroom as students collect back their assignments in dreary anticipation of corrections to be done. “Luckily, I passed,” exclaims a student, heaving a sigh of relief. A few students scrutinize their papers, seemingly trying hard to decipher the teacher’s handwriting. Others browse quickly through the written comments, satisfied that they’ve understood the teacher’s feedback.

How would you want your students to receive and respond to feedback comments?

Knowing what feedback should do for students’ learning hinges on understanding its role in Assessment for Learning (AfL). AfL has been defined in the MOE Assessment Glossary as “assessment that supports teaching and learning with specific use of learner-centred approaches and strategies. It is primarily used for ensuring (emphasis mine) that the intended learning outcomes are achieved by students” (Ministry of Education Singapore, 2017).

This article seeks to clarify what feedback should do for students’ learning by introducing three Big Ideas of effective feedback:
  1. Effective feedback should help students find and fix their learning gap.
  2. Effective feedback should make students want and be good enough to fix their gap.
  3. Effective feedback should persuade students to go beyond fixes and gaps.
In rethinking about what feedback should do for student learning based on the three Big Ideas, I would like to offer a definition of effective assessment feedback practice: It is not about teachers unilaterally giving feedback to students. It is not about telling students what their weaknesses are without reservations. It is not about getting students to do corrections. Effective assessment feedback practice is ultimately about personally persuading and convincing each learner of the value, and values of learning your subject discipline.

What is your idea of effective assessment feedback practice? Read the following examples and pen down what would make the feedback practice more effective.

Example 1.1

Subject/Topic/Level: Science/Food chains and food web/Primary 6
Contributed by Teo Guat Soon

Part A: Description of assessment challenge

From the analysis of students’ responses, two Issues were observed. For part (a), students were not able to give an explanation to support their choice of a food producer. For part (b), students failed to state what “affected” meant and did not make a distinction between the terms “main food source” and “only food source” in their reasoning.

Part B: Description of assessment feedback practice

The teacher collected the students’ assignments and selected a few responses that were representative of student misconceptions and/or poor conceptual reasoning. The next lesson, she gave each student a copy of the collated unmarked samples of anonymised students’ answers. As part of a whole class discussion, she got students to compare the selected samples, and notice wrong concept(s) or wrong term(s) used, and the evidence of a lack of conceptual reasoning. As the teacher elicited and consolidated the learning points, the students took notes of their specific gaps on the worksheet. Finally, the teacher got the students to work on their refined answer. The teacher marked the refined answers with ticks to annotate the gaps that were closed, and with crosses and cues to highlight gaps that students still need to address. Example 1.1.2 illustrates the feedback provided.

The teacher had moved away from the typical practice of marking students’ assignments, going through the answers in class, and getting students to copy down the model answer as corrections. She has instead taken effort to select weak and stronger exemplars of students’ work and have them make judgement on how well the answers have addressed the question requirements. Unpacking the criteria of good performance also enabled students to compare them against their own answers to address their own specific learning gaps. This example illustrates that there is a critical difference between giving feedback and feedback as practice. Feedback as practice moves from the act of teacher telling students all their mistakes toward the co-construction of knowledge through showing and concluding.

Big Idea 1: The key to effective feedback is helping students find and fix their learning gap.
What should feedback do for student learning? Assessment feedback should help all students find and fix their learning gap. Assessment feedback should help develop students’ capacity to make judgement and take requisite action. Helping students find and fix their learning gap will ensure that they develop the capacity to make judgement and utilize feedback for improvement.


Example 1.2

Subject/Focus: Chinese Language/Spelling
Contributed by Tay Choon Hong

Part A: Description of assessment challenge

The motivation to learn Chinese can be low for some students due to the fact that their preferred mode of communication is through English. The strokes of Chinese characters are also relatively more difficult to master. Some students are also easily content with what they have achieved. They are not motivated to put in more effort in their work. What should the teacher do? (a) Tell the students what their weaknesses are without reservations, or (b) Tell the student what they are capable of achieving. Which one is a better choice for motivating the student to improve?

Part B: Description of assessment feedback practice

The answer to the above question about the teacher’s necessary action seems pretty obvious. The teacher was cognizant that feedback comments could influence students’ self-perception of their competence and shape their beliefs about their learning ability. It led him to want to practise positive feedback so that students would be more willing to act on the feedback. Example 1.2.1 and Example 1.2.2 shows the teacher’s comments to a student on his performance across two spelling tests. He had noticed from previous spelling tests that the student tended to make careless mistakes in writing the strokes for the Chinese characters so he wrote a comment to have the student see him to address this concern, which the student eventually did. The teacher continued practising positive feedback in the next spelling test as shown in Example 1.2.2 by encouraging the student to continue to apply the method taught so that he could achieve his best performance.

The teacher reflected that in his past feedback practice, there was a tendency to focus on the weaknesses of students instead of their potential. Adopting a formative assessment mindset helped in shifting attention to what students could potentially do. While acknowledging that writing such comments can be time consuming, the capital investment has yielded benefits for students in terms of a greater motivation to learn. The teacher described what has been helpful in practising positive feedback: (a) Thinking from the perspective of the learner: How are they likely to react to the written/verbal feedback; (b) Including comments with a positive connotation to try again instead of merely directing students to do corrections; and (c) Giving students a concrete opportunity to act on the feedback.

Big Idea 2: The key to effective feedback is making students want and be good enough to fix their gap.
What should feedback do for students’ learning? Assessment feedback should make all students want and be good enough to fix their learning gap. Assessment feedback should build students’ motivation and ability to use feedback to enhance learning for the long term. Making students want and be good enough to fix their gap will build their will and skill in receiving and responding to feedback for sustainable learning.


Example 1.3

Subject/Topic/Level: Mathematics/Linear Law/Secondary 3
Contributed by Lee Chin Hock

Part A: Description of assessment challenge

A procedural understanding is required in the learning of Linear Law in mathematics. This is likewise in the learning of other subjects such as process skills in Science. Teachers will usually teach the concept and then use examples to demonstrate the steps to solve the questions. However, some students still encounter difficulty in solving such questions involving procedural understanding. In this case, the teacher noticed three common mistakes that students make in answering questions involving Linear Law: (a) Students encounter difficulty in joining all the points to plot the linear graph when most points are not nicely collinear; (b) Students do not extend the line to cut the vertical axis and as such, they are unable to find the Y- intercept; and (c) Students forget to replace Y and X with the actual labels of the axes instead they assume Y = y.

Part B: Description of assessment feedback practice

The teacher wanted to help students develop a checklist to guide their procedural thinking. By then, students were familiar with the conversion of a linear form to a non-linear equation. The learning focus was to help students with conversion of a non-linear form to a linear equation. He began at the feed up stage, using a question on converting a linear form to a non-linear graph to help students recall the procedural steps. He elicited students’ responses to draw up a checklist for the procedure. This helped students understand what a good checklist would involve. Using another question, the teacher demonstrated the steps involved in converting a non-linear form to a linear equation. Students were asked to create a checklist of the steps involved on their own. They could refer to the first checklist as needed. Selected students were then invited to present their checklist to the class for feedback. Through questions posed by the teacher as feedback, the presenting students were able to revise their checklist, and the ensuing class discussion also prompted other students to compare their checklist with those presented to identify and close the gaps in their procedure. A student-generated checklist is shown in Example 1.3.1.
Later, an opportunity to transfer their learning to another topic was created. Students were asked to create a checklist on plotting a cumulative frequency curve. By then, they were able to do it independently without additional instructional scaffolding and a combination of peer feedback and teacher feedback could be employed.

Big Idea 3: The key to effective feedback is persuading students to go beyond fixes and gaps.
What should feedback do for students’ learning? Assessment feedback should persuade students to go beyond corrections in moving their learning forward. Assessment feedback should develop students to become self-regulated learners, capable of monitoring their learning in the long term. Persuading students to go beyond fixes and gaps will help develop their self-regulation for learning in and beyond the classroom.

How did the three Big Ideas connect-extend-challenge your thinking on what would make the feedback practice more effective? Share your thoughts with us in the AFAL community on this Padlet https://padlet.com/AFAL/BigIdeas

Focus

E-Pedagogy in support of AfL and AFAL: What does it do for all learners?
by Rachel Goh 



The theme for this issue is E-Pedagogy (E-Ped) in support of AfL and AFAL. While various definitions of E-Ped have been offered (e.g., Baldins, 2016), I prefer the conceptualisation of E-Ped as "approaches to teaching that utilize the affordances of digital information and communication technologies” (Way, 2009). This view contrasts a narrower conception of E-Ped as an assemblage of teaching strategies for E-learning.

There is an ongoing debate about whether E-Ped is a branch of pedagogy (Baldins, 2016), understood as educational practice informed by particular learning theory (e.g., behaviorism, cognitivism, social constructivism, etc.), or is “a radically different vision of pedagogy based on soft skills and digital literacies” (Livingstone, 2012, p. 9). Beyond such a binary quandary, a more generative way in theorizing, is perhaps to consider the argument of E-Ped as reflexive pedagogy that draws on different learning theories, decisions about which is most apt flow from the judgement as to the intended learning outcomes (Kalantzis & Cope, 2020; James, 2006).

In this regard, a principled approach to E-Ped would support the ambitious views of employing E-Ped intended “to accelerate and deepen learning by making it more active and personalized” (MOE, 2020a) and “for active learning that creates a participatory, connected and reflective classroom to nurture the future-ready learner” (MOE, 2020b). If E-Ped is to be employed to deepen students’ learning, it would not be difficult to make the connection to how it hinges on supporting AfL with the ultimate goal of getting students “to take ownership by playing an active role in the process of learning in school and beyond” (Leong, Ismail, Tay, Tan, & Lin, 2019).

So, how can E-Ped support a learner-centred AfL process for all learners?

Watch this video for an overview of the discussion on E-Ped in support of AfL and AFAL.



The COVID-19 pandemic has thrusted us into home-based learning at a time when face-to-face interactions are constrained. While the best of technological tools cannot fully replicate learning conditions afforded by in-person human interactions, digital affordances, when harnessed appropriately, can enhance learning and enable different meaningful learning experiences, in and beyond the physical classroom. Drawing from published literature and school practices, examples will be presented on how digital technologies have been harnessed to elicit evidence on where students are in their learning, engage them in quality feedback, and elevate their role in assessment.

We begin by describing three technology-enabled strategies that may be employed separately or in combination at the substitution or augmentation levels to enhance AfL (Puentedura, 2013). Then, how technology can be harnessed to modify task design to enable deep learning will be discussed. Finally, an example will be presented to illustrate the affordances of technology in redefining student learning, specifically in enabling the development of student voice.

The three technology-enabled strategies that are of interest are the use of: (1) online knowledge surveys, (2) online student-generated questions and peer responses, and (3) electronic reflective journals.

  • Online knowledge surveys are employed to elicit information on student understanding, and the evidence of which is used formatively for course design/redesign. A knowledge survey consists of a set of questions that cover the content of the course, which students answer by responding to a rating scale of their confidence to respond with competence to each question (Bahati, Fors, Hansen, Nouri, & Mukama, 2019). Knowledge survey practices can serve formative assessment purposes by giving students opportunities to self-assess their understanding and learning gaps at key junctures as the course progresses. While a knowledge survey, in and of itself, may be a poor indicator of learning, gains in student self-efficacy were reported when comparing pre- and post-instruction self-assessment surveys (Bowers, Brandon, & Hill, 2005; Clauss & Geedey, 2010). The implication for tech-enabled self-assessment to support all learners is the need to raise student awareness of the different resources they can tap to close the identified learning gaps, for example, revising self-study materials, engaging in peer/teacher dialogue to clarify understanding, etc.
  • Getting students to generate learning material-related questions and respond to one another’s questions are ways of activating students as learning resources for one another (Wiliam & Leahy, 2016). The feature article on the practice in Juying Secondary School provides an illustrative case of peer assessment enabled through the affordances of Flipgrid in the context of blended learning for the teaching of oracy in Ms Lau Jia Yun’s classroom. Click here to view the article and video.
  • Journals, enabled by technology such as Google documents, can offer a digital learning space for student reflection beyond the limits of physical records. When well-designed, journal prompts and tasks help students reflect on critical learning incidents and/or learning interactions over a given period of time (Thorpe, 2004). Electronic reflective journals, when used in support of AfL, can be used to prompt students to pose questions to elicit teacher feedback on their written work. This can help teachers offer more focused feedback and gain insights on student thinking. The affordances enabled through in-text comments allow students to respond to the teacher’s questions/comments, create a digital space to extend dialogic feedback, and provide a reference for future work.

In the study by Bahati et al. (2019) where 109 pre-service teachers were surveyed, the results showed that the online knowledge survey was an e-assessment strategy that the participants were mostly satisfied with in terms of both quality of engagement and quality of feedback. This was followed by electronic reflective journals and online student-generated questions. The study found no relationship between the students’ scores and learner satisfaction with aspects of blended learning, unlike another study (Chitkushev, Vodenska, & Zlateva, 2014). While the link between learner performance and learner satisfaction with the elements of blended learning is not definitive, the affordances of E-Ped to promote greater student agency and ownership of their learning lends weight to the curriculum imperative for using E-Ped to support all learners.

Beyond the inclusion of one or more technology-enabled strategies at the substitution or augmentation levels (Puentedura, 2013), technology can be harnessed to modify task design in support of AfL. In a study involving 410 first-year undergraduates in a psychology course, an online cognitive assessment tool (OCAT) was designed to employ different cognitive learning strategies to enable students to experience deep approach learning (DAL) as opposed to adopting a surface-approach to learning (SAL), characterized by rote memorization. The OCAT which takes a multiple-choice format begins with a free recall response where students can type as much information as they want in a dialogue box to enable active retrieval and help them experience “prime associations” (Shaw, MacIsaac, & Singleton-Jackson, 2019, p. 128). The OCAT offers retrieval cues for incorrect responses and second opportunities to answer questions for fewer marks, and ends with explanations for both correct/incorrect responses as immediate feedback. The study found that students regardless of whether they were learning-oriented (LO) or grade oriented (GO) had the highest level of engagement with the second opportunity feature, but only students who were high LO were taking advantage of the paired retrieval cues, such as taking time to read a text or review a video, to gain a deeper understanding of the material. This study illustrates the affordances of technology in enabling “repetitive attempts, retrieval cues and immediate feedback” (Shaw et al., 2009, p. 139) in transforming online tasks to support AfL. The implication for E-Ped in supporting all students, regardless of their academic-orientations, is to raise student awareness of the educative potential of the built-in cognitive learning features of online assessments to help them make the transition into deep approach learning.

Try this simple OCAT designed using Google Forms. Experience its affordances in supporting AfL through a combination of knowledge survey, retrieval cues, second tries, and immediate feedback.

In the final example, I want to present a case of how E-Ped transforms/redefines students’ learning experience to enable the development of student voice. The use of audio/video technology to complement written feedback is not new. What is gaining traction is the use of screen-casting technology to record verbal feedback with live annotations on student work which enables teachers to engage students in feedback beyond the limits of in-person interaction.

Keen to use screen-casting for tech-enabled feedback? Visit this link for curated how-to tutorials on using Zoom and Google Meet for screen-casting.
https://sites.google.com/view/assessment-literacy-for-all/home/AfL/Engaging


Minimally, the screen-cast video recording compels students to watch/listen to the feedback. The use of technology-enabled feedback in itself does not guarantee that students will take up the feedback. Opportunities to act on feedback need to be orchestrated by designing follow-up actions expected of students, be it in revising their writing or working on a parallel task.

Beyond harnessing digital technology to capture feedback, the study by Van der Kleij, Adie, and Cumming (2017) explored the use of video technology in enabling student voice in assessment feedback. Students were involved in video-stimulated recall of feedback conversations they had with their teacher and through viewing videos of their prior feedback interactions with teachers, they were able to self-reflect on their involvement in the feedback process and what they could/should have clarified, raised or discussed in the conversation. The study demonstrated the affordances of technology in elevating the role of students in assessments by encouraging them to make their voices heard, and allowing them to participate in feedback as a dialogic practice as opposed to a teacher monologue.

Beyond the “How to”, there are issues that are more intractable. Equity concerns argue that whether all students have access to such technology-enabled learning is fundamental (Klenowski, 2015). In the case of HBL in Singapore and elsewhere, “access is a far more complex issue than mere provision of facilities” (Furlong et al., 2000, p. 94). The availability of a computer, reported as a machine-to-student ratio, does not necessarily mean genuine access for all learners. The fact that there is provision for school use does not always mitigate the low access at home (Facer & Furlong, 2001), which is a current issue of concern in uplifting all students.

In this respect, the second feature article on school practice offers insights from parent surveys on their perspectives of the challenges and affordances of HBL. From their lived experience during the circuit-breaker, we can draw important implications on orchestrating necessary conditions to better support learning in the home front when in-person interactions in schools are constrained. Click here to read the article.

Beyond the practical considerations of HBL, we need to begin dialogue on making E-Ped supported AfL accountable for all learners in our specific contexts. How should we go about “ensuring that intended learning outcomes are achieved by ‸ALL students” (MOE, 2019)? What would E-Ped in support of AfL entail for the least, the last and the lost amongst our learners?

Suggested citation:

Goh, R. (2020). E-Pedagogy in support of AfL and AFAL: What does it do for all learners?. Assessment for All Learners (AfAL) Bulletin, October 2020.


References:
Bahati, B., Fors, U., Hansen, P., Nouri, J., & Mukama, E. (2019). Measuring learner satisfaction with formative e-assessment strategies. International Journal of Emerging Technologies in Learning, 14(7), 61–79. DOI 10.3991/ijet.v14i07.9120

Baldiņš, A. (2016). Insights into e-pedagogy concept development. Procedia-Social and Behavioral Sciences, 231, 251-255.

Bowers, N., Brandon, M., and Hill, C. D. (2005). The use of a knowledge survey as an indicator of student learning in an introductory biology course. Cell Biology Education, 4, 311-322.

Chitkushev, L., Vodenska, I., & Zlateva, T. (2014). Digital learning impact factors: Student satisfaction and performance in online courses. International Journal of Information and Education Technology, 4(4), 356.

Clauss, J. & Geedey, K. (2010). Knowledge surveys: Students ability to self-assess. Journal of the Scholarship of Teaching and Learning, 10, 14-24.

Facer, K. & Furlong, R. (2001), Beyond the myth of the ‘cyberkid’: Young people at the margins of the information revolution. Journal of Youth Studies, 4(4), 451-69.

Furlong, J., Furlong, K., Facer, K., & Sutherland. S. (2000). The national grid for learning: a curriculum without walls? Cambridge Journal of Education. 30(1), 91-110.

James, Mary. (2006). Assessment, Teaching and Theories of Learning. 47-60. 10.13140/2.1.5090.8960.

Kalantzis, M., & Cope, B. (2020). Introduction: The Digital Learner–Towards a Reflexive Pedagogy. In Handbook of Research on Digital Learning (pp. xviii-xxxi). IGI Global.

Klenowski, V. (2015). Fair assessment as social practice. Assessment Matters, 8, 76-93.

Leong, W. S., Ismail, H., Tay, H. Y., Tan, K., & Lin, R. (2019). Adopting learner-centred AfL process. [Brochure]. Singapore: Author.

Livingstone, S. (2012). Critical reflections on the benefits of ICT in education. Oxford review of education, 38(1), 9-24.

Ministry of Education Singapore. (2020a, August 19). Infosheet on SkillsFuture for Educators (SFED). https://www.moe.gov.sg/docs/default-source/document/media/press/2020/infosheet-on-SFEd.pdf

Ministry of Education Singapore. (2020b, September 15). Singapore Learning Designers Circle e-Bulletin Term 3/2020. https://drive.google.com/file/d/1HcgA4jvWTf2ZxVwXhtlensM4n4RCdwyc/view

Ministry of Education Singapore. (2019, June 20). Assessment concepts in assessment portal. Abstract retrieved from MOE Singapore intranet website OPAL.

Puentedura, R. R. (2013, May 29). SAMR: Moving from enhancement to transformation [Web log post]. Retrieved from http://www.hippasus.com/rrpweblog/archives/000095.html

Shaw, L., MacIsaac, J., & Singleton-Jackson, J. (2019). The efficacy of an online cognitive assessment tool for enhancing and improving student academic outcomes. Online Learning, 23(2), 124–144.

Thorpe, K. (2004). Reflective learning journals: From concept to practice. Reflective practice, 5(3), 327-343.

Van der Kleij, F., Adie, L., & Cumming, J. (2017). Using video technology to enable student voice in assessment feedback: Video, student voice and assessment feedback. British Journal of Educational Technology, 48(5), 1092–1105. https://doi.org/10.1111/bjet.12536

Way, J. (2009). Emerging E-Pedagogy in Australian Primary Schools. In Leo Tan Wee Hin & R. Subramaniam (Ed.), Handbook of Research on New Media Literacy at the K-12 Level: Issues and Challenges (pp. 588–606), London: IGI Global.

Wiliam, D., & Leahy, S. (2016). Embedding formative assessment. Hawker Brownlow Education.