Other than testing accommodations, learners could also be provided with a simple audio or video explanation of assessment instructions to enhance their comprehension of the assessment criteria. Clear and explicit assessment criteria could also promote greater independence and ownership of the learning when learners are closely guided by it.
Offering choices in formative assessment tasks such as quizzes or encouraging independent contributions in online class discussion forums or digital notice boards are some examples of formative assessments that can be embedded within instruction to add variety, entice learner participation, enable learners to monitor their own learning and develop a supportive community. Summative assessment tasks could take the form of a poster design, journal writing, conducting mini Ted Talks or producing a video demonstration of a skill that could inspire greater intrinsic motivation for learners to harness the best of what they have learnt and invest effort accordingly. Hence, providing options and allowing students to make choices about assessment conditions are not only some plausible ways to engage students in more optimal performance hence minimising differential threats, they may also help learners find relevance of what they have learnt to the rich experiences they bring and have gained in the process of learning.
%20Approach%20to%20Assessment%20Stress%207.png)
%20Approach%20to%20Assessment%20Stress%208.png)
A valid question that may be on educators’ minds is – if learners’ emotions are so important and should be considered in assessments, how then can they best be accounted for to obtain accurate estimates of learner knowledge, understanding, skills and strategies? One way to recognize the effects of emotion on performance is to measure emotions themselves as part of a more comprehensive approach to assessment. Learner engagement in assessments could be extended to include their written or oral feedback at the end of lessons or assessments as a gesture of offering choices and extending a more agentic role in the learning and assessment process. (Rose, Robinson, Hall, Coyne, Jackson, Stahl & Wilcauskas, 2018, p. 171). Taking into account students’ interests and experiences with assessments is an important step towards more equitable instruction (Shepard, Penuel & Davidson, 2017).
Apart from giving students options as a strategy to pique their interest, it is also equally vital to address potential barriers to help them manage their emotions and sustain motivation.
Alleviate potential obstacles to engagement in assessments
Although it can be inferred that learners’ failure to achieve desired learning outcomes demonstrated in their assessment results could be attributed to inherent learning difficulties or gaps, it could also indicate a lack of engagement. There could be many factors affecting learner’s engagement with assessments. For example, instead of presenting instructions and questions in lengthy sentences and littering texts with sophisticated linguistic jargons, learners with SpLDs who have weak memory or processing skills would appreciate instructions and questions that are brief, simple and direct as suggested by a student with specific language difficulties below.
%20Approach%20to%20Assessment%20Stress%209.png)
Excerpt 5 : Suggestion by a student on how assessment items can be made more accessible to learners with SpLDs
Students with dyslexia who struggle with decoding words or retrieving phonics concepts, spelling and grammatical rules to apply in editing a passage containing errors in spelling or grammar will be less frustrated from cognitive overload if they are given options to choose from as suggested by the student below.
%20Approach%20to%20Assessment%20Stress%2010.png)
Excerpt 6 : Suggestion by a student on how assessment items can be made more accessible to learners with SpLDs
Consider also how a student with selective mutism would perform in the typical environment and conditions of a school or high-stakes oral examination requiring him or her to be able to read a passage he or she has never seen before and be able to engage in a conversation with examiners he or she has never met before. Such assessment methods assume that all learners are able to speak on demand. Would the results obtained then accurately reflect the student’s ability to articulate his or her thoughts and ideas? Alternative assessment methods should be explored in order to present more accurately, what students with significant learning difficulties know and can do (Frey & Gillispie, 2018). In this regard, perhaps the potential of assistive devices or technological tools could be harnessed to enable such learners to respond optimally in assessment environments. Technology could provide viable means not only to help students cope, but leverage on their interests in order to better engage them in assessments.
Leverage on the use of technological tools
Fostering better student engagement in assessments is a learning process for educators. One way to begin is by conducting a digital survey or online poll using Kahoot, Padlet or Mentimeter platforms at the end of lessons to find out the kinds of tasks or activities learners have found or would find most appealing. Technological tools could also be used astutely as alternative platforms we can offer learners to demonstrate and apply knowledge. A number of schools have explored blogging, vlogging and student podcasts as assessment options. Tasks could also be varied according to levels of challenge or difficulty, without modifying the original construct. Success criteria should also be made clear and explicit for the purpose of self-monitoring the achievement of learning goals and task accomplishment.
%20Approach%20to%20Assessment%20Stress%2011.png)
Digital tools can also be harnessed to monitor learner engagement in reading and reading comprehension activities. One example we can take inspiration from is a web-based tool called Udio that was designed by The Center for Applied Special Technology (CAST, 2011) to provide struggling readers with appropriate levels of reading challenge sufficient to motivate and develop reading interest and comprehension. Potential frustrations with word-level decoding are minimised by making Text-to-Speech (TTS) function and online dictionary available. Teachers are also able to observe students’ emotional engagement with the texts when they provide their affective response online. Students can choose reading materials that interest them and will be directed to an emotional response screen upon completion of the reading activity. Their response could then provide a valuable basis for conversations or discussions about the text and the reading activity thereafter (CET, 2016; Rose et al, 2018). The potential of this application can perhaps be explored in the Student Learning Space (SLS) platform that is accessed by students and teachers in Singapore schools.
Conclusion
Providing flexibility and options in assessments may be a mammoth task and requires effort in terms of time, cost, logistics and manpower, including a major paradigm shift at the systemic level but this can possibly begin in small steps and can be explored and implemented in schools. The 21st Century Competencies Framework clearly places a strong emphasis on the holistic development of learners (MOE, n.d.). Traditional and standardised examinations could only evaluate specific skill sets out of a whole range of the cognitive and affective abilities underlying learner performance. New thinking and approaches that are more flexible would be required in order to cater to students with distinctive learning needs.
Assessment is a process of discovery about our learners and this should extend beyond evaluating what they know or do not know. Cognitive and affective factors beyond this realm of ‘knowing or not knowing’ are just as capable of influencing school outcomes and ultimately, growth and success beyond the classroom, and these, more often than not, tend to be overlooked. Recognising the value of these factors in determining learner engagement and performance in assessments is important in guiding instruction aiming to close achievement gaps, and designing appropriate assessment measures that would best address the learning challenges of learners with SpLDs. Assessments should seek to enable these learners to showcase their knowledge and potential, instead of further disabling them.
References
American Educational Research Association (AERA), American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
CAST. (2011). Universal design for learning guidelines version 2.0. Wakefield, MA: Author.
DAS. (2020). Other Specific Learning Differences. Retrieved June 27, 2022 from https://www.das.org.sg/about-dyslexia/what-is-dyslexia/other-specific-learning-differences.html
Della Sala, S., & Cubelli, R. (2007). “Directional apraxia”: A unitary account of mirror writing following brain injury or as found in normal young children. Journal of Neuropsychology, 1,3 –2
Duncan, H., & Purcell, C. (2017). Equity or Advantage? The effect of receiving access arrangements in university exams on Humanities students with Specific Learning Difficulties (SpLD). Widening Participation and Lifelong Learning, 19(2), 6-26.
Elliott, S. N., Kurz, A., & Schulte, A. (2015). Maximizing access to instruction and testing for students with dis-abilities: What we know and can do to improve achieve-ment. Smarter Balanced Assessment Consortium. UCLA: McGraw Hill. Every Student Succeeds Act (ESSA) of 2015, Pub. L. No. 114-95. (2015).
Freire, P. (2009). Chapter 2 from Pedagogy of the Oppressed. Race/Ethnicity: Multidisciplinary Global Contexts 2(2), 163-174. https://www.muse.jhu.edu/article/266914.
Frey, J. R., & Gillispie, C. M. (2018). The accessibility needs of students with disabilities: Special considerations for instruction and assessment. In Handbook of Accessible Instruction and Testing Practices (pp. 93-105). Springer, Cham.
Ganschow L. and R.L. Sparks (2000). Reflections on language study for students with language learning problems: Research, issues and challenges. Dyslexia 6: 87-100.
Ganschow L., R.L. Sparks and J. Javorsky (1998). Foreign language learning difficulties: An historical perspective. Journal of Learning Disabilities 31: 248-258.
Ivcevic, Z., & Brackett, M. A. (2014). Predicting school success: Comparing conscientiousness, grit, and emotion regulation ability. Journal of Research Personality. http://ei.yale.edu/publication/predicting-school-success-comparing-conscientiousness-grit-emotion-regulation-ability-2/
Kettler, R. J. (2012). Testing accommodations: Theory and research to inform practice. International Journal of Disability, Development and Education, 59, 53–66.
Plattner, H. (2013). An introduction to design thinking. Institute of Design at Stanford, 1-15.
Sergio Della Sala, Clara Calia, Maria Fara De Caro & Robert D. McIntosh (2014): Transient involuntary mirror writing triggered by anxiety, Neurocase: The Neural Basis of Cognition
Meyer, A., Rose, D. H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST Professional Publishing.
Ministry of Education, Singapore (n.d.) 21st Century Competencies. Retrieved from https://www.moe.gov.sg/education-in-sg/21st-century-competencies
Ministry of Education. (2011). Psychoeducational Assessment and Placement of Students with Special Educational Needs: Professional Practice Guidelines. [PDF file]. Singapore: Ministry of Education. Retrieved from https://www.moe.gov.sg/docs/default source/document/education/specialeducation/ files/professional-practiceguidelines.pdf
Mokhtar, F. (2019, March 5). GCE O- and N-Level exams to be replaced by new national common exam in 2027. Today Online. https://www.todayonline.com/singapore/gce-o-and-n-level-exams-be-replaced-new-national-common-exam-2027
Nelson, J. M., Lindstrom, W., & Foels, P. A. (2015). Test anxiety among college students with specific reading disability (dyslexia) nonverbal ability and working memory as predictors. Journal of Learning Disabilities, 48(4), 422-432.
Piechurska-Kuciel, E. (2010). Reading anxiety and writing anxiety in dyslexia: Symptomatic and asymptomatic adolescents. Advences in Research on Language Acquisition and Teaching: Selected Papers, Gala, 375-386.
Rose, D. H., Robinson, K. H., Hall, T. E., Coyne, P., Jackson, R. M., Stahl, W. M., & Wilcauskas, S. L. (2018). Accurate and informative for all: Universal Design for Learning (UDL) and the future of assessment. In Handbook of accessible instruction and testing practices (pp. 167-180). Springer, Cham.
Rose, D. H., & Gravel, J. W. (2013). Using digital media to design student-centered curricula. In R. E. Wolfe, A. Steinberg, & N. Hoffmann (Eds.), Anytime, any-where: Student-centered learning for students and teachers (pp. 77–101). Cambridge, MA: Harvard Education Press.
Shepard, L. A., Penuel, W. R., & Davidson, K. L. (2017). Design principles for new systems of assessment. Phi Delta Kappan, 98(6), 47-52.
Steele, C. M. (1997). A threat in the air: How stereotypes shape intellectual identity and performance. American psychologist, 52(6), 613.
Please click on the sidebar to access these remaining articles
- In Conversation with Ms P. Durka Devi, Teaching Fellow, Learning Sciences and Assessment, NIE.
- In Conversation with Mr Tan Ken Jin, School Staff Developer, Bartley Secondary School.
- Assessment Practices with academically low progress learners in a Singapore Primary School by Mr Jerome Chong
- Recommended Publications