The Holistic Education Alternative Learning (HEAL) Approach to Assessment Stress

Towards better engagement of learners with SpLDs in assessments through HEAL

by Siti Asjamiah Asmuri, Lead Educational Therapist, Dyslexia Association of Singapore (DAS)



Let’s begin by revisiting this all too familiar quote that has generated much debate among educators and school leaders on assessments.  Most of us would have attempted to situate ourselves in the shoes of animals other than the monkey or the bird who had been predisposed with innate characteristics to be able to accomplish the task successfully.  While we question the term ‘fair selection’ used in this picture, a more pertinent question we need to ask ourselves is how we have responded to this since the time we first came across this quote.  What has changed and how far have we come?

Recent developments in Singapore’s educational landscape have been encouraging.  They reflect a growing interest in embracing diversity and accepting learner variability as the new norm. One example is the gradual replacement of streaming in secondary schools with the subject-based banding system in which students have the option of taking subjects at their level of proficiency.  Another landmark agenda in the pipeline announced by the then Education Minister Ong Ye Kung in 2019 is the introduction of a new common national examination to complement the subject-based banding system that would replace the current GCE O- and N-Level examinations (Mokhtar, 2019).  This could be a good time to reflect and re-examine assessment approaches and practices for our increasingly diverse learners, including those with diagnosed Specific Learning Differences (SpLDs).  SpLDs refer to differences or difficulties some individuals have with particular aspects of learning.  These affect the way information is learned and processed and may be diagnosed in learners where there is an observed lack of achievement at age and ability level, or a large discrepancy between achievement and intellectual ability.   Some examples of SpLDs are dyslexia, dyscalculia, dysgraphia and Attention Deficit Hyperactive Disorder (ADHD) (DAS, 2020).

Earlier this year, a friend who is a Special Needs Officer in a secondary school shared an image of a student’s work sent by her colleague, seeking advice on what appeared to be the student’s written response to a Geography test question, parts of which had been written in complete reverse as seen below.


Excerpt 1: sample of a written response to a test question in a Geography paper by a Secondary 2 student with dyslexia

This student had been diagnosed with dyslexia, a specific learning difficulty primarily affecting the skills involved in accurate and fluent word reading and spelling (MOE, 2011).  Though reverse or mirror writing is not to be construed as a symptomatic trait, it is not unusual that this is presented by some individuals.  Some studies have attributed this to working memory deficits or visual processing issues.  However, at least one study mentioned stress and anxiety as a possible contributor (Della Sala & Cubelli, 2009).   Fortunately for this student, her teacher was willing to try her best to decode the words and give her the marks she deserved.  At the same time, she also wondered how best she could support her in her attempts at assessments moving forward especially if this continued to persist.   

A number of studies have mentioned the occurrence of emotional disturbances such as anxiety, fear of failure, feelings of inadequacy and low motivation as negative consequences resulting from language learning deficits which were found to have caused learners to produce ineffective or incoherent writing (Piechurska-Kuciel, 2010;  Schweiker-Marra and Marra, 2000; Ganschow et al.  1994; Ganschow and Sparks, 1996). Test anxiety and the fear of disappointing parents are indeed evident in our Singapore students. This would be particularly true for students struggling with dyslexia and its effects could sometimes be seen in their writing and test performance as shown below. 


Excerpt 2:  Response from a Secondary 2 student with dyslexia on how he felt about exams and his performance in exams


Excerpt 3: Response from a former Primary 6 student with dyslexia on how she felt about exams and exam results

In another study by Nelson, Lindstrom & Foels (2015), college students with dyslexia were reported to have higher test anxiety than those without when they were required to attempt tasks that are heavily language-based and required extensive reading.  Although we can argue that there may be other stress-inducing factors such as culture and social expectations and that stress and anxiety are also commonly experienced by many students with or without learning difficulties,  accompanying executive function deficits such as poor working memory and processing speed experienced by students with learning difficulties may gravely impact their coping mechanisms, adaptability and ability to respond appropriately, hence affecting test performance that may not truly reflect their learning and capabilities.   What this might also suggest is that stress and anxiety could also be attributed to test formats, conditions or methods that may not be conducive to facilitate optimum performance (Elliott, Kurz & Schulte, 2015). 
 
A key implication here is that emotions do have the propensity to influence learner engagement with assessments and their test performance.  Assuming that existing assessment practices and methods remain status quo, equipping these students with relevant study skills, time-management and self-monitoring strategies are some ways they can be better supported to cope with the demands and rigours of assessments.   Test accommodations such as granting extra time to compensate for deficits in working memory and processing speed, and integrating mindfulness activities in the curriculum are other forms of support that schools and learning organisations have been providing.   However, the effectiveness of such adaptive measures has not been sufficiently investigated.  In fact, a study by Duncan (2017) conducted on undergraduate students with a valid diagnosis of dyslexia, dyspraxia and dysgraphia from the English, History and Law faculty who received examination adjustments in the form of extra time or extra time with the use of a word processor, indicated that they were not able to fully place these students on a level playing field with their peers.

Educational frameworks we know today that have garnered considerable interest place emotions as a key tenet.  The Universal Design for Learning (UDL) and Design Thinking (DT) frameworks are some examples.  UDL was conceptualised based on research guiding the development of flexible learning environments to accommodate individual learning differences by providing multiple means of engagement, representation, as well as action and expression.  By giving due consideration to how emotions may affect learner engagement and performance, UDL embraces a broader and more comprehensive view of assessment beyond acquiring and understanding learners’ skills and knowledge and providing timely feedback and information to improve learning outcomes.  It seeks to gather information about the way learners interact with learning environments and assessment methods and encourages educators to explore and implement alternative options to minimise barriers and maximise opportunities for all learners to grow and succeed (Meyer, Rose, & Gordon, 2014; Rose & Gravel, 2013; CAST, 2011; Rose & Meyer, 2002).  Likewise, DT recognises empathy as the first and most important step in the design thinking process.  By encouraging educators to put themselves in the shoes of their learners, it allows them to better understand the challenges faced by them, their physical and emotional needs, how they think about the world, and what is meaningful to them in designing curriculum and assessments (Plattner, 2013).  

UDL and Design Thinking are certainly useful frameworks in guiding our instructional and assessment practices for all students.  Students with SpLDs in particular, have additional challenges requiring a systematic approach to address their learning needs.  In my experience working with these students, I realise that they are unique individuals with distinctive needs.  They would require a bespoke approach to bring out the best in them and their learning.  With this in mind and taking inspiration from the key principles of the UDL and Design Thinking frameworks, I have formulated the following HEAL approach to guide us in our instructional and assessment practices.  

H - Help these students to identify themselves as people, not problems. I wanted this approach to focus on these students appreciating themselves as unique individuals and this calls for a humanistic approach which will be explained in greater detail below. 

E - Engaging students with SpLDs calls for educators to reframe their expectations of student engagement. What kind of behaviours would we expect from students with SpLDs to indicate that they are engaged? There is often the misconception and temptation to identify students being engaged with being ‘attentive’– watching and listening to the teacher. We should, instead, address students’ desire for learning by focusing on elevating their interest through the tasks and activities organised. Students have their own unique ways of engaging.

A - Alleviate factors that may pose potential barriers for students with SpLDs to demonstrate their learning. At times, the methods or materials used in an assessment may demand additional skills or understanding not directly connected to what is being measured or tested. Construct irrelevant variables such as requiring students to write proper sentences to explain the workings of a Mathematical problem, may hinder students with language difficulties from being able to demonstrate their learning, thus bringing to question the accuracy of the data derived. Regularly evaluating existing assessment measures and tools enable us to reflect on assessment validity so that students with SpLDs can be given more equitable opportunities to demonstrate their potential. Hence, the purpose of alleviating barriers that are usually inherent in standardised formats of assessments is not to make them easier, but fairer for them. 

L - In an age where learning is increasingly driven by technology, we should leverage on the use of technological devices and assistive tools not just to engage learners, but also break down barriers to learning and collaborating with others, that students with SpLDs often face in their efforts to access school curriculum. Some examples of technological applications are Text-to-Speech (TTS) to help with reading and Speech-to-Text (STT) to help with spelling and writing. 

Humanistic approach to assessments

Paolo Freire’s humanistic approach to education emphasises human liberation from oppressive systems and the importance of recognising the potential of the whole person in the learning process to facilitate growth (Freire, 2009). Adopting the humanistic approach therefore, places humans and being humane at the heart of curriculum.  This entails studying assessment data beyond the analysis of quantitative scores in assessments to include observations of learner behaviour and interaction with assessment methods and instruments. As shared earlier, emotions have been found to influence learner performance. Different test instruments generate different emotional effects for different learners.   Learners who work well under pressure may thrive in high stakes examinations. Students with poor working memory, on the other hand, may perform better and have a better chance at experiencing success and progress in frequent, bite-size assessments. Working under highly controlled or timed conditions of summative assessments such as End-of-Year or high-stakes examinations demand students to write quickly and accurately. These tend to impose additional cognitive load and exacerbate stress for students with SpLDs, thus placing them at a disadvantage. Emotions and the patterns of engagement they engender, may pose problems for accurate measurement of constructs, such as knowledge of Math computations or writing proficiency. It is possible that inappropriate choices of assessment methods and instruments could be the cause of a strong and differential variance on prescribed construct- relevant measures. One way to address this is to use flexible and comprehensive assessment tools to provide teachers with better insights into its possible reasons. Even students themselves can be a good source of such information as evident in the suggestion given by a student with dyslexia below.


Excerpt 4: Feedback from a secondary school student with dyslexia about examinations

‘H’ therefore reminds us that our overall approach to assessments should be humane, and this also implies the need to understand our learners to be humans who need to be engaged and interested in their learning, instead of viewing them as robots that are ready to perform on demand under time constraints and rigid test conditions.

Engage and elevate interest and motivation

In providing learners with appropriate and multiple means of engagement, assessments may have the potential to support interest, drive motivation and develop persistence. Just as students learn more effectively when they are engaged and motivated, their performance on assessments can be enhanced by increasing engagement. Learners’ interaction and engagement with test tools and methods provide valuable information on their ability to demonstrate learning and strengths at their most optimum level.  It is important that educators frequently observe and look out for behavioural indicators in students who may be experiencing challenges with applying the skills tested when using a specific tool.   Are there any available accommodations to address the challenge and will the accommodation(s) change the construct being measured by the test (Kettler, 2012)?  In giving such learners access to appropriate test accommodations or tools, the validity of the inferences made from test scores could be enhanced. Testing accommodations are adjustments made that do not alter the assessed construct and are applied to test presentation, environment, content, format (including response format), or administration conditions for particular test takers.   These may either be embedded within assessments or applied after the assessment is designed (AERA, 2014).  Some examples of common testing accommodations include varying test presentation (e.g. oral delivery, drawing and typing), test duration (e.g. extended time, delivery across multiple days or testing sessions) and the provision of reading support, among others (Sireci, Scarpati & Li, 2005).


Other than testing accommodations, learners could also be provided with a simple audio or video explanation of assessment instructions to enhance their comprehension of the assessment criteria.  Clear and explicit assessment criteria could also promote greater independence and ownership of the learning when learners are closely guided by it.  

Offering choices in formative assessment tasks such as quizzes or encouraging independent contributions in online class discussion forums or digital notice boards are some examples of formative assessments that can be embedded within instruction to add variety, entice learner participation, enable learners to monitor their own learning and develop a supportive community.  Summative assessment tasks could take the form of a poster design, journal writing, conducting mini Ted Talks or producing a video demonstration of a skill that could inspire greater intrinsic motivation for learners to harness the best of what they have learnt and invest effort accordingly.   Hence, providing options and allowing students to make choices about assessment conditions are not only some plausible ways to engage students in more optimal performance hence minimising differential threats, they may also help learners find relevance of what they have learnt to the rich experiences they bring and have gained in the process of learning. 



A valid question that may be on educators’ minds is – if learners’ emotions are so important and should be considered in assessments, how then can they best be accounted for to obtain accurate estimates of learner knowledge, understanding, skills and strategies?   One way to recognize the effects of emotion on performance is to measure emotions themselves as part of a more comprehensive approach to assessment.  Learner engagement in assessments could be extended to include their written or oral feedback at the end of lessons or assessments as a gesture of offering choices and extending a more agentic role in the learning and assessment process.  (Rose, Robinson, Hall, Coyne, Jackson, Stahl & Wilcauskas, 2018, p. 171).    Taking into account students’ interests and experiences with assessments is an important step towards more equitable instruction (Shepard, Penuel & Davidson, 2017).  

Apart from giving students options as a strategy to pique their interest, it is also equally vital to address potential barriers to help them manage their emotions and sustain motivation.  
     
Alleviate potential obstacles to engagement in assessments   

Although it can be inferred that learners’ failure to achieve desired learning outcomes demonstrated in their assessment results could be attributed to inherent learning difficulties or gaps, it could also indicate a lack of engagement.  There could be many factors affecting learner’s engagement with assessments.   For example, instead of presenting instructions and questions in lengthy sentences and littering texts with sophisticated linguistic jargons, learners with SpLDs who have weak memory or processing skills would appreciate instructions and questions that are brief, simple and direct as suggested by a student with specific language difficulties below. 


Excerpt 5 : Suggestion by a student on how assessment items can be made more accessible to learners with SpLDs

Students with dyslexia who struggle with decoding words or retrieving phonics concepts, spelling and grammatical rules to apply in editing a passage containing errors in spelling or grammar will be less frustrated from cognitive overload if they are given options to choose from as suggested by the student below. 


Excerpt 6 :  Suggestion by a student on how assessment items can be made more accessible to learners with SpLDs

Consider also how a student with selective mutism would perform in the typical environment and conditions of a school or high-stakes oral examination requiring him or her to be able to read a passage he or she has never seen before and be able to engage in a conversation with examiners he or she has never met before.  Such assessment methods assume that all learners are able to speak on demand.  Would the results obtained then accurately reflect the student’s ability to articulate his or her thoughts and ideas?  Alternative assessment methods should be explored in order to present more accurately, what students with significant learning difficulties know and can do (Frey & Gillispie, 2018).   In this regard, perhaps the potential of assistive devices or technological tools could be harnessed to enable such learners to respond optimally in assessment environments.   Technology could provide viable means not only to help students cope, but leverage on their interests in order to better engage them in assessments. 

Leverage on the use of technological tools

Fostering better student engagement in assessments is a learning process for educators.  One way to begin is by conducting a digital survey or online poll using Kahoot, Padlet or Mentimeter platforms at the end of lessons to find out the kinds of tasks or activities learners have found or would find most appealing.  Technological tools could also be used astutely as alternative platforms we can offer learners to demonstrate and apply knowledge.  A number of schools have explored blogging, vlogging and student podcasts as assessment options.   Tasks could also be varied according to levels of challenge or difficulty, without modifying the original construct.  Success criteria should also be made clear and explicit for the purpose of self-monitoring the achievement of learning goals and task accomplishment. 


Digital tools can also be harnessed to monitor learner engagement in reading and reading comprehension activities.  One example we can take inspiration from is a web-based tool called Udio that was designed by The Center for Applied Special Technology (CAST, 2011) to provide struggling readers with appropriate levels of reading challenge sufficient to motivate and develop reading interest and comprehension.  Potential frustrations with word-level decoding are minimised by making Text-to-Speech (TTS) function and online dictionary available.  Teachers are also able to observe students’ emotional engagement with the texts when they provide their affective response online.   Students can choose reading materials that interest them and will be directed to an emotional response screen upon completion of the reading activity.  Their response could then provide a valuable basis for conversations or discussions about the text and the reading activity thereafter (CET, 2016; Rose et al, 2018).  The potential of this application can perhaps be explored in the Student Learning Space (SLS) platform that is accessed by students and teachers in Singapore schools. 

Conclusion

Providing flexibility and options in assessments may be a mammoth task and requires effort in terms of time, cost, logistics and manpower, including a major paradigm shift at the systemic level but this can possibly begin in small steps and can be explored and implemented in schools.  The 21st Century Competencies Framework clearly places a strong emphasis on the holistic development of learners (MOE, n.d.).   Traditional and standardised examinations could only evaluate specific skill sets out of a whole range of the cognitive and affective abilities underlying learner performance.  New thinking and approaches that are more flexible would be required in order to cater to students with distinctive learning needs. 

Assessment is a process of discovery about our learners and this should extend beyond evaluating what they know or do not know.  Cognitive and affective factors beyond this realm of ‘knowing or not knowing’ are just as capable of influencing school outcomes and ultimately, growth and success beyond the classroom, and these, more often than not, tend to be overlooked.    Recognising the value of these factors in determining learner engagement and performance in assessments is important in guiding instruction aiming to close achievement gaps, and designing appropriate assessment measures that would best address the learning challenges of learners with SpLDs.   Assessments should seek to enable these learners to showcase their knowledge and potential, instead of further disabling them.

References

American Educational Research Association (AERA), American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

CAST. (2011). Universal design for learning guidelines version 2.0. Wakefield, MA: Author.

DAS. (2020). Other Specific Learning Differences. Retrieved June 27, 2022 from  https://www.das.org.sg/about-dyslexia/what-is-dyslexia/other-specific-learning-differences.html

Della Sala, S., & Cubelli, R. (2007). “Directional apraxia”: A unitary account of mirror writing following brain injury or as found in normal young children. Journal of Neuropsychology, 1,3 –2

Duncan, H., & Purcell, C. (2017). Equity or Advantage? The effect of receiving access arrangements in university exams on Humanities students with Specific Learning Difficulties (SpLD). Widening Participation and Lifelong Learning, 19(2), 6-26.

Elliott, S. N., Kurz, A., & Schulte, A. (2015). Maximizing access to instruction and testing for students with dis-abilities: What we know and can do to improve achieve-ment. Smarter Balanced Assessment Consortium. UCLA: McGraw Hill. Every Student Succeeds Act (ESSA) of 2015, Pub. L. No. 114-95. (2015).

Freire, P. (2009). Chapter 2 from Pedagogy of the Oppressed. Race/Ethnicity: Multidisciplinary Global Contexts 2(2), 163-174. https://www.muse.jhu.edu/article/266914.

Frey, J. R., & Gillispie, C. M. (2018). The accessibility needs of students with disabilities: Special considerations for instruction and assessment. In Handbook of Accessible Instruction and Testing Practices (pp. 93-105). Springer, Cham.

Ganschow L. and R.L. Sparks (2000). Reflections on language study for students with language learning problems: Research, issues and challenges. Dyslexia 6: 87-100. 

Ganschow L., R.L. Sparks and J. Javorsky (1998). Foreign language learning difficulties: An historical perspective. Journal of Learning Disabilities 31: 248-258.

Ivcevic, Z., & Brackett, M.  A. (2014). Predicting school success: Comparing conscientiousness, grit, and emotion regulation ability. Journal of Research Personality. http://ei.yale.edu/publication/predicting-school-success-comparing-conscientiousness-grit-emotion-regulation-ability-2/

Kettler, R.  J. (2012). Testing accommodations: Theory and research to inform practice. International Journal of Disability, Development and Education, 59, 53–66.

Plattner, H. (2013). An introduction to design thinking. Institute of Design at Stanford, 1-15.
Sergio Della Sala, Clara Calia, Maria Fara De Caro & Robert D. McIntosh (2014): Transient involuntary mirror writing triggered by anxiety, Neurocase: The Neural Basis of Cognition

Meyer, A., Rose, D. H., & Gordon, D. (2014). Universal design for learning: Theory and practice. Wakefield, MA: CAST Professional Publishing.

Ministry of Education, Singapore (n.d.) 21st Century Competencies. Retrieved from https://www.moe.gov.sg/education-in-sg/21st-century-competencies

Ministry of Education. (2011). Psychoeducational Assessment and Placement of Students with Special Educational Needs: Professional Practice Guidelines. [PDF file]. Singapore: Ministry of Education. Retrieved from https://www.moe.gov.sg/docs/default source/document/education/specialeducation/ files/professional-practiceguidelines.pdf

Mokhtar, F. (2019, March 5). GCE O- and N-Level exams to be replaced by new national common exam in 2027.  Today Online. https://www.todayonline.com/singapore/gce-o-and-n-level-exams-be-replaced-new-national-common-exam-2027

Nelson, J. M., Lindstrom, W., & Foels, P. A. (2015). Test anxiety among college students with specific reading disability (dyslexia) nonverbal ability and working memory as predictors. Journal of Learning Disabilities, 48(4), 422-432.

Piechurska-Kuciel, E. (2010). Reading anxiety and writing anxiety in dyslexia: Symptomatic and asymptomatic adolescents. Advences in Research on Language Acquisition and Teaching: Selected Papers, Gala, 375-386.

Rose, D. H., Robinson, K. H., Hall, T. E., Coyne, P., Jackson, R. M., Stahl, W. M., & Wilcauskas, S. L. (2018). Accurate and informative for all: Universal Design for Learning (UDL) and the future of assessment. In Handbook of accessible instruction and testing practices (pp. 167-180). Springer, Cham.

Rose, D.  H., & Gravel, J.  W. (2013). Using digital media to design student-centered curricula. In R.  E. Wolfe, A.  Steinberg, & N.  Hoffmann (Eds.), Anytime,  any-where: Student-centered learning for students and teachers  (pp.  77–101). Cambridge, MA: Harvard Education Press.

Shepard, L. A., Penuel, W. R., & Davidson, K. L. (2017). Design principles for new systems of assessment. Phi Delta Kappan, 98(6), 47-52.

Steele, C. M. (1997). A threat in the air: How stereotypes shape intellectual identity and performance. American psychologist, 52(6), 613.



Please click on the sidebar to access these remaining articles
  • In Conversation with Ms P. Durka Devi, Teaching Fellow, Learning Sciences and Assessment, NIE.
  • In Conversation with Mr Tan Ken Jin, School Staff Developer, Bartley Secondary School.
  • Assessment Practices with academically low progress learners in a Singapore Primary School by Mr Jerome Chong
  • Recommended Publications