Open access peer-reviewed chapter

What is the Point of Assessments?

Written By

Nenadi Adamu

Submitted: 30 January 2023 Reviewed: 05 February 2023 Published: 03 March 2023

DOI: 10.5772/intechopen.110379

From the Edited Volume

Higher Education - Reflections From the Field - Volume 4

Edited by Lee Waller and Sharon Kay Waller

Chapter metrics overview

41 Chapter Downloads

View Full Metrics

Abstract

Assessments in higher education remain a crucial method in which the student’s understanding and engagement with the course content continue to be measured. Evidence suggests that effective assessments must not only enhance the student’s learning, but it must also encourage the students to be able to recognize quality as well as improve their performance for future tasks. There are those who argue that because of the emphasis given to employability in the delivery of higher education, assessments are becoming increasingly simplistic and less innovative, leading to a fast-paced approach to studying, thus limiting student imagination and engagement. This paper explores the rationale and strategy of assessing first year students on a BA Health and Social Care degree, with specific focus on a core 30 credit units by reflecting on the changes made to the unit and the written assessment and evaluating the outcome of the changes using student performance, ending with the question: “What is the point of assessments?’

Keywords

  • health and social care
  • higher education
  • assessments
  • teaching and learning
  • student engagement

1. Introduction

Assessments in higher education are used as a tool to test understanding as well as improve on student’s abilities in future or similar tasks [1]. Gibbs et al. [2], however identify the fact that because of the changing nature of higher education and the shifting emphasis to employability, and a new approach to studying which is fast paced, assessments becoming less creative, raising the question of what we are assessing for. Villarroel et al. [3], use the phrase ‘authentic assessments’ to describe assessments that strike a balance between learning and the employability. They argue that assessments should essentially enhance employability by encouraging the students to develop the skills that are needed in employment post-graduation. Reimann and Khosronejad [4] rightly suggest that the designing of assessments need to be an authentic process, which effectively combines pedagogy and the rationality of the assessment.

1.1 Why do we assess?

“Assessment is a moral activity. What and how we choose to assess shows quite starkly what we value” (Knight, 1995, Cited in [5]).

The idea of Assessment for Learning’ or AfL is featured quite heavily in the discourse on the rationale for assessments, as well as the teaching and learning strategies in higher education (HE), and assessments are viewed fundamentally as a form of testing or evaluation [6, 7]. The Quality Assurance Agency (QAA) policies also highlight the importance given to assessments and this is evident in their argument that assessment strategies need to have a combination of challenge, opportunities for learning and growth [8].

The literature on assessments in HE acknowledges the shift in assessment strategies from the traditional mode of frequent assessments which have only been maintained by traditional institutions such as Cambridge, Oxford and the Open university to fewer assessments and less feedback being received by students. This raises the question of the purpose of assessments in the current climate and the impact on assessment strategies [6, 9, 10]. To answer the question on why we are assessing, it is important to look at who we are assessing, what we are assessing and the value of what we assess as seen in the outcome of our assessments.

1.2 Who are we assessing—the cohort

Jones and Wehlburg [11] highlight the need to be mindful of the need to have assessment tasks that inspire engagement from the students, and this is important as we consider who our students are. The student engagement is often influenced by a number of factors, but the diversity of the cohorts is a significant element that needs to be considered when evaluating why we assess. Northedge [12] argues that HE has seen a ‘diversification of students and courses’ over the years and our student cohorts now have diverse life experiences and multiple responsibilities in work, study and personal lives. Therefore, we must recognize that students a bring their lived experiences with them to their learning [13].

Looking more closely at the students we are assessing; it is important to start with the course they are studying. The Health and Social Care degree has remained a highly popular one with recruitment growing steadily across the industry over the last few years. The Health and Social Care Act 2012, and the 2022 Health and Care Act has led to increased investment in the Health and Social Care sectors, and this means more and more people have become interested in pursuing a qualification in Health and Social care with a view to securing employment in the field [14]. There is a greater awareness of the benefits the course offers, and the course fundamentally aligns with the vision of widening participation. Internal data suggests that many our students have limited access and opportunities due to a lack of social capital. As a course team with large numbers per cohort and a range of learning abilities, the challenge to design our assessments in a way that strikes a good balance across the range of abilities and levels of commitment remains evident. The course (Health and Social care) continues to increase in popularity due to the rising awareness of the opportunities in the health and social care industry as well as the opportunity to be a stepping-stone for applicants hoping for a career in the professional subjects such as nursing and social work.

This has led to the decision to revisit the current curriculum and assessment strategy to ensure that we are actively offering an inclusive practice in both teaching and more specifically in our assessments [15]. Having an inclusive practice as described by Leeds Beckett [16] will mean that as a course team, our assessments are designed to be meaningful (clear and unambiguous) and authentically contextualized. This is important because it is easy to get focused on comparing our strategies and methods with close competitors in an attempt to benchmark our provision, but the focus should arguably be on our students and their improving their overall learning experiences and outcomes.

1.3 What are we assessing? The assessment

Reimann and Khosronejad [4] rightly suggest that the designing of assessments need to be an authentic process, which effectively combines pedagogy and the rationality of the assessment. Bloxham and Cambell [17] emphasize the need for the academic community to engage in dialogue to ensure that assessments continue to meet the performance and quality standards and this evaluation provided an opportunity to do this. Having recognized that inclusivity is a crucial element in the designing and implementation of assessments in higher education [18]. Assessments must be designed to reflect the diverse range and ability of learners particularly in widening participation universities [19]. This is also reflected in the Quality Assurance Agency (QAA)’s quality code, which is clear about the need to ensure that all students are offered the same opportunities to be able to show evidence of their learning by the adoption of diverse strategies to demonstrate the achievement of learning outcomes. Evidence also shows that in order to achieve this, course teams must be creative and malleable in developing assessment strategies across courses and departments to ensure that students are offered a variety of methods and formats to give all the students a chance at succeeding [20].

The assessment was changed from a 1500-word report to a 2000-word portfolio which was deemed to be more inclusive and reflective of the student-centered approach. The portfolio required students to use different academic skills and draw from a range of sources to accommodate different strengths and in so doing, optimise their grade potential. Drawing on the Universal Design for Learning (UDL) guidelines in offering choice and autonomy [21] the students were given four broad topic areas to choose from thus engaging student interest and acknowledging the diverse learners as well as their individual backgrounds and employment routes. This was also to afford students the ability to link learning to employment particularly for those already in practice, but also for the students with no practical experience, as they would be able to learn from colleagues during workshops and assignment preparation sessions and in so doing, engage in community and collaboration [21].

1.4 The outcome

The assessment was undertaken by first year undergraduates in eight different sites inclusive of the main campus. Of the eight sites, four have been used for this analysis as there is available date to compare performances.

For the main campus, in the 2020/2021 academic year, the lowest grade on the unit was 10% and highest 75%. Half of the class scored between 42% and 58%. For the 2021/2022 academic year, the highest grades went up with more students scoring in the upper quartile range, but fail grades increased. The average grade was 45.4 in 2021/2022 compared to 48.1 in 2020/2021. There were however more students in the 2020/2021 cohort (128) when compared to the 2021/202 cohort (95 students) which suggests that grades worsened overall (Figure 1).

Figure 1.

This graph shows the grades for the previous academic year for the previous assessment and the lowest grade was 10 and highest 75. Fifty percent of the class scored between 42 and 58. With the new assessment, the highest grades went up the more student scoring in the upper quartile range but fail grades increases with average grade being 45.4 in 2021/2022 compared to 48.1 in 2020/2021. There were however more students in the 2020/2021 cohort (128) when compared to the 2021/202s2 (95 students) cohort which suggests that grades worsened overall.

Figures 24 show the comparison of the performance across the three other campuses. Overall, the patterns are similar to that of the main campus. Increased grade levels but more fails overall.

Figure 2.

This shows that there were similar issues at Campus A with more fails in 2021/2022 with the new assessment.

Figure 3.

Drop is highest grade in 2021/2022 with the new assessment and average grade worse that previous year.

Figure 4.

Slight improvement observed from this cohort with a better performance on average.

Advertisement

2. Observations

  • It is important to see the grades compared together for the different academic years. The data shows that grades appear to have worsened in 2021/2022 taken at face value.

  • Highest grades were better in 2021/2022 with the highest grade awarded being 85% and suggests that those who engaged and understood the assignment did better.

  • Those graded 1–10 higher in 2021/2022. This could be because of a number of reasons which include academic offenses which are given 1% and secondly, those who used a topic of their own choice were automatically given 10%.

  • Those graded 20%/−25% slightly higher in 2021/2022 and these were predominantly because of citing misleading references and non-UK based sources.

  • There was increased ability to identify contract assignments due to the prescriptive nature of assessment and those suspected of this were invited to a viva and grades for these were capped at 25%.

  • Marking evidence suggested that students did better in the section on legislation and case law as compared with the section on media and the introductory section. This will be explored further in the discussion (Figures 5 and 6).

Figure 5.

This shows the performance across all eight sites where the old assessment was attempted in the 2020/2021 academic year. Most grades were between 41% and 51%.

Figure 6.

This also shows the overview of performance across all 8 sites in 2021/2022 for the new assessment and overall, most grades remained in the 41–51% bracket suggesting that there was not much improvement in outcome for the students as a whole.

Advertisement

3. Discussion

The importance of assessments in HE continues to remain an issue under consideration and there are those who argue that assessments act as evidence of student learning and that students' learning is achieved by the assessment process [22]. Although it would appear that there is a lot of focus on teaching and learning strategies, as a way to improve the student experience and enhance student engagement, assessments must be seen to be fit for purpose that takes connivence of the diverse nature of our student groups [23]. Regardless of this, it is still imperative to explore what really is the point of assessments in HE, This question involves considering different approaches and perspectives and understanding that there are complexities that must be addressed in order to ensure that the assessment process is not just yet another mandatory exercise in HE that is executed as an expectation of both the staff and students but rather, that it is an actual tool that can be used to improve the learning experience of the students.

Drawing on the assessment used for this study, the first thing to be considered was the cohort being assessed. The data showed that there were a significant number of students across the two academic years that scored 1–10% which is indicative of academic offenses, and in addition, those who have not followed the assignment specifications which suggests that they may have contracted out their work to a third party. Thinking of who we are assessing, this is relevant because it can be argued that students may be making the decision to get help with their assessments as a result of a lack of confidence in their own abilities. This was one of the criteria in mind when the new assignment was designed as a four-part portfolio to allow students with a range of abilities to be able to successfully engage with the tasks and maximize their grade outcomes. Bretag et al. [24] argue that the designing of an assessment can be a determinant to the student’s ability to contract it out and from the figures above, the increase from 2.6% to 9.8% of students falling in the lower grade category shows that students more students were not engaging with the new assessment despite the changes. This could be an area for further research to explore the barriers limiting our students from engage with different assessment types.

Another issue that has been identified is what we are assessing for. There is significant the discourse around the need for inclusive assessments and this formed the basis for the redesign of the assessment. The redesign of the assessment was influenced by some of the key themes identified in the Culturally informed assessments toolkit such as ‘unclear and inconsistent instructions’, ‘inexperience in certain assessment modes’ and ‘clarity of assessments. Drawing on the guidelines of the Universal Design for Learning (UDL), the process of planning and designing of assessments was structured and focused on achieving the provision of ‘multiple means of engagement, representation, action and expression’ [21]. The assignment design also offered choice and autonomy with the students able to choose from four specific areas to write about, thus engaging student interest and acknowledging the diverse learners as well as their individual backgrounds and employment routes. This also afforded students the ability to link learning to employment particularly for those already in work, and for the students with no practical experience, the idea was that they would be able to learn from colleagues during workshops and assignment preparation sessions and in so doing, engage in community and collaboration. In addition to inclusivity, the new assessment was designed to mitigate for academic offenses and the issue of the authenticity of student’s work by ensuring that elements of the assessment required personalized reflections which could not be successfully addressed without engaging in the assessment preparation sessions.

In reflecting on the question of what we are assessing, this assignment was designed based on UDL themes to assess for interest, perception, comprehension, and expression and communication. Section one was the background or introduction which tests their academic writing skills, research skills, and knowledge of the issue being discussed. The second section required the use of a media source (print or video) reporting on the social issue, and this allowed students show their interest in an issue, as well as their perception of how the media influenced the discourse on the issue and inadvertently the law and practice. The third section was the legislation and case law section which allowed students to engage with legislation and evidence their awareness of the law and case law to explore how the law is used in practice. The data showed that students did better in section three which was a surprise as this was deemed (by both staff and students) to be the most challenging section and students did worse in the second section which was included to encourage a non-academic element to the assessment to allow those who struggled with academic skills (research, writing and analysis) maximize their grades.

It can therefore be argued that since what we assess is useful only if it is deemed relevant by students, and other stakeholders such as politicians, industry, academics as well as the society in general [22] it is worth exploring what is the point of assessments.

Advertisement

4. Conclusion

The purpose of this reflective piece is to challenge us as academics, teachers, and members of faculties to engage in frank dialogues about the point of assessments. The experience of redesigning an assessment with the view to improve student engagement and grade outcomes, drawing on specific frameworks that arguably could ensure these goals were met has been an interesting one with very surprising outcomes. The grades on the unit worsened, and students did not do better in the non-traditional elements of the assignment. Bad academic practice increased which could be because of the student’s lack of confidence and/or comprehension of the assessment purpose and requirements. There are no quick answers but an understanding that to continue to encourage better outcomes and experiences for students, we would need to keep asking ourselves ‘what is the point of assessments?’.

References

  1. 1. Bryan C, Clegg K, editors. ‘Introduction’ in Innovative Assessment in Higher Education: A Handbook for Academic Practitioners. London: Routledge; 2019
  2. 2. Gibbs HD, Bonenberger H, Hull HR, Sullivan DK, Gibson CA. Validity of an updated nutrition literacy assessment instrument with the new nutrition facts panel. International Journal of Food Sciences and Nutrition. 2020;71(1):116-121
  3. 3. Villarroel V, Bloxham S, Bruna D, Bruna C, Herrera-Seda C. Authentic assessment: creating a blueprint for course design. Assessment & Evaluation in Higher Education. 2018;43(5):840-854
  4. 4. Reimann P, Khosronejad M. Designing authentic assessments in higher education. In: Measuring and Visualizing Learning in the Information-Rich Classroom. New York: Routledge; 2015. pp. 108-122
  5. 5. Durkin L. The imperfect practice of assessment in higher education. Assessment Update. 2021;33:6-13. DOI: 10.1002/au.30270
  6. 6. Sambell K, McDowell L, Montgomery C. Assessment for learning in higher education. Routledge; 2012
  7. 7. Wiliam D. What is assessment for learning? Studies in Educational Evaluation. 2011;37:3-14
  8. 8. QAA. 2018. Available at: https://www.qaa.ac.uk/quality-code. [Accessed: 3 April 2021]
  9. 9. Carless D. Exploring learning-oriented assessment processes. Higher Education. 2015;69(6):963-976
  10. 10. Gibbs G. Why assessment is changing. In: Innovative assessment in Higher Education: A Handbook for Academic Practitioners. Routledge; 2019
  11. 11. Jones BM, Wehlburg CM. Learning outcomes assessment misunderstood: Glass half-empty or half-full. Journal of the National Collegiate Honors Council -- Online Archive. 2014:439. Available from: https://digitalcommons.unl.edu/nchcjournal/439 [Accessed: 20th November 2022]
  12. 12. Northedge A. Rethinking teaching in the context of diversity. Teaching in Higher Education. 2003;8:17-32
  13. 13. Andresen L, Boud D, Cohen R. Experience-based learning. In: Understanding Adult Education and Training. New York: Routledge; 2020. pp. 225-239
  14. 14. Factor F. SASS Self-evaluation Document. School of Applied Social Sciences, University of Bedfordshire. 2020. Available on Request
  15. 15. University of Bedfordshire Culturally inclusive toolkit. Available at: https://www.beds.ac.uk/cle/focus/assessment. [Accessed: 17 December 2019]
  16. 16. Leeds Beckett University. 2018. Inclusive Teaching and Assessment Practice: Guidance for staff. Available at: https://www.leedsbeckett.ac.uk/staffsite/-/media/files/staff-site/quality-assurance/key-information/validation/inclusive-assessment-guide-final.pdf. [Accessed: 4 December 2019]
  17. 17. Bloxham S, Campbell L. Generating dialogue in assessment feedback: exploring the use of interactive cover sheets. Assessment & Evaluation in Higher Education. 2010;35(3):291-300
  18. 18. Sambell K, Brown S, Graham L. Engaging students with positive learning experiences through assessment and feedback. In: Professionalism in Practice. Cham: Palgrave Macmillan; 2017. pp. 139-187
  19. 19. Butcher J, McPherson E, Shelton I, Clarke A, Hills L, Hughes J. Unfit for purpose? Rethinking the language of assessment for widening participation students. Widening Participation and Lifelong Learning. 2017;19(2):27-46
  20. 20. JISC. 2016. Inclusive Assessment. Available at: https://www.jisc.ac.uk/guides/transforming-assessment-and-feedback/inclusive-assessment. [Accessed: 5 November 2020]
  21. 21. CAST. 2018. Universal Design for Learning Guidelines version 2.2. Available at: http://udlguidelines.cast.org. [Accessed: 16 December 2019]
  22. 22. Ibarra Sáiz MS, Rodríguez Gómez G, Boud D, Rotsaert T, Brown S, Salinas ML, Rodríguez Gómez HM. The future of assessment in Higher Education. 2020. Available at: https://roderic.uv.es/bitstream/handle/10550/78020/7495606.pdf?sequence=1 [Accessed: 15 December 2022]
  23. 23. Sambell K, McDowell L, Sambell A. Supporting diverse students: Developing learner autonomy via assessment. In: Innovative Assessment in Higher Education. New York: Routledge; 2006. pp. 178-188
  24. 24. Bretag T, Harper R, Burton M, Ellis C, Newton P, van Haeringen K, et al. Contract cheating and assessment design: exploring the relationship. Assessment & Evaluation in Higher Education. 2019;44(5):676-691

Written By

Nenadi Adamu

Submitted: 30 January 2023 Reviewed: 05 February 2023 Published: 03 March 2023