Methods

Research Questions

As outlined above, this project was originally conceived not as a research project but an in-class assignment. Following the completion of this assignment (and after grades were awarded), we were interested in further examining the student responses and reflections more systematically. Therefore, we applied for research ethics approval for the use of these student contributions for this secondary purpose, developing several formal research questions that we wished to address. The following general question is addressed in this report: To what extent does this rubric (and its related assignment, “You Be the Grader ”) have instructional value to learners? More specifically:

  • • To what extent can meaningful information be gleaned from this assignment to scaffold subsequent instructional activities?
  • • To what extent do students demonstrate development as self-regulated learners, facilitating their continued learning beyond this assignment and course?

Data Collection

To address these questions, we aimed at obtaining information from the assignment responses to inform future instructional units that directly address the students’ gaps in knowledge and competence regarding academic writing. We also aimed at collecting information on the students’ own developing autonomy and self-regulation.

Participants

A total of 49 out of 65 students from the three classes (75 percent) granted consent for their assignments to be used for this research purpose. Because of the anonymized nature of the data, we do not have the exact demographic details of these students, but almost all students in all three classes were in their 20s, and approximately 85 percent of each class consisted of Chinese international students studying in a variety of colleges. This ESL support writing course is a required course where students need to receive a minimum grade of 70 percent in order to continue in their programs.

Data Collection Procedures

The first data source consisted of the comments made by students on the “You Be the Grader” assignments - the assignment required that they justify every grade they gave on each essay for each of the four criteria in the rubric. We collected all learner comments from three of the five essays (6, 26, and 30) because they represented a range of performance, from 2.5 to 4.5 out of 5 as represented by their official scores. Students wrote an average of 9.5 comments on each essay, for a total of approximately 1500 separate comments on all essays by all students.

An additional data source consisted of the reflections students were asked to make following the grading activity. In addition to questions about the time it took them to complete the assignment and their prior experience and impressions of the rubric, they were asked to comment on the challenges completing the assignment, how their understanding of academic writing improved, and one thing in their writing they planned to change as a result of this exercise (as a way of encouraging students to consider their future development beyond the course). There were approximately 300 comments provided by consenting students in response to these questions.

Data Analysis: Grading Comments

Step 1: Two members of the team read all comments independently, first identifying all comments representing engagement with the rubric, i.e., where students made comments with reference to elements of the rubric and linked these elements to examples of specific content and language in the essays. We then compared our answers and came to an agreement on all the elements on this list. In this way, we disregarded all vague comments that were not substantiated - that could have been made without even reading the essay or the scale at all (e.g., “The essay was interesting but there were problems with organisation”). During this stage, we determined that of the 49 students, 40 provided at least one comment that would be considered as engaging meaningfully with the scale.

Step 2: For the comments that engaged with the rubric, we identified dominant patterns in the students’ perception of the quality of the essay and their understandings of the elements of the rubric and of quality writing. We also identified places of misunderstanding or other comments that suggested gaps - elements to feed forward to future instruction. These patterns were identified independently, then discussed to come to a consensus.

Data Analysis: Learner Reflections

In Baker et al. (2020), we introduce these reflections, discussing them in terms of what they reveal about students’ reported learning. Here, we take an LOA approach to re-examine these reflections for evidence of increased autonomy and self-regulated learning. We first read through the comments for all references to autonomous future learning - their independent work beyond the course. These were not related to a list of things they thought were weak in their writing, but references to the efforts they need to take for their own improvement. Vague comments (e.g., “I need to work more on my writing”) were excluded. We also collected all comments related to the benefits of having and studying the rubric, to see the extent to which they provided evidence of increased learner agency and metacognitive awareness. Finally, we collected comments from learners related to the benefits of peer assessment. These comments were reviewed and reorganized to respond directly to the research questions. While some students performed this task in a perfunctory manner, only repeating wording from the rating scale, at least two-thirds of every class responded with substantive comments related to metacognitive awareness, increased agency, benefits of peer assessment, or plans for future learning.

 
Source
< Prev   CONTENTS   Source   Next >