We address the challenge of assessing performance when IELTS (Academic Writing Task 2) candidates may have memorized and reproduced lengthy chunks of text that potentially disguise their true proficiency. Our profiling procedure separates out text that is more and less likely to reflect the candidate’s genuine linguistic knowledge. The procedure was applied to 233 retired scripts by Chinese candidates, and the results are analyzed by band and test centre.
As expected, errors decreased as band increased. Similarly, the quantity of non-generic nativelike text increased with band. But the use of material copied from the question and of ‘generic’ nativelike text (text that can be used in most essays) remained constant across bands for all but one test centre. Using the mean profiles as norms, a script known to be problematic was examined, to demonstrate how profiling can isolate the nature of differences. Three less extreme ‘outlier’ scripts from the main sample were also examined, to help locate a threshold for what counts as a problem, and demonstrate why unusual profiles can occur. To assist examiners, a simplified version of the profiling procedure is offered, that can be used as an informal diagnostic.
The profiling procedure recognizes the legitimacy of producing some pre-memorized nativelike material in a writing test, by contextualizing it within the broader pattern of the candidate’s written performance overall. The procedure requires further refinement than was possible within this modest project, but already suggests potential strategies for IELTS examiners to recognize memorized material in writing tests.