Developing the assessment literacy of IELTS Test users in higher education

The rapid growth of IELTS has resulted in a growing number of people providing information about the IELTS Test, setting standards, interpreting scores and advising test-takers. This project examined the assessment literacy needs of university test users (including admissions, marketing, academic and English language staff), how well these needs are being met and what other approaches could be adopted to meet these needs. The study took the form of a “proactive evaluation” (Owen, 2006), which included:

  • Online survey and face-to-face interviews to investigate the assessment literacy needs of IELTS Test users at two Australian universities and how well these needs are met by current resources.
  • Discourse analytic study of the IELTS Guide (2009).
  • Comparative evaluations of different IELTS Test resources and the institutional sections of the IELTS, TOEFL and PTE Academic websites.
  • A review of best practice in staff online training programs.

The survey and interview findings indicated that the IELTS Test was mostly needed for advising prospective students about English language entry requirements and making admissions decisions. To these ends, test users were mainly focused on four topics about the IELTS Test: the minimum IELTS scores for entry to courses at their university, the different components of the IELTS Test, how long IELTS scores are valid, and the relationship between IELTS scores and other evidence of English proficiency accepted by their university. 

The survey and interview results also indicated that the needs of the IELTS Test users were reasonably well met. Most mainly accessed their institution’s English language entry regulations and, to a lesser extent, the IELTS official website, for information about the IELTS Test. After reading the IELTS Guide (2009), all survey respondents generally found it to be informative. However, some believed that it could have included more information about the meaning and interpretation of IELTS test scores. 

A discourse analytic study of the IELTS Guide suggested that it had more of a marketing – than educational – emphasis, which may limit its usefulness as a training document. The comparative evaluations of different IELTS Test resources and the institutional sections of the three different test websites suggested the IELTS website was an informative resource, although some of its content and user-friendliness could be improved. The most popular alternative choice for learning about the IELTS Test in the survey and interviews was online tutorials, an approach that has not been used to date by the IELTS partners. A detailed example of best practice in online training programs is provided. Finally, recommendations are made for developing the assessment literacy of IELTS Test users.