ACARA news, November 2015

Automated essay scoring: a viable solution for NAPLAN online writing tasks

30 November 2015

Today, ACARA has released results of research which provides additional evidence that automated essay scoring is a viable solution for marking NAPLAN online writing tasks.

The research, which began in 2012, has found four separate and independent automated essay scoring systems were able to mark NAPLAN persuasive writing tasks as reliably as human markers. 

ACARA CEO, Robert Randall, welcomed the findings, which form a part of a comprehensive research and development program designed to prepare for the transition to NAPLAN online.

“Automated essay scoring of the writing component of NAPLAN will result in parents and teachers receiving their children and students’ results within two weeks of taking NAPLAN,” Mr Randall said.

“The precision and earlier provision of the results will help teachers tailor their teaching to student needs.

“Teachers and other markers will continue to be involved in the process – by training the automated essay scoring system (via 1,000 human scored essays), marking a sample of essays to check and validate what the system is doing; and by marking essays that the system might have difficulty marking (because the essays are different from what the system has been trained to mark).

“The research results show that automated essay scoring works for NAPLAN-type writing, but we will continue with our research to refine the system and to gather more evidence, which we will use to assure parents and teachers of the viability of automated essay scoring and to make a final decision about proceeding. If need be, we could double mark samples of student essays, until everyone is comfortable with automated essay scoring.”

Four separate and experienced vendors were engaged to score a sample of NAPLAN persuasive essays in 2012, using the current NAPLAN writing rubric. The vendors represented a cross-section of different approaches and methods for automated assessment of writing. The vendors were provided with 1,014 essays, along with scores provided by human markers, to train and validate their automated essay scoring systems. After training and validating the systems, the vendors used them to mark 339 tests. On overall scores and each writing criteria assessed, the four automated essay scoring systems achieved levels of agreement comparable with the human markers.

“What is exciting about this research is that, although the four vendors had different automated essay scoring systems, they were all able to mark the essays as well as the human markers,” Mr Randall said. 

“This is not the end of the research into automated essay scoring,” Mr Randall explained. “We intend to expand on this research in 2016 to include a larger sample of students and multiple prompts within and across writing genres (persuasive and narrative) before making a final decision about the approach to be used in 2017.”

View our infographic that illustrates the automated essay scoring research (PDF 1.3 kb).

About NAPLAN online

Federal, state and territory education ministers have agreed that NAPLAN will move online from 2017, over a two–three year period. NAPLAN online will provide better assessment, more precise results and faster turnaround of information. Significant planning, development, research and trialling are going on behind the scenes to make sure we are all ready to move NAPLAN online.

My School updated with 2015 Term 1 and 2 student attendance data

19 November 2015

Today, the My School website has been updated with school student attendance rates for Semester 1 (Term 1 and 2) of 2015. Attendance data are provided for: all students; Indigenous students; and non-Indigenous students. Data are suppressed where student numbers are fewer than, or equal to, five.

In this update, a new student attendance measure has been introduced: the proportion of students attending school 90 per cent or more of the time across the semester (Terms 1 and 2).

On 18 December 2015, a further update will be undertaken of the student attendance rate for Term 3.
Attendance data reporting is a COAG (Council of Australian Governments) initiative to help improve learning outcomes for Indigenous students by monitoring attendance. Data are reported twice a year, for Terms 1 and 2, and for Term 3.

2014 NAP – ICT literacy report shows a decline in ICT literacy

17 November 2015

The 2014 National Assessment Program (NAP) – information and communication technology (ICT) literacy report has been released today by the Education Council.

The NAP – ICT literacy test assesses student ICT knowledge, understanding and skills, as well as students’ ability to use ICT creatively, critically and responsibly.

In October and November 2014, around 10,500 Year 6 and Year 10 students participated in the NAP – ICT literacy online test. Samples of students were randomly selected from over 650 government, Catholic and independent schools in metropolitan, rural and remote areas around the country.

The report shows a significant decline in the mean performance of Year 6 students in 2014, compared to the last assessment in 2011. Similarly, the mean performance of Year 10 students is significantly lower than the mean performance in all previous NAP – ICT literacy assessments (2005, 2008 and 2011). The report also shows that in each year level, there has been a reduction in the percentage of students meeting the NAP – ICT literacy proficient standards.

ACARA CEO, Robert Randall, says the proficiency standards set in this assessment are challenging but they are reasonable and attainable for Year 6 and Year 10 students. For example, Year 6 students were asked to search a website to find appropriate material, format a document, crop an image and create a short slide show. Students in Year 10 were asked to design an online survey, use software to add two new levels to an online game and create a short animated video. 

“The decline in performance is of concern, and there is a need for a renewed focus on the teaching of digital technologies in schools,” says Mr Randall.

“Schools now have access to the Australian Curriculum: Digital Technologies, which covers the core aspects of ICT literacy that are vital for students to engage in a world dependent on these technologies for future employment and social interaction.

“We cannot expect students to reach the proficiency standard represented by the NAP – ICT literacy assessment on their own, through a personal use of technology. There is a need for explicit attention on the teaching and learning of knowledge, understanding and skills, which were the subject of this test and which are in the Australian Curriculum: Digital Technologies.”

From 2017, education ministers have determined that the National Assessment Program – Literacy and Numeracy (NAPLAN) will commence to move online. The familiarity that students have with technology, as observed in the survey results taken after the conclusion of the NAP – ICT literacy test, confirms the viability of the move to NAPLAN online. The NAP – ICT literacy results do not mean that the achievement of this goal has been compromised. The content of the NAP – ICT literacy test is focussed on higher order thinking and achievement of specific knowledge, understanding and skills relevant to a sophisticated use of information and communication technologies.

Read the full 2014 National Assessment Program (NAP) – information and communication technology (ICT) literacy report on the NAP website 

See examples of the questions asked in the NAP – ICT literacy assessments:

NAPLAN online: update to technical specifications

09 November 2015

On 12 October 2015, ACARA, in conjunction with Education Services Australia (ESA), released NAPLAN online technical specifications (PDF 105 kb) via its NAP website. 

When the technical specifications were released, ACARA noted that:

The use of on-screen keyboards or external keyboards for tablets is subject to research that may support the use of on-screen keyboards with tablets devices for students who prefer this method. Keyboard purchases should be delayed until ACARA releases further research.

Preliminary findings from a device mode effect study (final report to be released in January 2016) have further informed our position on devices suitable for NAPLAN online. In summary, the study (based on more than 3,500 students from Years 3, 5, 7 and 9 across 72 schools around the country) found that:

  • NAPLAN online is capable of being taken on a range of devices (laptops and tablets), with no consistent device effect across content domains, items types and year levels.
  • An external keyboard is not necessary for successful interaction with online items when students are responding to tests on tablets (although external keyboards for tablets could still be used if a school preferred this approach).

The study did identify some limited device effects, which were small, not pervasive and centred on specific item types and features. ACARA is confident that, as student familiarity with devices improves between now and NAPLAN going online, these minor issues will be addressed.  

ACARA will now use these preliminary findings to:

  • update the NAPLAN online technical specifications
  • work with state and territory authorities to develop strategies to increase student familiarity with devices and online testing in the lead-up to 2017 and beyond
  • refine item development processes to manage the limited device effects identified
  • conduct further device mode effect studies across 2016.

For further information, email [email protected]