Thursday morning started with a presentation by Diane Steward from the British Council Istanbul, who gave detailed information on IELTS exams and answered the participants’ questions. She recommended that both the website and the specially designed DVD which gives lots of samples, band descriptors, feedback to the candidates, etc.
Building on the IELTS information given by the opening speaker, Barry talked briefly about lexile results in relation to different exams e.g. IELTS, TEEP and iBT. In terms of text complexity, the texts utilised in the IELTS Reading component were the most consistent as they were mainly at the undergraduate level with some falling within the post-grad level. The range for other tests varied widely from pre-undergrad to post grad levels.
Barry then presented a case study as an example as to how a grammar test can be developed using the following steps: planning – design – development – administration – evaluation.
One difficulty he faced during the project was the absence of a grammatical progression list to categorise structures according to the CEFR levels (the CEFR does not make explicit reference to grammar). The Core Inventory, which identifies key grammatical and lexical resources at each of the levels, was produced after the project. These lists prepared by the BC and EAQUALS represent a very useful resource for test designers: www.teachingenglish.org.uk/sites/teacheng/files/Z243%20E&E%20EQUALS%20BROCHURErevised6.pdf
Participants started to discuss types of grammar questions such as rewrite type questions, TOEFL type questions. They reflected that they are not sure if they are testing the grammar or other things. Barry suggested Purpura (2004) Assessing Grammar, CUP; Heaton (1990) Classroom Testing, Longman.
At the start of the afternoon session, participants wrote down ‘light bulb’ moments, ‘illuminating moments from the week’. Sue will put these up on a wall linked to the blog on Friday.
Wrap up session questions
Q; What would be your three favorite tasks for asking Explicit grammar teaching?
A: It all depends on your context and purpose. You should be the judge of it.
Q; Can you or should you assess learners from everything you taught?
A: No, you have to select and compromise. You cannot ever satisfy all the teachers.
If it is in the curriculum, teachers have no right to say that they haven’t taught it. Just say, ‘tough’ if they say ‘but I haven’t taught it’.
Q; could you tell us what you think about assessing process writing?
A; It has three stages, planning, doing and revising. I wouldn’t assess the developmental stages.
Young learners should only be assessed on speaking
Q; When can you get away without an item analysis?
Q: in item analysis, what percentage shows that an item is successful?
A: in achievement tests, hopefully 100% is. However, over 95% correct answer may be too easy, testing nothing. In general when more than 50% of the candidates answers a question correctly this shows that it is a good question. In proficiency tests, below 20-25% and above 75% correctness means that question is not successful
Q. How long should a tester wait to test a newly taught item?
A. A week or so, postponing is not a good idea