Go to navigation
It is currently Sat Dec 10, 2016 10:23 pm

All times are UTC




Post new topic Reply to topic  [ 17 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: Tue Mar 19, 2013 12:41 pm 
Offline

Joined: Tue Jul 21, 2009 9:56 pm
Posts: 8228
I'm considering setting up an e-petition to KCC asking them to:

1. Consider waiting to hear whether the changes to more "coaching resistant" 11plus papers in other authorities are successful before changing the specification for the 11plus test and thus keeping September 2014 test the same

2. Consulting with parents prior to making changes

3. Repeating the headteacher consultation to help identify what is actually wrong with the current test and the changes that would be beneficial

4. Using data from secondary schools and the past 13 years of testing by GL assessment in Kent to see whether the test results have been reliable enough in terms of identifying potential, and if not, why not. Without this analysis it is difficult to ensure that changes will be for the better.

5. Involving an expert in testing in formulating the specification and the tender evaluation criteria

6. Making clear what the decision making process and timescale is for this whole exercise

7. Explaining the 11plus process more clearly to parents of year 4 children via Kent primary schools in order to help level the playing field and encourage more to consider trying the 11plus e.g. even now there will be parents who don't know that you can buy GL assessment practice papers, or that you can enter for the 11plus whatever the school thinks about your child's chances

Would you sign it? What else would you put in the petition?


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 12:53 pm 
Offline

Joined: Mon Aug 22, 2011 8:20 pm
Posts: 1706
Location: Warwickshire
mystery wrote:
Using data from secondary schools and the past 13 years of testing by GL assessment in Kent to see whether the test results have been reliable enough in terms of identifying potential, and if not, why not.
That's a difficult one to prove because the educational environment will have an impact on outcome - so you can't really say that the eventual result is a reflection of just "potential".


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 1:33 pm 
Offline

Joined: Tue Jul 21, 2009 9:56 pm
Posts: 8228
True - I suppose what I'm trying to get at is that a "good" test would be giving grammar schools the kind of children who, if they continue to do well, would be getting Bs and above at GCSE, and the non-selectives the potential Cs and below. If they grammars are getting many children who passed the 11plus who are not this capable, and the non-selectives are getting many children who didn't pass who are capable of a pretty good sweep of Bs and above then, arguably, the test needs fixing.

Yes, quite how one measures this is tricky. But there must be some way of doing this other than just going on the tales of so many children who are coached to get into grammar who can't hack the pace, and so many children who were capable but failed on the day who have gone to the non-selectives. There's a mix of data for year 7 pupils that could be thrown into the equation.

A testing expert would be able to have a fair stab at it. Obviously there must always be a fair amount of abitrariness at the borderlines in any test, but again, a testing expert would be able to look at the data provided by the tendering company to see if the tests being proposed were as valid and reliable as one could hope for.


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 1:42 pm 
Offline

Joined: Fri Oct 26, 2012 3:43 pm
Posts: 29
In its current format the test is creating a disproportionate number of top scorers. The extra forms in the SS's this year have shown how the scores start dropping away as would be expected, with the lower scores needed for admission. Take away those extra places and a problem emerges in West Kent and I don't think the Grammar annexe is necessarily the solution given the number of vacant spaces in some of the Maidstone Grammars.
I sat the 11+ 30 years ago when it was compulsory to do so. I'm sure coaching was taking place back then but it certainly wasn't commonplace. I remember seeing the NVR practice paper and feeling quite flustered over it as I had never seen anything like it before and the same applied to many of my friends. 10 of us still made it to the Girls' Grammar though.
I really feel for all the parents going through this uncertainty with their Yr 4's but at some point one year group are going to be the first to sit any new tests.


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 1:52 pm 
Offline

Joined: Tue Jul 21, 2009 9:56 pm
Posts: 8228
I don't know whether there was a disproportionate number of top scorers in this year's test - we'd need to see the bell curve for that -- some of the FOI data on here suggested that might be the case but it's a bit hard to work through in the format it was given - and it's not because of "test format" that a test produces a poor distribution (as did the Bucks test). It's because they haven't put enough of the harder questions in and presumably haven't tested the papers / item spread adequately beforehand. Maybe this is a matter of managing existing contracts more closely too, and setting out what is acceptable for the test?

If it's not a good test now (which your take on the data suggests that it isn't as a test with too many people scoring highly is not a good discriminator at the pass / fail boundary) this is even more reason for not bunging a load of public money hurriedly at changing the test to something that may not be any better .... these things require time and thought from people who know about them, not a hasty political decision.


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 2:12 pm 
Offline

Joined: Fri Oct 26, 2012 3:43 pm
Posts: 29
I certainly agree that they could look at making changes to the existing format by possibly introducing different question types and harder questions. However, as you point out Mystery our Primary schools are not consistently teaching at the levels needed to achieve a level playing field for all our children.
It seems an almost impossible task to find a suitable test that will produce the desired outcomes KCC are seeking.


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 4:34 pm 
Offline

Joined: Mon Aug 22, 2011 8:20 pm
Posts: 1706
Location: Warwickshire
mystery wrote:
I don't know whether there was a disproportionate number of top scorers in this year's test - we'd need to see the bell curve for that -- some of the FOI data on here suggested that might be the case but it's a bit hard to work through in the format it was given - and it's not because of "test format" that a test produces a poor distribution (as did the Bucks test). It's because they haven't put enough of the harder questions in and presumably haven't tested the papers / item spread adequately beforehand. Maybe this is a matter of managing existing contracts more closely too, and setting out what is acceptable for the test?

If it's not a good test now (which your take on the data suggests that it isn't as a test with too many people scoring highly is not a good discriminator at the pass / fail boundary) this is even more reason for not bunging a load of public money hurriedly at changing the test to something that may not be any better .... these things require time and thought from people who know about them, not a hasty political decision.
In Kent for 2012 entry 76 children scored 418, 98 scored 419, and 589 scored the maximum 420.
For 2013 entry (with an increased maximum of 423 - i.e. 3 x 141) 880 scored over 420 for 2013, including over 660 on the full 423.

Out of a secondary transfer cohort of about 15000 that's over 4% scoring at 141 - whereas statistically it should be about 0.32%. A big difference. Standardising against the cohort, rather than a theoretical whole population would give more granularity to the results - whether or not they were to change test provider.

From some figures posted here recently:
Quote:
VR - 141 (raw score 64 out of 80 correct)
NVR - 141 (raw score 50 out of 72 correct)
it looks like there are a big range of raw scores which can all give the same outcome. Whatever the other pros and cons, that's one thing which would certainly change with CEM tests.


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 5:01 pm 
Offline

Joined: Tue Jul 21, 2009 9:56 pm
Posts: 8228
But they haven't asked for that in the specification have they? And they haven't managed GL Assessment during the current contract to those levels if that was what was wanted?

The specification that you provided the link for is not really clear in what is wanted really is it? How can you then fairly judge what comes in if you haven't set out a broad approach to what you want in the first place?

This is a public procurement exercise not a piece of guesswork as to what would please some Kent councillors at the moment. What is the value of this contract roughly?


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 6:30 pm 
Offline

Joined: Fri Oct 26, 2012 3:43 pm
Posts: 29
Okanagan, looking on the Warwickshire forum, I see the historical data for the qualifying scores for each school since the introduction of the CEM test. Although it means absolutely nothing to me, I can see that the first year produced a similar score for all schools on the initial allocation. This year the qualifying scores have a wider gap between schools, so could it be tutoring is having an affect on these? I can see though the qualifying scores are very variable from year to year, which I'm guessing is the whole point to them.
I am correct in thinking that the raw scores needed to qualify are also much lower?


Top
 Profile  
 
PostPosted: Tue Mar 19, 2013 7:33 pm 
Offline

Joined: Mon Aug 22, 2011 8:20 pm
Posts: 1706
Location: Warwickshire
Yes raw scores needed are probably 60-65% on average - although it varies according to which school and at some nearer 50% raw score overall will get a place. The test here may be a bit different to what you would get though as it's an opt-in test which probably only about 25% take - and even then this year the average raw score the numerical reasoning (maths) part of it was only 23% :shock: It's definitely set here with the expectation that they won't finish it, and they won't get it all right - hence it does differentiate at the top end. Which it needs to as Warwickshire (particularly South Warwickshire) only has super-selectives - so there are no grammars catering for the top 25% as you have. I suspect you'd get something closer to the one they're putting in in Buckinghamshire as to test for almost the full ability range, with the intention of selecting the top 25% (ish).

Historically, what happened in Warwickshire was that after the first year of CEM tests the priority areas for the three schools in the Eastern area (Lawrence Sheriff, Rugby High and Ashlawn) changed, so that there were 2 areas - the required score went up in the outer area, and down in the inner area which had a higher priority for places, when they had previously had equal priority. The other 3 schools weren't affected and the required scores remained broadly similar until this year when 4 of the schools increased their admission numbers, hence the fall in the required score for those 4. The variation in scores is largely that the boys school for the Southern part of the county (King Edward VI) didn't increase its admission numbers, when the others did. There is some suggestion that in the first year people doing DIY preparation did well, but that this has tailed off a bit since. I've not seen any hard evidence of this though.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 17 posts ]  Go to page 1, 2  Next

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
CALL 020 8204 5060
   
Privacy Policy | Refund Policy | Disclaimer | Copyright © 2004 – 2016