Page 1 of 1

Standardised scores

Posted: Mon Oct 06, 2008 12:45 pm
by shuff
Can anyone explain this to me? Is standardisation done with scores from previously chosen children from all over the uk used as guinea pigs, or is it done with the very children having taken the test?

Re: Standardised scores

Posted: Mon Oct 06, 2008 1:01 pm
by dadofkent
shuff wrote:Can anyone explain this to me? Is standardisation done with scores from previously chosen children from all over the uk used as guinea pigs, or is it done with the very children having taken the test?
From the children taking the test so that the top 25%, in the case of Kent, for that years cohort can be selected.

Posted: Mon Oct 06, 2008 2:25 pm
by ergo
I'm not sure, but I was under the impression that your first thought was correct and that the test was taken by a randomly selected group and then the scores evaluated from their performance?

Posted: Mon Oct 06, 2008 3:06 pm
by KenR
I think this question was covered earlier in the year in a discissions about Judd scores:-
I suspect the problem comes from the different ways you can Age Standardise.

The most common approach is to Standardise against the actual candidate Exam population - this results with an average score of 100 (still with a max of 140) and the 87th percentile is about a score of 117.

But some LEAs use tests that have already been pre-standardised (by Trialing) against a national average population. In this case the average score is very much higher although the maximum Age Standardised score is still 140 - it's just that there are lots of scores of 140.

It's looks at though Kent use the latter approach - we can make this assumption as Judd School state that the average score is actually 120 not 100.

The problem is that I don't know how the CAF tests are standardised.

However, Judd state that this years minimum pass mark was 413 which is an average of just under 138 - so this is equivalent to a normal NFER cohort type standarisation score of about 118 or the 88th to 89th percentile. i.e circa 12% offers. Sounds like it's a tough school to get into.

Best thing to do is to try a few similar sample papers at home in strict test conditions and check the scores.

Posted: Mon Oct 06, 2008 4:48 pm
by medwaymum
My understanding is that whatever process they use to standardise the children taking the test, they end up selecting 21% of the higher scorers who are deemed 'successful' and the remaining 4% are selected from Headteachers appeals.

Posted: Mon Oct 06, 2008 5:51 pm
by One Down
KenR, The quote from a previous discussion contains a number of inaccuracies. The Judd website does not say that the mean score is 120, it says that the minimum selective score is 120. The mean cannot be 120 or 50% of the children will pass. According to NFER, 11+ tests still have a mean of 100. The Judd website does say that 140 is the most common score, which is true because it includes all those who would have scored 140+ ie those in the rh tail of the bell curve above 140. This may be where some confusion occurred.

Kent use local standardisation for their 11+, as you can see on the NFER website. This has to be the case, as 11+ candidates are mostly a self-selecting group of the most able children, so standardising them against the general population would just produce lots of 140s, not the spread required to differentiate between abilities. Local standardisation is why we can never work out how many age standardisation marks our children may get- it depends how clever the other children taking the test are (and specifically how clever those born in the same month are.) As NFER point out, the 11+ is standardised using the one and only group of children to take the particular test written for them.

If anyone is interested, I found this link while looking for something else.
http://www.nfer.ac.uk/publications/pdfs ... ialing.pdf
It is from NFER, showing how they trial test questions to make up a bank to choose from for testing and makes interesting reading. So what happened to our NVR?? Makes you think they must make one of the papers in each year harder on purpose, then.......??? :twisted:

Posted: Mon Oct 06, 2008 6:08 pm
by dadofkent
One Down wrote:KenR, The quote from a previous discussion contains a number of inaccuracies. The Judd website does not say that the mean score is 120, it says that the minimum selective score is 120. The mean cannot be 120 or 50% of the children will pass. According to NFER, 11+ tests still have a mean of 100. The Judd website does say that 140 is the most common score, which is true because it includes all those who would have scored 140+ ie those in the rh tail of the bell curve above 140. This may be where some confusion occurred.

Kent use local standardisation for their 11+, as you can see on the NFER website. This has to be the case, as 11+ candidates are mostly a self-selecting group of the most able children, so standardising them against the general population would just produce lots of 140s, not the spread required to differentiate between abilities. Local standardisation is why we can never work out how many age standardisation marks our children may get- it depends how clever the other children taking the test are (and specifically how clever those born in the same month are.) As NFER point out, the 11+ is standardised using the one and only group of children to take the particular test written for them.

If anyone is interested, I found this link while looking for something else.
http://www.nfer.ac.uk/publications/pdfs ... ialing.pdf
It is from NFER, showing how they trial test questions to make up a bank to choose from for testing and makes interesting reading. So what happened to our NVR?? Makes you think they must make one of the papers in each year harder on purpose, then.......??? :twisted:
But the test is selecting the top 25% of the whole years cohort, including those that do not take the test. Therefore, significantly more than 25% of those who actually take the test will pass. If, for arguments sake, only 25% of the cohort who were eligible to take the test did so, then they would all pass.

Posted: Mon Oct 06, 2008 6:49 pm
by KenR
Hi onedown

I've had a further read of the information on the Judd Wensite about the kent tests and I'm pretty sure that the tests will be pre-standardised against a national trial sample.

The Kent Tests – Information for Parents
The maximum score in any of the Kent PESE tests is 140, but this does not equate to 100%.
Historically, the score of 140 equates to approximately 75% on any one test.

A minimum selective score of 120 equates to approximately 50%. The tests are in:
Verbal Reasoning (80 questions)
Non-Verbal Reasoning (72 questions)
Mathematics (50 questions)

Figures are approximate because older children in the year group needed to gain a slightly higher % than this, whilst younger children could gain a slightly lower %. In addition, there is a variation from year to year.

There is no guarantee that this pattern (standardisation) will continue in January 2008 or that commercially produced practice tests match this distribution.

Judd Entry Line
Scores in the three tests are added together and students are then rank-ordered according to their scores (with distance from the school determining the ranking for students on the same aggregate score). 420 is the most common score in Kent and is achieved by approximately
50% of those offered a place at Judd. In 2007 the entry line for Judd (the score of the 125th student offered a place) was 413 (this has varied between 403 and 414 in recent years).
The reason I'm fairly sure is that for a normal Age Standardisation the Standardised Scores correspond with a percentile rank, with a score of 133-138 being the 99th percentile and 139+ being well inside the top 1%. Given that Judd say that 140 is the most common score and that this equates to about 75% in any one test, therefore it must be a National Sample Standardisation. In a normal Standardisation against the cohort 140 is non a very common score.

This is not uncommon, Warwickshire did this for many years using computer pregenerated results for the Morey House Tests. Again these produced lots of scores of 140 (70 out of 946 in 2006/2007) - a max of nearly 5% for 1 of the tests.

Posted: Mon Oct 06, 2008 8:01 pm
by One Down
This is a turning in to an interesting discussion for those of us scrutinising details as a diversion from thinking about when the results will arrive!

Dad of Kent, you are absolutely correct as not everyone takes the exam. However, various bits of info suggest more than 50% of children fail in Kent and the pass mark changes from year to year so 120 seems to be arbitrary.

KenR, I appreciate your more extensive knowledge of various different areas. Most of my info was gleaned from the 11+ section of the NFER website, as the Kent 11+ is set by NFER. This section is tken from the 11+ selection section of the website:

Tests that are sold to schools 'off-the-shelf', for purposes other than secondary selection, are usually accompanied by a standardisation table that enables the teacher to look up a pupil's standardised score. This table can be computed in advance because, prior to publication of the test, it has been administered to a large representative national sample of pupils and the standardised scores that are in the table are a reflection of the raw scores and ages of all the pupils in that sample.

In contrast, NFER constructs 11+ tests especially for a school or Local Authority and this is not taken before or after by any other group of pupils. Therefore, the tests are standardised using the first and only group of pupils to take the tests - the 11+ candidates - and this is known as a "local standardisation".

Any ideas? I suppose in the end, it doesn't really matter, as the only thing that counts is what's in that envelope!

It should be understood that, without special adjustment, locally standardised scores are not comparable to nationally standardised scores. This is because the pupils in a national sample are specially chosen to be nationally representative, whereas this cannot necessarily be said for the group of pupils that is entered for the secondary selection tests at one particular school or in one particular Local Authority.

Posted: Tue Oct 07, 2008 7:31 am
by KenR
Hi One Down

I guess the only way to find out would be to write or or send an email to Kent LEA and request clarification - this would obviously need to be a Kent parent.

It's interesting that Warwickshire switched from Morey House last year (due to claims of cheating) and changed to a bespoke NFER for 1 year only prior to switching again to a bespoke Univ of Durham this year.

When they ran the NFER test last year I believe they wanted the marking and scoring to be a similar range to the previous test (pass mark 126) and so I think they must have adopted a National rather than a local Standardisation. So I suspect that NFER can do this is required.

As you say, a somewhat academic exercise as it's what in the envelope that counts!

Best of luck

Regards

Ken