Is this real research or just a commercial activity?

In the Herald this morning:

‘SCOTTISH children are no longer the best at reading across the UK and Ireland – after being knocked off the top spot in a “stinging blow” to education leaders. The largest literacy study ever conducted in the UK, written by Professor Keith Topping from the University of Dundee, has revealed that Scottish pupils are now joint second in terms of their level of reading comprehension – on a par with England and behind Northern Ireland.’

I can find no sign of ‘proper’ peer-reviewed research, published in a credible journal, but only this press release on the Dundee University website:

‘The What Kids Are Reading Report 2020, written by the University of Dundee’s Professor Keith Topping for reading practice and assessment provider Renaissance UK, showed that Scottish youngsters have slipped behind Northern Ireland and have come joint second in terms of level of reading comprehension, alongside England.

In an effort to access the actual report, I find that What Kids Are Reading reports are published by a US corporation which seems to fund ‘research’ as part of its marketing strategy stimulating the sales of a narrow range of books which it describes as the ‘Top 20 Books’ for different levels of reading. Renaissance reports also have links to produce such as these:

I can’t seem to access any report with details of sampling, methodology or data. The Dundee press release has no link to it but, oddly, does list some of the ‘favourite books and authors.’ I remain open to being corrected about this, if someone can provide me with a link.

I see regarding last years report:

The annual What Kids are Reading Report, analysed by the University of Dundee’s Professor Keith Topping, was written using data compiled by literacy and assessment provider Renaissance UK. The study analysed the reading habits of 1,057,720 pupils across the UK and Ireland, including 29,524 in Scotland.

The data was provided by a private business?


16 thoughts on “Is this real research or just a commercial activity?

  1. Well, good for NI kiddies!

    We don’t need to be top at everything all the time 😉

    And I’m not keen on the way children’s learning has been politicised – it’s far too much pressure to put on young people. It’s good to know there are good educational standards and children are benefitting, but just like the pizza scoring – what is a useful measure and do we want children being pressured into unnecessary achievements? Reports like what this sounds like are pretty sketchy, and don’t help.

    Liked by 3 people

  2. Here’s an actual expert talking about progress on tackling the attainment gap –

    “Some encouraging indicators are also emerging. In December 2019 PISA reported that pupils’ social background has less influence on reading and maths attainment in Scotland than the OECD average and attainment between the most and least disadvantaged children and young people has narrowed on most indicators. For example, the gap between the most and least disadvantaged narrowed in the percentage of pupils achieving expected CfE level in literacy, and numeracy in primary schools. In secondary schools the gap between the most and least disadvantaged has narrowed in the percentage of pupils in S3 achieving CfE 3rd level or better in numeracy.”

    Liked by 2 people

  3. Found this while, like you, searching for the report on which several newspaper reports are based this morning, to inform my Masters class about to start next week at University of Glasgow on ‘Developing LIteracy’. What is this about …! Can only find the US report (as yet unpublished) and no useful reliable evidence.

    Liked by 1 person

  4. To really see what these “Experts” of all kinds, on many subjects, we need to delve in a bit more on where the come from, what their back ground is, what prompted their question or report on any given issue. As we have seen repeatedly in the past, many have a motive. Knowing their history and where the got their status should be a subject before we listen to anything they say.

    Liked by 1 person

  5. This is probably only for ‘anoraks’ here rather than given a wider profile. And with apologies for the length.

    Having delved into the full text, including the data tables and technical annexes, of the report entitled ‘What Kids Are Reading’ I confess to be struggling to verify either the assertions made by The Herald or the ‘proof’ claimed by the Scottish Tory spokesperson on education Jamie Greene that it shows failing standards. I’m also perplexed by the claims based on this report in the University of Dundee’s press release, a source where one would expect clarity and objectivity.

    Firstly on the origins of the report. It is based on research commissioned by a commercial company, Renaissance, a company which sells educational software to schools. This particular report exploits data generated by the classroom use of its commercial ‘Accelerated Reader’ package, described as a personalised practice and daily progress monitoring system with accompanying computer quizzes which test pupils’ understanding (comprehension) of each individual book they have read.

    It is a relative decline in comprehension amongst pupils in Scotland that the University of Dundee’s press release focuses on and has been picked up by others.

    The report examines ‘quality of comprehension’ using a metric it terms the ‘Average Percent Correct’ (APC) score. This is assessed on the basis of pupils’ performance on book ‘quizzes’ – the higher the score the better the pupil’s comprehension (As an aside, at one point the report states: “The most effective indicator of quality of implementation is Average Percent Correct (APC)” – so is the implication here that the APC is a measure influenced by how well the computer package has been implemented?)

    In the report quiz based APC data – the metric of comprehension – are reported separately for pupil cohorts ranging from Year 1 to Year 11 (England’s terminology) over the school year 2018-19. (Years 1 to 7 equate to Scotland’s P1 to P7 in the primary sector.)

    Searches for APC (i.e. the comprehension related) data disaggregated at a country (‘regional’) level yielded surprising results. Given the prominence (and policy significance) of this matter in the University press release and the nature of the Tory spokesperson’s comments, I expected something quite definitive. However, I have failed to find a reference in the main report or its appendices to APC scores being combined together (aggregated) across all years by individual country: therefore I was unable to verify the claim that Scotland’s ranking has changed over time in any overall sense. If an overall determination has indeed been made by the researchers their method and statement of this conclusion is not evident in these terms with the report!

    There are a series of tables in an appendix (from Table 63 onwards) labelled “Variation in Reading by Region”, each table focuses only on one Year-based cohort of pupils in turn. Oddly it is (only) in a footnote to each table, in small type, that APC scores (i.e. the crucial comprehension measure) for each country are given, for 2020 report and for the previous yea’s for comparison.

    So to the nub of the matter, the APC scores for Scotland relative to the other home countries:

    YEAR 1: there are no UK APC data for this Year outside England – so no ranking of Scotland is possible

    YEAR 2: Scotland’s score ranks third equal in the 2020 report out of the four UK nations: it ranked second of three in the 2019 report

    YEAR 3: Scotland’s score ranks second equal in the 2020 report: in the 2019 report it had the same score and this was equal to all other UK nations in that report

    YEAR 4: Scotland ranks first in the 2020 report of the four UK nations: it also ranked first in the 2019 report

    YEAR 5: Scotland ranks first in the 2020 report: it ranked first equal in the 2019 report

    YEAR 6: Scotland ranks fourth in the 2020 report: it ranked first equal in the 2019 report

    YEAR 7: Scotland ranks second equal in the 2020 report: it also ranked second in 2019.

    Although one can continue the analysis up to Year 9 to 11 cohorts the data are less directly comparable as the transition from primary to secondary schools takes place in different ‘years’ across the UK. Also the number of pupils involved becomes much smaller in older cohorts. But for completeness – Scotland ranks first equal on to APC scores in Year 8 in the 2020 report, it was fourth in 2019 with the same score; for Years 9 to 11 combined Scotland ranks first in the 2020 report and in the previous year ranked second equal.

    So does an overall, significant performance decline in the data for Scotland jump out at you? Me neither! It’s all rather ‘odd’.

    Over the various Year groups and over two years’ of data (which for the avoidance of doubt cannot justify any conclusions of a trend, to better or worse results, anyway!), it is some stretch to see any compelling ‘proof’ in this report that the Scottish Tory spokesperson is claiming demonstrates failure within Scottish education!

    Moreover I failed to find a technical explanation of the statistical significance (or not) of the differences recorded in average APC scores either between the different countries in the UK or between the two time periods. It is in the APC scores for Year 6 that Scotland’s relative ranking sees the most evident change: this is due to its score of 0.93 in the 2019 report dropping to one of 0.91. But what is the statistical significance of such a change? I think the University of Dundee and its principal investigator in this commercially-funded study should explain to the Scottish teaching profession, the Scottish public, The Herald, and of course the Scottish Tories’ education spokesperson!

    Dear John, please alert us to something much more interesting and rewarding tomorrow!!

    Liked by 1 person

      1. Indeed. Without showing that the tests of reading are valid and reliable and, as StewartB points out, the level of statistical significance which led to the conclusions, and, of course, how the pupils were chosen – were they a randomised sample?, the “research” is useless. Maybe you can email the author and ask him to respond (or not) to the criticisms on here.

        (most tests of reading, or anything else, use standardised tests which have been rigorously tested on a large population. Even these have their problems, but at least they have been constructed using established procedures.)


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.