Introduction
Boost Insights includes a range of free reports to help analyse your test data. These include:
- School overview report (digital report)
- Question-level analysis report (digital report)
- Group report (PDF)
- Individual learner report (PDF)
- Learner progress report (PDF)
- Test performance comparison (digital report)
- Age performance review (digital report)
- Group average review (digital report)
Scroll down for a detailed summary of each report.
School Overview Report (Digital Report)
The School overview report lets you easily compare attainment across year groups for each academic year. The stacked bar chart shows the percentage of learners working towards, at the expected standard or working at greater depth across the assessment suites used in your school during the year.
This report is available for: New PiRA, New PUMA, PiRA for Scotland, PUMA for Scotland, New GaPS, NTS Reading and NTS Maths.
Note that the report is only generated for pupils who have taken the test for their specific year group, e.g. all Year 2 pupils who have taken New PiRA 2 Autumn. Any pupils who have taken a different paper intended for a different year group, i.e. those working below, are not included in the report.
When you first land on the report it will be blank, the user makes their selection from the filters. They need to select:
The product(s) / test suite(s) used in their school (up to 3)
The academic year they want to generate the report for
The academic term (Autumn (Term 1), Spring (Term 2), Summer (Term 3))
Clicking on a bar in the School overview report (e.g. Year 4) will load the Question-level analysis report for the specific Year group and test paper (e.g Year 4, New PiRA 4 Autumn).
Question-level Analysis Report (Digital Report)
The question-level analysis report lets teachers analyse gaps in learning to inform targeted teaching at a question and strand level. The report can be run for a specific class, group or year group on one test.
You can select multiple classes OR groups OR year groups to generate the report (i.e. you can only choose multiples within the same category).
Note that if a learner has taken a test paper twice the QLA report will only pull in the most recent result for a learner.
The QLA report is separated into two parts: Overall scores and Strand scores and the key to explain any colour coding can be found by hovering over the (i) button.
The report lets you view each learner’s specific results for every question and strand. You can also view the overall performance of the group via the average scores for each question and strand. If a test has facility/national average values available then a difference is calculated between the facility/national average and your learners’ average score. The question facility/national average is the percentage of pupils in our standardisation sample who got a question right and the strand facility/national average is the percentage score the standardisation sample achieved on a specific strand. The facility/national average allows you to compare the performance of your group/class/year group of learners to our standardisation sample.
Note that facility/national average values are not available for ART, AMT, BNST and SSRCT.
For learners working below:
If you have any learners who are working below and have taken a test that is outside of their age range, e.g. Year 4 pupils who have taken New PUMA 2 Autumn, you can use the ‘Add learner(s) working below range’ feature at the bottom of the QLA report. This ensures that the average score for the group/class/year group reflects the number of learners in the group and not inflated.
[image]
Click the + and – buttons to add or remove learners working below range. Once added the averages in the Overall scores section and Strand scores section will update. Note that this information is not saved, if you navigate away from the page the information will be lost.
If you wish to keep a copy of the QLA for your group/class/year group which includes learners working below range you can click export test data to download the report as a spreadsheet:
[image]
Learners working below will be added at the bottom of the list of names in the export, one row per addition.
[image]
This report is available for all products excluding SSRCT. Note that for our wide-span tests--ART, AMT, and BNST--that there will not be specific facilities provided for each question and each strand for the test papers, as the test papers for each product can be used across a wide-age range. Therefore the facility row and difference row at the top of the report will be hidden, and you will only be able to view your learners’ average scores for each question and each strand.
Group Report (PDF)
The Group report lets you view the performance summary of learners in a group or class on a specific test, including the proportion of learners working below, at or above expectations and the group’s average strand performance. For every learner you can review their raw score, relevant standardised scores and the number of marks awarded for each strand.
In the proportions matching expectations graph, the data is displayed as a count not a percentage, e.g. 20 pupils in total are working at.
This report is available for all products, but the ‘Proportions matching expectations’ graph will not show for the following products: ART, AMT, BNST and SSRCT. For these products, there will also not be any national average data reported in the ‘Average strand performance’ bar chart.
Individual Learner Report (PDF)
View the performance of an individual learner on a specific test. Overall performance includes the learner’s raw score and relevant standardised scores. A strand performance graph is also included.
This report is available for all products.
Learner Progress Report (PDF)
Compare the performance of one learner across multiple tests (up to six) over time. The overall performance for each test includes the learner’s raw score and relevant standardised scores. The Strand performance bar chart can be used to home in on specific learning gaps.
This report is available for all products.
Test Performance Comparison (Digital Report)
Compare the performance of a class, group or year group across two test papers from the same test suite, viewing the results side-by-side.
Results are presented as raw scores, standardised scores and age-standardised scores.
Two test papers must be selected to generate the report.
The difference is calculated for you and colour-coded to highlight if a learner is performing below, at or above average. When calculating the difference, if a standardised score or age-standardised score contains a less than (<) or more than (>) sign, the number that precedes or follows the sign is used in the calculation e.g. if a pupil has scores <69 and 71, then the difference will be calculated as 71 – 69, and the result will be displayed as <2.
This report is available for all products excluding BNST and SSRCT.
Age Performance Review (Digital Report)
Compare the age performance of a class, group or year group on one test. Review a learner’s age-standardised score, actual age at the time of the test and the attainment age on the test.
The difference between a learner’s actual age and attainment age is calculated for you and colour-coded to highlight if a learner is performing above, at or below their actual age.
The difference is calculated for you and colour-coded to highlight if a learner is performing below, at or above average. When calculating the difference, if a age-standardised score or attainment age contains a less than (<) or more than (>) sign, the number that precedes or follows the sign is used in the calculation e.g. if a pupil has scores <69 and 71, then the difference will be calculated as 71 – 69, and the result will be displayed as <2.
This report is available for all products excluding BNST and SSRCT.
Group Average Review (Digital Report)
Compare the group average of a class, group or year group across the products (test suites) used in an academic year. To generate the report an academic year must be selected.
For each assessment and for each termly paper, review the average standardised score, average age-standardised scores and average attainment age of learners who took the test.
For each measurement the difference is calculated for you and colour-coded to highlight if the group has made above-average, average or below-average progress. When calculating the difference, if a standardised score, age-standardised score or attainment age contains a less than (<) or more than (>) sign, the number that precedes or follows the sign is used in the calculation. E.g. if a pupil has scores <69 and 71 then the difference will be calculated as 71-69, and the result will be displayed as <2.
[image]
To access the Group average review report, you must select the Product Comparison Reports tab in Analyse data: dashboard & reports. For detailed guidance, see Generating Reports.