The statistics module, available on FLOWmulti, can be used to gain insight into the performance of both test-takers and questions on a particular flow. The module is available to both assessors and reviewers, although whereas assessors will only see the summary statistics for the students to which they have been assigned, the reviewer has more general oversight over all students enrolled on the flow.
To access the statistics module as an assessor, click the left-hand flow menu from inside the marking tool and choose Statistics.
As a reviewer use the statistics button from the flow landing page.
The landing page shows a graphical overview of the generated statistics, split up into several sections which are explained further below.
Test-level statistics are provided in 3 charts, shown below.
A histogram depicting the number of test-takers within each scoring interval. Use the wrench icon to adjust the size of the bins to 2, 5, 10 or 20 point ranges.
A pie chart which shows the proportion of items (note this is at the item-level rather than the individual question-level) marked as either fully correct, partially correct, incorrect, unanswered or awaiting manual scoring. You can adjust the chart by clicking on the legend to hide or show different categories.
A summary of the scoring including the mean, the maximum and the minimum scores achieved as well as an indication of which was the most and least difficult items (based on average scores) as well as which test-taker scored the highest overall.
Item-level and participant-level statistics are shown under four separate tabs, each described in the following sections. Most of the data available here can be exported in case you wish to load it into other statistical applications to conduct further analysis.
The participant tab gives an overview of how each test-taker has performed on the test.
- Participant information (which can be anonymised depending on the flow settings).
- Access the individual participant view to drill down into a more detailed overview of that particular test-taker's performance.
- See a summary of any annotations added against the participant (only for Assessors)
- Submission status.
- The calculated score (includes items that have been manually scored).
- A visual indicator of the distribution of scoring outcomes across the test items (values are correct, partially correct, incorrect, awaiting manual scoring or unanswered).
- Export the table to a CSV file. It is possible to select which columns should be included in the export.
The score tab shows a more detailed view of how test-takers have performed across the different questions and items in the test.
- Participant information (which can be anonymised depending on the flow settings).
- Access the individual participant view to drill down into a more detailed overview of that particular test-taker's performance.
- Score information split into different columns for the auto-scoring and manual scoring as well as the total score.
- The scoring applied to each item in the test with a colour-coded indication of whether the items was answered correctly or incorrectly. You can see if a manual score has been applied by hovering over a value.
- Export the table to a CSV file. It is possible to select which columns should be included in the export.
The assignment tab shows a summary of how the questions performed in relation to the scores achieved by the test-takers.
- Item number or ID.
- Access more detail by drilling down into each individual question.
- Average, variance and spread columns to denote the mean score achieved on the item, the variance and the standard deviation of the scores in each item.
- A visual indicator of the distribution of scoring outcomes across the test items (values are correct, partially correct, incorrect, awaiting manual scoring or unanswered).
- Export the table to a CSV file. It is possible to select which columns should be included in the export.
The selected responses tab shows a list of the individual items with a correct answer key, a points value for the correct answer and an indication of which response option was selected for each test-taker. Where the test is set to shuffle answer options the data visible here will be the response option anchored in its original position when the test was built (not necessarily the letter selected by students if shuffling was enabled).
Please note that this feature does not support all question types and works best for single best answer style items.
It is possible to export this data to load into other statistical packages if required.
The individual item overview shows a more detailed set of summary statistics about that specific item including a summary, score and answer distribution, comparative distribution boxplot, granular item analysis and details of the test item itself.
Shows a brief summary of the item performance data including the average score, variance and standard deviation.
A pie chart showing the distribution of scoring outcomes on the item. Options are correct, partially correct, incorrect, unanswered and awaiting manual scoring. You can adjust the chart by clicking on each label in the legend to show or hide it in the chart.
A pie chart showing the distribution of responses on the item. Not all question types are supported and this feature worked best with single best answer style questions. You can navigate between individual questions in the item using the dropdown field to the bottom right.
Shows the summary statistics mapped onto a boxplot for comparison against other items in the test. The currently selected item is shown in blue.
Shows the score achieved by each test-taker on the item, split into auto-scored and manually-scored credit. Also displays the correct answer and that chosen by the test-taker.
A preview of the item as it was displayed to test-takers. Where the test is set to shuffle answer options the data visible here will be the response option anchored in its original position when the test was built (not necessarily the same order the options were presented to individuals).
The individual participant overview provides details the student performance across the test and on each item in the test. This includes summary information, mark and answer distributions, cohort comparison (histogram) and a detailed outline of how they responded to each item.
Shows the test-taker's biographical information, but can be anonymised in the flow settings.
Shows the proportion of the total available marks the test-taker has been awarded, with a visual indication of the split between auto-scored and manually awarded points.
A pie chart displaying the distribution of the scoring outcomes for all of the items in that test-taker's submission. Options are correct, partially correct, incorrect, unanswered or awaiting manual scoring. You can adjust the chart by clicking on each label in the legend to show or hide it in the chart.
A histogram that shows where the test-taker sits within the overall cohort distribution of scores. The bin in which the test-taker resides is denoted by the dark blue bar. Use the wrench icon to adjust the size of the bins.
A complete overview of the test-taker's performance across all items in the test. This includes the status of each item (correct, partially correct, incorrect, unanswered or awaiting manual scoring) as well as the scoring achieved by auto-validation and by manual marking and how that scores compares to the cohort mean for each item.