It’s not easy being an e-marking team manager. With students counting the days until results are published, you need to meet your own marking targets and assess the marking of your team of examiners. You need technology that encourages accuracy and speed.
So when our long-term client RM wanted to make sure its RM Assessor e-marking platform was working for team managers, we began with one question in mind: what will really make life easier for this pressured user group?
We’d already been working on the development and user experience design of RM Assessor, but our focus for this project was entirely on the experience of team managers.
We ran workshops to understand their needs and challenges, and mapped out how they currently complete tasks on RM Assessor – and how this could improve. We then used sketching, wireframes, prototypes and several rounds of user testing to validate our findings before making changes.
We worked closely with the RM team to help develop their skills too – sharing our approaches to things like user testing and impartial user interview techniques.
RM Assessor’s users were at the heart of our thinking throughout the process, so it means a lot that their initial feedback has been so positive.
94 % of markers rated RM Assessor3 as good or excellent
95 % said it made marking more efficient.
98 % said it was intuitive to use.
The training we gave RM has the potential to improve other aspects of the platform, too: helping to ensure future development is always based on the genuine experiences and expectations of users.
Having an intensive phase of the project focused on user experience really helped us to make this complex area of the system better for team managers. The feedback from the examining bodies has been very positive.
Whether you want to create a new digital product or make an existing one even better, we'd love to talk it through.