Assessment & Marking

Moving to e-Assessment: a case study

Student looking at screen with hand on keyboard

By Dr Therese Lawlor-Wright, Lecturer in Management of Projects,
the School of Mechanical, Aerospace and Civil Engineering


I teach on the core ‘MSc in Management of Projects’ unit in the School of MACE. The programme is very large, taking up to 330 full time students per year, and students are mainly from overseas with a variety of different backgrounds. This unit is run in the first semester, when many students are still struggling with language comprehension and need to become familiar with a vocabulary and terminology for Project Management. In this core unit, we include eLearning material which introduces them to terminology and some core concepts from the Association of Project Management. The eLearning is very popular with the students as it allows them to study the material at home and to replay as often as they like. In this way, students quickly grasp the vocabulary and key concepts and are better prepared for higher level study in their MSc course.

The assessment of this learning takes the form of a multiple choice, closed book test with 60 questions in one hour (the same format as the accredited exam). I ran this test in a paper-based form with the students for two years. Due to the high number of students and the need for adequate spacing and supervision in the test environment, this required booking additional classrooms with support of teaching assistants and academic staff. It also required time to prepare and photocopy test papers and answer sheets, to mark the papers and then to transfer the marks into Blackboard for release to the students.

Running the paper-based test was quite a logistic feat due to the need to split the large student group into different venues and arrange invigilation, transport of papers, etc. Obtaining additional rooms for the test was difficult and co-ordinating the activities of already busy staff was challenging. After the test, the marking was shared between six staff but needed to be organised and distributed, then marks checked and collated. My office became loaded with paper and I was extremely concerned about the possibility of answer sheets being misplaced, as well as potential for error in transferring marks from paper to electronic systems.


The idea of moving to online assessment had been considered about one year previously. I found it quite daunting as it was a new approach – introducing the potential for Blackboard gremlins to strike – jeopardising my best-laid plans. I had a couple of meetings with the EPS eLearning team throughout the year, during which we discussed how it could be done. The eLearning staff supported me greatly in setting up the assessment in the Blackboard environment. This was done months in advance to allow for quality checking; questions were presented in random order and kept in a hidden test folder.

I realised that the EPS eLearning team were experts in running these tests across the Faculty with student groups. I was reassured that they were used to dealing with students and the everyday problems that they present with, such as panicking and losing logins. I came to realise that running the test was something that I would do in partnership with the team rather than being on my own. They explained about how we could use an ‘assessment desktop’ on the University PCs to restrict the students during the test to just accessing the University Blackboard system rather than external sites or personal email. We agreed the best solution was to use the School computer cluster. This has a capacity of 160, this meant two sittings of the test with a 30 minute changeover between the two student groups. With password protection on the tests and the access restricted to computers in the test cluster, I was reassured that students outside the computer cluster could not get access. We made a plan for the day itself and I was assured that the eLearning team would be there to support me, right from clearing the cluster until the end of the tests. Knowing they would be on hand to deal with any Blackboard or IT gremlins helped me to make the final decision to go for electronic assessment and definitely reduced the anxiety associated with trying something new.

In briefing the students, I gave clear instructions (using Blackboard announcements) on which students should attend each session. I also gave an in-class briefing about what to bring to the test and what to expect at the test session. On the day before the test, I put plenty of signs up in the computer cluster to warn other students about the booking. I had scheduled the first session immediately following another class (with a one hour gap to allow preparation of the room and installation of the assessment desktop). This was to avoid the unpleasant task of having to ask students who were not taking the test to leave just as they had arrived and started their work.

On the Day of the Test

I arrived at the cluster one hour beforehand to find the eLearning team already clearing the room in order to install the assessment desktop. Later I learned that there had been some unforeseen problems in rolling out the Assessment desktop. This meant that there needed to be some manual intervention and it was all hands on deck to type some instructions into 160 PCs. The atmosphere was calm and although gremlins had indeed struck, they were very effectively zapped!

Students were admitted 20 minutes before the start of the test. They were directed to the available PCs and each student then logged into Blackboard and opened the test folder. Students then waited until the test start time, when the eLearning team displayed the passwords. When students entered the password, they could begin the test. Being presented with questions in a random order meant there was less opportunity for ‘overlooking’ other students, and the test atmosphere was calm and relaxed. When they were finished, students submitted their answers and were allowed to leave. Both test sessions started and finished on time.

After all students had finished the test and left, we cleared and checked the room. After a telephone call, IT Services removed the Assessment Desktop so the computers in the cluster were restored to normal use. We had occupied the cluster for 4 hours, I could hardly believe it was all over and everything was so calm. The key seemed to be working as an integrated team; we had different areas of responsibility and knew our roles. I was on hand with some teaching assistants to support with the test, but, in reality, the students were familiar with the Blackboard environment and all was very straightforward. We took a group photo, then released the computer cluster back to the School.

Immediately following the test, I worked with one of the eLearning team to check that all was in order with the scores recorded in Blackboard. I was impressed with the level of detail of the results (breakdown by student and by question). This would allow me to find if there was any unforeseen problem with a question or to check on a student’s individual results if needed. I did not find any problems, so we downloaded the scores into a spreadsheet and put the results on timed release for the students the next day. I was able to quickly identify the average mark and notify the administrator about one student who had not attended.

In previous years, I worked hard to get the results released to the students before Christmas. Last year, after considerable effort the results were released 10 days after the test. This year, the results were available for release immediately. I allowed myself one day before releasing the marks to the students to resolve any unforeseen issues, but actually there were none. This definitely is a smoother experience for the students; the test atmosphere was very calm, and I have not received a single query about the test results.

One week after the test, I am still astonished that it all worked so smoothly. At this stage in previous years, my office was full of test papers that needed shredding and I was struggling to get the marks together for release. This year we have also saved some trees as well as some time, and students have had their results 9 days earlier than previously -what’s not to like?

The School of MACE has a strategy to improve quality and reduce loading and this experience shows that electronic assessment has the potential to do this. It took a while for me to become confident enough to make the transition from the paper-based system. In reality, it was the support of the EPS eLearning team as well as the Blackboard tools that allowed this to happen. I look forward to working together in the future to improve other courses.

Anyone thinking of making a similar move, I would advise start early and work with the Elearning team. With their support, you can discover what is possible, investigate alternatives and resolve any issues about moving to e-assessment.