top of page

edX Usability Testing


For my HF750 Testing and Assessment course at Bentley, my team of six worked with a great sponsor, edX. We were tasked with examining the "first hour" that new users spend with the website with special consideration to any sources of attrition.

I was fortunate to be helped my great members of my team that were versed in project management, quantitative analysis, business, and usability test research. I gained so much knowledge throughout the term that I will carry onward in my goal to become a UX professional.

Team Management

Our team had six members and three major deliverables, so two team members took the role of "project managers" for each deliverable. C and D worked on the Test Plan. M and A worked on the Expert Review. Finally, K and I (Grace) worked on the Final Presentation.

I live very close to edX headquarters, so I volunteered to be the contact person for edX.

We first learned about severity scales, particularly the big four- Nielsen, Dumas and Redish, Rubin and Chisnell, and Wilson. Then we conducted individual Expert Reviews of the website, based on Nielsen's heuristics and Nielsen's severity rating scale. We then consolidated all six of our expert reviews via a spreadsheet and we each rated the severity of each problem on a scale from 0 (not a usability problem) to 4 (a catastrophic problem). From this we submitted a merged Expert Review, which we presented to edX.

Following our expert reviews, we developed a test plan in which we outlined the usability test. My team member, C, who had experience conducting usability tests greatly contributed to the success of this component. Our test plan examined the general journey of a first time user with the expectation that we would examine the positives and negatives in the initial evaluation of the website. The test included the six usability tasks, the System Usability Scale (SUS ), and after scenario questions. When creating the test plan, we included a moderator's guide as well as consent forms. My role was working on selected components of the test plan, including test procedure, equipment needed, sponsor's responsibility, and the consent forms.

Usability Test

Each member of the team moderated two usability tests: one in-person and one remote session. Each member then observed and took notes for two additional sessions. Most of our in-person sessions were held at the Bentley User Experience Center, meaning that we could take advantage of the camera equipment, computers, and the observer room (with a two way mirror and live recordings of the usability session).

Moderating: In my experience of moderating sessions, I found the in-person session to be very natural and easy to complete. It was great to apply many of the techniques we learned about in class and in our readings. Remote sessions, however, require more preparation in advance and is subject to technical difficulties, which I fortunately did not experience during my moderated session. I would love the opportunity to conduct more tests and leverage and improve upon this experience. In the future I would also like to try synchronous and asynchronous tests via UserZoom or UserTesting. This summer I hope to familiarize myself with both programs.

Observing: In my role as moderator, I placed notes into a note-taking grid that C created. This grid had room for general observations, post-task questions, post-test questions, and the SUS. One important thing I learned was the benefit of writing in the times in which certain “key” observations or participant quotes occurred. Because all of our sessions were video recorded, this gave us the opportunity to look back and review these parts of the test.

Additionally, edX generously provided Amazon gift card honorariums for each of the twelve participants in our study. I was responsible for emailing out these gift cards to participants post-test.

After the usability sessions were complete, we reviewed the note-taking grid for all the sessions that we had observed and moderated (4 for each of us). Then we input main findings about four things 1) Whether Key Problems Identified in the ER Validated or Not 2) New Positive Findings 3) New Negative Findings 4) Other recommendations for the ER. Following this, K and I as project mangers for the final presentation assigned people parts of the final presentation that we felt best-suited our team’s strengths and would form a cohesive presentation. D completed the introduction, K completed the high level overview of findings and the global themes identified, A completed Task 1 (First Impressions), C completed Task 2 (Searching and Browsing), M completed Task 3 (Registering), and I completed Tasks 4, 5, and the summary.

The summary was a challenge for me to create. I looked up how to calculate a SUS online and used a template that A provided to format the data. “A” also provided the ASQ charts as well. In the future, I would love to do a more rigorous statistical analysis in the examination of the information we collected, however, it was out of the scope of this class (and we were trying to consolidate our already very long presentation to fit within an hour).

It was great to work on the final presentation and ER with the team. Throughout, we collaborated well and provided suggestions on each of our sections. We presented our final presentation in class, and based on our professor’s feedback we revised our presentation, and presented it to edX.

We will check in a few months from now to see how any of our findings have influenced the website. We hope these findings improve the experiences of edX users!


edX Usability Testing


For my HF750 Testing and Assessment course at Bentley, my team of six worked with a great sponsor, edX. We were tasked with examining the "first hour" that new users spend with the website with special consideration to any sources of attrition.

I was fortunate to be helped my great members of my team that were versed in project management, quantitative analysis, business, and usability test research. I gained so much knowledge throughout the term that I will carry onward in my goal to become a UX professional.

Team Management

Our team had six members and three major deliverables, so two team members took the role of "project managers" for each deliverable. C and D worked on the Test Plan. M and A worked on the Expert Review. Finally, K and I (Grace) worked on the Final Presentation.

I live very close to edX headquarters, so I volunteered to be the contact person for edX.

We first learned about severity scales, particularly the big four- Nielsen, Dumas and Redish, Rubin and Chisnell, and Wilson. Then we conducted individual Expert Reviews of the website, based on Nielsen's heuristics and Nielsen's severity rating scale. We then consolidated all six of our expert reviews via a spreadsheet and we each rated the severity of each problem on a scale from 0 (not a usability problem) to 4 (a catastrophic problem). From this we submitted a merged Expert Review, which we presented to edX.

Following our expert reviews, we developed a test plan in which we outlined the usability test. My team member, C, who had experience conducting usability tests greatly contributed to the success of this component. Our test plan examined the general journey of a first time user with the expectation that we would examine the positives and negatives in the initial evaluation of the website. The test included the six usability tasks, the System Usability Scale (SUS ), and after scenario questions. When creating the test plan, we included a moderator's guide as well as consent forms. My role was working on selected components of the test plan, including test procedure, equipment needed, sponsor's responsibility, and the consent forms.

Usability Test

Each member of the team moderated two usability tests: one in-person and one remote session. Each member then observed and took notes for two additional sessions. Most of our in-person sessions were held at the Bentley User Experience Center, meaning that we could take advantage of the camera equipment, computers, and the observer room (with a two way mirror and live recordings of the usability session).

Moderating: In my experience of moderating sessions, I found the in-person session to be very natural and easy to complete. It was great to apply many of the techniques we learned about in class and in our readings. Remote sessions, however, require more preparation in advance and is subject to technical difficulties, which I fortunately did not experience during my moderated session. I would love the opportunity to conduct more tests and leverage and improve upon this experience. In the future I would also like to try synchronous and asynchronous tests via UserZoom or UserTesting. This summer I hope to familiarize myself with both programs.

Observing: In my role as moderator, I placed notes into a note-taking grid that C created. This grid had room for general observations, post-task questions, post-test questions, and the SUS. One important thing I learned was the benefit of writing in the times in which certain “key” observations or participant quotes occurred. Because all of our sessions were video recorded, this gave us the opportunity to look back and review these parts of the test.

Additionally, edX generously provided Amazon gift card honorariums for each of the twelve participants in our study. I was responsible for emailing out these gift cards to participants post-test.

After the usability sessions were complete, we reviewed the note-taking grid for all the sessions that we had observed and moderated (4 for each of us). Then we input main findings about four things 1) Whether Key Problems Identified in the ER Validated or Not 2) New Positive Findings 3) New Negative Findings 4) Other recommendations for the ER. Following this, K and I as project mangers for the final presentation assigned people parts of the final presentation that we felt best-suited our team’s strengths and would form a cohesive presentation. D completed the introduction, K completed the high level overview of findings and the global themes identified, A completed Task 1 (First Impressions), C completed Task 2 (Searching and Browsing), M completed Task 3 (Registering), and I completed Tasks 4, 5, and the summary.

The summary was a challenge for me to create. I looked up how to calculate a SUS online and used a template that A provided to format the data. “A” also provided the ASQ charts as well. In the future, I would love to do a more rigorous statistical analysis in the examination of the information we collected, however, it was out of the scope of this class (and we were trying to consolidate our already very long presentation to fit within an hour).

It was great to work on the final presentation and ER with the team. Throughout, we collaborated well and provided suggestions on each of our sections. We presented our final presentation in class, and based on our professor’s feedback we revised our presentation, and presented it to edX.

We will check in a few months from now to see how any of our findings have influenced the website. We hope these findings improve the experiences of edX users!


bottom of page