User Testing

Three user tests were carried out for the three different fidelity stages. There are various ways in which a product can be examined. Personally, I find heuristic evaluations to be very effective. However, as meeting deadlines was a factor I chose to use a “think out loud” method. The same technique was used for all three user tests. Therefore, I will explain the procedure for the medium fidelity testing.

Medium Fidelity Evaluation

Yvonne Rogers states that a product must be repeatedly checked during the design process to see if the product continues to meet the needs of the user. These evaluations are known as formative evaluation. Evaluations that are done to assess the usability of the completely finished product are known as summative evaluations. However, due to the objective and time restraints of this thesis, this product was not to be subject to any summative evaluation. This sort of evaluation would take place at a later stage before releasing the product to the public. Once a higher fidelity version of the mobile application was created, user testing was preformed once again to obtain user feedback. The testing structure followed the framework presented in “Cooperative Evaluation: A run-time guideCooperative Evaluation: A run-time guide”.  Within this guide Monk provides advice on how to recruit users, how to prepare tasks, how to interact & record and highlights the importance of debriefing. Some key noteworthy tips include:

  • Recruit users who are similar to the target users.
  • Be specific with tasks and examine important features twice.
  • Try keeping it light and informal. Assure the user that what they might deem as a mistake is actually a flaw with the design/system that they have highlighted therefore providing valuable information.
  • Encourage the user to think aloud in order to give an insight into thought process. Keep the user talking.
  • Reflect on the prototype with the user to solidify any issues or difficulties the user may have had with the testing.

Usability Test

After the redesigns suggested from the lo-fidelity user testing was implemented into the medium fidelity digital wireframe, it was possible to host another user testing session to gather updated user feedback. The medium fidelity user testing was an opportunity to assess things that could not be assessed in the first user test such as the use of colours, fonts and finer details. Furthermore, the product is at a more complete stage allowing more complex tasks to be tested. The InVision digital wireframe allowed an opportunity to test the full user experience and capability of the application as a whole. It was populated with dummy data to represent real time API data that would ordinarily be accessed online.

Before the evaluation commenced the InVision prototype was installed onto an Android phone. The application prototype was imported from its URL onto a Motorola Moto 3G. The battery was fully charged.

Place: Various on location testing dependent on participants’ availability.
Time: 3rd, 4th and 5th of August 2016
Equipment:

  • Motorola Moto 3G
  • Notebook and pen for documenting the process
  • Camera to take pictures of the testing
  • Sound recorder to allow future reflection of the experience.

Adhering to Nielson user testing theory about 5 user test subjects being ideal, five participants (fours male and one female) were tested. Four of the participants had recent experience using Android software, whereas the fifth user was not too familiar with it. However, all users were frequent users of smart phones, social media platforms and were familiar with using mobile applications in order to complete an objective. The participants were told that they would be using a mobile application titled “Right Here, Right Now” that was designed to provide the user with real time information about POIs of their choosing (including most recent pictures that are uploaded from the POI and the recent content of what people are saying about the POI). Each user test was performed individually and took approximately 25-30 minutes to complete the tasks, which were followed by a 5-minute debriefing. Participants were told of the importance of them providing running commentary in order to understand their thought process when decision making and comprehending what is presented.

A task list of 17 tasks was created to assess the usability of the mobile application. The users completed all 17 tasks using the InVision digital wireframe version of the application. The purpose of these tasks had three primary goals:

  • Test the user’s understanding of the user interface components of the application
  • Assess the user’s understanding of the POIs “recent activity” provided by social media i.e. their understanding of the chosen word association, origin of the content, capability to relate the content to real time events at a chosen POI.
  • Monitor the users’ experience.

 

 

Check out the development stage.