“QNAP OceanKTV is the first NAS-based karaoke application on the market allowing you to create a quality karaoke experience at home or at work. Store your favorite songs in the QNAP NAS, attach it to a big-screen TV via HDMI, and start singing with OceanKTV!” Qnap
Ocean KTV offers a full-featured karaoke singing experience with versions for QNAP NAS, iPad, iPhone, Android, and TV. All of the supported platforms sync and work together in real time, share the same feature sets, and have a user experience tailored for each device’s platform. In developing the user experience vision for the product, we felt that understanding the customers context of use was of utmost importance, as was their wide range of capabilities; the defined customer personas included those with little to no computer experience.
One of the main goals of this particular study was to find what problems customers had when accomplishing specific tasks in their context of use. Earlier studies, and preliminary user tests, found a number of problems to investigate, but this was the first usability test done in the field. Post-test qualitative interviews took the opportunity to gain further understanding of the customers problem space.
Though it was planned to be available and localized for various international markets, the software itself was tailored for the greater China market as a means to drive sales of the hardware product. All user research and testing was conducted locally with Chinese speakers.
My primary responsibility was the user experience for the mobile platforms, iOS and Android. Prior to this test I performed heuristic analysis, literature reviews, user research Through field visits and interviews) and was in constant communication with the product team using wireframes and various fidelity prototypes. I also participated in creating a set of overarching guidelines for creating interfaces for TV and for handheld hardware remote controls. Perhaps the most interesting activity was gaining an understanding of the whole KTV experience by conducting field visits to KTV parlours throughout Taiwan in order to understand first hand the kind of experience we were trying to create (despite living in Taiwan for many years my dislike of singing kept from participating in this common activity).
With the assumption that familiarity reduces learning times and may drive initial adoption, initial user experience direction provided by product management closely resembled many of the commercial systems that users come across in various KTV rooms throughout Taiwan and China. A comprehensive competitive analysis was performed, initial prototypes created, but the reference experiences were fraught with usability problems and didn’t always translate well to smaller handheld screens.
We caught many of these problems internally, including with testing in a controlled environment, but had yet to involve our customers in later iterations. We were particularly concerned with the customers ability to navigate and find items within a large data set using a mobile device in a distracting environment.
I designed the test, was the chief facilitator, was responsible for analysis, and took care of the extensive preparations that are required for an onsite test such as this. During the test I had the invaluable aid of an event host and 2 other facilitators who were familiar with the test design. This allowed me to focus on observing, interviewing, and note taking.
Approval was provided from the director of the design center and the head of product. Involving product management was key as they were required to read and possibly act upon our recommendations. Our test also had to fit within the projects timeline.
This usability test came at the tail end of a number of iterations, we used a modified Agile approach, on various interfaces that allowed for interaction on a number of devices. At this stage we were utilizing late stage high resolution prototypes with most functions working as intended. From the start of test design, to test preparation, recruitment of participants, carrying out the test and finally analysis, it took about 7 working days to complete the test.
Test design involved defining the problems that we wanted to focus on, refining the purpose of the test, deciding what parts of the software would benefit the most from usability testing, defining who the participants would be, creating the specific tasks to test, creating the post test questionnaire and interview questions, and more.
Session planning and recruitment involved ensuring we had a time, location and the equipment we need. It’s easy to make mistakes here so I created a large check list in Omnifocus to ensure nothing got forgotten. For finding participants I relied upon the network of my facilitators to find people to participate who matched our personas.
Running the test required me to relinquish control to the moderators to manage people and create a suitable environment for the test, while I had the participants accomplish various tasks using the think aloud protocol, while I took notes. During the interviews, I asked a facilitator to take notes while I conducted the interviews (“conversations with a purpose”) so I could give the participants my undivided attention. Throughout the test my goal was to talk as little as possible, simply guiding the participants or by asking follow questions (I like to use the 4 why’s technique).
The results were quantified using a 4 point severity rating that we developed internally but which borrowed heavily from the examples of others.
Ocean KTV was assessed using the Think Aloud protocol, enabling us to examine the interface and determine any possible usability pain points. These were used to distill a list of findings that impeded the softwares usability and propose a list of recommendations to address these issues, and improve upon the customer’s overall experience using Ocean KTV.
Some of these recommendations included:
- Improve the legibility of and increase the user’s confidence in search results by providing stronger language filtration, removing the cumbersome tab interface, add better search to allow users to understand where and what there are searching for, and provide stronger search result feedback by adding a status indicator to the search bar and multiple signifiers to results confirming relevant search.
- Redesign the layout of the start screen to support giant leaps in navigation, to better support customers desire to first browse (then search) by adding more relevant information based on customer’s interests, and improve back to home screen capability by improving ensuring that there is always a visible method to return to that screen.
What I learned
Competitive analysis was often misused in Taiwan. Instead of using the method to highlight strengths and weaknesses of products in order to make more informed decisions about product strategy, it was often an excuse to find apps. or UI to copy, without a deep understanding of what lead to their design decisions. While user research was a common activity, in-depth user interviews often took a back seat to a copy machine approach. This approach introduced problems that appeared in testing which perhaps could have been addressed at somewhere else in the process. I should have pushed back harder at being the product teams human copy machine.
Running on-site usability tests can be fraught with possible problems. This onsite test was almost cancelled due to technical difficulties if it were not for the resourcefulness of my helpful host who found another room to host our test. Being prepared for all possible contingencies is key.
The following documents available upon request (some are in Chinese): Test Plan, Moderator script, Scenerios (participant copy), System Usability Scale (SUS), and the Final Usability Report.
Ocean KTV can be downloaded from the App Store.
I was a Senior Engineer at QNAP and partnered with various project teams to lead the user experience vision for mobile and desktop software.