DJ2 is a live streaming application which connects with Network Attached Storage via a desktop client, iOS and Android mobile apps. Designed for use for scheduled and at the moment live broadcasts, it fits usage in classrooms, home, live events or anywhere with a network connection.
DJ2 features a clean easy to use live streaming experience, with social network integration, a feedback system, stream replaying, subscriptions and a scheduling interface.
I was responsible for developing the user experience vision and collaborating with the product manager to develop the original software spec. This was accomplished through a lengthy literature review, user research, interaction design, and expert reviews. Deliverables included sketches, wireframes, and a number of very lengthy shared Google docs., a version which can be viewed below (as a word doc):
This was an entirely new product category for QNAP, so whatever initial problems needed to be solved primarily rested with the Product Managers. During the completion of initial feature spec., lead engineers and UX are invited to help shape the initial document. It’s an entirely Agile and collaborative process, with a bit of well intentioned arguing, lots of writing, meetings and eventual involvement of the user interface team.
The product teams were often distributed, with meetings held via video conference and Skype calls. When language was an issue, which it often could be, I developed a quick sketching workflow whereby I would sketch a possible solution, take a photo which was automatically shared via Dropbox to Skype. I prefer face-to-face meetings but with Taiwan’s great network infrastructure this kind of solutions were a good alternative.
Below are some examples for the mobile version of DJ2 which show early flow, high fidelity wireframes, and some UI work I did to bring the direction to life.
This direction was completed with input from the product managers and based on user research (competitive analysis, interviews). It’s similarity to Meerkat, Periscope and Ustream was intentional but debated.
One of the key differences in direction I introduced was the reduction of possible tasks that users could perform on each screen. By focusing on those tasks that users most want or need to perform we reduce the complexity of the interface, thereby reducing their cognitive load.
A round of high fidelity wireframes for the iOS version
Some UI samples I finished
I spent a great deal of time exploring the record button above. I didn’t have time to test the differences with users, but settled on the above shape and label as I felt it was the clearest and offered the best touch target.