surface hub home page image-min.jpg

Microsoft

 

Hub_Screen_02-min.jpg

The Surface Hub is an interactive whiteboard developed and marketed by Microsoft, as part of the Microsoft Surface family. The Surface Hub is a wall-mounted or roller-stand-mounted device with a touchscreen that has multi-touch and multi-pen capabilities running the Windows operating system. These devices are targeted for businesses to use while collaborating and videoconferencing.

MY ROLE

  • Design strategist

  • Lead interaction designer

  • Visual designer

CHALLENGES

  • Microsoft acquired Jeff Han's Perceptive Pixel, which had previously developed large-screen multi-touch displays such as the CNN Magic Wall. Microsoft intended to mass-produce the devices with Windows as the operating system, which inherently broke all of the existing UI experiences on the device.

  • Sell a design-led culture into an engineering-centric environment.

IMPACT

  • My responsibility as the lead UI designer was to redesign the existing UI on the device to seamlessly work within the Windows ecosystem.

  • I partnered with the founder of Perceptive Pixel and his engineering team to design a new set of pen and touch UI experiences that resulted in UI patterns that still run on the device today.

Above, I am presenting to the founder and inventor of Perceptive Pixel, Jeff Han (on the right).

Above, I am presenting to the founder and inventor of Perceptive Pixel, Jeff Han (on the right).

 

Below are a few examples of the UI experiences I created for the code name, LSX (large screen experiences).

When I kicked off the project, I needed to set context on what it meant to design for public computing experiences. I invented the ‘spectator’ and ‘performer’ vernacular that was adopted by our team when speaking about our personas.

When I kicked off the project, I needed to set context on what it meant to design for public computing experiences. I invented the ‘spectator’ and ‘performer’ vernacular that was adopted by our team when speaking about our personas.

Above, I created these three categories of public computing paradigms. We used these categories as a foundation when designing our UI solutions.

Above, I created these three categories of public computing paradigms. We used these categories as a foundation when designing our UI solutions.

Hub_Screen_07-min.jpg
Hub_Screen_08-min.jpg
Hub_Screen_09-min.jpg
 

Inspiring through storytelling. I hired a storyboard artist to help illustrate our environments and key scenarios to inspire our team with our UX vision for this new computing paradigm.

Hub_Screen_18-min.jpg
Hub_Screen_16-min.jpg
 
Hub_Screen_15-min.jpg

Walk up and Use. The critical use case ‘walk up and use’ is defined as a series of UI affordances and experiences that enabled our users to successfully understand and interact with our device successfully from the first time they interact with the device. A majority of our users were first time users.

High-level task flow for the ‘walk up and use’ use case.

High-level task flow for the ‘walk up and use’ use case.

A sample of the many UI sketches for the ‘walk up and use’ core use case.

A sample of the many UI sketches for the ‘walk up and use’ core use case.

Early prototype movie I created. We would view these on the actual device during our design reviews.

 

Lead designer for the hero application, Whiteboard.

Above is one of the early prototypes I designed for the whiteboard app experience. Early on, we considered duplicating the UI experience to enable seamless, real-time collaboration. After prototyping and testing, we found it was overkill and not useful for real-time collaboration. Usually one person was writing and the other was observing.

Hub_Screen_17-min.jpg

This is the final UI that is still used today.