Looking at the current wearables landscape, these devices often take one or more of the following roles:
The wearable collects data about the wearer activity or physiological condition. This data can be used to monitor the user’s state as well as to encourage her to improve her fitness, movement, and other health factors. Jawbone UP is an example of such a wearable.
The wearable device, often being more readily accessible to the user than his smartphone, displays selected alerts and events from that device, such as a phone call, incoming message, or meeting reminder. The user can then decide whether to pick up the phone and act upon it or respond later.
Note that most wearables acting as “Messenger” today rely on a Bluetooth connection with the smartphone for their operation.[—] Through this connection, they essentially mirror selected event notifications, so users are alerted more quickly. The Pebble watch functions as such a device. Another is Moto 360, which displays a variety of on-time alerts and notifications to users, as shown in Figure 4-11.
Figure 4-11. Examples of alerts and notifications on the Moto 360
The wearable facilitates certain communication, media, or other activities that are already available on the smartphone (or other devices) by offering a simpler, more convenient experience. For example, capturing a video by using Google Glass is much easier compared to a smartphone. Instead of the user having to turn on the screen, unlock the device, launch the camera app, change to video mode, and then hold the camera device in front of the body to take the video, Google Glass allows the user to immediately trigger the camera (via a voice command/button click), right in the moment, to capture the video seamlessly, still remaining an integral part of that experience (rather than holding a device that places a “wall” between her and the scene).
The wearable device augments the real world with information that is overlaid on the environment and its objects, potentially with the capability to interact and digitally manipulate that information. The film Minority Report is a popular culture reference for this type of AR technology. In the iconic scene shown in Figure 4-12, agent John Anderton (Tom Cruise) is manipulating — in a perfect orchestral fashion — a heads-up
AR display interface at the Justice Department headquarters.
Figure 4-12. An image from the famous movie Minority Report, directed by Steven Spielberg (Paramount Home
AR-equipped wearables open up a wide array of use cases that these devices can enhance and optimize (from gaming, navigation, shopping, and traveling, to communication, education, medicine, search, and more). It is clearly a promising direction toward which the industry is advancing, one which could offer a more natural, unmediated, immersive interaction with the world. I expect that AR technologies along with their implementation in consumer products will significantly mature and grow over the next 5 to 10 years, leading to a tremendous change in the way we perform daily tasks, engage with content and media, and interact with the environment (and each other).
At the same time, it’s worth noting that currently very few devices are actually able to deliver such experiences (especially when referring to the interaction layer, on top of the information overlay), let alone for broad daily consumer use cases. Recall at the beginning of this chapter, I mentioned Oculus Rift and CastAR as two smartglass devices that integrate AR/VR technology focused on gaming. Another example is Meta’s Pro AR eyewear, which is one of the pioneering devices to introduce an interactive holographic UI. As of this writing, this wearable is on the cusp of becoming available, with an anticipated price tag of $3,650 (which is still far beyond mass market reach).
If you look carefully at these different roles, you’ll notice that they greatly impact other
UX factors as part of the design, further stressing the interconnectedness between all these factors. For example, Facilitator and Enhancer both require the wearable to have a display — physical or projected — so that users can view the information and potentially interact with it, as well (we talk more about the types of wearable displays in the next section). This also means that the device needs to be located in an area that is easily reachable for the user (within touch, sight, and, preferably, hearing). These requirements essentially restrict you to the upper front part of the body (from the hips up). The head, arms, hands, and neck, as well as pockets are usually the most convenient locations.
A Tracker usually requires that it be placed on a specific body location and/or have direct contact with a certain body part in order to reliably record the desired data. This narrows down the location options yet still leaves room for creativity. For instance, if the device needs to be in touch with the chest, you could design it as a chest band, a necklace, or even as a smart t-shirt. The preferred route depends on the type of data you want to collect, the technology used, the target audience, and the use cases in focus.
DISCUSSION: FROM WEARABLE DEVICES TO SMART CLOTHING
Athos is one of the pioneers in smart workout gear (https://www.liveathos.com/). The company’s athletic clothing is equipped with sensors that measure muscle exertion from the chest, shoulders, arms, back, quads, hamstrings, and glutes, plus heart rate and breathing — all rolled up into a single piece of clothing, as illustrated in Figure 4-13.
Figure 4-13. Athos smart workout gear, equipped with sensors throughout
With its clothing, Athos demonstrates an important principle of wearable design: keep it simple and single. In other words — it’s better to design the wearable experience based on a single unit, rather than break it down to multiple components, such as a set of multiple trackers to wear or a smartglass that comes with a separate touchpad for interaction.
Multiple components are much harder to manage on a day-to-day basis, they are more cumbersome to wear and remove, and people are more likely to lose pieces along the way. As a result, they could be a significant barrier for adoption and ongoing use. Athos cleverly packaged all of its sensors in a single clothing unit, so that as a consumer, you need only to wear a shirt (a daily, familiar behavior), and the rest is done seamlessly.
Looking ahead, touchscreen t-shirts, which allow interaction on the clothing itself (through projected display) appear to be just a few years away (http://mashable.com/2013/02/15/armour39/).
As the wearable market and technology continue to develop, we will see the list of wearable roles enriched, both in terms of the functional and interaction possibilities within each role and additional new roles (probably more tailored to specific domains/market segments and needs).
In any case, remember that these roles are not necessarily mutually exclusive. Some wearables do choose to focus on a specific role only; consider, for example, MEMI (see Figure 4-14), an iPhone-compatible smartbracelet that serves as a messenger wearable. The bracelet uses light vibrations to notify the user of important phone calls, text messages, and calendar alerts.
Figure 4-14. MEMI smartbracelet, designed as chic jewelry that looks nothing like a “technological device”
Others, however, integrate multiple roles within the same device. Samsung’s Galaxy Gear smartwatch is an example of a wearable that serves as a tracker, messenger, and facilitator, all in one device. It has a pedometer that tracks step data; it is linked with the smartphone and displays notifications on the watch screen; and it facilitates actions such as calling, scheduling, and taking a voice memo by speaking to the Gear device (which is immediately accessible).
Deciding which route to take with a wearable (single role or multifunctional) goes back to the design fundamentals: your users, use cases, and the broader ecosystem of devices. As with any UX design, there’s a trade-off between simplicity and functionality; the more features you add to the product, the more complex it is to use. Therefore, make certain that you focus on your users and try to determine what they really need by looking across the entire experience map (with all its touch points) before adding more features.
Also, wearables are very small devices; thus, they are very limited in terms of display and interaction (which we discuss in the upcoming sections). Additionally, you have other devices in the ecosystem that people are using (smartphones, tablets, and so on), and these can either work in conjunction with the wearable or take care of some of the functionality altogether, thereby relieving the wearable of the burden.
Remember the discussion about the need to consider gender in fashion preferences and the current masculine- dominated wearable industry?
The MEMI smartbracelet shown in Figure 4-14 is focused on changing exactly this status quo. It is designed and branded as “Wearable Technology Made by Women for Women.” On their Kickstarter page, the founders explain, “Our friends don’t want to wear big, black, bulky tech devices. In fact, they don’t wear ‘devices,’ they wear jewelry. So, we set out to create a bracelet that is both stylish and functional” (http://kck.st/1phgyxX). The MEMI design is definitely a refreshing change in the wearables landscape, and the project has already exceeded its funding goal of $100,000.