Using Research and Context for Enhancement

Research is essential to the success of all businesses. Wearable technology is changing how we go about conducting research. Through the use of lightweight wearable technology, researchers are able to see things from the perspective of users better than ever before. Tobii, SensoMotric Instruments, and Applied Science Laboratories have created eye tracking glasses for research. Using these glasses, researchers can get an inside look at user experiences including shopping, driving, gaming, advertisements, sports, and more. Researchers can then use this data to build contextual experiences for users to enhance everyday activities and enterprises.

Smart Shopping

According to the Food Marketing Institute, the average number of items in a grocery store is 43,844, making the assortment of items one of the countless factors that go into decision-making, which creates a wide variety of aspects to affect shoppers’ decisions. By shopping through the consumer’s perspective, researchers can observe which products catch customers’ eyes and how long it takes to make a decision or if the consumer has already decided on an item before walking down the aisle. Tobii glasses can answer these questions by seeing if the user picks up a box without hesitation or if they scan the selection of items and pick up a box for examination. Does the shopper spend more time comparing nutrition facts or prices? By seeing the shopping experience from the eyes of the customer, researchers can see what matters most to someone when making a purchase.

Researchers can use this information to further develop the technology. With the integration of technology and features similar to that of Google Glass, the glasses can learn a customer’s shopping habits to provide a contextual experience. For the health conscious shoppers, the glasses could display nutrition information as a user is stopped in an aisle glancing over the products. The glasses could then also display a price comparison of similar items or show the available coupons and specials for the price conscious shoppers. A running total price of the items placed in the cart could be displayed on the glasses so customers know what to expect at the register.

Crash Course in Drivers’ Usability

Car manufacturers can use eye tracking glasses to build better features into cars to better satisfy their customers. When drivers get into a new car for the first time, are they easily able to figure out how to control windshield wipers, headlights, and temperature controls?  Do drivers have better reaction time to prevent accidents through proximity sensors that alert of a potential collision?

Eye tracking technology can be used for more than just increasing an automotive manufacturer’s sales, but can also help reduce accidents through pinpointing hazardous roads, signage, speeding zones, or distractions. Through the use of eye tracking glasses, researchers can tell where a user’s eyes are focused and observe factors contributing to accidents. In 2012, there were 33,561 motor vehicle fatalities in the United States. This can be reduced by using data collected to educate drivers and improve driving conditions to decrease accidents.

The driving experience could also be significantly improved with the integration of contextual experiences. As a driver is approaching a traffic light, the glasses could flash a yellow or red light to alert the driver to slow down at the intersection. Instead of looking down at an in-dash navigation unit, the glasses could display an arrow at each turning point along with the mileage until the next turn and ETA.

Improved Quality Analysis

Before we begin developing a mobile app at ChaiOne, we have dedicated user experience researchers that work to find the best mobile solution for the intended users. While our researchers spend a great amount of time observing and questioning users to help find the best mobile solution, it would be beneficial to be able to use eye tracking glasses. Once an app is developed we have Quality Analysts that test our apps countless times across all devices to ensure that our product has no flaws and is easy for users to use. We, along with our customers who have asked us to build mobile apps, can benefit from eye tracking glasses to see users’ interaction with our apps from their perspective. This concept can be applied to quality analysis across all industries to improve products and services for users.

Enhanced Enterprises

Wearable technology, such as Vuzix interactive glasses, help enterprises to increase efficiency and minimize error. The lightweight glasses can easily be worn by employees wherever they go, truly mobilizing enterprises so that employees will no longer feel tied down to laptops or handheld devices.

In the oil and gas industry, the glasses can display efficiency stats and error warnings for all equipment on the rig. Upon a technician arriving at the rig, he will be able to quickly pinpoint and fix issues through the glass’s display directing him right to the malfunctioning part of the equipment. He will also have arrived on scene with the right parts and tools for repair because an employee on the rig forwarded the error message from the glasses to the technician ahead of time.


Context and wearable devices are the future of technology. Eye tracking glasses are able to provide insight into the user experience to find exactly how context can improve devices and everyday activities. By learning patterns of users, this can be used to build contextual experiences unique to each user through predictive analytics. ChaiOne has created a platform to make it easier for developers to also create contextual experiences for users. This solution, called ContextHub, allows you to better understand users with one line of code and ties the Internet of Things, including wearable technology, together. Although ContextHub is currently only available as a private beta, you can sign up for early access here.


How can you see wearable technology enhancing your enterprise?

Let us know your thoughts and tweet @chaione.


Google I/O Introduces Contextual Awareness

On June 24th, Google hosted their annual developer’s conference, Google I/O, at the Moscone Center in San Francisco. The event kick started with Sundar Pichai, Android’s Senior Vice President of Chrome & Apps, who shared that Android users cumulatively check their phones over 100 billion times per day. While Google is pleased that their 1 billion Android users are actively using their smart phone, Android developers want to improve the user experience through contextual awareness to ultimately decrease the number of times a user must physically check their phone.

Contextual Awareness

Users are increasingly using multiple devices therefore Google is working to create a seamless connected experience across them. Whether you are using your device at home for entertainment or at work for productivity, contextual awareness will enhance the user experience by bringing users the information they need when they need it. As Google works to integrate all the users’ devices, they are ensuring that mobile comes first by configuring devices such as watches, cars, television, etc. to be centered around mobile.

Android Wear

Each user receives an average of 150 notifications a day. With wearables, these interactive notifications as well as relevant information will appear right on your watch. In a world of multiple devices, wearables act as a key. Using context, tasks as simple as entering your password to unlock your phone have been eliminated. Bluetooth technology will instruct your phone to skip the password and automatically unlock your phone if wearing a smart watch. Although Apple has already built fingerprint passwords into its devices, Android’s personal unlocking skips the wait for your phone to read and accept your fingerprint.

Developers will now be able to write code that runs directly on the wearable itself to bridge notifications from mobile. App installation automatically occurs on both wearables and mobile simultaneously. David Singleton, Android’s Director of Engineering said, “The best wearable apps respond to the users’ context, put glanceable cards in the stream and allow the user to take direct action in just a few seconds.” Even Pinterest will now be able to enhance your devices using context. You will receive an alert whenever you are near a restaurant, store, or any location pinned by someone you follow. Eat24 allows you to order food in less than 20 seconds and the app learns your eating habits to enable easy ordering. If you tend to order Jimmy John’s for lunch on Wednesdays and Domino’s for dinner on Fridays, you will receive a notification at your usual time asking if you would like to place your normal order. Complete your order in just three taps with one tap on the notification, another to confirm, and the last tap to pay.

Android Auto

Users check their phone an average of 125 times per day. In the US, 25% of accidents are due to using devices while driving. Through integration of context, devices are able to predict behavior and know what matters to users allowing devices to display exactly what is most important to them at any given moment helping drivers to keep their eyes on the road and off their devices. Android Auto makes it easier and safer to use the features of your phone you want while driving by putting them right in the dashboard. Normal car controls and voice control make it unnecessary to use handheld devices.

Contextual awareness brings drivers the information they want. Google maps is easier than ever with all its features available right in your dashboard with voice control eliminating any need to take your hands off the wheel. Voice controlled texting aims to stop users from looking down at their phones while driving. With the Android Auto SDK, developers can build their apps into users’ driving experiences. Over 40 automotive manufacturers have joined the open automotive alliance. By the end of this year, you can purchase a new car that will allow you to engage in the Android Auto driving experience.


ChaiOne’s platform, ContextHub, is able to further enhance contextual experiences by connecting the entire Internet of Things. Developers can integrate context into their apps with the addition of just one line of code with ContextHub. By integrating ContextHub into applications, the potential to create contextual experiences is endless. For example, a developer could use ContextHub to enhance user experiences by using data from your calendar so that ContextHub can know your schedule and plan accordingly. The need to manually set an alarm clock every day could be eliminated as your calendar will know when you need to be woken up and when to start automatically brewing your coffee or tea. When you get in your car, directions to the destination along with driving conditions could be displayed without having to request it. Once you exit the house, ContextHub could tell your security system to turn on, lights to turn off, and thermostat to adjust. As you approach your house on your return home, geofencing technology could instruct the thermostat to set a comfortable temperature. When you arrive, your garage door could open and the security system could disarm using sensors in communication with your phone. Having studied your behavior pattern to learn your interests, your phone could alert you when your favorite show is on and automatically turn on your smart TV. Such functionality brings endless possibilities to technology when integration apps with ContextHub.


Wearables, television, and automotive is just the start of building contextual experiences. Any device or sensor with bluetooth or Internet connectivity can be used to enhance user experiences. Although smart homes have been the biggest area to see the use of context, industries such as automotive, retail, oil & gas, healthcare, and entertainment have also been working to integrate context. As the trend in technology keeps moving towards context, developers will continue to look for ways to simplify daily activities and beyond.