Google I/O Introduces Contextual Awareness

Screen Shot 2014-06-26 at 10.49.12 AM.png

On June 24th, Google hosted their annual developer’s conference, Google I/O, at the Moscone Center in San Francisco. The event kick started with Sundar Pichai, Android’s Senior Vice President of Chrome & Apps, who shared that Android users cumulatively check their phones over 100 billion times per day. While Google is pleased that their 1 billion Android users are actively using their smart phone, Android developers want to improve the user experience through contextual awareness to ultimately decrease the number of times a user must physically check their phone.

Contextual Awareness

Users are increasingly using multiple devices therefore Google is working to create a seamless connected experience across them. Whether you are using your device at home for entertainment or at work for productivity, contextual awareness will enhance the user experience by bringing users the information they need when they need it. As Google works to integrate all the users’ devices, they are ensuring that mobile comes first by configuring devices such as watches, cars, television, etc. to be centered around mobile.

Android Wear

Each user receives an average of 150 notifications a day. With wearables, these interactive notifications as well as relevant information will appear right on your watch. In a world of multiple devices, wearables act as a key. Using context, tasks as simple as entering your password to unlock your phone have been eliminated. Bluetooth technology will instruct your phone to skip the password and automatically unlock your phone if wearing a smart watch. Although Apple has already built fingerprint passwords into its devices, Android’s personal unlocking skips the wait for your phone to read and accept your fingerprint.

Developers will now be able to write code that runs directly on the wearable itself to bridge notifications from mobile. App installation automatically occurs on both wearables and mobile simultaneously. David Singleton, Android’s Director of Engineering said, “The best wearable apps respond to the users’ context, put glanceable cards in the stream and allow the user to take direct action in just a few seconds.” Even Pinterest will now be able to enhance your devices using context. You will receive an alert whenever you are near a restaurant, store, or any location pinned by someone you follow. Eat24 allows you to order food in less than 20 seconds and the app learns your eating habits to enable easy ordering. If you tend to order Jimmy John’s for lunch on Wednesdays and Domino’s for dinner on Fridays, you will receive a notification at your usual time asking if you would like to place your normal order. Complete your order in just three taps with one tap on the notification, another to confirm, and the last tap to pay.

Android Auto

Users check their phone an average of 125 times per day. In the US, 25% of accidents are due to using devices while driving. Through integration of context, devices are able to predict behavior and know what matters to users allowing devices to display exactly what is most important to them at any given moment helping drivers to keep their eyes on the road and off their devices. Android Auto makes it easier and safer to use the features of your phone you want while driving by putting them right in the dashboard. Normal car controls and voice control make it unnecessary to use handheld devices.

Contextual awareness brings drivers the information they want. Google maps is easier than ever with all its features available right in your dashboard with voice control eliminating any need to take your hands off the wheel. Voice controlled texting aims to stop users from looking down at their phones while driving. With the Android Auto SDK, developers can build their apps into users’ driving experiences. Over 40 automotive manufacturers have joined the open automotive alliance. By the end of this year, you can purchase a new car that will allow you to engage in the Android Auto driving experience.

ContextHub

ChaiOne’s platform, ContextHub, is able to further enhance contextual experiences by connecting the entire Internet of Things. Developers can integrate context into their apps with the addition of just one line of code with ContextHub. By integrating ContextHub into applications, the potential to create contextual experiences is endless. For example, a developer could use ContextHub to enhance user experiences by using data from your calendar so that ContextHub can know your schedule and plan accordingly. The need to manually set an alarm clock every day could be eliminated as your calendar will know when you need to be woken up and when to start automatically brewing your coffee or tea. When you get in your car, directions to the destination along with driving conditions could be displayed without having to request it. Once you exit the house, ContextHub could tell your security system to turn on, lights to turn off, and thermostat to adjust. As you approach your house on your return home, geofencing technology could instruct the thermostat to set a comfortable temperature. When you arrive, your garage door could open and the security system could disarm using sensors in communication with your phone. Having studied your behavior pattern to learn your interests, your phone could alert you when your favorite show is on and automatically turn on your smart TV. Such functionality brings endless possibilities to technology when integration apps with ContextHub.

Conclusion

Wearables, television, and automotive is just the start of building contextual experiences. Any device or sensor with bluetooth or Internet connectivity can be used to enhance user experiences. Although smart homes have been the biggest area to see the use of context, industries such as automotive, retail, oil & gas, healthcare, and entertainment have also been working to integrate context. As the trend in technology keeps moving towards context, developers will continue to look for ways to simplify daily activities and beyond.