Here are some of the highlights
Android O is Google’s next version of their Android platform. Here are some of the new features that will be available on new phones during the summer
- Picture-in-picture support allow users to minimize a video and continue to surf through other apps while the video and sound continues to play. This feature has been available on the iPad since iOS 9. Offering picture in picture support should help increase video engagement on Android.
- Autofill with Google makes it easier to set up a new device, login and remember creditcards. Its a complementary solution to the Google Smartlock feature which makes it easy for users to remember passwords and automatically login. The autofill API/sdk will be available for app developers to use in their apps.
- One of the challenges with developing for Android has been the slow adoption rate of the latest operating systems by device manufactures. Sometimes taking years before the majority of users are using the latest Android operating system version released. Google’s project treble attempts to improve this, “we’re re-architecting Android to make it easier, faster and less costly for manufacturers to update devices to a new version of Android.” If they succeed it may ultimately result in users adopting the latest versions of the operating systems relatively quickly, thus making it possible for you to offer the latest OS features quickly.
- Notification dots/badges have been available on iOS devices for quite some time and will be available in Android O. The notification dots allow you to indicate some form of status on the icon. For example something is new within your app. Users can long press on an app icon to view a contextual menu and glance at the notifications.
- Android instant app functionality is now open to all developers to use. Instant apps allow users to run your apps instantly without installation. The instant apps can be initiated from any URL including search, social media, messaging and other deep links without needing to install the app first.
- Android Vitals is a project focused on optimizing battery life, faster boot times, graphic rendering time, and stability. The developer tools have improved to give developers an insight into the performance of their apps.
Virtual Reality
Google announced the next version of their Daydream VR headset called Euphrates.The new standalone headset is made specifically for VR and will not require cables, a telephone or PC. Everything is built into the headset. The portability of the device and the fact that the hardware and software is tailored specifically to deliver the ultimate VR experience makes this a convenient and user friendly VR headset. This may help to increase the adoption rate of VR headsets and make hardware more accessible to users. One of the killer features of this new device will be the ability to share what the headset user is seeing on a TV using chromecast.
Augmented/mixed reality
Google have also worked on advancing AR capabilities. New mobile devices supporting Google’s augmented reality computing platform called Tango are being released this year. Devices using Tango use sensors, GPS and machine learning to try and understand the space around them similar to us. One of the examples given at the keynote was using a combination of Google maps and computer vision to offer indoor navigation they are calling a visual positioning service (VPS).
AI/Machine Learning: AI and machine learning advancements were highlighted throughout all of Google’s announcements. Product and platform improvements that supports Google’s shift from a mobile first to ‘AI first’ strategy. Creating services and interfaces that allow humans to interact with computers in more natural and immersive ways. Some of the highlights included
- The continued improvements in voice recognition software. Computers are getting much better at understanding speech. Products are evolving beyond keyboard and mouse input to use voice and computer vision. Google assistent and google home are spearheading this evolution in interaction interfaces. The Google assistant will also be available for iPhone users. An update to google home and chromecast allow users to see visual responses to conversations with Google home on their TV.
- They also showcased some of their computer vision technology through Google lens. The technology will be available through Google assistent and photos first. It uses machine learning to understand the world around you, some of the examples given were using your camera to identify flowers, identifying and understanding text in images and translating the text if necessary. Soon you will be able to have a conversation with Google assistent about what you and it sees. Google assistent SDK is available to developers to use in their devices .
- TensorFlow, Google’s open source machine learning library will soon be available to use in apps. A lightweight version called TensorFlow Lite will be made available to app developers. Developers can build lean deep learning models that can run entirely within their Android apps./
As the tech giants like Google offer smarter and more advanced products and services based on AI and machine learning users will expect the same level of interaction and personalisation from content publishers. Algorithms are now in many instances smarter than humans.
For more information about Googles upcoming plans for 2017 check out the Google developer channel on youtube and the keynote presentation.
https://youtu.be/Y2VF8tmLFHw