Google's preview release of its open-source Tensorflow Lite software for machine learning developers shows a shift in the field of AI.
By developing AI to be capable in running algorithms straight inside mobile devices without internet connection to the cloud, Google sets TensorFlow Lite as the groundwork for AI for Internet of Things (AIoT)
The software library is aimed at creating a more lightweight machine learning solution for smartphone and embedded devices. Available for both Android and iOS app, TensorFlow Lite paves the steps for developers to create AI-powered systems that aren't just capable and accessible, but also disposable.
Some products, like Google Assistant, Amazon's Alexa, Microsoft's Cortana and Apple's Siri are among the most popular uses of AI in the mainstream. With voice-recognition capabilities, they are interactive AIs.
With developers having access to TensorFlow Lite, developers can focus on applying existing capabilities of models to new data, rather than creating one that needs to learn new capabilities from existing data. Developers can design AIs based on existing hardware, without having to provide any new functionality for AI applications.
It can run on Snapdragon processors inside many smartphones, making devices AI-capable without having to connect to the cloud.
TensorFlow team said that:
"We plan to prioritize future functional expansion based on the needs of our users. The goals for our continued development are to simplify the developer experience, and enable model deployment for a range of mobile and embedded devices."
TensorFlow Lite is optimized and trained for: MobileNet which enables identification across 1000 different object masses, Inception v3 which is an image recognition model, and Smart Reply that is an on-device conversational model that provides one-touch replies to conversational chat messages.
Redesigned from scratch TensorFlow Lite is lightweight, cross-platform and optimized for mobile devices.
According to the TensorFlow Lite's website, the software is designed to meet the following criteria:
- Capable for smart appliances to create new possibilities for on-device intelligence.
- Stronger user data privacy paradigms where user data doesn't leave the mobile device.
- Ability to serve 'offline’ use cases, where the device does not need to be connected to a network.
Supporting Android Neural Networks API, TensorFlow Lite AI platform allows developers to run AI models on a smartphone and, upon adding new data, run those algorithms to determine new outcomes. It’s simply like machine-learning on the go, without internet connectivity.
Being capable and accessible, but at the same time disposable, TensorFlow Lite is paving the way for voice-controlled disposables chips and AI-powered appliances that don’t expose data to hackers. The aim for the project, is to create a ecosystem where AI can be implemented on any devices, regardless how dumb or disposable they are.
Google engineer and TensorFlow technical lead Pete Warden told MIT: "What I want is a 50-cent chip that can do simple voice recognition and run for a year on a coin battery.” And here TensorFlow Lite can put that vision into a reality.