Not only Facebook and its properties are interested in Snapchat's features. Google also has an eye on it, and this time it's about augmented reality (AR).
The tech giant Google borrows Snapchat's AR feature to create a version of Motion Stills for Android. The feature brings AR objects to animated GIFs and videos.
Here, users can add objects like chickens, dinosaurs or others to any horizontal surface using their smartphone camera. But the thing here is that Google makes it a bit more natural with it using motion tracking technology to avoid frequent anomalies that are common when using AR.
Motion Stills version 2.0 allows users to insert animated 3D objects into the real world before recording.
Starting December 2017, Google Pixel 2 owners with Android Oreo 8.1 received access to Google’s AR Stickers. Using Google’s ARCore technology, the Pixel 2 allows users to insert Star War, Stranger Things characters and other animated characters, directly into their surroundings.
But with Google's Motion Stills App, any Android phone with a built-in gyroscope can have such feature. So no ARCore required.
And considering that the minimal Android version is 5.1 (which was released in 2014), Google is making sure that Motion Stills can reach a lot of potential users that are interested in AR.
Here, Motion Stills is less advanced. But to achieve similar abilities, it takes advantage of a device’s existing accelerometers and gyroscopes as reference points to make virtual object "sticks" on a flat plane (like a table or a hand), and tracking the location of the device to accurately show the object.
Then it separates the camera's rotation to come up with the estimated position so it can create a 3D translation of the object in real-time.
After that, it combines the data with gyroscopic data to give a truly perspective-correct view of virtual objects. So even if the user pans away and come back, they don't need to calibrate it. The initial rendering, with respect to the ground plane, users can track six degrees of freedom of the camera (3 for translation and 3 for rotation). This allows the app to accurately transform and render the virtual object within the scene.
And for it to scale up or down objects proportionally, Google uses the same technology that YouTube uses to track and blur faces.
The end result looks similar to what the Pixel 2 can do, allowing users to insert polygonal chickens, robots, and dinosaurs into scenes. Initially, branded experiences from the Pixel 2 aren’t available.