Since the official launch of its ARCore, Google has been working quietly on improving its augmented reality (AR) platform. Now, the company says appears ready to unveil the next-generation upgrades that promise to make AR experiences seem much more realistic in the future.

The upgrades, part of ARCore’s all-new Depth API, will soon allow developers to perform what is known as occlusion, which is when an artificial object can be blocked from view by other real-world objects in a scene. For example, you could place a virtual cat in your living room, watch it disappear from view when you angle your camera in a way that places a bed or table or some other object in between.

Google says it is able to do this through optimizing existing software, so you will not need a phone with a specific sensor or type of processor to view this. It is also all happening on the device itself, and not relying on any help from the cloud. So long as you have a phone that supports ARCore, which is pretty much every new Android phone released in the last few years, you will be able to access these new features.

Google’s AR division last week did a real-time demos to show off the new depth technology. The demo revealed the ability for AR objects to move through an environment by going around and over real-world objects. It involved a cooking robot that engages in a food fight with you that takes into account the furniture and walls of the environment, with desserts leaving realistic splatters on surfaces. In the demo you could also create colorful shaped blocks out of thin air that could bounce off virtually any surface, even the handlebars of an exercise bike.

In fact, these features are already available, as part of updates to home design app Houzz and Google’s own AR in Search feature. Google says the more than 200 million Android devices will also get occlusion for any object that has an AR model in Google Search.

Google is also reported to have developed a way for AR objects to interact with the real world more realistically, move through an environment the way a real-world 3D object would, and interact with surfaces like you might expect physical matter would.

The company does not have a timeline for when it does expect to release this toolset more broadly, but it is likely these capabilities will be showing up in apps and AR web experiences some time next year.


Read Today's News TODAY... on our Telegram Channel click here to join and receive all the latest updates t.me/thetimeskuwait