Metaverse is not dead! I have been working recently with a lot with AI and Microsoft Mesh, but the metaverse much more: it is not just a virtual world. It is also a “platform” where physical and digital realities can converge. By using robotic avatars, we can bridge the gap between the real and the virtual, allowing us to interact with both environments in novel ways. For instance, we can use robots to explore and manipulate physical spaces that are otherwise inaccessible or dangerous. This is nothing new. But what about if you connect that to a new experience that connects virtual spaces with physical ones? The virtual form can be enhanced with additional information and collaboration. Imagine working on a construction site with a team of experts from different countries, who can see through robots’ eyes, interact and give you feedback and guidance in real time. Pretty cool. Scifi? Read on..
Autonomous robots at industrial sitesTaking the next stepRobopresenceScifi or reality?
The potential of robotic avatars for the metaverse is intriguing, and it is not limited to the industrial industry. We can create new experiences and services that combine the best of both worlds, enhancing our physical reality with digital elements and vice versa. Robotic avatars can be a new interface for the metaverse.
Autonomous robots at industrial sites
When I read Telia’s article about use of service robots, it sparked the idea that I need to write about other opportunities for the metaverse. In the article it was mentioned how robot dog could be used to autonomously scan their surroundings and thus create a dynamic virtual representation of the space they work in. This can enable to keep digital twins up to date with the latest information, especially when imaging is analyzed with vision AI (Artificial Intelligence).
Frans the robodog is a four-legged mobile robot platform designed for industrial use, developed by Telia as part of the multi-purpose service robotics as operator business innovation project. It can collect and analyze data, model its environment in 3D, and perform gas and thermal imaging measurements. Frans has been tested in various industries, including construction, forestry, and mining, where it can monitor the progress and quality of work, detect hazards and equipment failures, and assist with small tasks. It can also operate in dangerous or hot environments where human workers would be at risk. One of its most impressive features is its ability to create a digital 3D model of its surroundings using its camera and measurement technology. This model can be used to create a metaverse, a virtual representation of the physical world, which can be accessed using VR/MR/XR (Virtual Reality/ Mixed Reality / eXtended Reality) headset/glasses or a computer screen, enabling remote collaboration and communication across separate locations.
Service robots can be utilized in various sectors, tasks, and roles both indoor and outdoor, maintenance, surveillance and even customer service.
Instead of a person doing the scan manually, a robot can model the environment all the time. When something changes, it can be updated to the model. Using Vision AI to connect model parts onto digital twins enables connecting that digital model to the physical space. And no, this is not something that just emerged yesterday. Spot the robot dog fitted with scanner for 3D site data capture – Construction Management. Boston Dynamics has already a range of robots for versatile needs. Autonomous robots walking around the work site is not scifi, it is already a reality and they have proven to be very useful. Read more from Forbes article.
Picture from Boston Dynamics Spot page.
Taking the next step
We know Microsoft Dynamics 365 Remote Assist, that can be used to connect an expert to a local maintenance person, and they can use video, annotation, and other tools to reach the goal – fix the problem or job. What else we need? Digital twin? Let’s use Microsoft Mesh to create that. And we already established that there are robots that roam around industrial sites on various tasks. What these all mean together for collaboration and metaverse?
Instead of just creating a just a 3D model or digital twin based on scans, how about we starting using robots as the physical representation of the person in the industrial metaverse? Someone could enter the Microsoft Mesh digital twin in virtual reality and get connected to the robot on site, which could be walked around the site by defining where you want to go by that person in VR environment.
There are easily issues if you try to control the robot real time in virtual reality – based on what the robot sees – the more user-friendly solution is, on my opinion, to instruct the robot to move to a specific location and robot takes care of navigating and transition by itself. After all these robots are assumed already to be doing various autonomous tasks at the site.
Adding more capabilities, video at minimum, would enable remote inspection of parts and areas. Remote controlled drones with videos are not something new, but what would be new in industrial use is to add the capability to interact with people who are there – in the physical space – without a middleman. Instead of what you do with Remote Assist, the remote person would have the ability to move around (the robot) and use it as a channel for speech, videos and – depending on robot – interactions via arms or tools.
It is possible to add an arm to Spot. Picture from Boston Dynamics.
But it gets way cooler than that. Agility Robotics’ Digit is an autonomous humanoid shaped & sized robot that can perform even more tasks what a human can. Is two legs better than four? Not probably when it comes to stability, but it does provide more versatile use cases.
Picture from Agility Robotics.
GXO Logistics is testing Digit in Spanx Warehouse. Take a look.
That is not the only place where you can encounter Digits. Amazon is already testing them as well.
Keeping all this in mind, it may not be for a long when we start seeing more robots walking in warehouses, industrial sites, factories and so on. This brings me to the next topic, the point of this article.
Robopresence
Robot can then be the stand-in at the physical world for the remote expert and make the collaboration in the metaverse possible and reality. The remote expert could pull in more details, engineering plans and measurement to the virtual space that go beyond what is shown in the video – as the whole area is 3D modeled, it is possible to see the big picture. This could allow potential use cases where robots are given tasks (to check some details in the factory) and when it arrives at the destination the expert is notified so he/she can take more control of the robot to perform the task. People physically at the site could talk with the robot, so they do not need to use a headset or hold a device & point the camera in the right direction. Instead of a holopresence, this would be a robopresence.
Some benefits would be making the interaction more natural, remote experts would be able to move, select a view to see for themselves what the situation is on site, ability to investigate without a local person assisting and all this while the remote experts in the virtual environment. Talk about the real metaverse experience with this! And while robopresence is not used, the robot could carry on its automated tasks instead of just idling around. Getting robots for just robopresence will not get the ROI, but as there are more autonomous and walking robots at the site this would be a natural next step to utilize them.
There is a problem at warehouse device? Operator gets a notification about something that is not usual and puts in a work order for remote expert. The remote expert can get “highjack” one of robots on the site and move it to the area with the issue and start inspecting what is wrong – and possibly fixing the issue. Or in case local person is needed, that remote expert could work with local technician – but instead of having to have a call open the local person would be talking and interacting with the robot, which is the remote expert present both virtually and physically at the same time.
Autonomous robots are often used for analyzing and scanning the environment, so a person doesn’t have to go there personally. Picture from Boston Dynamics page.
Would these robots be four or two legged? Dogs or humanoid? It depends on various needs, but in environments where there are more complex tasks and versatility is needed humanoids will have an edge since those sites have been designed humans in mind. So, this is a full it depends on needs answer. Would robopresence be better than using a mixed reality device (HoloLens) by the local technician? It may sound off at this point in time, but I do think eventually this is an another it depends question. There are tasks that can be performed via robots, without having to need utilize local technician at all. Then there are tasks that it is way quicker for the maintenance technician grab the HoloLens and work on the problem with D365 Guides and Remote Assist. Of course there is also the aspect that it will feel weird to talk to the robot at first.
AI x Metaverse would be the thing here as well. Vision AI linked to the video can be used to detect anomalies, summarize the situation, suggest next steps, and these are just a few examples. GPT-4 turbo with vision can already identify and describe what is happening in pictures or videos. This can be part of HoloLens, but also shown in the virtual reality digital twin.
Industrial robots have of course more use cases beyond manufacturing or process industry. For example, think the hazardous environments that are dangerous to humans, such as nuclear power plants, chemical factories, or deep-sea exploration. Mining and exploration are other great industrial use cases where robots can work under harsh conditions. Some of these are already done and used, since remote control robots are just equipment that can be replaced easily.
Scifi or reality?
Is this a far vision or already reality? Copilot for Dynamics 365 Guides is connecting AI with Mixed Reality. While it relies on the person working at the site, with AI assisting, there are various building blocks that connect with this one. There are autonomous robots at some sites walking around and performing tasks – some are robotic dogs such as Boston Dynamics Spot and some are humanoid robots such as Agility Robotic’ Digit working & piloting at GXO Logistics and Amazon. Robopresence is something that could be done today but most sites don’t have these autonomous robots… Yet.
Photo from GeekWire: Amazon started its initial real-world testing of Digit last week. (GeekWire Photo / Todd Bishop)
What do you think about Robopresence? Are there cases of this already in the wild or already running a pilot?