猫咪社区

Skip to Content, Navigation, or Footer.

Brown, Cornell researchers develop VR software that uses robot proxy for enhanced remote collaboration

New technology allows for hands-on collaboration over a distance, regardless of size differences between facilities

<p>The idea for VRoxy was sparked by Cornell PhD student Mose Sakashita’s frustrations working remotely as a teaching assistant during the COVID-19 pandemic.</p><p>Photo Courtesy of Juan Siliezar / Brown University </p>

The idea for VRoxy was sparked by Cornell PhD student Mose Sakashita鈥檚 frustrations working remotely as a teaching assistant during the COVID-19 pandemic.

Photo Courtesy of Juan Siliezar / Brown University

Virtual reality has the potential to make remote collaboration far more feasible in the workplace, according to Brown researchers. With headsets, robots can act as proxies for remote workers in healthcare facilities or classrooms.

But current VR technology has limitations when it comes to spatial awareness. If the space in which a person operates a VR headset doesn鈥檛 match neatly with the space a robot proxy might operate in, movement can become challenging.On Oct. 29, a team of researchers from Brown and Cornell new VR software that aims to alleviate this problem at the 2023 Association for Computing Machinery Symposium on User Interface Software and Technology.

The software 鈥 called VRoxy, a play on the term 鈥渧irtual reality proxy鈥 鈥 allows remote collaborators to use a small space to physically interact with others in much larger facilities via a robot proxy.聽

鈥淲hen people focus on VR interactions, 鈥 they mainly focus on collaboration with digital aspects,鈥 Mose Sakashita, a PhD student at Cornell studying human-computer interaction, told The Herald. 鈥淲e wanted to focus on collaboration around the physical objects in the physical space.鈥

ADVERTISEMENT

鈥淗ow can we facilitate complex tasks that cannot be supported by video-conferencing systems like Zoom?鈥 he asked.

Sakashita, the lead author of a recent describing the project, was among those presenting at last week鈥檚 symposium.聽

The idea for VRoxy was sparked by his frustrations working remotely as a teaching assistant during the COVID-19 pandemic. 鈥淚t was really difficult for me to engage in collaboration,鈥 Sakashita said. 鈥淭hose dynamic or nuanced cues like head direction 鈥 or my body position in relation to physical objects (were) completely missing.鈥 But robots have the capacity for these nuanced movements, he said. So 鈥渨e started working on projects that use robotic embodiment to enhance the sense of being physically together.鈥

Existing VR software allows users to control robots remotely, but it requires them to have a space that mimics their human collaborators鈥 facilities in size and layout. 鈥淭hey're likely not going to have the same lab space or the infrastructure for it,鈥 Brandon Woodard GS, a PhD student at Brown studying human-computer interaction, told The Herald.聽

Woodard worked on the methodologies and testing of VRoxy remotely, operating a robot located at Cornell鈥檚 Ithaca campus all the way from Providence.聽

VRoxy users see a 3D rendering of the remote space they are working in. When the user is navigating this space, 鈥測ou're kind of in this cartoon world,鈥 Woodard said. 鈥淚f we were constantly just running a 360-degree (live) video stream, 鈥 there (could) be a lot of glitches.鈥

The user navigates this space by walking around and using teleportation links, represented by circles on the ground. If the user steps into the circle, they can teleport a much greater distance in the VR space than they are actually traveling in their physical location.聽

VRoxy 鈥渁llows a remote person to be as mobile as anyone else who is physically in the same space鈥 鈥 if not more, wrote Jeff Huang, associate professor of computer science and Woodard鈥檚 PhD advisor, in an email to The Herald. With just a few steps, users can even 鈥渕ove鈥 to a completely different building, activating a different robot, Woodard explained.

Once the user navigates to their desired workspace within the facility, a 360-degree video feed is displayed. 鈥淪o now, instead of seeing a 3D copy of the room, you actually see the robot鈥檚 camera view. You can see people there, and you can fully look around and start interacting with people,鈥 Sakashita said.聽

As the user moves around, the robot can mimic their non-verbal cues, such as head rotation, facial expression, eye gaze and pointing gestures, Sakashita explained.聽

ADVERTISEMENT

Once the user has completed their desired tasks in a particular location, they can go back into navigation mode where the 360 camera is disabled and they once again enter the 3D rendering.

Though the software is still in its infancy, the team is already working on improvements. 鈥淲e have a mobile robot now, (but) it can only point and reference things,鈥 Woodard said. 鈥淭he next step is to have a robot that can grab objects and manipulate them.鈥

With future improvements, the software has the potential to be implemented in various fields, he added. Some of the team鈥檚 target scenarios include remote teaching and telehealth.聽

鈥淚 can (also) really see remote surgery benefiting from something like this, especially in countries where they don't really have the infrastructure they may need,鈥 Woodard said.

Get The Herald delivered to your inbox daily.

In any scenario, physical presence and non-verbal cues help facilitate a better understanding of colleagues鈥 intentions and enhance the social aspects of collaboration, researchers told The Herald.聽

鈥淶oom and video conferencing tools have focused on the face and spoken word as the primary channel for communication,鈥 Huang wrote. 鈥淏eing able to share and interact at the room level allows a fuller whole-person social experience.鈥


Liliana Cunha

Liliana Cunha is a staff writer covering Science and Research. She is a sophomore from Pennsylvania concentrating in Cognitive Neuroscience. In her free time, she loves to play music and learn new instruments.



Popular


Powered by Solutions by The State News
All Content © 2024 猫咪社区.