The future of collaboration in virtuality

by Pelican Press
14 views 10 minutes read

The future of collaboration in virtuality

Teleconferencing has further advanced collaboration within organisations and with partners and customers. Driven by the needs that the pandemic imposed in 2020, teleconferencing tools are now in common use. Simultaneously, a wide range of collaboration platforms emerged, facilitating various types of teamwork.

Extended reality (XR) – augmented and virtual reality (AR and VR) – the metaverse and digital twins now promise to unleash even more effective forms of collaboration tools. These new technologies enable intuitive visual and auditory landscapes ranging from simple graphical representations of rooms and participants to complete immersion into realistic environments that put users within the applications.

Expanding on this, spatial computing will increasingly free users from clunky interfaces that can distract and often require training. Instead, the user’s virtual surroundings will make it natural to interact in ways that resemble real-life behaviour.

Sound and sight will provide directional understanding, and sensor fusion will add layers of environmental, equipment and system data from the real world to virtual environments. Increasingly accurate and authentic representations of actual objects and phenomena will blur the lines between the real world and virtual representations. In fact, a team member in a real-world facility might share visualisations of this environment virtually with other team members using AR or VR. Field technicians, for instance, could almost seamlessly share on-location conditions with experts in command centres. 

Current headset challenges

Geoffrey Bund, CEO of Headwall, a provider of AR and VR enhanced command-and-control centre applications, notes that headsets have not caught up to the media hype cycle that has placed AR and VR within reach as the next computing environment. The ubiquity of the technology – at least for the mass market – will take a while.

Bund points to journalist Nilay Patel of The Verge, who argues that “you have to deliver a lot of value to someone to get something on their face”. Bund adds: “So far, headset size, weight and power concerns have prevented the exponential growth predicted in the consumer space because the value isn’t quite there.”

Matthew Ball, author of The metaverse: And how it will revolutionise everything, shares this sentiment. “It’s fair to say the technology has proved harder than many of the best-informed and most financially endowed companies expected,” he says.

Ball lists a number of applications – civil engineering, industrial design, workplace training, and others – where XR is not uncommon any more. But he noted in 2023: “It’s difficult to say that a critical mass of consumers or businesses believe there’s a ‘killer’ AR/VR/MR experience in the market today.”

Jani Jokitalo, senior advisor for information and communications technology at Business Finland, also perceives the need to improve displays. One of the stubbornly remaining challenges is to reduce the effect of nausea that users of headsets can experience, which is an obvious hurdle to effective collaboration, particularly over longer periods. Part of the sensation is triggered by vergence-accommodation conflict.

Vergence is the rotation of eyeballs to provide a stereoscopic image of a real-world object, while accommodation describes focusing on the image to create a sharp image. In the real world, humans can match both adjustments. In the virtual world, however, vergence and accommodation are often mismatched, resulting in virtual reality sickness. New display technologies, including varifocal lenses, can overcome the mismatch, but the technologies require maturing.

Another consideration is the weight of headsets and the balance of that weight so devices are not front-heavy. Users have to balance the weight with muscle force, which is exhausting over extended use. According to Jokitalo, “technical factors such as weight, miniaturisation, computational power, battery capacity, energy consumption and heat dissipation” need to improve while considering simultaneously “usability, comfort and, above all, cost”.

Headwall’s Bund also lists developments that will affect collaboration of the future. Headset technology is advancing fast, and Bund sees an important advantage in the ability of users to see the world around them through cameras within the headset.

First, users feel more comfortable using headsets if they are aware of their physical environment to avoid hurting themselves or damaging objects. Second, particularly relevant for collaboration in mixed environments, users can still see their team members or coworkers so that relatively natural interactions can still take place.

Also, improved hand-and-eye tracking will allow the system to collect data that can help when interacting with virtual objects and rendering collaboration-relevant images. Finally, adding detailed facial features, including expressions and eye blinking, provides team members’ reactions and therefore facilitates intuitive interactions.

The ever-increasing capabilities of advancing immersive technologies will enable improved and new forms of collaboration. Many developments simultaneously will catapult XR devices towards truly immersive applications that pull a growing amount of information into a coherent and comprehensive interactive landscape.

The distinction between AR and VR is increasingly relegated to technological and conceptual discussions – for practical purposes, AR and VR combine in XR and MR (mixed reality) applications.

Imitating real environments

Karoliina Salminen, smart manufacturing lead at Finnish research and technology organisation VTT, views effective persistency as a crucial aspect and believes that digital and physical worlds “must be blended autonomously based on current and individual needs”. To do so, she says, “maintaining the situational awareness of the physical worlds, virtual worlds and users” is crucial.

Problematically, the current technology is not sufficiently mature. Another consideration is communication between all three aspects – the physical environment, the virtual landscape and the user – and the ability to cater to the current aspects of the situation and the tasks a user is trying to accomplish.

Salminen also notes the need to create natural communication and effective collaboration between and among users – the ability to sense and experience collaboration partners’ surroundings as naturally as possible to closely match face-to-face, real-world interactions. “Shared reality” is the keyword in this context.

Current AR and VR technologies allow the sharing of many visual and verbal aspects in communication, but there is a lack of sharing of the sensations of other senses. In real-world collaboration, team members experience environments holistically and simultaneously – many experiences therefore do not require explicitly mentioning or spelling out.

For instance, the texture of device surfaces or the temperature of environments can be difficult to assess when sharing relies on verbal and explicit communication, which means intuitive and visceral understanding of issues and benefits can suffer in virtual collaborations. Implicit sharing of experiential information is a challenge, but also a crucial aspect of improving collaboration.

Moving towards spatial computing

A step forward in natural-feeling collaboration represents the move towards increasingly sophisticated spatial computing. The concept goes back to 2003 when Massachusetts Institute of Technology researcher Simon Greenwold published Spatial computing, a document that outlined the new concept of interacting with digital devices.

“Spatial computing is human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces. It is an essential component for making our machines fuller partners in our work and play,” he wrote.

Spatial computing is a stepping stone to increasingly immersive environments that will connect the physical and virtual worlds in visceral and intuitive ways. Interface technologies in virtuality will enable seamless interaction with digitally created environments. Input options in the metaverse and advances in sound and haptic experiences will allow users to experience the virtual world as genuinely immersive, allowing them to become one with digital landscapes. Such environments could open up new opportunities in training and collaboration.

Training applications will not only provide theoretical knowledge in intuitive ways, but can also teach physical actions, and even establish muscle memory with repeat prompting of movements. Such physical involvement will further collaboration in research efforts, product design and operational tasks, to name just a few areas that will benefit from spatial computing. In particular, spatial computing can enable seamless interaction with physical systems and objects remotely. For example, workers can operate machinery, researchers can handle lab equipment, and surgeons can collaborate on challenging medical procedures.

Similarly, spatial audio can create immersive environments that augment, even create, landscapes that provide authenticity and various layers of information. For example, conversations can be followed more easily when directional sound facilitates the identification of speakers. Machinery noise can reveal the location of equipment. Changes in sound location and adjustment of the sound can provide movement information, such as when mimicking the frequency changes of a passing car’s engine noise.

Spatial computing could also establish the mapping of environments, the fusion of a growing amount of sensor data, and even more seamless interaction within and with virtual environments and overlayed elements and objects. Naturally, artificial intelligence (AI) will play a role in this context to use the data for rendering environments and visuals. In entertainment applications, for example, AI will create personalised applications and storylines. In professional applications, AI will establish layers of information that can flexibly cater to operational and organisational needs.

Cathy Hackl, author of Into the metaverse and coauthor of Spatial computing: An AI-driven business revolution, argues that the market is currently experiencing “an evolution from mobile computing into spatial computing – computing that extends beyond the screens”.

She notes: “It starts to understand the physical world and expands computing further. Spatial computing will enable devices to understand the world in ways they never have been able to do before. It is going to change human-to-computer interaction, and eventually every interface – whether it’s a car or a watch – will become spatial computing devices.”


Martin Schwirn is the author of “Small data, big disruptions: How to spot signals of change and manage uncertainty” (ISBN 9781632651921). He is also senior advisor, strategic foresight, at Business Finland, helping startups and incumbents to find their position in tomorrow’s marketplace



Source link

#future #collaboration #virtuality

Add Comment

You may also like