Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (2024)

1. Introduction

During the transition from Industry 4.0 to Industry 5.0, industrial robotics technology is undergoing profound changes [1]. Industry 4.0 focuses on automation and data exchange, realizing intelligent and highly integrated manufacturing processes through IoT, cloud computing, and AI technologies [2]. Industry 5.0 further emphasizes human–machine collaboration, aiming to enhance production efficiency and flexibility through advanced intelligent technologies and human–machine interaction while also focusing more on sustainability and human-centric development.

In this transition, the application of metaverse technology in the industrial field makes the industrial metaverse a new direction for technological development. The industrial metaverse creates an immersive and interactive virtual environment, allowing technicians to perform simulations and experiments in the virtual world, breaking the limitations of traditional industrial operations [3]. As a core element of the future industrial metaverse, industrial robots will inevitably achieve virtual–real integration with digital models; to monitor and control this core element, a collaborative interface between humans and virtual–real models is required.

By constructing virtual models, a manipulable platform beyond physical boundaries is provided, allowing users to operate and collaborate autonomously without geographical and temporal constraints, thus addressing challenges such as high costs, slow technological updates, high operational difficulty, and the significant safety hazards faced in the intelligent application of current industrial robots [4]. Meanwhile, virtual–real integration and collaborative interaction technologies can further support data collection, analysis, decision making, and control, truly realizing controllable virtual–real linkage.

Current virtual modeling technologies, such as ABB Ltd.’s RobotStudio [5], KUKA Robotics’ KUKA Sim Pro [6], and Elephant Robotics’ MyStudio [7], to varying degrees, achieve functions such as robot simulation, offline programming, visual analysis, and virtual debugging. However, they cannot effectively achieve cross-platform interoperability, real-time communication, and control, nor meet the growing needs for scalability and customization. The ROS (Robot Operating System) is the most widely used operating system architecture designed for robot software development, supporting various operating systems and hardware platforms [8] and providing rich development resources. The ROS offers a robust real-time communication mechanism, which, through the WebSocket protocol, can achieve real-time communication and control of robots on the WEB platform, simulating the actual posture of robots [9] and thus providing possibilities for cross-platform interoperability. On this basis, it can also interface with WEB standards for virtual and augmented reality [10], constructing robot models in the industrial metaverse and promoting the realization of Industry 5.0.

Therefore, this paper designs a virtual–real interaction system based on ROS and WEB technologies, capable of effectively loading robots of various platforms, models, and brands, displaying and controlling robot actions in real time, and allowing for customized modifications to robots, providing a foundation for the subsequent industrial metaverse.

This system can achieve the following expected societal impacts in terms of technological development and application, particularly through the deployment of the industrial metaverse:

  • The system can realize cross-platform and interoperable robot management and control. This will significantly improve production efficiency, reduce issues caused by incompatibilities between different brands and models of equipment, and lower costs for enterprises related to equipment maintenance and management.

  • The system supports cross-regional communication and collaboration, and is adaptable to global and diverse industrial environments. Through virtual modeling and real-time interaction, cross-cultural teams can collaborate more effectively, enhancing work efficiency and team cohesion.

  • The open-source nature of the system can promote deep technical exchange and cooperation. Developers and researchers can use this system to develop and validate new algorithms and functions, thereby advancing robot technology and virtual reality technology.

The rest of this paper is structured as follows: Section 2 explores the development trajectory of the metaverse and industrial robots and the connotations of virtual–real integration; Section 3 discusses preliminary solutions for human–machine collaboration in the context of the industrial metaverse and proposes a design framework for a virtual–real interaction system, implementing a cross-platform multi-industrial robot interaction system based on WEB technology; Section 4 focuses on key technologies in virtual–real integration and virtual–real model communication technology, and designs and implements a general communication mechanism based on the ROS; Section 5 introduces the detailed method of the virtual robot URDF model and loading; Section 6 demonstrates the setup of various system environments; Section 7 compares the system’s performance with traditional simulation software; and Section 8 provides conclusions and future work prospects.

2. The Development Trajectory of Metaverse + Industrial Robots and the Connotation of Virtual–Physical Integration

2.1. Development Trajectory

With continuous technological advancements, the application of industrial robots is also evolving to meet increasingly complex manufacturing demands and technical standards. The evolution of virtual–physical integration in industrial robot applications can be divided into four stages.

  • Early Automation and Programming:

From the 1960s to the 1970s, with the emergence and use of the first industrial robots, the focus was on basic automation technology and programming. During this period, all operational steps were performed in the real world, with the primary goal of enabling operators to understand and master the operation of the equipment. By the late 1970s to the 1980s, applications began to include microprocessor control and more flexible programming languages, such as VAL (Variable Assembly Language). At this stage, all operational steps were also completed in the real world.

2.

Integrated Systems and Enhanced Functions:

In the 1990s, applications started focusing on the use of robots in integrated systems, involving enhanced functions such as vision recognition and intelligent sensing. During this development stage, the application of industrial robots began using offline data to recreate the actual working scenarios of robots in virtual environments [11]. Although the operation of robots in reality and virtual simulation was not synchronized in time, this method promoted the diversification and parameterized planning of applications [12]. Meanwhile, these virtual scenarios could be connected to real control systems, allowing for semi-physical simulation and debugging of the actual control systems, enhancing the interactivity and practicality of operations [13].

3.

Intelligent and Collaborative Robots:

From the early 21st century to the present, the focus has shifted to the intelligence and collaborative capabilities of industrial robots, emphasizing the integration of robot systems with AI and IoT technologies. At this stage, to overcome the inconsistency in time scales, the application of industrial robots has realized information-driven virtual scenarios that are synchronized with real systems in real time. Currently, Adriana Florescu et al. use Tecnomatix Plant Simulation(Tecnomatix/Siemens Product Lifecycle Management Software Inc.) to create a simulation model of a robotic arm system [14]; F. Gabriele Pratticò et al. use VRTS(HTC Vive Pro kit and Unity game engine) for virtual reality training with robots [15]; Fuwen Hu et al. use SOLIDWORKS for dual-arm modeling and simulation [16]; and Xi Wang, Luis Pérez, and Pengxiang Xia use Unity v2020.1 for the simulation of industrial robot working environments [17,18]. Based on real-time, accurate virtual scenarios, parallel application systems synchronized with real-time status are constructed to simulate future planning and path selection [19]. During the simulation process of parallel systems, the optimal solution is selected and connected to the actual control system through information flow, and converted into control commands to guide the physical robots to execute tasks according to the optimal strategy.

4.

Industrial Metaverse Evolution:

In the future, the industrial metaverse will be a new stage in the application of industrial robots. In this entirely virtual industrial environment, broader global collaboration and communication can be achieved, creating boundless operational and training scenarios, making production and operation geographically unrestricted, and greatly enhancing industrial efficiency and effectiveness [20]. Alberto Martínez-Gutiérrez et al. discussed enabling technologies (e.g., VR, DT, IIoT, etc.) to design a connection framework to achieve the purpose of the industrial metaverse [21]. The industrial metaverse will provide an immersive and interactive operational platform where users can perform real-time operational training and solve complex engineering problems through virtual collaboration, further promoting innovation and the application of industrial robot technology [22].

2.2. Analysis of Virtual–Physical Integration in Various Stages

The levels of intelligence and virtual–physical integration have progressively increased across the four development stages described above. Figure 1 illustrates the system characteristics, integrated technological concepts, and human–machine collaboration methods exhibited at each stage, revealing the connotations of virtual–physical integration at each stage.

The system characteristics exhibited at each stage, categorized by the degree of intelligence, are as follows:

  • Offline Simulation and Display:

During the Integrated Systems and Enhanced Functions stage, this system characteristic was mainly achieved through 3D modeling and virtual reality technologies. For example, using CFD (Computational Fluid Dynamics) offline modeling technology, it is possible to effectively monitor the state of smart factories and provide operational guidance [23]. By the stage of Intelligent and Collaborative Robots, with the development of augmented reality (AR) and digital twin technologies, virtual information can be superimposed on the actual physical environment, allowing operators to intuitively see the equipment status and operational guidance in the real environment, thereby improving operational efficiency and accuracy.

2.

Immersive Training and Simulation:

Initially, this system characteristic was achieved through virtual reality (VR) technology. With the help of head-mounted devices and controllers, operators could enter a fully simulated industrial environment to perform equipment operation and maintenance training. VR training has significant advantages in improving operator skills and proficiency [24]. Similarly, AR, digital twins, parallel systems, and cyber-physical systems (CPS) technologies can also realize this system characteristic, endowing it with new connotations.

3.

Real-time Simulation and Monitoring:

This system characteristic is realized through AR technology, which allows real-time monitoring of equipment operational status. Operators can test and optimize equipment behavior in a virtual environment, where the simulation environment can reflect the reactions of real equipment in real time, achieving efficient virtual–physical integration. In CPS factory modeling and simulation, real-time monitoring and feedback are used to optimize production processes [25].

4.

Production Process Simulation and Validation:

This system characteristic combines the advantages of virtual simulation and actual operations, using digital twin technology to simulate and validate production processes. Operators perform operations in a virtual environment to ensure the optimization of production processes and the effectiveness of actual operations. The application of digital twin technology in multi-robot systems enhances production line optimization and operational efficiency [26].

5.

Reverse Control:

This system characteristic is achieved through the combination of real-time data feedback and advanced algorithms, enabling reverse control of equipment. Behaviors and adjustments learned in the virtual environment can be directly applied to actual operations, enhancing adaptive and efficient operations. Real-time monitoring servers allow for real-time data feedback and reverse control of equipment, improving the system’s adaptability and efficiency [27].

6.

Human–Machine Integration and Collaboration Platform:

This system characteristic creates an immersive, fully virtual workspace where operators and team members can interact in real time with colleagues around the globe. The virtual collaboration and innovation platform allows for work and learning in an environment without geographical and physical limitations, realizing global collaborative innovation in the industrial metaverse. In industrial environments, intuitive adaptive control technologies that enhance human–machine interaction can significantly improve team collaboration efficiency and innovation capabilities [28].

The proportion of information technology in the above system characteristics continues to increase, deepening the transition from physical to virtual. The integrated technological concepts evolve from 3D modeling, VR, and AR to digital twins, parallel systems, and CPS, progressively enhancing the degree of virtual integration and supporting the industrial system through trial and error, deduction, simulation, and computation enabled by information technology.

Three-dimensional modeling is the foundational technology for realizing VR, which, along with VR and AR technologies, collectively realizes digital twins [29]. Digital twins can extend into parallel systems, and when combined with control, they form programmable CPS. The higher the degree of virtual–physical integration, the more intelligent the system characteristics supported. For instance, digital twin technology can support offline simulation and display, immersive training and simulation, real-time simulation and monitoring, and production process simulation and validation. Parallel systems further enable reverse control, and the highest degree of virtual–physical integration in CPS can provide a complete human–machine integration and collaboration platform.

As the degree of virtual–physical integration increases, user participation evolves from on-site operations in the real world to on-site operations via virtual interfaces, and ultimately to virtual operations by virtual humans in a virtual world.

3. Preliminary Plan for Human–Machine Collaboration in the Metaverse + Industrial Robots

Human–machine collaboration (HMC) refers to the complementary advantages of humans and machines working together to accomplish tasks, resulting in higher efficiency, better outcomes, and greater flexibility compared with working independently. The goal of HMC is to leverage human intelligence, creativity, and judgment in combination with the computational power, precision, and repeatability of machines, thereby optimizing workflows, enhancing productivity, and creating value.

The integration of metaverse technology with industrial robotics significantly expands the prospects for HMC applications across various fields. In virtual training and simulation, new employees can quickly acquire operational skills through interaction with virtual devices. During training, the system records user operation data, analyzes and evaluates these data, and provides personalized training programs [30]. In industrial manufacturing, operators can remotely control robotic arms using VR devices to perform precise assembly and adjustments. Multiple operators can collaborate in a virtual environment, optimizing production processes and increasing efficiency [31].

For system maintenance and repair, experts can remotely guide on-site operators through the maintenance and repair process using the system. They can view the equipment status via VR devices, provide operational suggestions, and monitor the repair process in real time, ensuring accuracy and safety in operations. This approach enhances the efficiency of maintenance and repair tasks [32,33].

3.1. Concept of Implementing Industrial Metaverse Human–Machine Collaboration via WEB

By integrating WEB, ROS, and WEBXR technologies, an industrial metaverse platform is constructed, enabling users to collaborate with industrial equipment in virtual reality, as shown in Figure 2.

The real-time motion data of the robotic arm are transmitted to the ROS through a socket connection. The ROS (Robot Operating System), serving as the core backend, is responsible for processing and managing these data to achieve real-time interactive operation of the robotic arm [34].

The ROS converts the processed data into model data and subscription messages, which are transmitted to the front-end Three.js application via WebSocket [35]. Three.js uses these data to render a 3D model of the robotic arm and display it on the webpage, allowing users to view the arm’s actions and status in real time within a browser.

Three.js integrates the WEBXR API to transmit the rendered 3D scene to VR devices. When users wear VR headsets, they can interact immersively with the 3D model in virtual reality. Operations performed by users in VR are transmitted back to Three.js through the WEBXR API, which then sends the operation data back to the ROS, achieving real-time virtual and physical interaction.

Users operate within the virtual scene using VR devices, and these operation data are fed back to the ROS in real time. The ROS adjusts the robotic arm’s actions based on the user’s operations and updates the latest status in the virtual scene via Three.js, forming a closed-loop control system.

This system has extensive application prospects in various fields, such as industrial manufacturing, remote training, and virtual experiments, providing robust technical support for the transition from Industry 4.0 to Industry 5.0.

3.2. Design Architecture of Multi-Industrial Robot Connection WEB Simulation System

This system is divided into three parts: industrial robots, the ROS, and the WEB platform, as shown in Figure 3.

  • Controller and Body Part: The core of the robotic system includes the underlying hardware and controllers, consisting of physical robots, drive motors, sensors, processors, and controllers that directly control these hardware components. This framework includes processors that run control algorithms and handle robotic computation tasks, motor drivers that receive control commands to drive robot motors and execute actions, sensors that collect robotic state information such as position, speed, and environmental data, communication interfaces for data transmission between hardware components, control algorithms for robotic motion control such as path planning and posture adjustments, and interfaces linking sensors, actuators, communication modules, and other hardware components.

  • ROS: The ROS is a flexible framework for robotic applications, providing service design, hardware abstraction, low-level device control, common functionality implementation, and message passing. This framework includes topics for asynchronous message passing via a publish–subscribe mechanism, services offering request–response style interactions, ActionLib for handling asynchronous server communication such as long-duration movements, a parameter server for storing and managing ROS parameters used for configuration, IO management related to hardware devices, and nodes in the ROS for information publishing, subscribing, and parameter reading [36].

  • WEB Platform: As a cross-regional, cross-platform user interface, the WEB platform allows users to interact with the robotic system over the Internet. This platform addresses three key issues: the WEB interface accessible via browsers providing a graphical user interface, data transmission and communication through protocols like WebSocket for data exchange with the ROS [37], and backend processing for logical operations and data exchange, such as importing URDF models and exchanging robotic joint data. This framework includes login authentication to ensure system security and control access permissions, a real-time data display showcasing robot status such as position, speed, and fault information, a user interface for sending commands like start, stop, and path planning, and motion detection for monitoring robotic movements and providing real-time feedback.

These three parts work together to provide a complete process from user commands to robot actions. Users issue commands via the WEB platform, the ROS acts as middleware to process these commands and transmit them to the controller, and the controller executes the corresponding physical operations, controlling the robot to complete tasks. This framework allows for the rapid development and deployment of robot applications and can be flexibly adapted to different hardware and software requirements.

The system requires two-step communication to support the interaction between industrial robots and the WEB platform. The physical industrial robot connects to the ROS, and the ROS connects to the WEB platform, with the ROS used as middleware to achieve real-time monitoring and control of the physical industrial robot’s posture remotely.

3.3. Function Design and Implementation of the WEB Platform

The WEB platform can display the real-time posture of robots and provide features such as coordinate axis switching, state tracing, and multi-industrial robot loading. The combination of these functions ensures that the multi-industrial robot platform can offer an efficient, interactive, and comprehensive operating environment, thereby enhancing the skill levels of operators and increasing production efficiency.

3.3.1. Real-Time Posture Display of Industrial Robots

After loading the ABB IRB2400 robot model, the functions defined in the code represent the rotation angles of each joint of the robotic arm. These data are received in real time by the ROS through messages sent by the industrial robot. Subsequently, based on these functions, the robotic arm’s posture is updated in real time, adjusting the posture according to the rotation angles of each joint, as shown in Figure 4.

First, the ROS acquires real-time motion parameters of the robot, such as arm movement, axis movement, movement direction, motion trajectory, and timestamps. While reading these parameters, the system also stores them. Then, the parameters are transmitted to the model on the WEB platform to achieve real-time posture display of the robot model.

In each iteration, the system calls functions and renders the scene to update the posture of the robotic arm in real time. Additionally, users can manually input the desired angles in the angle input fields to manually control the movement of the robotic arm, thereby achieving reverse motion control of the physical robot.

3.3.2. Coordinate Axis Switching

To adapt to a new coordinate system, such as changing the upward direction from the Y-axis to the X-axis, the model needs to be rotated, and its position adjusted, as shown in Figure 5.

The system first reads the global coordinate axes from the URDF model file to ensure the model is correctly loaded and displayed on the platform. It then sets the X, Y, and Z axes and their corresponding −X, −Y, −Z to meet most coordinate axis switching requirements.

By applying a transformation matrix to the loaded robot model, all the joints can be adjusted according to the new coordinate system. Furthermore, the model’s rotation and position need to be updated to match the new coordinate system.

3.3.3. Multi-Industrial Robot Platform Loading

In the industrial metaverse, a simulation platform capable of loading and managing multiple industrial robots offers an innovative and efficient solution. This platform integrates technologies such as virtual reality, augmented reality, the Internet of Things (IoT), and artificial intelligence, creating a highly immersive, interactive, and intelligent industrial operating environment, as shown in Figure 6.

By loading multiple industrial robots from different platforms and models onto the simulation platform, users can view the robots on various terminals such as laptops and tablets via the WEB.

First, the system parses the robot models and then checks whether the platform already has a model displayed. If a model is already present, the system offsets the model’s coordinates during loading to prevent overlap; if there is no model, it loads directly. After loading, the system assigns the active joints of the robots, enabling the loading of multiple industrial robots.

The robots shown in the figure from right to left are the IRB2400 model robotic arm from ABB, the P5 model from the panda series of Elephant Robotics, and the KR3 R540 model robotic arm from KUKA. The industrial metaverse can achieve more efficient, safe, and flexible industrial operations and management. This multi-robot collaborative simulation system not only enhances the efficiency and safety of collaborative operations but also reduces costs and risks. Additionally, it improves training effectiveness and fosters technological innovation, providing robust technical support for the transition from Industry 4.0 to Industry 5.0.

4. System Communication Implementation Plan

4.1. Communication between Physical Robot and ROS

There is bidirectional communication between the physical robot and the ROS, as shown in Figure 7.

ROS nodes in the ROS parameter server receive key information about the robot, such as the state of the robotic arm, joint positions, motion trajectories, and timestamps, by subscribing to ROS topics. These nodes can include controller nodes of the physical robot, which use protocols like Controller Bridge and Sensor Driver to publish the control data and sensor data of the robot to the corresponding ROS topics [38].

After these data are published through the topics, other ROS nodes can subscribe to these topics to receive the data. For example, a processing node may load the robot’s URDF model and obtain model information, such as joint configurations, default states, and collision detection information, and coordinate transformations based on the received data. Additionally, this processing node can create a virtual environment and use technologies like ROS Bridge to customize topic information, generating a virtual robot model and user simulation interface to achieve mapping between the physical robot and the virtual model.

Users interact with the virtual model through the simulation interface, allowing them to monitor and control the physical robot mapped to the virtual model. The ROS uses algorithms like Cartesian path planning and forward/inverse kinematics based on topic information to plan the motion trajectories of the robot’s joints, ensuring that the trajectories conform to the preset thresholds of the robot model, thereby controlling the physical robot to perform actions according to the planned movements.

The process is represented in pseudocode as follows:

Pseudocode 1Figure 7 process
1: Initialize ROS node and setup communication
 Initialize ROS node ‘robot_control_node’
2: Robot control loop
 Loop until ROS node is shut down
3: Get robot state and sensor data from the real environment
 sensor_data = get_sensor_data_from_hardware()
 joint_state = get_joint_state_from_controller()
4: Process sensor data and calculate motion dynamics
 motion_dynamics = calculate_motion_dynamics(sensor_data, joint_state)
5: ROS Bridge handles communication with the virtual environment
 sensor data and joint state to ROS Bridge
 VR environment data from ROS Bridge
6: VR environment with the latest robot state
 vr_data = get_vr_data_from_ros_bridge()
 update_vr_environment(vr_data)
7: Web-based VR Engine updates the scene using Three.js
 update_web_vr_scene(vr_data)
8: Sync URDF model with the real robot state
 urdf_model = load_urdf_model(‘/path/to/urdf’)
 update_urdf_model(urdf_model, joint_state)

Through this method, real-time bidirectional communication is achieved between the virtual model in the ROS and the physical robot, effectively connecting the virtual and physical worlds and enhancing the flexibility and application range of the robot system.

One of the obstacles to implementing this solution is that some robot companies do not provide corresponding URDF models for their robots, requiring users to convert CAD models into URDF models. For example, KUKA only provides CAD models.

4.2. Communication between ROS and WEB Platform

The connection between the ROS and the WEB platform is established as shown in Figure 8.

In the ROS parameter server, the URDF model is converted into a 3D model usable by Three.js, and relevant configuration data such as robot arm parameters, rotation axis parameters, and collision information are transmitted. The system uses ROSBridge to build a communication bridge between the ROS and the WEB application, converting ROS messages into a format suitable for the WEB [39]. Then, utilizing Three.js, a JavaScript library specifically used for rendering 3D models and environments on web pages, detailed and interactive robots and their operating environments are created.

With the help of rosjs, the system can achieve basic interactions with the ROS, while ros3djs is used for 3D visualization of robots and environments [40], providing a real-time, dynamic, and interactive robot control and simulation environment on the WEB platform.

The process is represented in pseudocode 2 as follows:

Pseudocode 2 Figure 8 process
1: Initialize ROS node and set up communication
 Initialize ROS node ‘web_robot_control_node’
2: Main function for the Web platform
 Function web_platform_main():
   Initialize Web Page
   rosjs and ros3djs
   keyboard control event listeners
 Set up WebSocket connection
 Connect to WebSocket server
     receiving user input:
     handle_user_input(input_data)
     receiving ROS messages:
     update_web_page(ros_data)
3: Function to handle user input
 Function handle_user_input(input_data):
   user input
   corresponding ROS control message
   control message to ‘/robot/control’ topic
4: Function to update the Web Page
 Function update_web_page(ros_data):
   robot state on the Web Page
   three.js to update the rendering of the 3D model
5: Main function for the ROS
 Function ros_system_main():
   Initialize ROSBridge and Tf2_web_republisher
   Set up ROS topic subscriptions and publications
     Loop until ROS node is shut down:
     Get robot state and sensor data
     Publish sensor data and joint state to ROS topics
     Get control commands and send them to the physical robot
     Shutdown ROS node
6: Set up ROS topic subscriptions and publications
 Function set_up_ros_topic_subscriptions_and_publications():
   Subscribe to ‘/robot/state’ topic
   Create publisher for ‘/robot/control’ topic

After completing these two communication steps, information exchange between the WEB page and the physical robot is realized. Through this integration, we can create an interactive, multi-user industrial robot simulation platform in the metaverse, which is not only suitable for education and training but also provides strong support for the research and application of industrial robots.

This integration method demonstrates the great potential of combining advanced WEB technology with industrial robot simulation, offering new possibilities for virtual laboratories and the planning and testing of complex tasks.

5. Virtual Robot URDF Model and Loading

5.1. URDF Model Classes

Three.js does not directly support loading URDF models, so it requires parsing the URDF model for loading. The system introduces modules such as STLLoader and ColladaLoader in Three.js, which are used to load different types of 3D model files. STLLoader is used to load STL format files, a file format commonly used in 3D printing and computer-aided design, representing 3D models through triangular facets. ColladaLoader is used to load Collada format files, which contain information about the model’s geometry, texture, lighting, and even animations and physical effects. Additionally, the system includes custom URDF model classes such as URDFBase, URDFCollider, URDFVisual, URDFLink, URDFJoint, and URDFMimicJoint to handle different components of the model. Their respective functions are shown in Table 1.

Each class inherits from Object3D, which means they can all be operated and transformed as 3D objects in the Three.js environment.

The URDFBase class provides common properties and methods for all other URDF-related classes. The functions of URDFCollider, URDFVisual, and URDFLink classes are mainly implemented by inheriting the methods of Object3D. The URDFJoint and URDFMimicJoint classes extend the functionalities of the base class, adding properties and methods to control joint behaviors. The URDFRobot class acts as a top-level container, managing all links, joints, and other components, and provides methods for obtaining and setting joint values.

5.2. URDF Model Conversion

The process of converting a URDF model into the URDFRobot’s visual and collision model framework is shown in Figure 9. The system parses the URDF file, which defines the robot’s structure and properties in XML format. Next, the system loads the STL or Collada models referenced by the URDF file, which describe the geometry of each robot component in detail. Based on these geometric data, the system creates Visual objects and Collider objects, used for rendering the robot’s appearance and performing collision detection in physical simulations, respectively.

Then, the system constructs the URDFRobot object, which contains all the robot’s parts and their relationships. To simulate the connections and movements between the robot’s parts, the system creates URDFLink and URDFJoint objects, which reflect the physical properties of the links and the motion constraints of the joints. The system also processes and generates the data needed for the interactive interface and analyzes the motion range of the robot joints to ensure accurate reflection of the robot’s motion limits in the simulation.

Finally, the entire robot model is converted and exported in a Three.js-compatible format, allowing the robot model to be rendered on the WEB platform using the Three.js library, and thereby providing users with a visual operation and interaction experience.

5.3. WEB Interaction of Robot Model

After converting the URDF model to the Three.js format, the next step is to add the instance of the converted URDFRobot class to the Three.js 3D scene. Then, using the Three.js rendering engine, the robot model is rendered into the ‘<canvas>’ element on the webpage, providing users with a visual interactive interface.

Additionally, by using the ‘setJointValue’ method of the URDFJoint class, the system can receive and apply the position data of the robotic arm from the ROS, updating these data in real time to the joints of the robot model. This allows the dynamic state of the industrial robot to be reflected on the webpage in real time, enabling users to intuitively observe every movement of the robot during task execution.

6. Experimental Environment Deployment

6.1. Industrial Robot Simulation

This system utilizes the ABB IRB2400 robot(ABB Ltd.:) in RobotStudio to simulate a physical industrial robot. RobotStudio, developed by ABB Robotics, can fully simulate all the performances of a physical industrial robot. After loading the ABB robot action code using the teach pendant, the robotic arm starts executing the actions corresponding to the code. The overall effect is shown in Figure 10.

6.2. Industrial Robot Connection to ROS

In the Ubuntu 18.04 operating system, enter the following command line: roslaunch abb_irb2400_moveit_config moveit_planning_execution.launch sim: = false robot_ip: = xxx.xxx.xxx.xxx, where ‘xxx.xxx.xxx.xxx’ is the IP address of the specific robot to be connected. After a successful connection, a connection success message will be displayed on the RobotStudio teach pendant.

In this step, the ROS provides a communication interface for data exchange between the robot and the computer. By invoking drivers, the ROS sends control commands to the robot’s motors and actuators and receives the robot’s status information.

6.3. ROS Connection to WEB

Load the ROSlib library in the WEB. ROSlib is a JavaScript client library that enables communication with the ROS in the browser. By using ROSlib, a new ‘ROS’ object can be created, which will connect to the server running the ROS via the WebSocket protocol:

var ros = new ROSLIB.Ros({url: ‘ws://<your-ros-bridge-server>:<port>’});

Enter the IP address of the ROS to connect to the robot already in communication with the ROS. Next, create a new ‘Topic’ object for the robot’s status information. It subscribes to the corresponding topic in the ROS as a listener to receive real-time data from the robot, as shown in Figure 11.

var listener = new ROSLIB.Topic({

 ros: ros,

 name: ‘/robot_joint_states’,

 messageType: ‘sensor_msgs/JointState’

});

listener.subscribe(function(message) {

});

The ROS uploads the robot’s joint positions, speeds, torques, temperatures, status data, target positions, path planning, motion commands, and timestamps for synchronizing robot status and motion trajectory data. This information allows the model in the WEB to simulate the industrial robot’s status in real time.

The collaboration of the entire system relies on these seamless information flows from the robot to the ROS, and then to the WEB application, and the information is finally displayed in 3D form on the user interface through Three.js. This lays the foundation for information flow synchronization for the future industrial metaverse. One of the obstacles encountered during this part of the research is the need to first examine the packets sent by the physical robot to determine which packet contains the arm position parameters and which one contains the movement trajectory of the arm. If the manufacturer’s development documentation does not provide this information, it will be necessary to identify it through manual comparison.

7. System Performance Analysis

To validate the system’s performance, a comparative analysis was conducted on model loading times and response times for driving commands. The comparison was made against the virtual simulation and communication software of established industrial robotics companies, KUKA and ABB. The robot models used for testing were the KUKA KR150 and ABB IRB2400, with Benchmarking Scripts employed as the testing tools.

7.1. Model Loading Time Analysis

Figure 12 below is a comparison of model loading using the system with KUKA Sim Pro (version 1.1) and RobotStudio (version 6.08), the robot simulation communication software from KUKA Robotics GmbH and ABB Ltd., respectively.

Under the condition of loading the same model, each platform loaded the model 25 times, and the loading times were recorded by the testing tool. The test results are shown in Figure 13 and Figure 14.

According to the test results, our system’s average loading time for the KUKA KR150 model is 2.953 s, compared with 5.426 s for KUKA Sim Pro, resulting in a loading efficiency improvement of approximately 45.58%. The average loading time for the ABB IRB2400 model is 4.9224 s, compared with 6.5384 s for RobotStudio, resulting in a loading efficiency improvement of about 24.72%.

The significant improvement in model loading efficiency can be attributed to our system’s smaller codebase, which excludes functionalities such as offline programming, signal analyzers, and sales tools, focusing solely on model simulation. This specialization allows for greater efficiency in the loading of the same models compared with traditional-brand robot simulation software.

7.2. Model Drive Response Time

For the response time test, each model’s actions were divided into three groups, driving axes 1, 2, and 3, respectively. The model’s three-axis markings and driving directions are shown in Figure 15.

Each group of actions was tested 25 times. In the same environment, the control command issuance for the models was tested. The response time test results for our system’s model driving are shown in Figure 16 and Figure 17.

According to the test results, the average time for driving KUKA KR150 axes 1, 2, and 3 with our system is 43.72 ms, 46.48 ms, and 54.08 ms, compared with KUKA Sim Pro’s 76.92 ms, 79.64 ms, and 90.08 ms, resulting in loading efficiency improvements of approximately 43.16%, 41.64%, and 39.96%, respectively. The overall average response time improvement is about 41.50%.

For the ABB IRB2400, the average time for driving axes 1, 2, and 3 with our system is 62.76 ms, 64.12 ms, and 57.12 ms, compared with RobotStudio’s 85.32 ms, 89.48 ms, and 83.44 ms, resulting in loading efficiency improvements of approximately 26.44%, 28.34%, and 31.54%, respectively. The overall average response time improvement is about 28.75%.

The improvement in response times can be attributed to the limitations of traditional simulation software’s teach pendants, where commands must be transmitted to the model through protocols like TCP/IP, EtherNet/IP, CAN bus, and RAPID. In contrast, our system’s commands are directly transmitted via WebSocket, resulting in enhanced model driving response times.

8. Discussion

This paper explores methods to optimize the application and experimentation of industrial robots in the context of the industrial metaverse through WEB and ROS technologies. The following discusses system performance and future developments:

  • System Performance

1. Integration and Operational Efficiency:

The system combines WEB and ROS technologies, enabling users to remotely access and control industrial robots through a browser. The ROS, serving as middleware, provides robust hardware abstraction, device control, and messaging capabilities, while WEB technology makes the user interface more friendly and easy to operate. This combination not only enhances operational convenience but also significantly reduces maintenance costs and operational risks.

2. Virtual and Physical Integrated Experimental Environment:

By creating a virtual environment, this system allows users to conduct operational experiments in a safe and risk-free setting. This virtual–physical integration model offers a dynamic and interactive operational experience, enabling users to perform and test industrial robot operations in a near-realistic environment, enhancing the realism and practicality of the experiments.

3. Multi-Industrial Robot Simulation:

The system supports the simultaneous loading and simulation of multiple industrial robots, greatly improving the flexibility and scalability of experiments. Users can simulate the collaborative operations of multiple robots in a virtual environment, testing complex industrial processes and tasks, thus increasing the efficiency and effectiveness of the experiments.

  • Industrial Metaverse Prospects

1. Global Cooperation and Collaboration:

The industrial metaverse provides an immersive and interactive operational platform, allowing technicians worldwide to interact and collaborate in real time within the same virtual space. With the application of WEB technology, users can access the system through a browser without the need for special hardware, achieving seamless cross-platform and cross-device collaboration. This significantly lowers the barriers to remote collaboration, promoting global technical exchange and cooperation.

2. Enhancing Industrial Efficiency and Safety:

The industrial metaverse, through the application of virtual reality (VR) and augmented reality (AR) technologies, creates a safe and efficient environment for industrial robot operations and training. Users can conduct operational training in a virtual environment, simulating complex industrial scenarios to enhance operational skills and responsiveness. All operations are carried out in a controlled and safe virtual environment, greatly reducing the risk of equipment damage and personal injury.

3. Innovation and Application Prospects:

The application of the industrial metaverse extends beyond production management and optimization to include education and training, remote collaboration, and supply chain management. As technology continues to advance, the industrial metaverse will achieve deeper applications in fields such as industrial manufacturing and healthcare. This study provides theoretical and practical support for this transformation, anticipating that industrial metaverse technology will continue to drive the application of industrial robots towards more efficient, safe, and economical directions, ultimately achieving comprehensive innovation.

9. Conclusions

This paper discusses how to improve the application and experimentation of industrial robots in the industrial metaverse through WEB + ROS technology. The core technologies of the system include the ROS (Robot Operating System), which provides robust backend support for robot control and data exchange, and WEB technology, which makes the system interface user-friendly and easy to operate. By integrating these technologies into the industrial metaverse platform, the system can offer a seamless and dynamic operational experience, allowing users to perform industrial robot operations and tests in an environment close to reality. Additionally, the introduction of virtual–real integration and human–machine collaboration technologies enables users to conduct robot operations and collaboration in a virtual environment, achieving more efficient training and operations.

By comparing traditional industrial robot simulation software (such as ABB RobotStudio v6.08 and KUKA Sim Pro v1.1), it has been demonstrated that the virtual–real interactive system based on ROS and WEB technology has significant advantages in terms of real-time performance, cross-platform compatibility, scalability, WEBXR integration, and economic performance. The system provided a design framework for a cross-platform multi-industrial robot interactive system, realized the WEB visualization of URDF models, improved the system’s model loading and driving response performance, and supported multi-robot collaborative simulation. Virtual–real integration technology has achieved seamless connection between virtual and real environments, enabling users to perform simulated operations in a virtual environment, enhancing training effectiveness and operational safety.

Future research should focus on further optimizing system performance, expanding the range of robot support, enhancing the human–machine interaction experience, and promoting the development of the industrial metaverse. In the short term, the system and device support will be optimized. In the mid-term, virtual reality technology will be deeply integrated, and advanced functional modules will be developed. In the long term, a complete industrial metaverse ecosystem will be constructed, and industry standards will be promoted. Additionally, challenges such as technology integration, real-time performance assurance, data security and privacy protection, and user training and promotion need to be addressed. Through the continuous development of virtual–real integration and human–machine collaboration technologies, it is expected that more efficient, safer, and economical industrial robot applications will be achieved, driving the transition from Industry 4.0 to Industry 5.0.

This system provides new solutions for industrial robot applications and experiments and has established a basic experimental platform based on this concept. In the future, industrial metaverse technology will achieve deeper applications in industrial manufacturing, healthcare, and other fields. This paper’s research aims to provide theoretical and practical support for this transition. Through in-depth analysis and practical verification, we expect that industrial metaverse technology can continuously promote industrial robot applications towards more efficient, safer, and economical directions, ultimately realizing the comprehensive evolution from Industry 4.0 to Industry 5.0. The further application of virtual–real integration and human–machine collaboration technologies will make industrial operations more intelligent and collaborative, pioneering a new era in industrial robot technology.

Author Contributions

Conceptualization, J.Y. and Z.W.; methodology, J.Y.; software, Z.W.; validation, J.Y., Z.W. and Y.Y.; formal analysis, W.W.; investigation, Z.W.; resources, J.Y.; data curation, W.W.; writing—original draft preparation, Z.W.; writing—review and editing, Z.W.; visualization, J.Y.; supervision, N.L.; project administration, N.L.; funding acquisition, J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This Research was jointly funded by the Science and Technology Project of Jiangsu Provincial Administration for Market Regulation in 2023 [KJ2023053] and Jiangsu Petrochemical Process Key Equipment Digital Twin Technology Engineering Research Center Open Project [DTEC202301].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ordieres-Meré, J.; Gutierrez, M.; Villalba-Díez, J. Toward the industry 5.0 paradigm: Increasing value creation through the robust integration of humans and machines. Comput. Ind. 2023, 150, 103947. [Google Scholar] [CrossRef]
  2. Kusiak, A. Smart manufacturing. In Springer Handbook of Automation; Springer International Publishing: Cham, Switzerland, 2023; pp. 973–985. [Google Scholar]
  3. Park, H.; Ahn, D.; Lee, J. Towards a metaverse workspace: Opportunities, challenges, and design implications. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–20. [Google Scholar]
  4. Javaid, M.; Haleem, A.; Suman, R. Digital twin applications toward industry 4.0: A review. Cogn. Robot. 2023, 3, 71–92. [Google Scholar] [CrossRef]
  5. RobotStudio. Available online: https://new.abb.com/products/robotics/es/robotstudio (accessed on 14 July 2024).
  6. KUKA Sim Pro. Available online: https://www.kuka.com/en-hu/products/robotics-systems/software/simulation-planning-optimization/kuka_sim/ (accessed on 14 July 2024).
  7. MyStudio. Available online: https://www.elephantrobotics.com/download/ (accessed on 14 July 2024).
  8. Baek, E.T.; Im, D.Y. ROS-based unmanned mobile robot platform for agriculture. Appl. Sci. 2022, 12, 4335. [Google Scholar] [CrossRef]
  9. Kaarlela, T.; Padrao, P.; Pitkäaho, T.; Pieskä, S.; Bobadilla, L. Digital twins utilizing XR-technology as robotic training tools. Machines 2022, 11, 13. [Google Scholar] [CrossRef]
  10. Xing, Y.; Shell, J.; Fahy, C.; Xie, T.; Kwan, H.Y.; Xie, W. Web XR user interface research: Design 3D layout framework in static websites. Appl. Sci. 2022, 12, 5600. [Google Scholar] [CrossRef]
  11. Guertler, M.R.; Schneider, D.; Heitfeld, J.; Sick, N. Analysing Industry 4.0 technology-solution dependencies: A support framework for successful Industry 4.0 adoption in the product generation process. Res. Eng. Des. 2024, 35, 115–136. [Google Scholar] [CrossRef]
  12. Kong, X.T.; Yang, X.; Huang, G.Q.; Luo, H. The impact of industrial wearable system on industry 4.0. In Proceedings of the 2018 IEEE 15th International Conference on Networking, Sensing and Control (ICNSC), Zhuhai, China, 27–29 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6. [Google Scholar]
  13. Baker, S.; Waycott, J.; Carrasco, R.; Hoang, T.; Vetere, F. Exploring the design of social VR experiences with older adults. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 303–315. [Google Scholar]
  14. Florescu, A.; Barabas, S.A. Modeling and simulation of a flexible manufacturing system—A basic component of industry 4.0. Appl. Sci. 2020, 10, 8300. [Google Scholar] [CrossRef]
  15. Pratticò, F.G.; Lamberti, F. Towards the adoption of virtual reality training systems for the self-tuition of industrial robot operators: A case study at KUKA. Comput. Ind. 2021, 129, 103446. [Google Scholar] [CrossRef]
  16. Hu, F.; Wang, W.; Zhou, J. Petri nets-based digital twin drives dual-arm cooperative manipulation. Comput. Ind. 2023, 147, 103880. [Google Scholar] [CrossRef]
  17. Wang, X.; Yu, H.; McGee, W.; Menassa, C.C.; Kamat, V.R. Enabling Building Information Model-driven human-robot collaborative construction workflows with closed-loop digital twins. Comput. Ind. 2024, 161, 104112. [Google Scholar] [CrossRef]
  18. Xia, P.; Xu, F.; Song, Z.; Li, S.; Du, J. Sensory augmentation for subsea robot teleoperation. Comput. Ind. 2023, 145, 103836. [Google Scholar] [CrossRef]
  19. Vishwarupe, V.; Joshi, P.; Maheshwari, S.; Kuklani, P.; Shingote, P.; Pande, M.; Pawar, V.; Deshmukh, A. Exploring human computer interaction in industry 4.0. In AI, IoT, Big Data and Cloud Computing for Industry 4.0; Springer International Publishing: Cham, Switzerland, 2023; pp. 21–38. [Google Scholar]
  20. Schröder, J.H.; Schacht, D.; Peper, N.; Hamurculu, A.M.; Jetter, H.C. Collaborating across realities: Analytical lenses for understanding dyadic collaboration in transitional interfaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–16. [Google Scholar]
  21. Martínez-Gutiérrez, A.; Díez-González, J.; Perez, H.; Araújo, M. Towards industry 5.0 through metaverse. Robot. Comput.—Integr. Manuf. 2024, 89, 102764. [Google Scholar] [CrossRef]
  22. Albarrak, L.; Metatla, O.; Roudaut, A. Using virtual reality and co-design to study the design of large-scale shape-changing interfaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–17. [Google Scholar]
  23. Silvestri, L. CFD modeling in Industry 4.0: New perspectives for smart factories. Procedia Comput. Sci. 2021, 180, 381–387. [Google Scholar] [CrossRef]
  24. Abich, J.; Parker, J.; Murphy, J.S.; Eudy, M. A review of the evidence for training effectiveness with virtual reality technology. Virtual Real. 2021, 25, 919–933. [Google Scholar] [CrossRef]
  25. Weyer, S.; Meyer, T.; Ohmer, M.; Gorecky, D.; Zühlke, D. Future modeling and simulation of CPS-based factories: An example from the automotive industry. IFAC-PapersOnLine 2016, 49, 97–102. [Google Scholar] [CrossRef]
  26. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  27. Campeau-Lecours, A.; Côté-Allard, U.; Vu, D.S.; Routhier, F.; Gosselin, B.; Gosselin, C. Intuitive adaptive orientation control for enhanced human–robot interaction. IEEE Trans. Robot. 2018, 35, 509–520. [Google Scholar] [CrossRef]
  28. Villani, V.; Pini, F.; Leali, F.; Secchi, C. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 2018, 55, 248–266. [Google Scholar] [CrossRef]
  29. Tao, F.; Zhang, H.; Liu, A.; Nee, A.Y.C. Digital twin in industry: State-of-the-art. IEEE Trans. Ind. Inform. 2018, 15, 2405–2415. [Google Scholar] [CrossRef]
  30. Ladosz, P.; Coombes, M.; Smith, J.; Hutchinson, M. A generic ros based system for rapid development and testing of algorithms for autonomous ground and aerial vehicles. In Robot Operating System (ROS); The Complete Reference; Springer: Berlin/Heidelberg, Germany, 2019; Volume 3, pp. 113–153. [Google Scholar]
  31. Bai, H.; Sasikumar, P.; Yang, J.; Billinghurst, M. A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
  32. Brudy, F.; Budiman, J.K.; Houben, S.; Marquardt, N. Investigating the role of an overview device in multi-device collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–13. [Google Scholar]
  33. Grandi, J.G.; Debarba, H.G.; Maciel, A. Characterizing asymmetric collaborative interactions in virtual and augmented realities. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 127–135. [Google Scholar]
  34. Bezemer, M.M.; Broenink, J.F. Connecting ROS to a real-time control framework for embedded computing. In Proceedings of the 2015 IEEE 20th Conference on Emerging Technologies & Factory Automation (ETFA), Luxembourg, 8–11 September 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–6. [Google Scholar]
  35. Nuratch, S. Design and implementation of microcontroller-based platform-independent Real-time WebSocket Server for monitoring and control applications. In Proceedings of the 2017 14th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), phu*ket, Thailand, 27–30 June 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 624–627. [Google Scholar]
  36. Oliveira, M.; Castro, A.; Madeira, T.; Pedrosa, E.; Dias, P.; Santos, V. A ROS framework for the extrinsic calibration of intelligent vehicles: A multi-sensor, multi-modal approach. Robot. Auton. Syst. 2020, 131, 103558. [Google Scholar] [CrossRef]
  37. Anand, H.; Rees, S.A.; Chen, Z.; Poruthukaran, A.J.; Bearman, S.; Antervedi, L.G.P.; Das, J. OpenUAV cloud testbed: A collaborative design studio for field robotics. In Proceedings of the 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), Lyon, France, 23–27 August 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 724–731. [Google Scholar]
  38. Joseph, L.; Johny, A. Getting started with Ubuntu Linux for robotics. In Robot Operating System (ROS) for Absolute Beginners: Robotics Programming Made Easy; Apress: Berkeley, CA, USA, 2022; pp. 1–52. [Google Scholar]
  39. Krūmiņš, D.; Schumann, S.; Vunder, V.; Põlluäär, R.; Laht, K.; Raudmäe, R.; Aabloo, A.; Kruusamäe, K. Open remote web lab for learning robotics and ROS with physical and simulated robots in an authentic developer environment. IEEE Trans. Learn. Technol. 2024, 17, 1325–1338. [Google Scholar] [CrossRef]
  40. Myllymäki, M.; Hakala, I. Distance learning with hands-on exercises: Physical device vs. simulator. In Proceedings of the 2022 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 8–11 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar]

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (1)

Figure 1. Industrial metaverse evolution: virtual–physical trajectory diagram.

Figure 1. Industrial metaverse evolution: virtual–physical trajectory diagram.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (2)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (3)

Figure 2. WEB concept industrial metaverse.

Figure 2. WEB concept industrial metaverse.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (4)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (5)

Figure 3. Industrial robot connection WEB simulation system framework diagram.

Figure 3. Industrial robot connection WEB simulation system framework diagram.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (6)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (7)

Figure 4. Robotic arm posture display and workflow.

Figure 4. Robotic arm posture display and workflow.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (8)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (9)

Figure 5. Coordinate axis switching and implementation workflow.

Figure 5. Coordinate axis switching and implementation workflow.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (10)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (11)

Figure 6. Multi-terminal WEB display and implementation workflow.

Figure 6. Multi-terminal WEB display and implementation workflow.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (12)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (13)

Figure 7. Bidirectional communication diagram between physical robot and ROS.

Figure 7. Bidirectional communication diagram between physical robot and ROS.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (14)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (15)

Figure 8. Communication diagram between ROS and WEB platform.

Figure 8. Communication diagram between ROS and WEB platform.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (16)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (17)

Figure 9. Process flow diagram for URDF model conversion.

Figure 9. Process flow diagram for URDF model conversion.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (18)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (19)

Figure 10. Overall performance and motion diagram of industrial robot.

Figure 10. Overall performance and motion diagram of industrial robot.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (20)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (21)

Figure 11. Real-time data reception from robot diagram.

Figure 11. Real-time data reception from robot diagram.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (22)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (23)

Figure 12. Robot model loading.

Figure 12. Robot model loading.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (24)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (25)

Figure 13. Comparison diagram of loading time with KUKA Sim Pro model.

Figure 13. Comparison diagram of loading time with KUKA Sim Pro model.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (26)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (27)

Figure 14. Comparison diagram of model loading time with ABB RobotStudio.

Figure 14. Comparison diagram of model loading time with ABB RobotStudio.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (28)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (29)

Figure 15. Model driving direction diagram (KUKA, top left; ABB, bottom left; system, right).

Figure 15. Model driving direction diagram (KUKA, top left; ABB, bottom left; system, right).

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (30)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (31)

Figure 16. Comparison chart of response time with KUKA Sim Pro driver.

Figure 16. Comparison chart of response time with KUKA Sim Pro driver.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (32)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (33)

Figure 17. Comparison chart of response time with ABB RobotStudio driver.

Figure 17. Comparison chart of response time with ABB RobotStudio driver.

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (34)

Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (35)

Table 1. Custom URDF class functions.

Table 1. Custom URDF class functions.

Class NameInheritanceFunctionCharacteristics
URDF
Base
Object
3D
Provide Properties and Methods for Other ClassesStore URDF Node Data and Names
URDF
Collider
URDF
Base
Represent the Collision Detection Part of the Robot ModelSpecialized for Collision Detection
URDF
Visual
URDF
Base
Represent the Visual Part of the Robot ModelRender the Robot Model
URDF
Link
URDF
Base
Represent the Links in the Robot ModelRepresent the Rigid Body Parts of the Robot
URDF
Joint
URDF
Base
Manage the Robot’s JointsSupport and Control Various Joint Types
URDF
MimicJoint
URDF
Joint
Simulate the Behavior of Other JointsJoint Movements for Dependency Relations
URDF
Robot
URDF
Link
Represent the Complete Robot ModelOverall Management of the Robot Model, Obtain Framework and Set Joint Values

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.


© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Evolution of Industrial Robots from the Perspective of the Metaverse: Integration of Virtual and Physical Realities and Human–Robot Collaboration (2024)
Top Articles
Latest Posts
Article information

Author: Carmelo Roob

Last Updated:

Views: 6032

Rating: 4.4 / 5 (65 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Carmelo Roob

Birthday: 1995-01-09

Address: Apt. 915 481 Sipes Cliff, New Gonzalobury, CO 80176

Phone: +6773780339780

Job: Sales Executive

Hobby: Gaming, Jogging, Rugby, Video gaming, Handball, Ice skating, Web surfing

Introduction: My name is Carmelo Roob, I am a modern, handsome, delightful, comfortable, attractive, vast, good person who loves writing and wants to share my knowledge and understanding with you.