Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

WiMi Developed an Assistive Robotic Technology and a Control Approach Based on Hybrid BCI

WiMi Hologram Cloud, a leading global Hologram Augmented Reality (“AR”) Technology provider, announced that it developed an assistive robotic technology and a control approach based on hybrid BCI. The technology combines various technological components such as an eye-tracker, a device for recording EEG signals, a webcam, and a robotic arm to enable the user to accurately control the movement of the robotic arm by means of hybrid gaze BCI.

Read More about AiThority: Money Meets Machine: 5 Game-Changing AI Trends in Finance and Insurance

This assistive robot control technology will enable users to control the movement of the robotic arm end-effector through hybrid BCI, enabling more precise and flexible manipulation. The technology has been developed to improve the robot’s grasping performance, with a focus on improving its reaching performance so that grasping tasks can be automated. To achieve this goal, the development team divided the task into three key phases and leveraged the natural human visual motor coordination behavior.

First, the user specifies the target location of the assisting robot with a hybrid BCI in discrete selection mode. By observing the virtual rectangle appearing around the target, the user confirms that the target position has been successfully communicated to the assistive robot. Subsequently, it automatically switches to the continuous velocity control mode and enters the second phase. The user uses the hybrid BCI to move the robotic arm end-effector sequentially while avoiding collisions with obstacles. Once the end-effector enters a pre-specified area directly above the target object, it automatically stops and hovers over the target. Finally, the pre-programmed program is executed. The end-effector moves downward, adjusts the gripper orientation according to the direction of the target in the workspace, and successfully grabs the object. This design effectively reduces the number of degrees of freedom and allows the user to reach the object in three dimensions.

One of the key points of the technology is the application of hybrid BCI. This technology combines vision tracking and brain-computer interface technologies to enable control of the robot through discrete selection mode and continuous velocity control mode. In the discrete selection mode, the user inputs the target position by gazing at it, and then it automatically switches to the continuous velocity control mode, which moves the robotic arm end-effector to the target position according to the user’s velocity command.

The goal of WiMi’s technology is to enable accurate intent perception, efficient motion control and human-computer interaction. The underlying logic of this technology combines several technical components and algorithms to ensure stability, reliability and performance optimization of the control system. In the underlying logic, eye trackers and EEG signal recording devices play a key role in providing a sense of the user’s intent and attention by monitoring the user’s gaze point and EEG signals in real time. The eye-tracker tracks the user’s eye movements and determines the user’s gaze point and direction of view. EEG signal recording devices record the user’s EEG activity and extract features related to intent and attention through signal processing and analysis algorithms.

AiThority Insights: From Earth to the Stars: Unleashing the Power of AI in Space Engineering

Related Posts
1 of 40,741

Data processing and algorithms based on eye movement data and EEG signals need to be processed and decoded in real time in order to extract the user’s intent and attentional indications. This involves the use of techniques such as machine learning, pattern recognition and signal processing to recognize and decode the user’s intent and attentional state.

Environment sensing and obstacle avoidance is an important component. The technology utilizes sensors to sense the surrounding environment and the location of obstacles. The environment sensing data and algorithms enable real-time planning of safe paths and collision avoidance, and combine this information with user commands to ensure the safety and accuracy of the robotic arm during movement. The shared controller fuses user commands and robot autonomy commands to form new control commands that are used to precisely control the motion of the end-effector. The actuation system converts the control commands into the actual movement of the robotic arm to achieve accurate position control and gripping action. This requires the collaborative work of motion control algorithms, motion planning, and actuation control strategies to achieve precise communication of user intent and accurate task completion.

The visual feedback interface provides intuitive user interaction and feedback. The GUI displays a real-time scene of the working area of the robotic arm, presenting information such as the target position, obstacles, and the status of the robotic arm, enabling the user to intuitively understand the operation of the system. Meanwhile, augmented reality technology can provide enhanced visual feedback, such as the display of virtual rectangles and directional recognition of target objects, to further improve the accuracy and efficiency of operation. Through the visual feedback interface, users can monitor the robot’s motion status, the position of the target object, and the system’s response in real time to better understand and control the system’s behavior.

The transfer, processing and coordination of data play an important role in assistive robotics control technology. From eye-tracking and EEG signal-recording devices to shared controllers and actuation systems, data flows between different components and is processed and parsed accordingly. Data transfer and coordination ensure the real-time and stability of the system so that the user’s intent can be accurately communicated to the robot and precisely executed. In addition, error handling and fault tolerance mechanisms are included in the underlying logic. The system needs to be able to detect and handle potential errors or anomalies, such as sensor failures, communication interruptions, or motion errors. Fault-tolerance mechanisms ensure that the system is able to handle and recover appropriately in the face of abnormal situations to ensure the reliability and safety of the system.

Through a series of experiments and evaluations, the hybrid BCI-based assistive robot control technology is verified to have excellent performance in target specification, motion control and grasping tasks. The system has a wide range of application prospects and can be used in industrial production, medical care, education and training, service robotics and entertainment. Future research directions for this technology include further improving the performance and applicability of the system, optimizing the control algorithms and human-machine interface, expanding the application areas of the system, and combining it with other advanced technologies to further enhance the perceptual ability and autonomy of the system.

This assistive robot control technology is of great significance in the field of human-robot interaction and intelligent robotics, providing new ideas and methods for realizing automation solutions and artificial intelligence applications. The development of the assistive robotic technology based on hybrid BCI will facilitate the development of human-machine collaboration, improve productivity and quality of life, and promote scientific and technological innovation and social progress.

 Latest AiThority Insights : From Spam-Free to Superb: Google’s Take on AI-Generated Content

 [To share your insights with us, please write to sghosh@martechseries.com] 

Comments are closed.