BEIJING, Nov. 14, 2023 /PRNewswire/ — WiMi Hologram Cloud Inc. (NASDAQ: WIMI) ("WiMi" or the "Company"), a leading global Hologram Augmented Reality ("AR") Technology provider, today announced that it developed a machine learning-based real-time human-drone interaction with DigiFlightGlove. The technology aims to realize the perfect blend of drone control and human gestures. By integrating flexible sensors and microprocessors into a comfortable glove, the technology allows the user to achieve precise control and navigation of the drone through simple gestures and movements.
While moving drones in three-dimensional space has always been a challenge, WiMi’s DigiFlightGlove breaks the limitations of traditional control methods by combining gesture recognition with machine learning. Users simply wear this glove and can instantly and precisely interact with the drone through gestures and movements.
Features of WiMi’s DigiFlightGlove include a multi-modal command structure, machine learning-based gesture recognition, intelligent task scheduling algorithms, real-time performance and high accuracy. Through an integrated sensor system on the glove, it captures minute movements of the user’s hand and then transmits this data to a host system via a built-in microprocessor. Through the user interface, this signal data is transformed into a smooth data set that can be recognized by four different machine learning algorithms.
During the development process, the team collected thousands of data samples to train and optimize the deep-learning neural network. These samples cover a wide range of gestures and movements to ensure that DigiFlightGlove accurately recognizes and interprets the user’s intent. After repeated experiments, this technology achieved a 98.5% accuracy rate, providing a solid foundation for wearable smart gloves to control drones.
WiMi’s DigiFlightGlove technology is based on wearable smart gloves that enable interaction between humans and drones through machine learning and sensor technology:
Smart glove and sensor integration: First, a variety of sensors are integrated into the glove, including flexible sensors and microprocessors. These sensors capture information about the user’s hand movement and posture.
Data acquisition and processing: When a user wears the gloves, the sensors begin to collect hand movement data, including information such as the bending angle of the fingers and the orientation of the palm. This data is processed by the built-in microprocessor and converted into digital signals.
Data processing: The raw data collected needs to be pre-processed to remove noise and instability. This may include steps such as filtering, calibration and data alignment to ensure that subsequent machine-learning models can accurately interpret hand movements.
Feature extraction and data transformation: Features are extracted from the pre-processed data, which can include the angle of the finger joints, the hand posture, the speed of movement, and so on. The extracted features are converted into a data format that the machine learning model can understand, usually a set of numeric vectors.
Machine learning model training: It needs to be trained using a machine learning model for the smart glove to recognize different gestures and movements. The machine learning algorithm for gesture recognition is used to provide the model with a large amount of sample data, which includes the labeling information of various gestures and actions.
Model testing and optimization: After training, the model is tested to evaluate its accuracy in recognizing gestures and actions. Optimization and tuning of the model are based on the test results to improve the accuracy.
Real-time recognition and command generation: In actual use, when the user performs gestures and movements, the data collected by the smart glove is recognized in real-time by a trained machine learning model. The model translates the recognition results into corresponding commands, such as rising, descending, steering and other UAV control commands.
Task scheduling and drone control: The recognized gesture commands are mapped to control commands for the drone through a task scheduling algorithm. For example, a specific gesture may indicate that the drone turns left and another gesture indicates upward flight. The task scheduling algorithm can generate appropriate control commands based on the gesture sequence and real-time requirements to ensure that the drone moves as the user intended.
User interface and interaction: To enable a user to intuitively interact with the system, there is a graphical user interface (GUI) to display the recognized gestures and the corresponding UAV control commands. Users can see their gestures on the interface and also the response of the UAV.
WiMi’s DigiFlightGlove technology offers unprecedented flexibility in the field of human-drone interaction. With the smart glove, users can easily control the drone’s movements with gestures for precise flight and navigation, as well as bring more innovative possibilities for drone applications.
WiMi’s DigiFlightGlove technology is an exploration of the growing demand for drone applications and the premise of wearable technology and machine learning. By merging wearables, machine learning and drone technology to realize a new interaction paradigm, WiMi’s DigiFlightGlove technology is expected to find applications in a wide range of industries, including aviation, rescue, entertainment, logistics, agriculture, construction and more. As drone applications expand and wearable technology matures, the market demand for this technology will gradually increase. Investors, entrepreneurs, and large corporations are likely to look for collaboration and investment opportunities in this space to drive the development and commercialization of the technology. This technology opens up unprecedented possibilities for the use of drones in civilian applications. Drone control has become more intuitive and natural, making it easy for ordinary people to navigate complex flight tasks.
About WIMI Hologram Cloud
WIMI Hologram Cloud, Inc. (NASDAQ:WIMI) is a holographic cloud comprehensive technical solution provider that focuses on professional areas including holographic AR automotive HUD software, 3D holographic pulse LiDAR, head-mounted light field holographic equipment, holographic semiconductor, holographic cloud software, holographic car navigation and others. Its services and holographic AR technologies include holographic AR automotive application, 3D holographic pulse LiDAR technology, holographic vision semiconductor technology, holographic software development, holographic AR advertising technology, holographic AR entertainment technology, holographic ARSDK payment, interactive holographic communication and other holographic AR technologies.
Safe Harbor Statements
This press release contains "forward-looking statements" within the Private Securities Litigation Reform Act of 1995. These forward-looking statements can be identified by terminology such as "will," "expects," "anticipates," "future," "intends," "plans," "believes," "estimates," and similar statements. Statements that are not historical facts, including statements about the Company’s beliefs and expectations, are forward-looking statements. Among other things, the business outlook and quotations from management in this press release and the Company’s strategic and operational plans contain forward−looking statements. The Company may also make written or oral forward−looking statements in its periodic reports to the US Securities and Exchange Commission ("SEC") on Forms 20−F and 6−K, in its annual report to shareholders, in press releases, and other written materials, and in oral statements made by its officers, directors or employees to third parties. Forward-looking statements involve inherent risks and uncertainties. Several factors could cause actual results to differ materially from those contained in any forward−looking statement, including but not limited to the following: the Company’s goals and strategies; the Company’s future business development, financial condition, and results of operations; the expected growth of the AR holographic industry; and the Company’s expectations regarding demand for and market acceptance of its products and services.
Further information regarding these and other risks is included in the Company’s annual report on Form 20-F and the current report on Form 6-K and other documents filed with the SEC. All information provided in this press release is as of the date of this press release. The Company does not undertake any obligation to update any forward-looking statement except as required under applicable laws.
Source : WiMi Developed a Machine Learning-Based Real-Time Human-Drone Interaction with DigiFlightGlove
>
This content was prepared by our news partner, Cision PR Newswire. The opinions and the content published on this page are the author’s own and do not necessarily reflect the views of Siam News Network