Recent Publications

Leveraging Ultrasound Sensing for Virtual Object Manipulation in Immersive Environments

Published in IEEE BSN , 2023

Hand gesture recognition is a fundamental component of intuitive and immersive user interfaces in virtual reality (VR) applications. This paper presents a data-driven approach utilizing ultrasound data and deep learning techniques for hand gesture recognition in VR interfaces. The proposed methodology involves acquiring data from a subject, training a model using the acquired data, and evaluating the model performance on both the training data and during real time inference. The evaluation metrics primarily focus on accuracy percentage measuring the classifier performance in correctly classifying hand gestures. 4 hand gestures were primarily considered for the study and demonstration. For offline evaluation with a 20% test-train split, an accuracy percentage of 91% was observed. For online evaluation, an accuracy percentage of 92% was achieved. Results on the classification of 7 hand gestures were also analyzed for both online and online evaluation due to the promising results from the 4 gesture classification. The latency of the pipeline from ultrasound data acquisition using screenshots to sending commands for VR object manipulation was measured to be 59.48 milliseconds. The results demonstrate the effectiveness of the approach in accurately recognizing hand gestures, both during training and in real-time inference. We supplement our results with a video of the forearm ultrasound data being used to control a custom-designed VR game in a low-latency fashion. This research provides valuable insights into the performance and applicability of ultrasound-based hand gesture recognition techniques in VR interfaces. By employing deep learning and leveraging real-time data acquisition, this approach paves the way for intuitive and immersive interactions in various VR applications. The study contributes to the field of body sensor networks, highlighting the potential of forearm ultrasound based data-driven techniques for enhancing user interaction and immersion in VR environments.

Recommended citation: K. Bimbraw, J. Rothenberg and H. Zhang (2023). "Leveraging Ultrasound Sensing for Virtual Object Manipulation in Immersive Environments", IEEE 19th International Conference on Body Sensor Networks (BSN), Boston, MA, USA, 2023, pp. 1-4, doi: 10.1109/BSN58485.2023.10331075. https://ieeexplore.ieee.org/document/10331075

Estimating Force Exerted by the Fingers Based on Forearm Ultrasound

Published in IEEE IUS, 2023

Biosignal-based finger force estimation is an active area of research, with applications in teleoperation, human-machine interaction, and rehabilitation robotics. Traditionally, surface electromyography has been used to estimate hand grip and finger forces. In this paper, we show that forearm ultrasound can be used to estimate the force exerted by the fingers. A wireless ultrasound probe strapped to the forearm and a force sensor was used to estimate the ground truth. Accuracy percentages and root mean square error (RMSE) values were obtained for the shuffled and non-shuffled data subjected to a test-train split for all the fingers. It was found that the classification accuracy was 98.4 percent for the shuffled data, and 82 percent for the non-shuffled data averaged over all the fingers. For continuous estimation, the average RMSE was 0.02 N for the shuffled data and 0.2 N for the non-shuffled data. With a maximum force of 5 N, the average RMSE accounted for 4 percent of the maximum force for the non-shuffled data, and 0.4 percent for the shuffled data. These results show the potential of utilizing forearm ultrasound for estimating finger forces.

Recommended citation: K. Bimbraw and H. K. Zhang, "Estimating Force Exerted by the Fingers Based on Forearm Ultrasound," 2023 IEEE International Ultrasonics Symposium (IUS), Montreal, QC, Canada, 2023, pp. 1-4, doi: 10.1109/IUS51837.2023.10306652. https://ieeexplore.ieee.org/abstract/document/10306652

Simultaneous Estimation of Hand Configurations and Finger Joint Angles Using Forearm Ultrasound

Published in IEEE Transactions on Medical Robotics and Bionics, 2023

This paper is an extension of our IEEE ICRA 2022 work with a focus on generalizing the classification and angle estimation task to any ML/DL architecture, and doing an extensive analysis of data at different resolutions, acquisition speeds and subjects.

Recommended citation: Bimbraw, K., Nycz, C. J., Schueler, M., Zhang, Z., & Zhang, H. K. (2021). "Simultaneous Estimation of Hand Configurations and Finger Joint Angles Using Forearm Ultrasound", IEEE Transactions on Medical Robotics and Bionics, vol. 5, no. 1, pp. 120-132, Feb. 2023, doi: 10.1109/TMRB.2023.3237774. https://ieeexplore.ieee.org/abstract/document/10020174

Towards The Development of a Low-Latency, Biosignal-Controlled Human-Machine Interaction System

Published in IEEE SII, 2023

In this paper, we report the development of a system pipeline, and both physical and bioelectrical sensors as input modalities to demonstrate a near-natural motion synchronization between a human and a robotic arm. This effort was exemplified by the formulation of a method to reduce modular and system-level operational latency to achieve congruent human-machine interaction (HMI) through analyzing and simulating common mechanical motions. Furthermore, we explored several efficient machine learning (ML) model that reliably works with a variety of time-series-based biosignals reflective of intents, thus allowing a diversity of sensors to contribute as the system inputs. We believe our system pipeline represents a first step in unveiling otherwise hidden components within Biosignal-Controlled HMI systems and meeting the key challenges will bring us closer to the establishment of a natural, human-intent controlled, remotely operated HMI platform, with applications that extend far beyond major sectors of academia and industry.

Recommended citation: K. Bimbraw and M. Zheng (2023), "Towards The Development of a Low-Latency, Biosignal-Controlled Human-Machine Interaction System", 2023 IEEE/SICE International Symposium on System Integration (SII), Atlanta, GA, USA, 2023, pp. 1-7, doi: 10.1109/SII55687.2023.10039467. https://ieeexplore.ieee.org/abstract/document/10039467

Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm

Published in IEEE ICRA, 2022

With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Metacarpophalangeal (MCP) joint angle detection are important for a comprehensive reconstruction of the hand motion. Surface electromyography and other technologies have been used for the detection of hand motions. Ultrasound images of the forearm offer a way to visualize the internal physiology of the hand from a musculoskeletal perspective. Recent work has shown that these images can be classified using machine learning to predict various hand configurations. In this paper, we propose a Convolutional Neural Network (CNN) based deep learning pipeline for predicting the MCP joint angles. We supplement our results by using a Support Vector Classifier (SVC) to classify the ultrasound information into several predefined hand configurations based on activities of daily living (ADL). Ultrasound data from the forearm was obtained from 6 subjects who were instructed to move their hands according to predefined hand configurations relevant to ADLs. Motion capture data was acquired as the ground truth for hand movements at different speeds (0.5 Hz, 1 Hz, & 2 Hz) for the index, middle, ring, and pinky fingers. We were able to get promising SVC classification results on a subset of our collected data set. We demonstrated a correspondence between the predicted MCP joint angles and the actual MCP joint angles for the fingers, with an average root mean square error of 7.35 degrees. We implemented a low latency (6.25 - 9.1 Hz) pipeline for the prediction of both MCP joint angles and hand configuration estimation aimed at real-time control of digital devices, AR/VR interfaces, and physical robots.

Recommended citation: Bimbraw, K., Nycz, C. J., Schueler, M., Zhang, Z., & Zhang, H. K. (2021). "Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm." 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 2022, pp. 91-97, doi: 10.1109/ICRA46639.2022.9812287. https://ieeexplore.ieee.org/abstract/document/9812287

Tele-operative Robotic Lung Ultrasound Scanning Platform for Triage of COVID-19 Patients

Published in IEEE Robotics and Automation Letters, 2021

A cost-effective 2D tele-operative robotic platform for lung ultrasound (LUS), addressing COVID-19 diagnosis challenges and minimizing physical contact. The framework's key contribution is enhancing LUS accessibility and safety, proving successful application in humans, with my primary role being the development of the ultrasound data acquisition software.

Recommended citation: Tsumura, R., Hardin, J. W., Bimbraw, K., Odusanya, O. S., Zheng, Y., Hill, J. C., Hoffmann, B., Soboyejo, W., Zhang, H. (2021). "Tele-Operative Low-Cost Robotic Lung Ultrasound Scanning Platform for Triage of COVID-19 Patients" In: IEEE Robotics and Automation Letters, 6(3), 4664-4671. https://pubmed.ncbi.nlm.nih.gov/34532570/

Augmented Reality-Based Lung Ultrasound Scanning Guidance

Published in MICCAI ASMUS, 2020

Lung ultrasound (LUS) is an established non-invasive imaging method for diagnosing respiratory illnesses. With the rise of SARS-CoV-2 (COVID-19) as a global pandemic, LUS has been used to detect pneumopathy for triaging and monitoring patients who are diagnosis or suspected with COVID-19 infection. While LUS offers a cost-effective, radiation-free, and higher portability compared with chest X-ray and CT, its accessibility is limited due to its user dependency and small number of physicians and sonographers who can perform appropriate scanning and diagnosis. In this paper, we propose a framework of guiding LUS scanning featuring augmented reality, in which the LUS procedure can be guided by projecting the scanning trajectory. To develop such a system, we implement a computer vision-based detection algorithm to classify different regions on human body. The DensePose algorithm is used to obtain a body mesh data for the upper body pictured with a mono-camera. Torso sub-mesh is used to extract and overlay the eight regions corresponding to anterior and lateral chests for LUS guidance. To minimize instability of the DensePose mesh coordinates based on different frontal angles of camera, a machine learning regression algorithm is applied to predict the angle-specific projection model with respect to the chest. ArUco markers are utilized for training the ground truth chest regions to be scanned and another single ArUco marker is used for detecting the center line of the body. The augmented scanning regions are highlighted one by one to guide the scanning path to execute the LUS procedure. We demonstrated the feasibility of guiding the LUS scanning procedure through the combination of augmented reality, computer vision, and machine learning.

Recommended citation: Bimbraw, K., Ma, X., Zhang, Z., Zhang, H. (2020). "Augmented Reality-Based Lung Ultrasound Scanning Guidance". In: Medical Ultrasound, and Preterm, Perinatal and Paediatric Image Analysis. ASMUS 2020, PIPPI 2020. Lecture Notes in Computer Science, vol 12437. Springer, Cham. https://link.springer.com/chapter/10.1007/978-3-030-60334-2_11

Towards Sonomyography-Based Real-Time Control of Powered Prosthesis Grasp Synergies

Published in IEEE EMBC, 2020

The paper describes the classification of ultrasound information and its mapping onto a soft robotic gripper. In our real-time ultrasound-based control of a soft robotic gripper pipeline, we were able to train a machine learning model to classify 4 hand grasping configurations with an average accuracy percentage of 93%. The paper is a step toward intuitive and robust biosignal-based control methods for robots.

Recommended citation: Bimbraw, K., Fox, E., Weinberg, G. and Hammond, F. L. (2020). "Towards Sonomyography-Based Real-Time Control of Powered Prosthesis Grasp Synergies." 2020 42nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Montreal, QC, Canada, 2020, pp. 4753-4757. https://ieeexplore.ieee.org/document/9176483

A teach pendant to control virtual robots in Roboanalyzer

Published in IEEE RAHA, 2016

The paper describes the design of a teach pendant for a robot teaching software called RoboAnalyzer. Using the low-cost Arduino based teach pendant, the user can control physical and virtual robotic systems.

Recommended citation: Mehta, I., Bimbraw, K., Chittawadigi, R. G., & Saha, S. K. (2016). "A teach pendant to control virtual robots in Roboanalyzer", 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA) (pp. 1-6). IEEE. https://ieeexplore.ieee.org/abstract/document/7931881

Performance improvements of a 6-DOF motion platform

Published in IEEE RAHA, 2016

The paper describes how we improved the performance characteristics of a 6-DOF stewart platform being used as a driving simulator.

Recommended citation: Bimbraw, K., Mehta, I., Venkatesan, V., Joshi, U., Sabherwal, G. S., & Saha, S. K. (2016). "Performance improvements of a 6-DOF motion platform", 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA) (pp. 1-5). IEEE. https://ieeexplore.ieee.org/document/7931899