Introduction

Each year, nearly 55,000 people in the U.S. are diagnosed with oral cavity and oropharyngeal cancer, and another 12,470 people are diagnosed with laryngeal cancer. This means that 1 in 190 men and 1 in 830 women will develop throat cancer at some point in their life​.

One area where autonomous robotics can be explored is in the medical field. In most cases, implementation of an autonomous robot occurs within the realm of surgical applications. Soft materials have been explored for their use in robotics applications, especially in the biomedical field, where compliance and biocompatibility are important considerations. More specifically, an autonomous robot could be used for surgical applications related to cancerous tissue occurring in the endotracheal tube. Here it could be used for aiding endoscopy or performing other surgical procedures, such as biopsies.

Endoscopy is the most common technique for detecting and diagnosing these cancers, and the large diameter and relative simplicity of the upper gastrointestinal tract make it an ideal environment for developing a soft-robotic paradigm for guiding an endoscope to potential cancer sites. Traditional endoscopy requires a trained oncologist to feed a camera, either rigid or flexible, through either the mouth or the nasal cavity in order to see any problem regions in that area. The medical test requiring endoscopy is typically a short one, lasting less than 2 minutes, and typically requires numbing spray if entering through the nasal cavity. Most oncologists prefer this as going through the mouth can cause problems due to patients’ gag reflexes. Traditional endoscopy also requires an endoscopy tower, a large piece of equipment that holds other equipment on shelves and has the electrical connections and technology that is needed to use the camera properly.  Diagnosis is currently an issue because tumors are often buried deep in the throat or hidden under the surface of the mouth,​ it takes a clinician with a lot of training in order to identify the small tumors associated with throat cancer, and only 20-30% of patients will be able to see something visually to tell them they have throat cancer.

To evolve further past the scope of endoscopy and have a further impact on the medical field, we decide to also incorporate a biopsy element to our robot.  Currently, in order to take a biopsy for suspected throat cancer, an oncologist uses small, rigid forceps to reach ROI with this whole process guided by an endoscope. The tissue collected during this process will then be sent to a lab and tested for Human Papillomavirus (HPV) and throat cancer. This whole process takes a long amount of time because the patient has to be put in the OR, go under anesthesia, be operated on, wake up after being anesthesia and then discharged. It is also difficult because the oncologist is trying to reach a specific, narrow area with a very small field of view.

Our project hopes to improve upon this by creating a soft robot that can perform endoscopies and biopsies for use in private hospitals and rural areas to target health disparities. We have been inspired by the design and actuation of the MIT Meshworm.

Meshworm: A Peristaltic Soft Robot With Antagonistic Nickel Titanium Coil Actuators

Meshworm is a soft autonomous earthworm robot created by researchers at the Massachusetts Institute of Technology (MIT) and Harvard University from which we drew inspiration from. It primarily consists of a robot body made of polyetheretherketone (PEEK), a mesh-like tubing material, and antagonistic nitinol actuators. Nitinol is a shape-memory alloy made of 50% nickel and 50% titanium, which were used as coils to create the inchworm like motion of the robot. The nitinol actuators were connected to a circuit which would provide heat and cause the actuators to contract or elongate. 

With that said, our design will differ slightly as we work to incorporate our forceps and focus on our biomedical applications. We plan to replicate the longitudinal and circumferential actuators from Meshworm as well as using PEEK for the robot body. Our design differs in determining the methodology for actuating the forceps and the need for a biocompatible casing to surround the body of the robot to ensure minimal complication with entering the human esophagus.  

Sketch of MIT Meshworm (click to go to their website)

Project Statement

The aim of this project is to design a soft robot with inchworm-like locomotion that can perform small scale biopsies at cancerous tissue regions in the throat. 

Clinical Interviews

Meetings with Duke Health

As part of the information gathering portion of our research cycle, we conducted some clinical interviews by meeting with healthcare professionals affiliated with Duke Health. These healthcare professionals are classified as otolaryngologists, or Ear/Nose/Throat specialists (ENTs). We met with Tami Runyan, PA-C and Walter Lee, MD, MHS.  

Tami Runyan is a Physician Assistant in the Duke Health Otolaryngology. The work that she does involves more of the nasal cavity, so from the beginning of the meeting, our consideration of esophagoscopy would be limited. This was due to us not considering gag reflexes if we were to have our robot enter through the mouth. She informed us early in the meeting that ENT care is beginning to become inundated and competitive in major areas. This ultimately leaves major health inequities and creates disparities for communities that don’t have access to this care. She suggested that it would be worth considering something that could be disposable or easily cleaned for reuse as a means to make endoscopy inexpensive so to target those health disparities in rural communities. This played a role in developing our design criteria and allowed us to consider different materials, size constraints, and overall designs. At the end of the meeting, she suggested that the team visit her at Duke South Clinic during the semester for us to see the equipment in person, ask any follow-up questions, and for our team lead, Jasmine, to get scoped. 

Our next meeting in the Duke Health system was with Walter Lee. Walter Lee is the Chief of Staff of Head and Neck Surgery and works specifically with Surgical Oncology. Our meeting with him was a big turning point for our design, primarily related to the robot’s application. He informed us that aiding endoscopy is not a need at this time as his team had recently come up with a design for a compact flexible scope that would allow endoscopy to be performed with a smaller instrument and less equipment. Due to this, he suggested that we transition to aiding biopsy. This transition would allow us to relax our size design criteria because we would now be able to move through the throat and connect rigid forceps to the end of the robot body to take the biopsy at the desired location. He mentioned a need for automating more surgical procedures like tracheal esophageal punctures and procedures related to ear infections but stressed that our most useful application would be in biopsy. As a result, we decided to transition from aiding endoscopy to aiding biopsy by tweaking our application slightly to incorporate forceps. 

Visit to Duke South Clinic: Jasmine’s Testimonial

As mentioned, our meeting with Tami Runyan lead us to the Duke South Clinic for Jasmine to get scoped so that we could further understand the endoscopy process. Jasmine’s expectations going into this test surrounded the feeling of the endoscope that would be used. She expected the test to resemble the feeling of a COVID-19 test since we knew beforehand that the test would be conducted through the nasal cavity. She was unsure of if she would be numbed during this process as Tami informed us that numbing can be used on a case-by-case basis. Lastly, she anticipated that the test would take around 3 minutes to complete. 

Flexible Scope
Endoscopy Tower
Endoscopy visualization of Jasmine's nasal cavity
Punch Biopsy Tool
Jasmine being scoped by Tami Runyan, PA-C
Endoscopy visualization of Jasmine's throat

The visit was very informative and insightful to us as we moved out of our research cycle. Many of Jasmine’s expectations were slightly different than what occurred. Jasmine recounted that the examination did not feel like a COVID-19 test at all, in large part due to the fact that Tami used a flexible scope. The flexible nature of the endoscope that was used allowed Tami to feed the scope slowly through Jasmine’s nostril. Tami entered through Jasmine’s left nostril because it looked more open; however, that was not the case, and it was a little more difficult to feed the scope through that end. She was more successful entering through the right nostril and we were able to see more on the camera from this. This told us that if a patient needed to be scoped on one end regardless of if it was more open or not, the doctor would have no choice but to still feed the scope through that nostril. Moving on, Jasmine did end up being numbed for this process and she attested to that being the worst part of the experience as she experienced a numbing sensation in her nose and mouth for an additional 2 hours after we left the clinic. Lastly, the test lasted about 3 minutes as Jasmine guessed because Tami wanted to explain the things that she was seeing to aid our discovery, but a traditional endoscopy test would last no more than 30 seconds to 1 minute. All in all, this experience was very helpful to our team and allowed us to move forward with certain considerations for the design of our robot. 

Brainstorming

As a team, we came up with seven ideas. The first design was a modular soft pneumatic continuum robot made of silicone. Each of the three segments had a tube running through it, and pressurizing the tubes creates movement. Pressurizing one tube bends the segment away, pressurizing two tubes bends the segment between the tubes, and pressurizing all three tubes elongates the module. This design is denoted in the Pugh Scoring Matrix as design A. The second design, Design B, was a modular soft robot made from a silicone-ethanol emulsion. It required the heating of 3 nitinol wires running through each module, causing the ethanol to boil and expand. The third design, Design C, was a hybrid vine/continuum robot using a series pouch motor where a camera is fixed to the end and connected to a tether. The fourth design, Design D, uses a modular pneumatic continuum robot where a camera rotates to get a full view of the region of interest. The first design of the second matrix, Design E, was based on a nitinol robot designed by PhD student Kent Yamamato. Design F was creating magnetic tentacles controlled by a magnetic endoscope to move the tentacles through the body. The final idea is Design G which is  based on the MIT inchworm, where a nitinol and titanium wire are wrapped around a mesh inchworm like robot and an applied current causes contraction and expansion. 

In order to actually score these designs, we decided on our five most important design criteria: biocompatibility, ease of use, forceps feasibility, cost, and reusability. We assigned each of these weights based on their level of importance. Biocompatibility was given a weight of 50%, ease of use 20%, forceps feasibility 20%, cost 5%, and biodegradability 5%. It is worth noting that we considered “field of view” as one of our initial design criterias, as you can see in the scoring matrices. We decided to eliminate that as one of our criteria as our project developed since we transitioned from endoscopy to biopsy. We anticipate that the trained oncologist using our robot for the biopsy would have conducted an endoscopy prior to the surgery, and thus the problem region would already have been identified. As a result, field of view would not need to be a significant aspect of our design criteria, but a stereovision camera would be used nonetheless. We then scored each design from a 1-5 in these categories and then multiplied them by the weights in order to get a weighted score. With these weights, the idea that scored the highest was Design E (yellow highlight) from the second matrix, the robot made by Kent. However, the construction of this design required a femtosecond laser, which was inaccessible for the timeline and scale of our project, so we chose our second option: the MIT meshworm.

Objectives

Electrical Goals

With respect to the fleshed-out final design of our biopsy robot, there are both electrical and mechanical goals. For the electrical components of the design, we hope to be able to manipulate the robot so that it operates off of inchworm-like locomotion. This relies on the actuation aspect of the robot which involves shape memory alloys that will be explained shortly. In addition, we aim to incorporate a PID control system to control the robot’s motion. PID control stands for Proportional-Integral-Derivative control, which essentially involves three components to output a setpoint in a control system by adjusting the inputs. This PID control also involves certain tuning parameters that we will adjust based on the needs of our design and how we anticipate the robot should be controlled. The last main electrical goal revolves around the integration of machine learning. More specifically, our team is using YOLOv8 as an artificial intelligence and machine learning component of this design. YOLOv8 is a computer vision algorithm that we can use for our robot to detect certain things. In the context of this project, we primarily want the robot to be able to detect itself and its surroundings. Surroundings in this case primarily refer to the areas that the robot would be sensing within the throat and a trained oncologist would have already gathered any information needed to perform the biopsy at a certain location. To incorporate YOLOv8, we are training the model by taking images of the robot against varying backgrounds and annotating the front, back, and whole body so that the robot would be able to detect itself in any environment. 

Mechanical Goals

Moving forward, one of the mechanical goals of the project calls for creating coils made of the nitinol shape memory alloy. Shape memory alloys are exactly as they sound: they can be trained to remember a certain shape. To explain more, shape memory alloys are materials that can be trained to remember a shape or orientation when exposed to an external stimulus, most often heat. In our case, we are using nitinol wires, a blend of Nickel and Titanium. We wound the nitinol around a core wire as a means to create coils or spring-like shapes and then subjected them to an annealing process by heating the wires at 400C for 20 minutes. This effectively trains our nitinol so that if those coils are deformed, reheating the wires will return it to the coil/spring shape. With that said, the coils can be wrapped around the body of the robot and a current will be applied to the nitinol wires which will create the actuation and inchworm locomotion. Another mechanical goal stems from the nitinol coils. The robot will be comprised of polyetheretherketone (PEEK) tubing. To create movement, we need to segment the robot to create different sections, and the number of sections will equal the number of coils. Lastly, our primary application for this robot is to perform controlled, small-scale biopsy; therefore, our last mechanical goal is to determine how to connect and integrate forceps to a soft robot to perform the procedure. Forceps are rigid, so the connection to a soft robot is one to experiment with and this would also require an entirely different actuation mechanism. 

Educational Goals

There are also larger scale goals that our team is targeting that relate to the engineering curriculum and current research being done in affiliation with Duke. We are hoping that this project will play a role in increasing the integration of medical robotics and soft materials work into the undergraduate MEMS curriculum. MEMS is a broad field and as it stands, the primary implementation of this project would likely fall within the bounds of a biomedical engineering (BME) course. Regardless of department, some of the classes we see this work being implemented involve ME 344 (Control Systems), ME 221 (Structures and Properties of Solids), BME 221 (Biomaterials), and BME 303 (Modern Diagnostic Imaging Systems). 

Research Goals

With respect to the research being done at and around Duke, there are two main research considerations or questions that we hope can inspire research in the medical robotics and engineering fields. The first asks us to determine how soft robots are able to perform or facilitate specific surgical procedures. This could branch into considerations about different surgeries outside of biopsies, material selection, ethics in engineering, and how the application may change if the robot is scaled at all. From there, another research consideration is how often soft robots can sense their location in space given their nonlinearity and flexible nature. This leans into the machine learning and data science realm and also requires us to think about the lack of rigidity of these robots and how this may affect the overall analysis of the bodies when contemplating the use of these robots inside of a human. 

System Decomposition

When thinking about our project, we split the aspects into five different subsystems which were then broken into more project components. The five subsystems that all of our project falls into are electrical, mechanical, application, actuation, and safety and compatibility.

The electrical subsystem is best represented by the category of self-sensing controls and machine learning. The robot must be able to be self-sensing, or know its position and location at all times. We also have used two PID controllers, one in the x direction and the other in the y direction, to make sure that it travels along the path we want it to. If it starts to deviate and head away from the direction it should be, the PID control will tell it to adjust its path. For the AI and machine learning element, we have created an image set of our robot with various backgrounds so that our AI element, YOLOv8, can real time identify what is our robot versus what is not.

The mechanical subsystem involves the physical implementation and modeling of the robot in the real world. The first aspect was the actual robot. Our soft robot is made of a mesh wire woven around three nitinol wires which break the robot into segments that create the inchworm motion. Because the robot will be implemented in either the esophagus or nasal cavity depending on the level of complexity, we also had to find suitable materials to mimic the elasticity of human anatomy, leading us to TPU. Finally we made a CAD model of the robot, tethering system, forceps connection, and camera connection in order to visualize the system and implementation together.

The application subsystem is dependent on whether or not the goal is for the robot to just reach a pre planned location or to take a biopsy there as well. Regardless, the robot will need a camera for visualization and sensors for detection. If the robot is intended to create a biopsy, there also needs to be an integration of forceps and also a bite block to ensure that the patient doesn’t close their mouth during the biopsy.

The second to last subsystem is actuation. Actuation is the method by which the robot actually creates the inchworm movement. The method of creating the actuators comes from the Meshworm project where a drill press was used to design the springs.

The last but most important subsystem is the safety and compatibility. The robot must be biocompatible in order to not cause a reaction in the patient while the procedure is happening, and it must also not cause any bodily harm as we want to leave the patient in a better situation than they started with. The safety and compatibility also ties into the environment the robot will be in. We have to be mindful of the fact that when someone is getting a biopsy they are also under anesthesia so our robot will have to be mindful not to disrupt these. We also have to think of the chemical composition of mucus and how that could agitate the electrical components of the robot, and of course take into account the general anatomy of the esophagus as that is our workspace.

Module Breakdown

Project Progress: Our Design

Nitinol Coils

Our first element of progress was with the nitinol coils. We first wanted to determine how to optimally make the coil, and so experimented on how to iterate on the MIT Meshworm setup with materials that we already had. We connected a core wire to a drill with a weight attached on the bottom to provide tension. We then held the nitinol also in tension perpendicularly to the core wire and then pulsed the drill in order to wind the nitinol into coils. Finally the core wire and nitinol were clamped together so that the nitinol wouldn’t come undone and the two were annealed in a furnace.

Once we had perfected the coil making process, we made five coils with different diameter core wires to determine which was best for creating nitinol spring actuators. We made wires with 22-, 24-, 26-, 28-, and 30-gauge wires, and progressively hung weights until the coil stopped stretching. With this data we plotted a force strain curve and determined the 28 gauge was the best because of its stiffness.

Annealing

A key part of the prototyping process involved annealing and heat treatment. For the coils, annealing took place for 20 minutes at 390C. The PEEK tubing also needed to be annealed and this took place at 140C for roughly 10 minutes. We used the Lindberg/Blue M Moldatherm Box Furnace for these processes.

3D Printing

To understand the environment that we were working in, our group also used 3D printing to model the larynx. We used thermoplastic polyurethane (TPU) filament for this printing. TPU is known to be a flexible filament material and we wanted to accurately depict the movement and lack of rigidity in the throat when our robot would be operating.

Circuitry

Lastly, to provide heat to our nitinol coils to facilitate the shape-memory effect, we crimped the nitinol to electrical wire. We then soldered that electrical wire to a circuit consisting of (3) 220Ω resistors in parallel. This circuit was connected to an Arduino Uno for 5V power and ground. Ideally, the final implementation of this design would utilize Raspberry Pi instead of Arduino Uno.

YOLOv8 Computer Vision Model

We developed our object tracking code step-by-step, aiming to become comfortable with each software in solation before adding them together. First, we modified example code on OpenCV’s python tutorials site to interface with a laptop webcam. Remnants of this code can be found in “TestInstall code.py” on our GitHub. Comments in the code explain the function of each line. Note that on different operating systems, you may need to add additional drivers to get the webcams to interface with OpenCV. Furthermore, VideoCapture takes an additional optional argument that selects the API used to display the captured image object. Some of these are available by default. Others require you to use a more complicated process to use. See the OpenCV VideoCaptureAPIs docs for reference.

The next step was to integrate YOLOv8 into our code. Ultralytics publishes code snippets for model training and object tracking using OpenCV. We modified our OpenCV code in TestInstall code.py” to perform object classification on webcam with a model pretrained on the COCO dataset (see our “Tracking code.py” in the GitHub. COCO comprises over 200,000 labelled images of common objects, but we needed to be able to track an uncommon object – a PEEK mesh. So, we had to create our own custom dataset with Roboflow comprising about 600 images of a PEEK mesh with the front and back ends marked by green and red tape respectively. In each image, we labelled all mesh, front, and back instances. We trained the model using YOLOv8 hyperparameters and with the initial weights from the pretrained model. This can be found in the “Training code.py” on our GitHub. We used Google Colab for training. They offer free GPU access, which significantly accelerates the process. The trained model had high accuracy and confidence on the training data, as can be seen in the video below. It also had low confusion. However, it performed poorly on unfamiliar images, suggesting the model was overfit to the training data. We want to increase the diversity of the images in our image set and reduce the number of epochs to reduce overfitting.We also began integrating depth information from a RealSense stereo camera. Information on how stereo cameras are able to obtain depth can be found here. We are developing code to extract depth information for the center of the bounding boxes of tracked objects, for use in a PID control algorithm. The unfinished code for obtaining depth information of tracked objects can be found in “Depth Tracking.py”, while code for creating a depth image can be found in “stereocamera example.py”. Unfinished code for PID control is available in “PIDCapstone(1).py”.

Robot Motion

This is a short video of the robot bending as a result of the connection to the nitinol coils. In the video, you will only see 1 longitudinal coil, but ideally, a final design would include 3 longitudinal actuators, and up to 5 circumferential actuators to cause the inchworm motion.

Future Work

Our main future work for this project includes incorporating the circumferential actuators with the longitudinal actuators in order to create a fully working robot that can both bend in nine degrees of freedom and also have inchworm like movement. We also want to continue training and updating the YOLOv8 based model, with the goal of putting the robot in our simulated phantom esophagus and seeing if YOLO can still differentiate between the robot, esophagus, and potentially even diseased tissue. We also want to incorporate the RealSense stereo vision camera, which will also aid the YOLOv8 model and create the endoscopic element needed for the biopsy. Finally, we want to incorporate the biopsy element of our idea through an actual forceps connection and an actuator.

The next stage of our final work incorporates the actual implementation of our robot into the physical esophagus system. We would need to create a biocompatible casing for the robotic body and its interior components as well as tether the end of the robot body to the bite block for control and navigation.

Meet the Team!

This team was composed of 3 Master’s students in the MEMS department with interests in materials science and robotics. Click the images below to go to our respective websites and learn more about us!

Jasmine King
Francis Sampson

Acknowledgements

We couldn’t do this project by ourselves, so special thanks to the following!

Duke MEMS
Capstone Professor: Dr. George Delagrammatikas
Capstone TA: Paavana Srinivas
Patrick McGuire
Evan Kusa
Dr. Siobhan Oca
Ravi Prakash

Duke Health
Tami Runyan, PA-C 
Walter Lee, MD, MHS

Dynalloy
Jeff Brown 

References

1. Bashir, Adnan & Al-Naami, Bassam. (2013). A Fusion Technique Based on Image – Statistical Analysis for Detection of Throat Cancer Types Vol 4 6 JJMIE .doc.

2. Baalamurugan, K. M., Singh, P., & Ramalingam, V. (2022). A novel approach for brain tumor detection by self-organizing map (SOM) using adaptive network based fuzzy inference system (ANFIS) for robotic systems. International Journal of Intelligent Unmanned Systems, 10(1), 98-116. doi:https://doi.org/10.1108/IJIUS-08-2020-0038.

3. Chan, J., Pan, F. T., & Li, Z. (2018). Design and Motion Control of Biomimetic Soft Crawling Robot for GI Tract Inspection. 2018 13th World Congress on Intelligent Control and Automation (WCICA), Changsha, China, 2018, pp. 1366-1369, doi: 10.1109/WCICA.2018.8630626.

4. Ciocan, R. A., Graur, F., Ciocan, A., Cismaru, C. A., Pintilie, S. R., Berindan-Neagoe, I., Hajjar, N. A., Gherman, C. D. (2023). Robot-Guided Ultrasonography in Surgical Interventions. Diagnostics, 13(14), 2456. https://doi.org/10.3390/diagnostics13142456.

5. Elsisy, Moataz & Chun, Youngjae. (2021). Materials Properties and Manufacturing Processes of Nitinol Endovascular Devices. 10.1007/978-3-030-35876-1_4.

6. Farber, Eduard & Zhu, Jia-Ning & Popovich, Anatoliy & Popovich, Vera. (2020). A review of NiTi shape memory alloy as a smart material produced by additive manufacturing. Materials Today: Proceedings. 30. 10.1016/j.matpr.2020.01.563.

7. Hamarneh, G., et al. (2014). Towards multi-modal image-guided tumour identification in robot-assisted partial nephrectomy. 2nd Middle East Conference on Biomedical Engineering, Doha, Qatar, 2014, pp. 159-162, doi: 10.1109/MECBME.2014.6783230.

8. Jiang, P., Peng, J., Zhang, G., Cheng, E., Megalooikonomou, V., & Ling, H. (2012). Learning-based automatic breast tumor detection and segmentation in ultrasound images. 2012 9th IEEE International Symposium on Biomedical Imaging (ISBI), Barcelona, Spain, 2012, pp. 1587-1590, doi: 10.1109/ISBI.2012.6235878.

9. Kaczmarek, B. F., Sukumar, S., Petros, F., Trinh, Q.-D., Mander, N., Chen, R., Menon, M., & Rogers, C. G. (2013). Robotic ultrasound probe for tumor identification in robotic partial nephrectomy: Initial series and outcomes. International Journal of Urology, 20: 172-176. https://doi.org/10.1111/j.1442-2042.2012.03127.x.

10. Kaye, D. R., Stoianovici, D., Han, M. (2014). Robotic ultrasound and needle guidance for prostate cancer management: review of the contemporary literature. Curr Opin Urol. 2014 Jan;24(1):75-80. doi: 10.1097/MOU.0000000000000011. PMID: 24257431; PMCID: PMC4000157.

11. Manfredi, L., Capoccia, E., Ciuti, G., et al. (2019). A Soft Pneumatic Inchworm Double balloon (SPID) for colonoscopy. Sci Rep 9, 11109 (2019). https://doi.org/10.1038/s41598-019-47320-3.

12. Seok, S., Onal, C. D., Cho, K.-J., Wood, R. J., Rus, D., & Kim, S. (2013). Meshworm: A Peristaltic Soft Robot With Antagonistic Nickel Titanium Coil Actuators. IEEE/ASME Transactions on Mechatronics, 18(5), 1485-1497, Oct. 2013, doi: 10.1109/TMECH.2012.2204070.