This year has seen a range of medtech advances. All with the potential to improve patient care, diagnostics and imaging.
We’ve pulled together a list of six of the most exciting ones that we read about.
Six inspiring medtech advances in 2019
A robotic arm that doesn’t use brain implants
Why make this?
Brain computer interface (BCI) technology could help paralysed individuals and those with movement disorders live more independent lives.
BCI has previously focused on invasive brain implants. Implants work but are risky for patients, complex to fit and expensive. But using external sensors means receiving “dirtier” signals. This is due to signals travelling through brain tissue and bone.
Researchers have developed a new framework that uses new sensing and machine learning techniques. With the framework it’s been possible to detect signals from deep within the brain, allowing for a higher level of control over a robotic arm than the previous non-invasive methods.
Professor Bin He and his team showed the framework enables a “mind-controlled robotic arm exhibiting the ability to continuously track and follow a computer cursor.”
“This work represents an important step in [non-invasive] brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones.” Bin He, Department Head and Professor of Biomedical Engineering at Carnegie Mellon University.
Next gen CT imaging
Why make this?
Computed Tomography (CT) imaging is used by radiology departments worldwide to help with patient diagnosis. It’s like a 3D X-ray and uses X-ray radiation, but it can be used to examine more than just broken bones. However, the level of detail in traditional CT is still not as high as it could be.
Building on CT, by improving the level of detail received in imaging without increasing patient exposure to X-rays during scanning.
Philips researchers and a global team at several universities and hospitals are leading research into the next generation of CT imaging, called Spectral Photon Counting CT (SPCCT). The clinical prototype handles photons differently to traditional CT. Where CT passes X-rays through the patient, while spinning the system around them and recording an image of the photons that make it through the patient, SPCCT goes further.
SPCCT doesn’t just look at how many photos come back through a person. It also measures the energy of each photon. With this information, it’s then possible to present a higher resolution 3D model of the anatomy under examination. It does this without increasing a patient’s exposure to X-rays when compared to original CT.
“[M]uch like we perceive different photon energy levels as different [colours] of light, SPCCT enables clinicians to ‘see’ X-rays in full [colour].” Philips
Prevent stillbirths by using the same sensors found in your smartphone
Why make this?
There are 2.6 million stillbirths per year worldwide. Detecting a foetus’s heartbeat is key to ensuring that a pregnancy is continuing as normal.
Existing monitors for foetal heartbeat and movement are bulky, expensive, and have poor battery life. They can only be set up by trained professionals.
Assistant Professor Negar Tavassolian and her team used smartphone sensors to measure foetal heartbeats and movements. Their device offers an opportunity to check foetal health at home. Its battery life also surpasses 24 hours, enabling consistent monitoring that can help detect if a change in movement has taken place.
“Almost a third of stillbirths occur in the absence of complicating factors. Our device could let a pregnant woman know if her [foetus] is compromised and she needs to go to the doctor.” Negar Tavassolian, Associate Professor at Stevens Institute of Technology
Source: Stevens Institute of Technology
A robotic capsule that could check colons for signs of disease
Why make this?
Checking for visible signs of abnormalities in people’s lower gastrointestinal tract means doing an endoscopy. Endoscopies are invasive and painful procedures. At a global level, 8 million people die per year from diseases of the digestive tract.
A way to examine the gastrointestinal tract that’s less invasive than the present method. The new approach had to be easy to guide and offer equal or better imaging quality.
A tiny robotic capsule, the “Sonopill” that takes micro-ultrasound images. Micro-ultrasound imaging is better at identifying some cell changes related to cancer.
Professor Pietro Valdastri and his team have now proven that it’s workable to guide a robotic capsule inside a colon and take micro-ultrasound images. A robotic arm outside the patient has a magnet on it that’s of the opposite pole to a magnet on the capsule and directs it. The team used artificial intelligence (AI) to check the capsule’s position.
“This discovery has the potential to enable painless diagnosis via a micro-ultrasound pill in the entire gastrointestinal tract.” Professor Pietro Valdastri, Chair of Robotics and Autonomous Systems, Leeds School of Electronic and Electrical Engineering
Source: University of Leeds
Biological imaging sped up thanks to an algorithm created for Netflix recommendations
Why make this?
The biological and chemical processes of biological specimens change quickly and continuously. This makes it challenging to get imaging of these processes.
Raman Spectroscopy checks the chemical makeup of complex samples. It has also shown promising results for analysing biological samples, like cancer cells. But it’s too slow to record the changes in biological processes.
Increasing the speed of bio-imaging while making the level of data collected manageable. The new method uses an algorithm to speed up data collection. Plus, cheaper and faster hardware in the form of a digital micromirror device called a spatial light modulator.
Hilton de Aguiar and his team’s algorithm was developed as part of a Netflix competition to predict film preferences. Netflix never used the algorithm. But it works well for reducing the amount of data needed in Raman spectroscopy to do bio-imaging. The algorithm also speeds up imaging while enabling higher compression.
“We combined compressive imaging with fast computer algorithms that provide the kind of images clinicians use to diagnose patients, but rapidly and without laborious manual post-processing.” Hilton de Aguiar, leader of the research team at École Normale Supérieure in France
Source: The Optical Society
Solving the “terahertz gap” is leading the way toward sensor innovation
Why make this?
Terahertz radiation is non-ionising and can pass through fabrics and plastics; it’s ideal for medical use. It could enable speedier skin cancer diagnosis through a portable scanner. The same technology could also do quality control in pharmaceuticals by measuring the molecules in pharmaceutical products to ensure even distribution of ingredients.
Terahertz radiation has long been challenging to achieve in a working system, leading to a “terahertz gap”.
Semiconductor chips that can emit terahertz radiation, doing so at “precise frequencies”. Emitting this radiation at pharmaceuticals in production can determine the composition of their molecules. And it can achieve a resolution that could inform cancer diagnoses.
Associate Professor Gerard Wysocki and his team developed a tablet as part of their proof of concept. It contained ingredients found in pharmaceuticals. These ingredients were in three zones within the tablet. The system managed to identify the components in their zones and where they had spilt from one zone into another, emulating when active ingredients are not mixed in properly.
They also showed how the same technology could perform detailed imaging on a coin. The system currently relies on being cooled to a low temperature to work, and so the next challenge is finding a practical way to operate it.
“Imagine that every 100 microseconds a tablet is passing by, and you can check if it has a consistent structure and there’s enough of every ingredient you expect.” Gerard Wysocki, an Associate Professor of electrical engineering at Princeton University