North America

Sneak Peek: Inside Philips' health tech research labs

MobiHealthNews got a tour of the new technologies under development — including contactless monitoring, augmented reality for surgery and the NICU of "the future."
By Laura Lovett
02:00 pm
Share

P.J. Keenan, Development Engineer, Philips Research models the HoloLens2 technology. 

While Philips made a name for itself in the 20th century making TVs, VCRs and other electronics, its more recent history is exclusively focused on the healthcare industry. But even the fabric of Philips’ healthcare research is quickly shifting, incorporating new technologies—both in hardware and software. 

“In the past we were always thought of as a hardware company. So, our head of strategy said ‘Philips made stuff you could put in a box and drop on your foot,’” Joe Frassica, chief medical officer and head of research, Philips North America, told MobiHealthNews. “Now 60% of our R&D work is with software, AI and machine learning.”

At the company’s Cambridge, MA research headquarters, the team is working on myriad new tools including contactless monitoring, augmented reality and new technologies for the NICU. MobiHealthNews got a sneak peek at three new technologies in development at Philips labs. 

Contactless monitoring

For clinicians, finding a patient’s pulse and respiration rate may be more hands-off in the future. Philips innovators are now looking to employ cameras and software get these vitals. 

“The red box, centered around the face, looks for very small changes in the color of the skin. So those are our heartbeat and appear all over body and anywhere we have skin,” Kees van Zon, principal scientist and group leader at Philips Research, told MobiHealthNews. “It is easy to detect. We can’t see those changes with the human eye but a camera and software can actually detect that.”

The technology is also able to use the camera and AI to analyze a patient's breathing. 

“Another important vital is the respiration rate, the breathing rate. The way we measure that is by simply looking for the motion of our upper chest due to breathing,” van Zon said. “You can easily see that with the human eye. The system can also detect this. We then track this respiration rate. “

In the lab, a patient or participant sits in front of the camera while it is calibrating. But the future uses could be a bit more on the go. For example, van Zon said the technology could be used to monitor patients waiting in the emergency department waiting room. The idea is that when a patient checks in they are asked if this camera is allowed to monitor their vitals while they wait to be seen. If they consent, the camera circulates the room and can take in their vitals. 

“As soon as we encounter a consenting patient it is going to take the vital signs of that person and put them in that patient's record and we know what person it is because we have added this patient recognition feature,” van Zon said. “So now we build up a trend of vital signs for each person that has consent. In the background we have an algorithm running that looks at all these trends and is going to be able to check and flag if there is any sign of deterioration. If that happens there is an alert to a care provider.”

HoloLens for surgery 

 

A team at the lab is exploring the use of augmented reality to help clinicians during interoperative procedures. 

“We have 3D CTs, 3D MRIs, 3D ultrasounds but we are still providing it to the healthcare provider on 2D screens as slices and then asking them to reconstruct it into something in a volume in their head, which physicians are great at doing, but we felt there was still an opportunity there to do more around presenting that data,” Molly Flexman, principal scientist at Philips Research, told MobiHealthNews. 

With this issue on the brain, researchers looked at a number of options including 3D printing, gesture control and voice control. But the year that these conversations were happening, 2016, was key. That was the same year that the HoloLens came out.

“As soon as we started using the HoloLens it was our belief that this was the solution we were looking for it and it seemed to address a lot of the challenges that we were looking at,” Flexman said. 

Now clinicians can access a number of images through the HoloLens, while their hands are busy with surgery. Philips is currently working with the HoloLens 2 prototype. Through this program clinicians can view images while in surgery and control the system with their vision. The images can also be seen in a 3D version. 

“The reason why we are focusing on interoperative is that that is the pain point that physicians came back with. What they all said is 'my screen is never where I need it so I’m always working awkwardly',” Flexman said.  “As soon as we get a couple team members in the room, we either have the screen in the way or it is too far away and there's a glare. … We’ve just had such a strong reaction around this problem that has become a bit more of our focus around how we address this.”

NICU tech

The Neonatal Intensive Care Unit (NICU) is another major focus area for Philips’ labs. A team of researchers, designers and scientists are working on re-hauling the typical NICU with both new tech and design. The team has designed single family rooms and bay rooms to cater to the needs of high-need newborns. The rooms are complete with a projector for calming images, a tool that reminds visitors to wash their hands, and a tool that allows care providers to see a baby’s condition with the scan of a badge, instead of needing to enter the room. 

Among the various tech features is a new platform that has an interface for families and caregivers. 

“One of the things that nurses like about this tool is that they typically walk around with a little cheat sheet with what they have to do in regards to care for that baby,” Operations Release Manager Jessica Durney told MobiHealthNews. “The nice thing about this is the task list [on the dashboard], so if they have to schedule something they can click and drag this over. The other thing is this will show up in the parent app. So, the parents can see, ‘oh my baby is going to have an MRI; it may not be a good time for me to come in.’ Or if they want to be here, they can be. Then when they get done they can check off the box.”

The platform also has an education component for parents. So, if their baby has a specific need or issue the family can quickly access resources explaining the condition. 

“They can at least get a basis and talk to the physician if they have any questions,” Durney said. 

The technology also has a portal for clinicians, which integrates with the EHR and which pinpoints different issues and trends. While many vitals may not be normal for a patient, this system will identify the ones that are dangerous. 

“When I open the EMR, everything is in red because the patient is sick — and that doesn’t help me because there is too much data there. Here we can extract the data and put it in a format that I recognize,” Frassica said. “So, this is how I round on patients head to foot—so when I come in the room, or outside of the room in the simple view, I can know I need to look at the lungs and the heart today.”

Share