Q&A: Building an autonomous module to monitor astronauts' health

Dr. Jim Feng, CEO of Phyxable, discusses the company's work with the Canadian Space Agency's Health Beyond Initiative to develop innovative tools for space health.
By Jessica Hagen
11:14 am
Share

Photo courtesy of Phyxable

Phyxable, a virtual care platform that connects patients with physiotherapists and offers users tools to be proactive with their health, was selected by the Canadian Space Agency as one of five companies to contribute to its 2023 Health Beyond Initiative, which supports sustainable and innovative solutions that address health challenges faced by astronauts.

Dr. Jim Feng, CEO of Phyxable, sat down with MobiHealthNews to discuss the Connected Care Medical Module the physiotherapy company and its partners are working on with the Agency to analyze astronauts' health during space flight. 

MobiHealthNews: How is Phyxable working with the Canadian Space Agency?

Dr. Jim Feng: CSA and NASA, and everybody who is currently on this new space flight journey to Mars, we're looking for better ways in creating human health. And if you look at space health, especially deep space health, where there's long flights that could be six months or more in duration, and in a microgravity situation, how you take care of those patients and take care of ourselves really is much different than what we do on Earth. 

We don't have the resources, number one. That's the key component there. And then you can only have, probably, generalists who know a little about a lot of things, but not a lot about little things. 

So one component of the CSA, and that's public information, is that the RFP [request for proposal] that they put out was specifically for two components. One is the hardware side. So how do you actually build little clinics and hospitals and that can be autonomous in nature, as well as self-sustained? So we built this really cool medical pod, and we'll be able to showcase it in the upcoming months.

It'll be at the Canadian Space Agency for NATO and NASA for people to come through and view, so it's really exciting times. So we have this pod that's fully autonomous. There's like 30 solar panels, a rain catch system, UV sterilization, like you name it, we got it. We even have robotics in there. 

The hardware is built by three partners. So it's Phyxable, Micron Digital and WizCraft Design. So we have all the skill sets to build the components, the hardware components of it. And yeah, the pod is actually called Advanced Medical Pods.

And then a key component of this is that it has to be interoperable. You have to be able to plug and play different devices in there to be able to get different solutions. Because we can't build it all, at the end of the day, we want to be able to integrate. So that's key. So there's going to be a lot of players that are going to be working on the Phyxable platform.  

And then, on the other side, is the software. We created a really cool AI-based tool that we're going to launch to the world, and I think it solves a big problem for interoperability. It's called Morpheus. There's more to come. That's Phyxable and Micron Digital together. We're working on the project together, led by Rohit Seth.

MHN: How is the module going to connect providers here on Earth to astronauts in space effectively, especially since there can be a lag in communication time depending on their location?

Feng: There's definitely a delay. So I think the difference of what we're building, along with a mesh network, actually, that Micron Digital has, that we're incorporating into the system, allows for different ways of connectivity.  

It's fully autonomous in its own right. It can work on its own, but if you need to branch out, it has a way to synchronize with the cloud on Earth, as well as back. And in regards to telemedicine, the conversation changes in how you deal with cases. The workflow changes. If you have a change in workflow, you have to change how the information is relayed from, you know, texts/voice, images, closed captioning and things in between. 

And what we're projecting is we'll have a demo up and ready by our showcase in the second week of July. So that'll be when we'll actually fully be out to the world.  

A lot of stuff has to be autonomous on the ships, on the space station, on the gateway, on the lunar bases, and hopefully, eventually on the Mars bases. And then that's where really good AI can come in to make some suggested implementation tasks and how to do it for the generalists that are there as well.

So these are things that we're building for the system as well. So we're really excited for that, because I think ultimately it really expedites how we work here on Earth.

Share