DMBN Web Exclusive: TeraRecon's Robert Taylor on life in the clouds

February 15, 2011
by Brendon Nafziger, DOTmed News Associate Editor
At the Radiological Society of North America’s annual meeting last November, there were two big trends in health IT: “cloud” and mobile-based solutions.

Almost every vendor flogging a health IT product had one in the “clouds” – that is, the data weren’t housed locally, but rather stored on a public or private server (the “cloud”) and usually accessed through the Web.

Why this jump to the clouds? And what role will mobile devices actually play in radiology? DOTmed News recently spoke with Robert Taylor, CEO of TeraRecon, a 12-year-old company that makes 3D advanced visualization software for MRI, PET and CT imaging, to get his thoughts on radiology’s top IT trends.

DMBN: At RSNA you showcased a cloud-based version of your 3D visualization suite, called iNtuition CLOUD. This year at the show, many vendors that do image processing, EHRs and RIS have moved many applications to public and private clouds – why?

Robert Taylor: You’re definitely right. Many, many companies are talking about cloud, and at RSNA this year we saw a lot of usage of cloud marketing. But a lot of the companies were not understanding what the true nature of cloud computing is and what the true benefits are, and we actually had to do quite a bit of work to explain to attendees what this technology really means.

But the reason we went in this direction is a combination of a few things. I mean, first of all, the infrastructure is now in place to really deliver this kind of application, and ours is obviously a data-heavy, advanced visualization application that involves the upload of gigabytes of data to the cloud every day, and then real-time interaction back from the cloud to a physician’s desktop to actually use the tools.

Really, it hasn’t been that long that the Internet technology has been available to support that. Even though the backbone has been there and data centers have been there, the availability right through the physician’s desktop hasn’t been there. But the time has come where we can feasibly deploy something like this. And that aligns with our own technology road map as our company has always been a client server company.

TeraRecon disrupted the market back in the early part of the decade, by introducing client server 3D, the AquariusNET server, and that really changed the whole paradigm of advanced visualization away from workstations into a tool that was accessible from any computer in the hospital. So our architecture has always been well suited to this kind of model, and we took the leap to get the entire application running through the Web browser, which really meant a cloud-based offering started to make sense. So, I think from a technology point of a view and an infrastructure point of view, the timing was right.

What are some of the main challenges to cloud?

Where this concept, this whole way of working, is still pretty nascent is in terms of the legal framework, and privacy framework, that health care providers have to get their arms around, to really get comfortable with putting image data and patient data into the cloud, and then operating that back into the enterprise as a service. For that reason, we put the cloud online, we deployed our data centers, we have the service available, but we basically made it free; so anybody can go to our www.terarecon.com/cloud site, and sign up for an account. We did this to really stimulate the discussion. First of all, let people try it, understand the value, discover the potential of cloud-based advanced visualization. That will foster the discussion about how we can get this deployed as a tool we use in our enterprise.

Could you talk more about the legal framework?

Traditionally if, for example, you bought a TeraRecon solution, you’d be putting a server into your data center. You’d be storing the images there. You’d have an institutional firewall that protects all of that data, and internal polices you’re comfortable with to protect the security and privacy of that data and avoid potential loss of data.

Now if you move to the cloud-based offering, there’s a pro and con. The pro is you don’t need the data center. You don’t need the internal IT infrastructure, so you can save an awful amount of internal resources in the deployment of the solution, and let TeraRecon take care of that, with our data center, our IT team managing the assets. But it does mean you’re handing over the image data to a company, to us, and then you have to be comfortable that as a health care provider you’re able to meet your obligations in terms of privacy and security and data protection, when you’re really subcontracting that to a company. That gives people pause. They have to think about what kind of contractual relationship will be appropriate to ensure that they’re meeting their obligations, and they’ve done enough due diligence on our security policies and what we’re doing in terms of safeguards to protect the data. Because nobody wants to be in a situation where they’ve handed over their data to a third party, and it got lost or disclosed inappropriately.

It’s nothing insurmountable, but it’s a new concept…Legal teams at the bigger university medical centers and bigger health care practices are putting together their point of view on how they can take this kind of functionality outside their four walls, and yet still remain comfortable with their obligations to meet all the regulations and legal requirements.

How onerous is that legal burden on practices looking to buy a cloud-based application?

It’s not particularly onerous. It’s the kind of thing they have to do anyway…I’d say this is something that is typical, routine work that is performed every year by health care institutions. We sign HIPAA agreements with all of our customers already. We already have remote access into their systems, and we’re exposed to their data in that regard. It’s just a process that has to be gone through. It’s nothing particularly major, nothing too difficult to overcome, but obviously moving into the cloud does add this dimension of the question: “How do you ensure your cloud provider is taking care of the responsibilities you have once you’re subcontracting that to them?” So nothing too onerous, but it will take a bit of time. 2011 will be a big year for cloud.

According to the Federal Communications Commission, close to a one-third of federally funded rural clinics and hospitals and close to a quarter of critical access hospitals lack broadband access. Is this broadband gap going to become an obstacle to broader deployment of cloud products?

You’re going to need a reasonable Internet connection or you’re not going to be able to get access to the service. But the momentum underway already, especially with health care facilities, onto broadband is huge. And it’s really a matter of time. This technology right now is the first time that advanced visualization, the really heavy number crunching and big data lifting of advanced visualization, is just coming onto the cloud. By the time the other two-thirds of hospitals have embraced this and explored the concept, the remaining third will surely be on reasonable broadband and take advantage of it.

How does the cost of having a service hosted on a cloud compare with having your own in-house server?

The way to purchase the cloud solution really is quite flexible. In fact, this is one of the great advantages of a cloud-based offering. TeraRecon has taken the time and effort and made the investment to set up the infrastructure, so adding a user, or adding some capacity, is a relatively easy thing to do. The big difference between the cloud offering and an internal solution is we can sell pure functionality. Whereas if we’re selling a software license to a hospital, they need to think about where they’re going to put the server, and how they’re going to maintain that server and so there’s an additional budget they have to take account of.

With the cloud, you can come in at the very ground level and buy the ability to do one case a week, without deploying any equipment, without having to make any kind of investment or long-term commitment…

I think that the cloud based model has tremendous potential for increasing accessibility of the technology, because it opens up a segment of the market, where the volume was a little low, or unpredictable, so people avoided getting into advanced visualization, and as a result patients are denied access to that kind of technology in the provision of care. Cloud lets you dip your toe in the water without a big commitment.

What’s the kind of market you see most willing to take this on?

If you’re a 3D service provider providing services to a large number of customers, working up their cases, this kind of cloud technology is a perfect fit. Or if you’re a radiology group that has a number of imaging centers and the radiologists are providing services for a number of hospitals in a geographical region, cloud is again a perfect fit.

We’re also seeing on the research side of the house tremendous interest, because any kind of research based on imaging typically involves multiple centers…

And then I think the fourth area that’s very interesting is we’re seeing those very low-volume users, who may not be particularly 3D imaging specialists, that are also very interested. So if you’re a cardiac team that’s doing a couple cardiac CTs a week, it’s very difficult to invest in a full-blown 3D solution. But if you can access all those tools on a low-volume basis, it makes sense.

You mentioned the Web access, and that brings us to mobile viewing. Where do you see this going in imaging?

I think there’s no question that mobile devices, like tablets and phones, offer tremendous value in terms of getting access to imaging data. So being a 3D provider, we developed mobile support for 3D visualization. We have an iPhone app, an iPad app. We embrace it as an extension of our Web access.

Being able to use a mobile device has many really interesting new dimensions. In Japan, for example, we had a trial of our system at a university medical system, and the surgeons obtained the iPad. They took it into the operating room, in a research setting, and were able to use the iPad in surgery to address an issue they could not address until the iPad was available. They took the 3D image, manipulated it on the iPad, and laid the iPad down right next to the patient’s head. So they had the same vasculature in the same orientation in the 3D image right next to the actual patient. That was never possible before, when the monitor was fixed to the wall or a boom arm.

Do you see radiologists making diagnoses off devices?

If you think about a mobile device, there’s the question of the marketing claim regulation by the FDA: whether it’s suitable for diagnostic use. It seems a bit odd to me to make a big focus on diagnosing from a phone. I don’t think radiologists or other physicians are really seeking permission or the ability to make the diagnostic interpretations from their phones…

But they want ready access to relevant information. Right now, you can email an image to a referring physician and demonstrate something to them in physician consulting situations. It doesn’t have to be displayed as a fully diagnostic image.

So the main role is to share images with customers --

Without question, that is the “killer application”: the ability for radiologists to deliver to their referrer... If a referrer gets just a text report, that has limited value. If it’s only accessible when the person’s sitting at their computer it has limited value, but if that radiologist finds a big aneurysm and they can deliver it through an interactive Web tool on a phone, a surgeon can see that…and they can start thinking about how to manage that before they get back to their computer. It has value. You’re adding value to the service you’re providing to the physicians, and that’s exactly what the radiologists that we’re talking to want to do.