Front Page

Monday, June 14, 2010

Native or Mobile Web - What's Your APPetite?

I'm often asked which is the better approach to mobile applications (apps) - a native app or a mobile web app - for accessing information via smartphones. This question has become increasingly relevant given the tremendous potential impact of mHealth on physician adoption of IT. In fact, physicians continue to adopt smartphones at a greater rate than the consumer market, therefore, this question will become increasingly relevant for organizations and vendors to consider as part of their mobile strategy.

If I had been asked this question only a few years ago, I would've undoubtedly said native apps are preferable if only because cellular networks and Wi-Fi connectivity were not nearly as widespread as they are today. Similarly, device browsers and form factors have greatly improved, making browsing and navigating the mobile web much more useful. Owing to at least these two important factors, the decision to go native or mobile web is not nearly as cut and dry as it has been historically.

Essentially, a native app refers to an application that is installed on a mobile device (usually a smartphone). Although apps developed for and distributed through Apple's iTunes App/Store predominate, other mobile platforms, namely Android and to a lesser extent BlackBerry, are beginning to make headway in the native app space. Still, people tend to think of Apple devices when discussing apps, if only because more data exists on their distribution and use, due in part to the closed ecosystem that Apple has created via the App Store model. 

In simplest terms, a mobile web app is an icon on a device's home screen that links to a website that is optimized for a smartphone's mobile browser. Currently, most web apps require connectivity (either Wi-Fi or cellular) for use, although this could change to the extent that web apps begin to use HTML5, Google Gears, or Widgets to deliver content.

So which is the better app approach? Sorry to disappoint, but the answer is, it depends. There are advantages and disadvantages to both shown in the following table:



Whether a native or web app strategy is pursued can depend upon many factors including some of the following:
  • Support environment - Does the organization have the IT support infrastructure to support devices installed on mobile (remote) devices that may not be easily retrievable (e.g., physicians who practice one day/week at the hospital)?
  • Device ownership and management - Are devices provided to end users by the organization or do users provide their own devices? Is there a narrow and tightly managed list of supported devices or is it a take all comers approach?
  • Version control - Does the organization require lengthy internal certification before accepting newer versions of apps?
  • The number of users - Does the need organization need to support the deployment and administration for a select number of users or many hundreds of users (e.g., an entire staff of physicians)?
  • Connectivity - How widely available and stable is Wi-Fi and/or cellular connectivity (e.g., hospitals are historically inconsistent with respect to Wi-Fi and cellular access)?

Ultimately, the ideal position for both a vendor and a customer/end user is to be able to offer and use either or both application options.

Monday, June 7, 2010

You Say Physicians Don't Like Technology?

















Many in the healthcare IT industry generally believe that physicians don't like technology. They cite years of research that shows physicians do not adopt the technology that is ostensibly purchased for them. The research, it turns out, is true - fewer than 10% of hospitals have achieved significant physician adoption of Computerized Physician Order Entry (CPOE), and fewer than 5% have achieved adoption of electronic documentation. Most of the"successful" adoption comes from Academic Medical Centers who employ physicians and residents, and therefore, can control system use to a greater extent. But 90% of US hospitals are community hospitals with largely voluntary staff comprised of independent practitioners - physicians who can and do practice elsewhere including multiple offices and even other hospitals. I would argue these community hospitals are the true test bed for physician adoption of IT.

The notion that physicians don't use technology simply because they don't like it is incorrect. Healthcare is replete with examples of physicians incorporating ground-breaking technologies of all kinds into their practice of medicine. From medical devices like implantable defibrillators to the most sophisticated imaging technology, physicians have shown a willingness, indeed a penchant toward adopting technology. These examples are not limited to technologies that involve direct patient care - witness their adoption of smartphones. The number of physicians using smartphones surged to 64% in 2009, and this number is projected to grow to 81% by the year 2012. This adoption rate out-paces that of consumers, among whom 65% are expected to own a smartphone by 2012.

"Usability" is often cited as the main culprit behind meager physician technology adoption statistics. Calls for improved user interfaces and screen layouts often lead to attempts at trying to weave these constructs into EHR certification criteria, for example. Indeed, the talk in the industry of late is around trying to impart usability as a requirement of Meaningful Use certification. As with prior attempts to legislate usability, however, these efforts are largely doomed to fail as the color, size and location of a button or a screen is not the primary culprit behind historically poor physician adoption.

The primary reasons for poor adoption have more to do with utility than usability. Simply put, if the technology is of no real benefit (or worse a detriment) to the physician and their practice of medicine, they will not use it. CPOE is the poster child for this challenge. Since before the Institute of Medicine's landmark 1999 study "To Err is Human," the industry has tapped technology, namely CPOE, as the keystone for reducing medical errors. Despite broad agreement on this as a chief benefit of CPOE, physicians have shown no real inclination to use these systems. Do physicians not believe in reducing or avoiding medical errors? Of course not. Instead, physicians struggle with systems that do not support their logical workflow and require them to provide information and respond to alerts that are better suited to other clinicians such as nurses, pharmacologists, radiologists, etc. These systems consume additional time on their busy schedules - time they cannot spare (a 10% reduction in physician productivity results in a 20% reduction in revenue).

Ask yourself, would you use something that provided no direct benefit to your daily work, or worse, provided no benefit AND took more of your time? What if that "something" wasn't even designed for your use, would you use it then? That's essentially what we are asking physicians to do - use technology that wasn't designed for their benefit, but we feel is worthwhile nonetheless.

You say physicians don't like technology? I say they don't like technology that does not benefit their practice of medicine.