Eyeware Computers Circumvent Logging In

It's standard practice to auto-minimize / lock electronic health records (EHRs) after a few minutes of inactivity. It's a reasonable practice: people forget to lock the computer when they walk away, so the computer needs to auto-lock to prevent unauthorized access. Unfortunately, this practice is extremely frustrating for users. Logging in and unlocking the EHR 20 or 30 times daily is extremely annoying. I've spoken with dozens of doctors who enter 3 sets of usernames and passwords to access their hospital EHR. Just imagine going through 3 sets of usernames and passwords dozens of times daily. It's quite aggravating.

Google Glass and other eyeware computers have sensors that can detect when the user takes the glasses on and off. This is a powerful concept. It means that users only need to authenticate and login once per wear instead of once every ten minutes. Eyeware computers, by virtue of the inherent advantages of the form factor, can perform certain tasks in 10 seconds that take 30-45 seconds on traditional form factors. Let's walk through a simple scenario - looking up the patient's blood pressure - to see how the substantial time saving is possible:

Laptop: wake laptop from sleep, login to Windows, login into Citrix (optional depending on the hospital), login to EHR, find patient, find blood pressure.

Smartphone/tablet: wake from sleep, login into Windows/iOs, login to EHR, find patient, find blood pressure.

Glass: nod head to wake Glass from sleep, ask "Ok Glass, what's Mr. Smith's blood pressure?"

Glass doesn't replace traditional EHR form factors such as laptops, tablets, and desktops. It simply doesn't have the screen real estate to act as a complete form factor replacement. But as a complementary form factor, Glass excels at streamlining brief EHR interactions that are cumbersome in traditional form factors.

Convention or Configuration in Healthcare?

This post originally appeared on HIStalk

One of the classical debates in computer science is convention versus configuration. This debate manifests in programming language design, product design, marketing, what hardware and software can and cannot do, and how they do it.

In the early eras of computing, configuration reigned supreme. The first computers had no security, no encryption, no rules, and very little if any software infrastructure to sit on top of. The early computers were in many ways a clean slate, akin to John Locke’s tabula rasa. Software and hardware were completely configurable. Before the Apple II, the only people who could acquire a personal computer were those who knew where to buy all of the components and how to assemble them. Assembly often times required soldering. Talk about high barriers to entry.

Apple has risen to become the world’s most profitable company because it embraces convention and eschews configuration wherever possible. Apple pioneered the first personal computer hardware that people couldn’t physically open or configure. Just as importantly, Apple makes software design decisions for its customers so that they don’t have to. It turns out that people want computers that "just work." They don’t want to deal with hundreds of settings and options.

So how has healthcare IT faired in the continuous march towards convention in favor of configuration? It’s been a mixed bag.

In many ways, healthcare IT has always been very conventional. Follow the rules, fill in the boxes, and get paid per Uncle Sam’s protocol. If you deviate slightly, you don’t get paid. Vendors reflect their customers, who in turn reflect the reimbursement system.

In other ways, healthcare technology has become far more configurable. Many EHRs boast over 500 or even 1,000 templates to expedite data entry. There’s no need for that many templates, but their existence reflects the underlying reality that most doctors would in their ideal world document according their own standards.

Epic takes a lot of heat for not being configurable enough. Many say that’s not true. Most of Epic’s practices that limit configurability are probably made in light of the fact that hospital EHR vendors are service companies, not software companies. They know that once you open the Pandora’s box of customization, it never ends. Epic focuses on the proven (and extremely difficult) training and deployment process. Cerner accommodates more configurability but has a poorer deployment record.

Which way should healthcare IT trend? Should vendors make more decisions on behalf of their clients? Because of their access and insight at other organizations, vendors may be more qualified to many decisions given how crucial IT is to most clinical functions today. Or not. Leave a comment with your thoughts.

EHRs Propagate "Best" Practices

This post originally appeared on HIStalk

Earlier today I had a discussion with a critical access hospital. I discovered that every doctor at the hospital had developed his or her own insulin sliding scale. The sliding scale paper forms were actually titled "Smith’s Sliding Scale" and "Johnson’s Sliding Scale" (names changed to maintain anonymity). All on-staff physicians completed medical school within 10 years of one another, but none of them could agree on a standard sliding scale.

The hospital deployment team encouraged the hospital to standardize. So we Googled insulin sliding scales. We found that virtually no one on the Internet agrees what the standard insulin sliding scale should be. We did find a sliding scale on CMS’s website, but there was so much other (mis)information on the web, I could hardly consider CMS’s sliding scale "official."

We herded sheep and got all the doctors in a room together to agree upon a standard sliding scale for the hospital. I was curious to learn why they preferred different sliding scales, and what evidence each had to support their claims. Not a single doctor tried to claim their sliding scale was a best practice, or that it was even backed up by any clinical evidence. Each doctor simply stated that they had been taught their respective sliding scales long ago and that everyone else’s was inferior to their own.

Everyone in the industry knows that EHR deployments bring about major workflow changes. Depending on the hospital’s previous policies and the flexibility of the EHR being installed, these changes can be dramatic. One of the most frequent change management challenges I’ve witnessed has been standardization where it didn’t previously exist. EHRs can enforce organization-wide standardization where it was previously impossible to enforce. Before, physicians could say, "I don’t care about the other doctors, my template / order / sliding scale / frequency / security / whatever needs to be different." Now it’s the other way around. "But the system doesn’t support that, so you need to come to an agreement with the others." It’s pretty hard for doctors to argue against that. In response, many doctors have quit or retired rather than conform.

EHRs present opportunities to standardize care within and across healthcare organizations and providers. This is not necessarily a good thing. We need to continually experiment to figure out what really is the best solution, because most of the time, we don’t actually know what the best solution is. It takes 20 years to standardize medical best practices because we need to experiment and collect huge amounts of data before we can be certain in medicine. How can we expedite that process?

We’ll experiment, collect data, and analyze it in literally every conceivable way. Everyday there are hundreds of thousands of clinical encounters that each generate dozens of data points. With enough data and the right tools, we will correlate every data point. Through raw statistical analysis, we’ll be able to predict diagnoses and outcomes with high levels of confidence. We’ll suggest diagnoses with confidence intervals to doctors as they enter data into EHRs.

When every doctor is presented a probability distribution of diagnoses backed up by billions of data points, they will usually agree with the computer. In time, opinions and practices may slowly converge.

Context is King in Eyeware Computing

In the 1960s, there was a single computing context: large projects run by large companies and governments with lots of money and engineers. Today, billions of people compute in quite virtually every context of their lives (drinking, driving, watching tv, reading, etc). As computing form factors have shrunk, human-computer friction has decreased and thusly context has increased. These attributes are inversely correlated.

About 60 seconds into this video from Google IO 2013, Robert Scoble talks about the power of context with regards to Google Glass. He's right. Context defines eyeware computing.

Google issued four user interface guidelines to Glass developers:

1. Design for Glass

2. Don't get in the way

3. Keep it timely

4. Avoid the unexpected

The first one is rather generic, but the final three guidelines collectively assert that apps need to be contextual. Contextually relevant information does not get in the way, doesn't show up at the wrong time, and doesn't show up when it's not expected or desired. Context is king.

Smartphones are very contextual devices. They know who you are, where you are, what you like, where you need to go, and more. Glass knows all of that too. But in addition to all of that, Glass provides one incredible new context: what you see. This is profound. Glass will eventually be able to pull virtually any information that you need without any user intervention based on what you're looking at. Scoble is right: Google needs to develop a contextual OS with triggers for all kinds of contexts - such as mode of transportation, entering a room, seeing a person, etc - that developers can plug into. The number of contextually-driven triggers is endless. In healthcare, for example, Glass could:

1. display DNR status to clinicians if a patient codes

2. beep if the active chart in the EHR is not the patient that the doctor is looking at

3. automatically display new information that's available for the patient since the last encounter

Please comment and list contexts where you think Glass and other eyeware computers can automatically provide guidance.

In Defense of Email, Subjects, and Threads

Everyone hates on email. Facebook is trying to kill email by making Facebook chat - a social stream - the primary form of digital communication across all computing devices. They're trying to "reduce complexity" by removing the "clutter". Facebook is adapting the key tenet of its flagship product - the NewsFeed stream - to other communication channels.

I'm going to defend email. Facebook's social stream in no way reflects human thinking and processes, especially when people are collaborating across multiple projects. Facbeook-style streams are useful for fast and immediate communication, but they're terrible at organizing and managing complex information. The more complex and varied the nature of the information, the less effective Facebook is at managing it.

Google's Matias Duarte, Director of User Experience for Android, has recently outlined a company-wide effort to present data to users in cards (Google Now and Google Glass already implement cards). Per Duarte, cards "make very clear the atomic unity of things; they're still flexible while creating a kind of regularity." He is absolutely correct. The human brain perceives and organizes information into "units." Some units are big, others are small. They are all atomic - singular and immediately understandable - in nature:

Yuo cna reed tihs becaseu the rbain recognizes langaueg in antomci "wdor" untis

Conversations that include thousands of words are remembered as a single conversational unit

Emails are organized into threads by subject

Email more accurately models how people think about and manage the disparate information in their lives, if used correctly (please don't respond to my "thank you" email with a totally random request). In our digital age of hyper-everything, email overflow is a real problem, but the basic structure of email as a way to organize and think about information is sound. I'm not suggesting that email is perfect. It's not. Email has many problems, but it's far better than social streams that don't model how humans manage and think about different pieces of information.

Most of email's problems are not inherent to the medium itself, but how it's used. Perhaps the biggest problem with email is that it's abused for every kind of communication - office jokes, to-do lists, email blasts, information feeds, project updates, calendar invites, and more. There are specific applications designed to accommodate each of these scenarios, but most people don't know or don't care to use different applications for different in different contexts. Unfortunately, everyone just falls back on email because it's simple, free, and they already know how to use it.

Perhaps the ideal email client of the future is the one that recognizes all of these distinct use cases, and automatically sorts email accordingly.