Dr. Steven Horng launched a Google Glass pilot program at Beth Israel Deaconess Medical Center late last year because he thought the futuristic device could help save lives. One night in January proved that.
A patient with bleeding in the brain told Horng he was allergic to certain blood pressure drugs — which the doctor needed to slow the hemorrhage — but didn’t know which ones. Horng had little time to leaf through the man’s medical files or search for records on a computer, but with Google Glass, he didn’t have to. Instead he quickly called up the patient’s information on the device’s tiny screen and saved his life with the correct medication.
This week, Beth Israel Deaconess is expanding the use of Google Glass to its entire emergency department, and the hospital said it is the first in the United States to employ the device for everyday medical care. Now, whenever ER doctors begin their shifts, they will slip on pairs of the high-tech glasses as routinely as they put on scrubs.
“We’re doing this to prove that the technology can work and really motivate others to explore this space with us,” said Horng, who helped pioneer the use of Google Glass at the hospital.
Google Glass — called simply “Glass’’ by the Internet giant and users of the device — is the company’s foray into wearable computing, which some technologists believe is the next frontier in mobile devices. The eyeglass frame has a small clear screen over one eye, which can display pretty much anything users can also see on a smartphone or mobile device, from email to web pages. Glass devices that are specially programmed, such as the ones at Beth Israel, can access information for specific uses, such as medical records, that can only be seen by the wearer.
In addition to its small screen, Google Glass features a high-resolution camera that can capture videos and still images. A thin touchpad on the right earpiece allows the user to control the device in the familiar style of a laptop, though Glass also responds to voice commands and head movements.
It is not yet commercially available, and Google has not announced a public release date. But the company has made prototype versions available to hand-picked “explorers” — mostly software developers willing to shell out $1,500 apiece — to identify potential commercial uses.
One such use may well be the medical community, which has been quick to embrace Glass. Hospitals and doctors across the US have been testing it to conduct remote consultations with specialists, for example, record live-videos during surgery for instruction, or provide quick access to patients’ charts, vital stats, doctors’ notes and other medical information–all without having to use their hands to operate a computer.
“And not only is it hands free, it’s always on, always in front of you and always giving you information,” said Horng, adding he frequently uses Glass to retrieve information that comes up during the course of conversations with patients.
“Rather than having to excuse myself, it means I can quickly access that information without having to interrupt the patient, lose eye contact, or even leave the room,” said Horng, who also holds degrees in computer science and biomedical informatics.
Getting used to Glass isn’t easy.
The screen sits just below the top frame of a pair of glasses, so that it is unobtrusive when a wearer looks straight ahead, but snaps into focus when a wearer’s eyes glance up and to the right. but Horng said he spent four months perfecting the display so that it is easy to read. It shows only a few lines of text at a time, and doctors can scroll through additional information by tilting their heads.
Beth Israel doctors get access to their patient’s records because Google Glass can read what are known as Quick Response codes — or QR codes — that are much like bar codes found on the sides of cereal boxes and other commercial products.
Beth Israel has begun posting QR codes on the doorways to patients’ rooms. Each code is unique to that patient, linked to his records that are stored on the hospital’s electronic database. Before entering the room, the Beth Israel doctor can scan the QR code with his Glass, and the patient’s information is promptly displayed on the screen.
The Glasses used by Beth Israel were modified for the hospital by a San Francisco startup called Wearable Intelligence to read the QR codes. Patient records are not shared with Google, according to hospital officials.
Other hospitals are also experimenting with Google Glass. In February, surgeons at the Indiana University Health Methodist Hospital in Indianapolis used the device during an operation to remove an abdominal tumor. Voice commands enabled the doctors to call up the patient’s MRI scans and keep the images in their field of vision for easy reference, without having to put down a scalpel.
At the University of California-Irvine Medical Center, experienced doctors have been able to monitor procedures performed by resident physicians who are wearing Glass devices that stream video live to their mentors.
Later this month, 13 doctors from around the US will gather at Google’s Cambridge office to pitch clinical uses for Glass in a contest sponsored by the Presidential Innovation Fellows program, the Massachusetts Institute of Technology, and the website MedTech Boston.
The common goal of many Glass projects is to keep health care workers’ hands free to perform their jobs.
“It’s literally the holy grail of hospital IT,” said John Rodley, cofounder of a Cambridge startup called Twiage, which is developing an application for Glass for ambulance attendants to quickly relay patient info to hospital emergency rooms. An EMT wearing Glass could just use voice commands to snap photos of a patient’s injuries, dictate notes, and send them to the nearest ER.
Beth Israel and Wearable Intelligence plan to expand the hospital’s use of Google Glass in the near future. Additional uses could include using Glass for doctors to consult with each other remotely.
For example, at Rhode Island Hospital in Providence emergency room doctors are using the device to stream video of patients who arrive with burns or rashes to dermatologists who can help direct the course of treatment.
“It’s essentially a live teleconference from your point of view,” said Wearable Intelligence chief executive Yan-David Erlich. “You can call a physician in another wing of the hospital who can see what you see and guide you by talking just to you.”