Clinicians in California are to utilize an Apple Watch system that will let them dictate their notes after a visit, and then have the transcription plus relevant health data automatically appended to a patient's medical records.
A new service for doctors from Altais, is to have them wear Apple Watches during patient visits and then use the technology to drastically cut the time needed for writing up their notes. The system, a platform developed in conjunction with Notable Health and Blue Shield of California, uses machine learning to automate parts of the process.
"Our goal is to help physicians seamlessly leverage technology to improve the health and well-being of their patients," said Jeff Bailet, M.D., president and CEO of Altais, "all while reducing administrative hassles and enhancing their professional gratification."
Instead of typing notes and entering details into a patient's Electronic Health Record (EHR), a doctor will be able to dictate them into his or her Apple Watch.
This can be done during or after the visit, but once the notes are entered into the Apple Watch system this way, natural language processing will determine the key points. This most relevant data will then be added to the EHR automatically.
"As a general internist taking primary care of an elderly population with multiple complex illnesses," says Richard Thorp, MD, president and CEO of Paradise Medical Group, "I will now have a maximally efficient workflow, streamlined data entry, and patient input pre-built into each of my patient encounters, and that is extremely exciting."
As well as cutting down admin time for doctors and making sure data is captured accurately, the service is intended to directly assist patients, too.
A Blue Shield app will allow patients in the region, and whose doctors are on the program, to be reminded of appointments, check their insurance, and also run self-assessment health surveys.
The Apple Watch is increasingly being used by health professionals, though it is chiefly assisting researchers looking into hearing, reproductive and general health.
15 Comments
I record health visits with my doctor.
How long until it turns out that snippets of audio are being reviewed for quality by an off-shore third party?
As a physician, working at a hospital system that utilizes the “Cadillac” of the electronic medical records (EMR) software, I can tell you that the machine learning and “automation” has a long, long way to go.
The “automation” part of the machine learning basically looks for key words in my documentation as well as looking at lab results and diagnoses that have been entered by humans to try to match patterns and suggest additional medical issues. The suggestions are wrong probably 70 percent of the time, and have been already documented 25 percent of the time (but the software doesn’t recognize this).
The few times that the suggestions are “relevant,” I’m getting asked to clarify something like which bacteria is responsible for causing a pneumonia. How the f*%k should I know? Most of the time respiratory cultures don’t grow the culprit bacteria; if it DOES grow, it usually takes a few days before there is enough to identify, and we document that anyway.
I’ve found that the AI portion of this 100’s of millions of dollars EMR is more of a distraction than anything else. I don’t know any physician who doesn’t simply ignore it.
On a related topic, I would point out that the documentation that a physician dictates must meet an EXTREMELY complex set of criteria, covering tons of nonsense bullet points that change based on the patient and various diagnoses. Dictating an entire note as a single recording into a watch would bypass all the advantages that the crazy expensive EMR software grants you, such as automating certain parts of the note (insertion of lab values, test results, etc.). The article also doesn’t say whether it’s (A) SIRI transcribing (which works extremely poorly when medical terminology is involved), (B) Custom dictation software such as Dragon Medical is involved, or (C) A human transcription service types this up.
Lastly, to those of you that think computers are “good” at reading studies such as XRays, EKG’s, etc, I can assure that while this may be true in science fiction movies, these things suck in real life. The automated EKG interpretation algorithms have been progressing for decades, but still suck. They get some things right, but get just as much wrong. We aren’t going to see these things nearly as good as humans anytime in the foreseeable future.