Electronic [Medical, Patient, Health] Records (2006)
Workshop by Richard Rathe, AMIA Meeting, November 07, 2006
Original Slides Abridged, Annotated, & Converted to HTML in 2025
Intro (Bad News & Good News)

Learn to think like an Informatician…
What's Left Out?
- Communication
- May be the key function for clinicians!
- Social Construct/Social Contract
- Users work within a social context.
- Process
- Does not address quality or optimization.
- Computers can easily make things worse!
An Example
Problem: In an ambulatory clinic, new lab results and follow-up such as repeat mammograms are falling through the cracks.
Solution: An automated alert system?
Right?
Not so fast…
The Cedars-Sinai Experience
…the biggest complaint — with potentially dangerous implications — involved the automatic alerts that flashed on the screen every time a doctor made an out-of-the-ordinary request. Designed to catch errors before they occur, the alerts became an unending series of questions, reminders and requests on fairly basic decisions.
A Simpler Solution?
Prior to the 2000s many Primary Care practices had a simple accordion file, divided by month. As lab and imaging results came in, a copy was filed in the appropriate slot as a reminder. For example, in January you receive a mammogram result recommending follow-up at six months. You or your staff would file the report (or a copy) in the August slot. At the beginning of each month someone would pull out the current batch and take appropriate action. Nothing was lost. Nothing was overlooked.

Bottom Line: Any automated reminder system must either a) be as simple to use as an accordion file or b) add significant value for the folks who have to use it.
What's Going On Here?
Norman's Law — A major problem occurs when those who suffer from technology's deficits and those who benefit are not the same people. (Donald Norman, Usability Expert)
Horky's Law — We generally deploy computers because we want to control something [or somebody!]. (Ralph Horky, Hospital VP)
Who suffers? Who benefits? Who's in control?
Learning from Failure
- Ambulatory Example
- 1996 article describes a $500K mistake. Users elected to enter bogus information just to
make the system happy
so they could get their work done.
- 1996 article describes a $500K mistake. Users elected to enter bogus information just to
- Inpatient Example
- In 2003 Cedars-Sinai Medical Center withdrew a computer-based physician order entry system because it was too disruptive and too slow for clinicians to use.
- Error Reduction
- 2005 U Penn study found that computerization did decrease certain types of errors, but it introduced entirely new classes of errors that hadn't been anticipated.
There are two possible outcomes: If the result confirms the hypothesis, then you've made a measurement. If the result is contrary to the hypothesis, then you've made a discovery. — Enrico Fermi
Input / Data Capture
Ideal: Direct data entry by the person who is responsible for it.
Reality: Data entry is often delayed and done by a proxy.
Reality: Data entry is time consuming and clinician time is valuable.
Example: Cost of transcription transferred from a hospital to the radiologists working there.
Since this talk was given, there has been a huge cost transfer from organizations to individuals providing direct patient care.
Scope?
Collecting Everything vs The Minimal Necessary
Example: The Continuity of Care Record

Architecture
- Distributed (from many, many)
- Federated (from many, one)
- Monolithic/Repository (THE ONE!)
Most real systems are a mixture.
Take Home Ideas
1) EHRs are not just computerized paper medical records:
- Clinical/Economic/Political Aspects
- Definition Varies by Point of View
- The more interesting aspects have less to do with record keeping and more to do with process improvement.
2) Good intentions are often overwhelmed by one or more of the big problems
faced by those who implement EHRs:
- Data Capture Hurdles
- Misalignment of Work/Benefit
- Garbage In/Garbage Out
- Lack of Standards
- Unintended Consequences are Everywhere!
3) 3. Use Grudin's Razor
to see more clearly:
When those who benefit [from a technology] are not those who do the work, then the technology is likely to fail or, at least be subverted.
4) Consider Norman's Four Efficiencies:
Does a proposed innovation allow someone to:
- Do more in less time?
- Increase the diversity of what is done?
- Communicate with others?
- Transform the work process itself?
- (Donald Norman,
Things That Make Us Smart
)
5) Should systems be modified to fit users needs, or should users change to fit the systems' needs?
- We hope for the former.
- We generally get the latter.