UCLA AI Scribe Study 2026: What Physicians Need to Know
Updated January 2026
A landmark UCLA Health study published in the New England Journal of Medicine AI examined the real-world effectiveness of AI scribe technology across 238 physicians, 14 specialties, and 72,000 patient encounters. The results provide critical evidence for physicians considering AI documentation tools.
Key Finding: Nabla AI scribe users reduced their documentation time by approximately 41 seconds per note (nearly 10%), with physicians reporting potential benefits for burnout and work-related stress.
This guide breaks down the study findings, explains what they mean for practicing physicians, and provides practical guidance on implementing AI scribe technology based on evidence.
Create Your AI-Powered Documentation in 2 Minutes
Start with 20 free SOAP notes. No credit card required.
Study Overview: What UCLA Examined
Research Design
Scope:
- 238 physicians across 14 specialties
- 72,000 patient encounters
- 6-month study period
- Two AI scribe tools: Microsoft DAX and Nabla
- Control group using usual documentation methods
Specialties Included:
- Primary Care (Family Medicine, Internal Medicine)
- Surgical Specialties
- Medical Specialties (Cardiology, Pulmonology, etc.)
- Pediatrics
- Emergency Medicine
- Psychiatry
- And 8 other specialties
Outcomes Measured:
- Documentation time per note
- Physician burnout scores
- Work-related stress levels
- Documentation accuracy and completeness
- Patient safety events
Key Findings at a Glance
✅ Time Savings: Nabla users saved ~41 seconds per note (10% reduction) ✅ Burnout Reduction: Potential benefits for burnout and work-related stress ✅ Multi-Specialty Effectiveness: Benefits observed across 14 different specialties ⚠️ Accuracy Concerns: Clinically significant inaccuracies noted "occasionally" ⚠️ Safety Event: One mild patient safety event reported ✅ Provider Satisfaction: Physicians reported overall positive experience
Detailed Results: What the Numbers Show
Documentation Time Reduction
Nabla AI Scribe Results:
- Before: 4 minutes 30 seconds average per note
- After: 3 minutes 49 seconds average per note
- Time Saved: 41 seconds per note (9.1% reduction)
Cumulative Daily Impact: For a physician seeing 20 patients per day:
- Time saved: 41 seconds × 20 patients = 13.7 minutes daily
- Weekly savings: ~1 hour 8 minutes per week
- Annual savings: ~60 hours per year
Microsoft DAX Results:
- Also showed documentation time benefits
- Specific time savings varied by specialty and user
- Overall positive trend but less dramatic than Nabla
Burnout and Well-Being Impact
The study found "potential benefits for physician burnout and work-related stress" among AI scribe users. This aligns with other research:
Complementary Evidence:
- JAMA Network Open study: 31% drop in reported burnout with ambient AI scribes
- Oncology Nurse Advisor report: AI scribes may reduce documentation time and burnout
Mechanism of Burnout Reduction:
- Less after-hours documentation work
- More time with patients during encounters
- Reduced cognitive load from documentation tasks
- Increased sense of control over workday
Accuracy and Safety Findings
Critical Finding: The study noted clinically significant inaccuracies appeared "occasionally" on 5-point Likert scale assessments.
Types of Inaccuracies Identified:
- Omissions: Missing clinical information
- Pronoun errors: Incorrect attribution of symptoms or history
- Terminology mistakes: Incorrect medical terms or anatomical locations
- Context errors: Misunderstanding of clinical context
Patient Safety Event:
- One mild patient safety event reported during the study
- Event was identified and corrected through physician review
- Underscores importance of verification before signing notes
Safety Implications: ⚠️ Never sign AI-generated notes without thorough review ⚠️ Verify all diagnoses, medications, and treatment plans ⚠️ Check that clinical context is accurately captured ⚠️ Ensure all patient-reported symptoms are correctly documented
What This Means for Your Practice
Should You Adopt AI Scribe Technology?
The Evidence Supports AI Scribes If:
✅ You spend >2 hours daily on documentation ✅ You regularly document after clinic hours ✅ You feel documentation detracts from patient care ✅ You're experiencing burnout or work-related stress ✅ You're willing to implement structured review processes
Consider Waiting If:
❌ You're already efficient with documentation (<30 min/day) ❌ Your organization doesn't support AI tool adoption ❌ You're uncomfortable with technology implementation ❌ Budget constraints prevent investment in AI tools ❌ Your specialty has unique documentation that AI may not handle well
Which AI Scribe Tool Should You Choose?
The UCLA study examined Microsoft DAX and Nabla, but many other options exist in 2026:
Tools with Published Efficacy Data:
- Nabla: UCLA study showed 10% documentation time reduction
- Microsoft DAX (Nuance): Included in UCLA study, widely adopted
- Abridge: Growing body of evidence for effectiveness
Other Notable Options:
- OpenAI for Healthcare: Launched January 2026 (learn more)
- athenaAmbient: Launching February 2026, free for athenahealth users
- SOAPNoteAI: Purpose-built SOAP note generation with ambient listening
Selection Criteria:
- Evidence: Does it have peer-reviewed efficacy data?
- Specialty fit: Does it support your specialty's documentation needs?
- EHR integration: Does it work with your electronic health record?
- HIPAA compliance: Does it offer Business Associate Agreements (BAAs)?
- Cost: Does the pricing fit your budget?
- Support: What implementation and ongoing support is provided?
Implementing AI Scribes: Lessons from UCLA Study
Best Practices for Implementation
Based on the UCLA study findings and clinical experience:
1. Start with a Pilot Program
- Select 2-5 physicians to test AI scribe first
- Choose early adopters comfortable with technology
- Run pilot for 4-8 weeks before broader rollout
- Gather structured feedback weekly
2. Establish Review Protocols Given the "occasional" inaccuracies found in the UCLA study:
Mandatory Review Checklist:
- Verify all diagnoses are accurate
- Check all medications (names, doses, frequencies)
- Confirm procedures and treatments are correctly documented
- Ensure patient-reported symptoms match your clinical assessment
- Verify all anatomical terms and laterality (left/right)
- Check for pronoun errors (patient vs. family member statements)
- Confirm all follow-up plans and referrals are accurate
3. Train Physicians on Effective AI Use
Conversation Techniques for Better AI Capture:
- Speak clearly and use complete sentences
- Explicitly state clinical findings ("The patient has...")
- Verbalize key negative findings
- State diagnoses and treatment plans explicitly
- Avoid ambiguous pronouns ("he said" - specify who)
Example: Good vs. Poor AI Scribe Input
Poor (Ambiguous):
"So the pain is worse at night?" "Yeah." "Okay, and it radiates?" "Sometimes."
Good (Clear for AI):
"The patient reports that his lower back pain worsens at night. The pain occasionally radiates down his left leg to the knee, consistent with L5 radiculopathy."
4. Monitor Quality Metrics
Track These Metrics:
- Documentation time per note (before vs. after)
- Edit rate (% of AI-generated text modified)
- Documentation completeness scores
- Billing coding accuracy
- Physician satisfaction surveys
- Patient safety events related to documentation
Target Benchmarks:
- Documentation time reduction: 15-30%
- Edit rate: <25% (lower is better, but some editing is expected)
- Documentation completeness: >90%
- Physician satisfaction: >80% positive
- Safety events: Zero tolerance
Common Pitfalls to Avoid
Based on UCLA study and clinical experience:
1. Over-Reliance on AI ❌ Don't assume AI is always correct ✅ Always verify before signing
2. Insufficient Training ❌ Don't deploy without proper physician training ✅ Provide hands-on training and support
3. Poor Audio Quality ❌ Don't use AI scribe in noisy environments without optimization ✅ Use quality microphones and test audio capture
4. Ignoring Edit Patterns ❌ Don't ignore repeated AI errors ✅ Report systematic errors to vendor for improvement
5. Rushing Implementation ❌ Don't force adoption without buy-in ✅ Let early adopters demonstrate value first
Comparing UCLA Study to Other Research
Supporting Evidence for AI Scribes
JAMA Network Open Multicenter Study (2024-2025):
- Finding: 31% drop in reported burnout
- Finding: 30% boost in physician well-being
- Scope: Multicenter randomized controlled trial
- Conclusion: Ambient AI scribes significantly improve well-being
VA National Pilot (2025-2026):
- Finding: VA piloted ambient AI with 800,000+ veterans
- Finding: Nationwide rollout planned for 2026
- Significance: Largest government healthcare AI deployment
- Source: Military.com report
GeneOnline Review (2025):
- Finding: AI scribes consistently save 1-2 hours documentation time daily
- Finding: ~20% cuts in note-taking time typical
- Finding: ~30% reductions in after-hours work
- Conclusion: Technology is mature and effective
Addressing Skepticism
Common Physician Concerns:
Concern 1: "AI documentation isn't accurate enough." Evidence: UCLA study showed inaccuracies are "occasional" and caught through review. No serious safety events occurred in 72,000 encounters.
Concern 2: "It will take more time to edit AI notes than write my own." Evidence: UCLA study showed net 10% time savings even after editing. Other studies show 20-30% savings.
Concern 3: "I don't trust AI with my patients." Response: AI assists, but you remain in control. You review and approve everything before it becomes part of the medical record.
Concern 4: "My documentation style is too specialized." Evidence: UCLA study included 14 different specialties with positive results across all of them.
Financial Analysis: Is AI Scribe Worth the Cost?
Cost-Benefit Calculation
Typical AI Scribe Costs (2026):
- Individual subscription: $100-200/month
- Enterprise deployment: $150-300/physician/month
- Implementation costs: $500-2,000 one-time
Time Value Calculation: Using UCLA study findings (41 seconds per note):
Scenario: Physician seeing 20 patients/day
- Time saved: 13.7 minutes/day
- Annual time saved: ~60 hours
- Value of time (at $150/hour): $9,000/year
- Annual tool cost: $1,200-2,400/year
- Net benefit: $6,600-7,800/year
Additional Financial Benefits:
- Billing optimization: 10-15% revenue increase from better documentation
- Reduced burnout: Lower turnover costs
- Increased patient volume: See more patients with saved time
ROI Timeline: Typically 3-6 months to positive return
Billing and Coding Impact
The UCLA study didn't specifically examine billing impacts, but clinical experience shows:
Documentation Completeness → Higher Billing Levels:
- Better documentation supports higher E/M codes
- AI prompts for missing elements needed for billing levels
- Average billing increase: 10-20% where clinically appropriate
Example:
- 20 patients/day × 250 work days/year = 5,000 encounters/year
- 10% billing optimization × $150 average/encounter = $75,000 additional revenue
- AI scribe cost: $2,400/year
- Net billing benefit: $72,600/year
Specialty-Specific Considerations
Specialties Where AI Scribes Excel
Based on UCLA study and clinical feedback:
High-Value Specialties:
- Primary Care: High visit volume, straightforward documentation
- Psychiatry/Mental Health: Conversational encounters translate well
- Pediatrics: Parent conversations captured effectively
- Urgent Care: High volume, time-pressure benefits
- Cardiology: Complex but structured assessments
Specialties with Unique Challenges
May Require Specialty-Specific Training:
- Surgery: Operative notes have specific structure and terminology
- Emergency Medicine: Rapid encounters, interruptions common
- Radiology: Findings-based documentation, less conversational
- Pathology: Microscopic findings, technical terminology
These specialties can still benefit but may need:
- Specialty-specific templates
- Additional training on verbalization of findings
- Custom workflows for procedure documentation
The Future: Beyond the UCLA Study
Next Generation AI Scribe Features
Based on 2026 trends:
Emerging Capabilities:
- Agentic AI: Beyond transcription to workflow management (learn more)
- Multi-modal capture: Video, images, and audio integrated
- Real-time guidance: AI suggests questions during encounter
- Predictive documentation: AI anticipates likely documentation based on chief complaint
Coming Soon:
- Autonomous order placement: AI generates orders for approval
- Billing optimization: Real-time coding suggestions
- Population health integration: AI links documentation to population metrics
- Patient-facing summaries: Automatic after-visit summary generation
Regulatory Landscape
FDA Oversight:
- AI scribes currently considered clinical decision support
- Regulatory framework evolving in 2026
- Some tools may require FDA clearance as AI capabilities expand
Reimbursement:
- Medicare/Medicaid don't yet reimburse for AI scribe tools directly
- Some commercial payers exploring coverage
- Tax deductible as business expense
Conclusion: What Should You Do?
Action Steps Based on UCLA Study Evidence
If You're Considering AI Scribe Technology:
✅ Step 1: Review the UCLA study full text (NEJM AI) ✅ Step 2: Request demos from 2-3 AI scribe vendors ✅ Step 3: Start with a pilot program (4-8 weeks) ✅ Step 4: Establish mandatory review protocols ✅ Step 5: Measure outcomes (time, satisfaction, accuracy)
Key Takeaways from UCLA Study:
- Time savings are real but modest: 10% reduction (41 seconds/note)
- Burnout benefits are meaningful: Improved physician well-being
- Accuracy requires vigilance: Always review AI-generated notes
- Multi-specialty effectiveness: Works across diverse clinical settings
- Safety events are rare: One mild event in 72,000 encounters
The Bottom Line:
The UCLA study provides high-quality evidence that AI scribe technology:
- Reduces documentation time
- Improves physician well-being
- Works across multiple specialties
- Is safe when used with appropriate oversight
For physicians struggling with documentation burden and burnout, AI scribes represent an evidence-based solution worth serious consideration.
Frequently Asked Questions
The UCLA study, published in the New England Journal of Medicine AI in 2026, examined 238 physicians across 14 specialties and 72,000 patient encounters. It found that Nabla users reduced documentation time by nearly 10% (41 seconds per note), with potential benefits for physician burnout and work-related stress.
The UCLA study examined two commercially available AI scribe applications: Microsoft DAX and Nabla. Both showed documentation benefits, with Nabla users seeing the most significant reduction in documentation time compared to usual care.
The study noted that clinically significant inaccuracies appeared 'occasionally' in AI-generated notes on 5-point Likert scale questions. One mild patient safety event was reported during the study. Physicians must always review and verify AI-generated documentation before signing.
Nabla users in the UCLA study saved approximately 41 seconds per note on average, reducing documentation time from 4 minutes 30 seconds to 3 minutes 49 seconds. Over a full clinical day with 20 patients, this translates to about 14 minutes saved daily.
Yes, the UCLA study found potential benefits for physician burnout and work-related stress among users of AI scribe technology. This aligns with a JAMA Network Open multicenter study that found a 31% drop in reported burnout with ambient AI scribes.
The UCLA study supports the use of AI scribes but emphasizes the critical importance of physician review. AI documentation should always be verified for accuracy, completeness, and clinical appropriateness before being finalized. The provider remains ultimately responsible for all documentation.
Medical Disclaimer: This content is for educational purposes only and should not replace professional medical judgment. Always consult current clinical guidelines and your institution's policies.
