Patient Data Privacy and Consent
Protect patient data when using AI tools—understand DPDP Act 2023 requirements, de-identification techniques, and consent best practices for Indian clinics.
You have just finished a busy OPD. You want to quickly draft discharge instructions using ChatGPT. You paste the patient’s history—name, age, address, Aadhaar reference, diagnosis—and hit enter. The AI gives you a beautiful output. But you have just shared identifiable health data with a third-party server, potentially violating the Digital Personal Data Protection Act, 2023 (DPDP Act).
This article will help you use AI tools confidently while keeping patient data safe and staying compliant with Indian law.
What Problem This Solves
Doctors are under enormous documentation pressure. AI can help—but the moment you paste patient details into any AI tool, you create a privacy risk.
The core problems:
- Legal exposure: The DPDP Act 2023 makes you accountable for how patient data is processed, even by third-party tools
- Trust erosion: Patients expect confidentiality; a data leak destroys years of trust
- Unclear boundaries: “But I’m just using ChatGPT for my notes” is not a valid defence if identifiable data leaves your system
- Consent gaps: Most clinics don’t have AI-specific consent language
What this article gives you:
- A practical de-identification checklist you can use in 30 seconds
- Understanding of what the DPDP Act means for your clinic
- Ready-to-use consent language for AI-assisted services
- Safe prompt patterns that protect both you and your patients
How to Do It (Steps)
Step 1: Understand What the DPDP Act 2023 Means for You
The Digital Personal Data Protection Act, 2023 applies to any digital processing of personal data. Here is what matters for your clinic:
| DPDP Act Concept | What It Means for You |
|---|---|
| Data Principal | Your patient—they have rights over their data |
| Data Fiduciary | You and your clinic—you’re responsible for how data is used |
| Processing | Any action on data: collecting, storing, sharing, or pasting into AI |
| Consent | Must be free, specific, informed, and unconditional |
| Purpose Limitation | Use data only for the stated purpose |
Key implication: When you paste patient data into ChatGPT, Claude, or any AI tool, you are “processing” that data through a third party. Unless you have informed consent and appropriate safeguards, this may violate the Act.
Step 2: Learn the De-Identification Rule
Before any patient information touches an AI tool, remove or replace these identifiers:
The De-Identification Checklist (memorise this):
| Remove This | Replace With |
|---|---|
| Full name | ”Patient” or initials like “Mr. R” |
| Age (exact) | Age range: “male in 50s” |
| Date of birth | Omit entirely |
| Specific dates | ”Day 3 of admission”, “2 weeks ago” |
| Address/locality | ”Urban setting” or omit |
| Phone number | Omit entirely |
| Aadhaar/ID numbers | Never include |
| Hospital/clinic name | ”Tertiary care hospital” |
| Doctor names | ”Treating physician” |
| Workplace | Omit or generalise: “office worker” |
| Relative names | ”Spouse”, “daughter” |
The 30-Second Test: Before pasting anything, ask yourself: “If this text appeared on a public website, could anyone identify who this patient is?” If yes, remove more details.
Step 3: Know What Is Safe to Use
Safe to paste into AI tools (after de-identification):
- Clinical symptoms and signs (without identifying context)
- Lab values and investigation results (without patient name/ID)
- Generic medication queries
- Disease education requests
- Template and format requests
Never paste into AI tools:
- Patient names, phone numbers, addresses
- Aadhaar, ABHA ID, insurance policy numbers
- Prescription images with patient details
- WhatsApp conversations with patients
- Photographs of patients (even wound photos if identifiable)
- Scanned documents with letterheads and signatures
Step 4: Update Your Consent Process
Add AI-specific language to your registration form or consent document. Patients should know if AI assists in their care—even for documentation.
Suggested consent addition:
“This clinic may use AI-based tools to assist with documentation, patient education materials, and administrative tasks. No identifiable patient information is shared with these tools. The treating doctor reviews all AI-assisted outputs before use. AI is never used for diagnosis or treatment decisions.”
Step 5: Create a Clinic Data Handling SOP
Even a simple one-page document helps:
- What tools are approved: List specific AI tools staff may use
- What data can be used: De-identified clinical information only
- What is prohibited: Names, contact details, IDs, photos
- Who reviews outputs: Doctor must verify before use
- Incident protocol: What to do if someone accidentally shares identifiable data
Example Prompts
These prompts demonstrate SAFE data handling patterns:
Prompt 1: Drafting Patient Instructions (De-Identified)
A patient in their 40s with newly diagnosed Type 2 diabetes (HbA1c 8.2%)
needs lifestyle counselling. They work a desk job and have limited time
for exercise.
Create a practical lifestyle advice sheet in simple English. Include:
- Diet modifications suitable for Indian vegetarian food
- Realistic exercise suggestions for busy professionals
- Warning signs to watch for
- When to contact the clinic
Keep it under 400 words, suitable for Grade 8 reading level.
Prompt 2: Structuring Clinical Notes (Safe Pattern)
Convert these de-identified OPD notes into SOAP format:
"Middle-aged male, diabetic for 5 years, presents with burning feet
for 3 weeks. No ulcers. On metformin 1g BD. Recent HbA1c 7.8%.
Examination: reduced vibration sense bilateral feet. Pulses normal."
Constraints:
- Do not add clinical information not provided
- Mark any missing essential details as "[TO BE ADDED]"
- Use "Assessment considerations" not "Diagnosis"
Prompt 3: Creating Consent Checklist (No Patient Data Needed)
Create a consent discussion checklist for upper GI endoscopy
in an Indian outpatient setting.
Include:
- Indication categories (not patient-specific)
- Procedure explanation in simple terms
- Common risks and rare serious risks
- Alternatives to discuss
- Post-procedure care instructions
- Documentation checkboxes for the clinician
Format as a printable one-page checklist.
Prompt 4: Medication Education Sheet
Create a patient information sheet for someone starting
Amlodipine 5mg for hypertension.
Include:
- What the medicine does (simple explanation)
- How to take it
- Common side effects and what to do
- Warning signs needing immediate medical attention
- Lifestyle tips that help the medicine work better
- Myths vs facts about BP medication
Language: English with Hindi translations for key terms.
Reading level: Grade 6-8.
Bad Prompt → Improved Prompt
Bad Prompt (Privacy Violation):
My patient Ramesh Kumar, age 52, from Andheri West, Mumbai,
phone 98205XXXXX, came today with chest pain. He had an MI
in 2019 at Lilavati Hospital. His current medications from
Dr. Sharma are Ecosprin 75mg and Atorvastatin 20mg. He works
at Tata Consultancy. His wife Sunita is worried. Please create
a follow-up plan.
What is wrong:
- Full name included
- Exact age, location, phone number
- Hospital name, doctor name
- Workplace, relative name
- All of this is now on a third-party server
Improved Prompt (Privacy-Safe):
Male patient in early 50s with history of MI (7 years ago).
Currently on antiplatelet and statin therapy. Presented today
with recurrent chest discomfort.
Create a structured follow-up plan including:
- Red flag symptoms for patient and family to watch
- Recommended investigations to consider
- Lifestyle reinforcement points
- Follow-up timeline suggestions
- When to seek emergency care
Format as a patient handout + clinician checklist.
What changed:
- No names (patient, family, other doctors)
- Age range instead of exact age
- No location, phone, or workplace
- Clinical facts preserved for useful output
- Same clinical utility, zero privacy risk
Common Mistakes
Mistake 1: “It’s just for my personal use”
Reality: The moment data leaves your device to an AI server, it is being processed by a third party. The DPDP Act applies regardless of your intent.
Mistake 2: Assuming AI tools are HIPAA/DPDP compliant
Reality: Most free AI tools (ChatGPT free tier, etc.) explicitly state in their terms that data may be used for training. Enterprise versions may offer better guarantees—read the terms carefully.
Mistake 3: De-identifying names but leaving everything else
Reality: A “52-year-old diabetic male from Andheri who works at TCS and had an MI at Lilavati in 2019” is identifiable even without a name. Remove ALL identifying combinations.
Mistake 4: Sharing prescription images or scanned documents
Reality: These contain multiple identifiers—patient name, clinic letterhead, doctor’s signature, dates. Never upload these to AI tools.
Mistake 5: Using AI outputs without review
Reality: Even with perfect de-identification, AI can generate incorrect medical information. You remain clinically and legally responsible for everything you give to patients.
Mistake 6: No documentation of AI use
Reality: If there is ever a dispute, you should be able to show that you followed safe practices. Keep a simple log of how AI assists your workflow.
Clinic-Ready Templates
Template 1: De-Identification Quick Reference Card
Print this and keep it next to your workstation:
BEFORE PASTING INTO AI — REMOVE:
[ ] Patient name → use "Patient" or "Mr./Ms. X"
[ ] Exact age → use "in their 40s/50s/60s"
[ ] Date of birth → omit
[ ] Specific dates → use "Day 3" or "2 weeks ago"
[ ] Address/area → omit or use "urban/rural"
[ ] Phone/Aadhaar/ABHA → never include
[ ] Hospital/doctor names → use "referring hospital"
[ ] Workplace → generalise: "office worker"
[ ] Family member names → use "spouse/child/parent"
30-SECOND TEST: Could someone identify this patient
if this text appeared publicly? If yes, remove more.
Template 2: Patient Consent Addendum for AI-Assisted Services
CONSENT FOR AI-ASSISTED CLINICAL SERVICES
I understand that [Clinic Name] may use artificial intelligence
(AI) tools to assist with:
- Preparing patient education materials
- Drafting documentation (reviewed by my doctor)
- Administrative and scheduling tasks
I understand that:
1. No information that can identify me personally (name, contact
details, ID numbers) is shared with AI tools
2. Only de-identified clinical information may be used
3. My treating doctor reviews all AI-assisted content before it
is used in my care
4. AI tools do not make diagnosis or treatment decisions
5. I can ask questions about how AI is used in my care
[ ] I consent to AI-assisted services as described above
[ ] I do not consent to AI-assisted services
Patient/Guardian Signature: _______________ Date: ___________
Template 3: Clinic AI Usage Log (Simple Version)
AI TOOL USAGE LOG
Date: ___________
Staff Name: ___________
Tool Used: [ ] ChatGPT [ ] Claude [ ] Other: _______
Purpose:
[ ] Patient education material
[ ] Documentation draft
[ ] Template creation
[ ] Administrative task
[ ] Other: ___________
De-identification confirmed: [ ] Yes
Output reviewed before use: [ ] Yes
Safety Note
Critical reminders for safe AI use in your clinic:
-
You are always accountable. The DPDP Act holds you responsible as the Data Fiduciary. “The AI made a mistake” is not a defence.
-
De-identification is your shield. When done properly, de-identified data is not considered “personal data” under the Act. This is your pathway to safe AI use.
-
Consent must be informed. Patients should know AI assists in their care. Add this to your registration process.
-
AI outputs need verification. Never give patients AI-generated content without clinical review. AI can confidently state incorrect information.
-
Enterprise tools may offer better protection. If your clinic uses AI heavily, consider paid enterprise versions that offer data processing agreements and no-training guarantees.
-
When in doubt, leave it out. If you are unsure whether information is identifiable, err on the side of removing it.
Copy-Paste Prompts
Prompt A: Safe Clinical Query Template
Clinical scenario (de-identified):
[Describe patient as: age range + gender + relevant conditions]
[Include: presenting complaint, duration, relevant history]
[Include: examination findings, investigations if relevant]
Request:
[What you need: differential considerations / patient education /
documentation draft / follow-up plan]
Constraints:
- Do not add clinical information not provided
- Use "considerations" not "diagnosis"
- Flag any missing critical information
- Format as: [specify: SOAP / bullet list / patient handout]
Prompt B: Patient Education Generator (Safe)
Create a patient education sheet for [condition].
Patient context (no identifying details): [age range, relevant
lifestyle factors like "office worker" or "homemaker"]
Include:
- What is this condition (simple explanation)
- Common symptoms
- Lifestyle modifications (India-appropriate)
- Medication adherence tips (generic, not prescription-specific)
- Warning signs requiring medical attention
- Myths vs facts
Language: [English / English + Hindi / other]
Reading level: Grade 6-8
Length: 300-500 words
Prompt C: Documentation Draft (De-Identified)
Convert these de-identified clinical notes into [SOAP format /
discharge summary format / referral letter format]:
[Paste de-identified notes here]
Rules:
1. Do not invent clinical details
2. Mark missing information as [REQUIRED: ___]
3. Use "Assessment considerations" not "Diagnosis"
4. Include a "Red flags discussed" section
5. End with "Reviewed and verified by treating physician"
Prompt D: Consent Checklist Creator
Create a consent discussion checklist for [procedure name].
Setting: Indian outpatient clinic
Include sections for:
- Indication (general categories, not patient-specific)
- Procedure explanation (simple language)
- Expected benefits
- Common risks (with approximate frequencies if known)
- Rare but serious risks
- Alternatives including no treatment
- Post-procedure expectations and care
- Warning signs to report
- Documentation checkboxes
Format: Printable one-page checklist with tick boxes.
Do’s and Don’ts
Do’s
- Do de-identify every piece of text before pasting into AI tools
- Do use age ranges (”50s”) instead of exact ages
- Do replace names with generic terms (“Patient”, “treating physician”)
- Do add AI consent language to your registration forms
- Do review every AI output before using it with patients
- Do keep a simple log of AI tool usage in your clinic
- Do train your staff on de-identification practices
- Do use the 30-second test: “Could this identify someone?”
- Do prefer enterprise AI tools with data protection agreements for heavy use
- Do stay updated on DPDP Act rules and notifications
Don’ts
- Don’t paste patient names, phone numbers, or addresses into AI
- Don’t upload prescription images, scanned documents, or patient photos
- Don’t assume “personal use” exempts you from data protection rules
- Don’t share Aadhaar, ABHA ID, or insurance numbers with AI tools
- Don’t use AI outputs without clinical verification
- Don’t forget that combinations of details can identify someone (age + location + workplace + condition)
- Don’t use WhatsApp chat exports with patients in AI tools
- Don’t share AI-generated content with patients without review
- Don’t assume all AI tools have the same privacy standards
- Don’t let convenience override compliance—the risks are not worth it
1-Minute Takeaway
The DPDP Act 2023 makes you accountable for how patient data is processed—including when you paste it into AI tools.
The solution is simple: de-identify first. Remove names, exact ages, dates, locations, phone numbers, ID numbers, and any combination that could identify someone. Use the 30-second test: “If this text appeared publicly, could anyone identify this patient?”
Safe AI use is possible. De-identified clinical scenarios, generic education requests, and template creation are all safe. Paste clinical facts without personal identifiers, and you get the same useful AI output with zero privacy risk.
Update your consent process. Add a simple clause informing patients that AI may assist with documentation and education materials. Transparency builds trust.
Always review AI outputs. You remain the doctor. AI drafts; you verify, modify, and approve. The clinical and legal responsibility stays with you.
Privacy compliance is not about avoiding AI—it is about using AI smartly. With good habits, you protect your patients, your practice, and yourself.