Navigating AI in Legal Practice: What Lawyers Should Never Share with ChatGPT

Artificial Intelligence (AI) tools like OpenAI's ChatGPT are revolutionizing the every industry, including the practice of law, often helping lawyers with research, drafting, and administrative tasks. However, using AI effectively—and ethically—requires a firm understanding of professional responsibilities. Lawyers must ensure they comply with confidentiality obligations, avoid improper disclosures, and protect client data.

Before you integrate AI tools into your legal practice, keep in mind these five golden rules—things you should never disclose to ChatGPT or any other general-purpose AI tool.

1. Personal Identifiable Information (PII)

Never enter personally identifiable information (PII) about clients, employees, or third parties. AI tools are not designed to store or protect sensitive data, and inputted information could be used for training future models or accessed by third parties.

Examples to avoid sharing:
❌ Client names, addresses, Social Security numbers
❌ Phone numbers or email addresses
❌ Birthdates, driver’s license, or passport details

2. Confidential or Privileged Client Information

Under ABA Model Rule 1.6 and state bar regulations, lawyers must maintain strict confidentiality regarding client matters​. Inputting client case details into AI tools could breach attorney-client privilege, exposing sensitive legal strategies and confidential records.

Examples to avoid sharing:
❌ Case details, legal arguments, or strategy discussions
❌ Contracts, pleadings, or deposition transcripts
❌ Internal client communications

💡 Best Practice: If you must summarize legal principles, redact any identifying details and speak in hypotheticals.

3. Proprietary or Trade Secret Information

Businesses entrust law firms with trade secrets and proprietary information, making it crucial to prevent unauthorized disclosure. AI systems are not bound by confidentiality agreements, and even with privacy controls, sensitive corporate information should not be input into external AI models.

Examples to avoid sharing:
❌ Internal corporate policies, contracts, or financial records
❌ Patent applications or proprietary technology descriptions
❌ M&A negotiations or non-public regulatory filings

4. Regulated Data (HIPAA, GDPR, etc.)

Certain legal matters involve regulated data, such as health information (HIPAA) or personal data under GDPR. Sharing regulated data with AI tools can result in legal and ethical violations, exposing firms to regulatory penalties​.

Examples to avoid sharing:
❌ Medical records or health information
❌ Data subject to banking and financial regulations (e.g., credit card numbers)
❌ Any cross-border data that may be subject to privacy laws like GDPR

💡 Best Practice: If AI is used in compliance-related work, consult IT and legal experts to ensure data is anonymized and handled securely.

5. Anything You Wouldn’t Want Publicly Shared

A simple rule of thumb: if it would be damaging or embarrassing for you, your client, or your firm if it were leaked, don’t enter it into an AI tool. Even with privacy safeguards, AI platforms are not foolproof, and cybersecurity risks exist​.

Examples to avoid sharing:
❌ Internal law firm policies or billing practices
❌ Unpublished legal opinions or research memos
❌ Anything subject to an NDA or confidentiality agreement

Legal AI Tools vs. ChatGPT: Why Data Disclosure Rules Differ

Unlike general-purpose AI tools like ChatGPT, legal-specific AI solutions are designed with legal workflows, compliance, and security in mind. Solutions like GetGC.AI, Spellbook, Marveri, Midpage, and Alexi take different approaches to AI-driven legal work, each with unique policies on data privacy, confidentiality, and model training.

  • GetGC.AI emphasizes California Bar AI compliance with data isolation and encryption​.

  • Spellbook positions itself as a business-use AI tool with policies tailored toward contract automation​.

  • Marveri and Midpage focus on streamlining legal research and workflows, each with its own security framework​.

  • Alexi is designed for legal research and memo drafting, with privacy measures that lawyers should carefully review​.

Because AI security and compliance standards vary across providers, lawyers must review each tool’s policies to ensure they align with professional responsibility requirements. To explore these differences in-depth, check out the Aloha Legal Whitepaper, which provides a comprehensive breakdown of legal AI solutions, their capabilities, and how law firms can implement them securely and ethically.

📌 Download the Aloha Legal Whitepaper today to ensure you’re using AI responsibly in your legal practice.

Final Takeaway: AI Can Enhance Legal Work—If Used Responsibly

While AI is a powerful tool, not all AI solutions are created equal when it comes to privacy and compliance. General tools like ChatGPT should never be used for sensitive client work, while specialized legal AI solutions offer more secure options with built-in confidentiality measures.

Best Practices for AI in Law Firms:

Use AI for general legal research, summarization, and brainstorming—but always fact-check the output.
Redact client details and sensitive information before entering prompts.
Ensure AI providers have strong security policies and do not use inputs for model training.
Vet legal-specific AI tools before handling client data.
Get client consent if AI-generated materials will be incorporated into legal work.

Interested in how AI can safely improve your law firm’s efficiency? Contact Aloha Legal today to explore tailored AI solutions designed for legal professionals.

Robb Miller

Dual national (US & Canadian) attorney - legal (securities, IP, corporate) and extra-legal services to technology companies, investors and funds.  Passion for Fintech, AI, LegalTech, Healthtech and other disruptive technologies, loves to help companies with corporate identity, and partnership ecosystems

https://robbmiller.me
Previous
Previous

New Guidance on AI-Related Inventions: What Innovators Need to Know

Next
Next

Maximizing Efficiency with AI-Driven Contract Analysis