Compliance
ABA Rule 1.6 and AI Tools: What Small Law Firms Must Know Before Signing Up for Anything
Most AI tools were not built with attorney confidentiality obligations in mind. Here is what to check before you let any AI tool touch a client file — and the five questions every lawyer should ask before signing up for anything.

Alex Cuomo
Co-founder, LexVault · March 30, 2026 · 5 min read

AI tools for lawyers are everywhere. But most were not built with attorney confidentiality obligations in mind. Here is what to check before you let any AI tool touch a client file.
Artificial intelligence is changing how law firms work. Document review that once took hours now takes minutes. Research that required a paralegal's full afternoon can be done in seconds. The tools are genuinely useful — and small firms, which have always had to do more with less, have the most to gain.
But before you sign up for any AI platform and start uploading client files, there is one thing you need to think through carefully: your confidentiality obligations under ABA Model Rule 1.6.
If you want to see how a tool built with these obligations in mind actually works, explore LexVault's beta — or read more about how we handle data isolation and compliance.
What Rule 1.6 Actually Requires
Rule 1.6 requires lawyers to make reasonable efforts to prevent the inadvertent or unauthorised disclosure of client information. The key word is reasonable. The ABA has been clear that this does not mean you can never use cloud tools or third-party software — but it does mean you need to understand how those tools work before you use them.
ABA Formal Opinion 477R (2017) addressed cloud computing specifically and confirmed that lawyers may use cloud-based services for client data as long as they apply reasonable care. That includes understanding the vendor's data practices, security measures, and what happens to the data after you stop using the service.
The same logic applies to AI tools. Probably more so, because AI introduces a specific risk that cloud storage does not: your client's documents could be used to train a model that other people's queries then draw from.
The AI Training Problem
Most consumer-grade AI tools — including many that market themselves to professionals — train their models on user inputs. When you upload a contract and ask the AI to summarise it, that contract may become part of the training data that improves the model for everyone else. In practice, this means confidential client information could, in some form, influence responses given to other users.
This is not a hypothetical concern. Several major AI providers have been criticised for doing exactly this with enterprise customers who did not read the fine print carefully enough.
For lawyers, this is a serious problem. The client never consented to their information being shared with a technology company's model training pipeline. Doing so without consent — and without a proper Data Processing Agreement in place — is difficult to justify under Rule 1.6.
Five Questions to Ask Any AI Vendor
Before you let an AI tool touch a client file, get clear answers to these questions:
- Is my data used to train your models? The answer should be an unambiguous no — not "we may use anonymised data" or "we use data to improve our service." No training, full stop.
- Is my firm's data isolated from other firms' data? Each firm's documents should be stored and processed in an isolated environment. A multi-tenant setup where all clients share the same database is a red flag.
- Where is the data stored? US-based infrastructure (ideally AWS or Azure US regions) is the standard for legal. EU firms should look for EU data residency. Know where your data physically sits.
- Is a Data Processing Agreement available? A DPA is a formal contract that sets out how a processor handles personal data. Any reputable vendor working with law firms should have one ready to sign — not "available on request" but publicly documented and automatically in place at signup. LexVault's DPA is publicly available and comes into effect automatically at signup.
- What happens to my data if I cancel? You should be able to export your data and the vendor should delete it from their systems within a defined timeframe after you cancel.
What "Reasonable Efforts" Looks Like in Practice
The bar is not perfection — it is reasonableness. Courts and ethics committees will look at whether you took steps proportionate to the sensitivity of the information and the risks involved.
In practice, that means:
- Reading the vendor's privacy policy and terms of service before uploading client data
- Checking whether a DPA exists and what it covers
- Satisfying yourself that the vendor's security practices are appropriate (SOC 2, encryption at rest and in transit, access controls)
- Documenting your due diligence if you work in a jurisdiction where the state bar has issued ethics opinions on AI use
You do not need to conduct a full technical audit. But you do need to go beyond clicking "I agree" on a terms page.
The Bottom Line
AI tools can be used ethically and in compliance with Rule 1.6 — but only if you choose tools that were designed with attorney obligations in mind. The difference between a tool built for general consumers and one built for law firms is not just features. It is the underlying architecture, the data practices, and the contractual commitments the vendor is willing to make.
When evaluating any AI tool for your firm, treat the data practices as a threshold requirement — not a nice-to-have. If a vendor cannot give you clear answers to the five questions above, that is your answer.
You might also find it useful to read about the hidden cost of document search at small law firms — a related problem that AI can help solve once you have the right tool in place.
LexVault
Built with these obligations in mind
Data isolated per firm. No AI training. DPA at signup. US infrastructure.
Explore the beta