Walk into any small business owners Facebook group right now and you will find the same conversation happening over and over. Someone asks whether they can use ChatGPT to write their client contracts. Dozens of people say they already do. Nobody mentions the liability.

The American Bar Association has. And their position is more nuanced than either the AI enthusiasts or the AI skeptics want to admit.

What AI Does Well

AI models handle standard contract language with impressive accuracy. Non-disclosure agreements, basic service agreements, freelance contracts, simple vendor agreements. For these use cases, the output from current AI models is often better than what a non-lawyer would write from scratch. The structure is correct. Common clauses are included. The language is clear.

For low-stakes agreements between parties who trust each other, AI-drafted contracts work. The market has figured this out. Millions of small businesses are using them daily.

Sponsored

Want to get AI certified?

AI Hammock gives working professionals a verified certification in applied AI — and teaches you skills you can use on day one. 7 days free.

Get AI Certified →

Where It Gets Dangerous

The problems show up in three places. First, jurisdiction. Contract law varies significantly by state and country. AI models do not always flag when a clause is unenforceable in a specific jurisdiction. They generate text that looks correct without surfacing that it may not hold up in a California court versus a Texas court.

Second, novel situations. Standard clauses do not cover every business relationship. When your situation is slightly unusual, AI models fill in the gaps with plausible-sounding language that may not protect you the way you think it does. You do not find out until something goes wrong.

Third, liability for what is missing. The most dangerous contracts are not the ones with bad clauses. They are the ones missing the right clauses. AI does not always know what you do not know to ask for.

The Bar Association Position

The ABA's formal guidance treats AI-assisted contract drafting the same way it treats any other legal tool. It is acceptable as a starting point. It is not a replacement for attorney review on anything with real financial or legal exposure. The unlicensed practice of law question remains unsettled in most states.

The practical guidance is straightforward. Use AI for low-stakes, standard agreements. Get a human attorney to review anything involving significant money, intellectual property, employment, or a party you do not fully trust.

The Bottom Line

Can AI write your legal contracts? Yes, and for many situations it does it well. Can it replace an attorney? Not when it matters. The problem is that you often do not know when it matters until after something breaks.

The small business owners using AI for contracts are not wrong. They are just operating with a risk profile most of them have not fully thought through.