AI for Attorney Intake Questionnaires: Safe UPL Practices

<i>I help small law firms in Central Florida use AI to screen leads and gather facts — without crossing the line into unauthorized practice of law. Here's what I've learned from working with firms in Maitland and Lake Mary.</i>

I was sitting in a conference room in Maitland last month, talking to a solo family law attorney. She was drowning in intake calls — sixty missed calls a day, she said, and most of them were people asking basic questions about divorce timelines or child custody. She wanted to use AI to help. But she was terrified of UPL. She’d heard stories of chatbots giving bad legal advice and getting firms fined. She asked me point-blank: “Can I use AI to screen potential clients without breaking the rules?”

The answer is yes — but only if you build the system carefully. In this post, I’ll walk through what’s allowed, what’s not, and how small firms in Orlando and beyond can use AI for intake questionnaires safely. I’ll use real examples from Central Florida firms I’ve worked with, and I’ll keep the jargon out of it.

Why Intake Is the Perfect Place to Start With AI

Most law firms lose money on intake. A paralegal spends 15 minutes on the phone with a lead, only to find out the case isn’t in the firm’s practice area. Or worse, the lead never gets a call back because the front desk is overwhelmed. AI can handle the first pass — gathering basic facts, checking conflicts, and routing the lead to the right attorney — without giving legal advice.

I’ve seen a three-attorney firm in Winter Park cut their intake response time from 4 hours to under 10 minutes by using an AI voice agent that asks structured questions. The system asks about case type, opposing party, and desired outcome. It doesn’t interpret the law. It doesn’t predict outcomes. It just collects information and sends a clean summary to the attorney.

That’s the key distinction: AI can ask questions; it cannot answer them. As long as your intake bot sticks to fact-gathering, you’re on safe ground.

The UPL Line: Three Rules Every Firm Must Follow

The unauthorized practice of law (UPL) rules vary by state, but Florida has clear guidance. I’ve reviewed the Florida Bar’s opinions on technology, and here are three non-negotiable rules for any AI intake system:

  1. No legal analysis. The AI cannot interpret facts or apply law. For example, it can ask “How long have you been married?” but it cannot say “In Florida, you need to be seperated for six months before filing.”
  2. No predictions. The AI cannot say “You have a strong case” or “You might get $50,000.” It can only gather information.
  3. Clear disclaimer. Every interaction must state that the AI is not a lawyer and that information provided does not constitute legal advice.

One firm in Lake Mary learned this the hard way. Their chatbot started answering questions like “Can I get alimony?” with generic summaries of Florida law. The Florida Bar sent a warning letter. They had to shut the bot down and rebuild it with stricter guardrails. The rebuild took two weeks and cost $4,500 in lost leads. Don’t make that mistake.

Building a Compliant Intake Questionnaire: A Practical Walkthrough

Let me show you what a safe, effective AI intake questionnaire looks like. I helped a family law firm in Oviedo set one up last quarter. Here’s the structure we used:

  • Step 1: Case type selection. The AI asks the lead to choose from a list of practice areas: divorce, child custody, adoption, etc. No descriptions, just labels.
  • Step 2: Fact collection. The AI asks specific, factual questions. For divorce: “How long have you lived in Florida?” “Do you have minor children?” “Are you currently employed?”
  • Step 3: Conflict check. The AI asks for the names of any opposing parties and checks against a database. If a conflict is found, the system flags it for human review.
  • Step 4: Scheduling. The AI offers available consultation slots from the attorney’s calendar.

The entire process takes about 3 minutes. The attorney gets a clean intake form with no interpretation. She told me it saved her paralegal 12 hours a week. The firm saw a 40% increase in qualified leads because they weren’t wasting time on bad fits.

What Happens When You Cross the Line? A Real Central Florida Example

I’ll be honest — not every firm gets it right. A personal injury firm in Sanford tried to automate their initial consultation. They trained an AI on Florida personal injury law and let it answer questions like “How much is my case worth?” The AI gave dollar estimates based on past settlements. Within three months, a former client sued, claiming the AI’s estimate was misleading. The firm settled for $75,000 and had to remove the bot.

That’s the worst-case scenario. But even small missteps can trigger bar complaints. I’ve seen firms get fined $2,500 for not having a proper disclaimer on their chatbot. The rules are strict, but they’re also clear. If you stick to the three rules above, you’ll be fine.

Tools and Templates: What I Recommend for Small Firms

You don’t need a custom-built AI system to get started. Several off-the-shelf tools work well for intake questionnaires, as long as you configure them correctly. I’ve tested a few with Central Florida firms:

  • Typeform + Zapier + GPT. You can build a simple questionnaire that sends answers to a GPT model for summarization. The GPT never talks to the lead — it only processes data. This is the safest option.
  • Voice AI platforms. I work with a few vendors that offer AI voice agents specifically for legal intake. They come pre-configured with UPL-safe scripts. I helped a firm in Clermont implement one, and it handled 200 calls in the first week without a single UPL issue.
  • Microsoft 365 Copilot. If your firm uses Office 365, you can use Copilot to draft intake summaries from transcriptions. Copilot doesn’t interact with clients — it just helps your team process information faster.

Before you buy any tool, run it through an AI readiness assessment. I’ve seen firms waste $10,000 on a system that didn’t fit their workflow. An assessment helps you identify what you actually need.

“The line between gathering facts and giving advice is thin. But if you build your intake system to only ask questions, never answer them, you’re almost certainly safe.” — from my conversation with a Florida Bar ethics attorney

Training Your Team: How to Avoid Common Mistakes

Even the best AI system can cause problems if your staff uses it incorrectly. I’ve seen paralegals copy-paste AI-generated summaries into emails to clients without reviewing them. That’s a UPL risk if the summary contains any legal analysis.

Here’s what I recommend for training:

  • Review every AI output. Before any information goes to a client, a licensed attorney must review it. This isn’t optional.
  • Set clear boundaries. Tell your staff: the AI is for intake only. If a client asks a legal question during intake, the AI should say “I can’t answer that. Let me connect you with an attorney.”
  • Audit regularly. Every month, review a random sample of AI interactions. Look for any instance where the bot might have crossed the line. Fix those immediately.

One firm in Apopka does a monthly “UPL scrub” where they review 50 random chatbot transcripts. They’ve caught three borderline responses in six months and fixed them before any client saw them. That’s the kind of diligence that keeps you out of trouble.

What About AI That Helps Attorneys Draft Questions?

A seperate but related topic: AI tools that help attorneys draft intake questionnaires. These are completely safe because they don’t interact with clients. For example, I’ve used GPT to generate a list of 20 questions for a personal injury intake. The attorney reviewed and edited them before adding them to the form. That’s just using AI as a productivity tool — no UPL issue at all.

If you’re curious about more AI terms, check out our AI glossary for plain-English definitions.

I also offer a fractional AI officer service for firms that want ongoing guidance. We meet monthly to review your AI systems and make sure they stay compliant as the technology changes.

If you’re ready to start, contact me for a free 30-minute consultation. I’ll help you design an intake system that saves time and keeps you safe.

“The line between gathering facts and giving advice is thin. But if you build your intake system to only ask questions, never answer them, you’re almost certainly safe.” — from a Florida Bar ethics attorney

Frequently asked questions

Can I use AI to answer client questions about Florida law?

No. Any AI that interprets or applies law to a specific situation likely constitutes UPL. You can only use AI to gather factual information or provide general legal information that is not tailored to the client's circumstances. Always include a disclaimer.

What if my AI only summarizes Florida statutes?

Even summarizing statutes can be risky if the AI selects which statutes to show based on the client's facts. The safest approach is to avoid any legal content in the AI's responses. Stick to fact-gathering questions only.

Do I need a disclaimer on my AI intake form?

Yes. Every interaction must clearly state that the AI is not a lawyer, does not provide legal advice, and that the information collected will be reviewed by an attorney. This is a Florida Bar requirement.

Can I use AI to check for conflicts of interest?

Yes, as long as the AI only compares names against a database and flags potential matches. It should not interpret or analyze the conflict. The attorney must make the final decision.

What's the best tool for a small law firm in Orlando?

For most small firms, a simple Typeform questionnaire integrated with a GPT model for summarization is the safest and most affordable option. Voice AI platforms are also good if you handle high call volume. I recommend an AI readiness assessment first.

Can I be held liable if my AI violates UPL?

Yes. The firm and the supervising attorney are responsible for the AI's actions. Penalties can include fines, reprimands, and even disbarment in severe cases. Always review AI outputs and train your staff.

Ready to talk it through?

Send a one-line description of what you are trying to do. I will reply within one business day with a plain-English next step. Email or use the form →