12 Questions to Ask Before Starting Any AI Project
A property management company spent $18,000 on an AI tenant screening tool. Three months in, they discovered it couldn't connect to their existing property management software. The vendor had never asked what systems they used. The company had never asked if integration was included. Eighteen thousand dollars and nobody asked the obvious questions.
Most AI projects fail not because the technology breaks, but because someone skipped the pre-flight checklist. These twelve questions sort out whether a project is ready to launch or headed for an expensive lesson.
Business Fit
The first four questions determine whether AI is the right solution at all. Get these wrong, and no amount of good technology will save the project.
1. Do You Have Enough Data?
AI tools need input to produce output. A chatbot needs a knowledge base of questions and answers. A sales forecasting tool needs 12-24 months of historical sales data. An email drafting tool needs examples of your company's writing style and common customer inquiries.
"We'll figure out the data later" is the most expensive sentence in AI projects. If your data lives in email threads, sticky notes, and one person's memory, the first step is data collection, not AI implementation. Our guide to cleaning messy data walks through that process. Good answer: "We have 18 months of customer service transcripts in Zendesk and a 40-page FAQ document." Red flag: "Our best employee just knows the answers."
2. Is This Process Worth Automating?
Not every repetitive task deserves AI. The math is simple: multiply the time the task takes by how often it happens by your hourly labor cost. If the annual savings don't exceed the first year of AI costs (software, integration, training), the project doesn't make financial sense.
A staffing agency screened 200 resumes per week at 3 minutes each: 10 hours weekly, about $15,000/year in labor. An AI screening tool at $8,000/year with $3,000 setup made clear sense. An accounting firm ran the same math for quarterly tax letters and found payback wouldn't come until year two. They passed. The AI ROI calculator helps run these numbers quickly.
3. Who Owns This Project?
Every AI project needs a single person who makes decisions, tracks progress, and answers questions. Not a committee. Not "the team." One name on one whiteboard.
That person doesn't need to be technical. They need authority to make choices and five to ten hours per week during the first two months, dropping to two to three hours after that. If nobody has that time, the project will stall in month one.
4. What Does Success Look Like at 90 Days?
"We want AI to improve customer service" is not a success metric. "AI handles 40% of incoming support tickets without human intervention, with a customer satisfaction score above 4.2/5" is a success metric.
Pick one or two numbers that you can measure before the project starts and after 90 days. Time per task, error rate, response time, volume handled, customer satisfaction score. If you can't define the metric, you'll never know if the project worked. And you'll keep spending money on something you can't evaluate. For a full measurement framework, see the AI ROI measurement guide.
Technical Questions
These three questions prevent the integration nightmares that kill projects after the check has been written.
5. What Needs to Connect to What?
List every system the AI tool needs to talk to: your CRM, email platform, phone system, database, accounting software, project management tool. Then ask the vendor: does your tool integrate with each of these? Natively, through an API, or through a third-party connector like Zapier?
Native integrations work best. API connections require developer time (budget $2,000-8,000). Zapier connections are cheapest but add a point of failure and sometimes lag by minutes. No integration path at all means manual data entry, which defeats the purpose. The integration checklist has the full compatibility matrix.
6. Where Does the Data Live?
Two parts. First: where is the data the AI needs? Cloud database means easy integration. Local spreadsheets mean a migration step. Paper files mean digitization before anything else. Second: where does the AI store its own data? If you handle sensitive information, cloud processing may require specific compliance certifications. Our security and privacy guide covers what to look for.
7. What Happens When It Breaks?
AI tools go down. APIs change without warning. Models get updated and start behaving differently. Your plan B determines whether a tool outage is a minor inconvenience or a business emergency.
For a customer-facing chatbot, the fallback might be routing to a human agent. For an internal document tool, staff go back to manual search for a day. For a sales forecasting tool, you pull last month's projections while the system recovers. Whatever the fallback, write it down before launch, not during a crisis at 9 AM on a Monday. Try our chatbot demo to see how fallback routing works in practice.
People Questions
Technology is the easy part. Changing how people work is where projects succeed or fail.
8. Who Will Champion This Internally?
The project owner from question three handles logistics. The champion is different: the person who gets excited about the tool, figures out creative uses, and convinces skeptical colleagues. Champions are usually mid-level staff who do the actual work the tool affects. If nobody on the team is even curious, mandated adoption produces compliance, not usage.
9. How Will You Train the Team?
One webinar is not a training plan. Our three-session training structure works because it spreads learning across two weeks: orientation, hands-on practice with real tasks, and workflow integration. Plan for three hours per person minimum.
Budget the training cost, not just the tool cost. A team of eight spending three hours each in training is 24 hours of labor. At $35/hour loaded cost, that's $840 before anyone opens the tool. Add the productivity dip during the first two weeks of adoption (tasks take 20-30% longer while people learn), and training is a real line item.
10. How Will You Measure Whether It Worked?
This is different from question four. Question four defines what success looks like. This question asks: can you actually measure it? If your success metric is "reduce email response time by 40%," do you know your current average response time? Can your email system generate that report?
Track the baseline for two weeks before turning on the AI tool. Without a baseline, you'll argue about whether the tool is working instead of knowing.
Vendor Questions
These last two questions protect you from vendor lock-in and unpleasant surprises.
11. What Happens to My Data If I Leave?
Ask this before signing. Some vendors let you export in standard formats with one click. Others make export difficult or charge for it. A few claim ownership of derivative data. Get the exit terms in writing: "We'll work with you" is not a commitment. "Export all data in CSV within 30 days at no charge" is. The vendor evaluation guide has the full list of contract terms to check.
12. Show Me a Case Study for a Business My Size
Vendors love showing case studies from Fortune 500 companies. A global retailer saving $2 million with AI tells you nothing about whether the tool works for a 15-person logistics company in Tampa.
Ask for a reference from a business with similar revenue, team size, and use case. If the vendor can't produce one, that's not automatic disqualification, but it means you're an early adopter for that vendor's product at your size. Price accordingly and negotiate harder on terms.
Using the Checklist
Run through all twelve questions before committing budget. Write down the answers. Any question where the answer is "I don't know" needs resolution before you proceed. Three or more unanswered questions means the project isn't ready.
The questions work for any AI project: chatbots, document processing, sales tools, scheduling automation. They also work at any budget level, from a $50/month subscription to a $50,000 custom build. Scale the diligence to the investment, but never skip the questions entirely. The AI readiness assessment pairs well with this checklist for a complete pre-project evaluation. And our first AI project guide picks up where this list leaves off.
AI insights that don't waste your time
One email per week. Practical AI tips for small business owners—no hype, no jargon, just what's actually working. Unsubscribe anytime.
Join 200+ Tampa Bay business owners getting smarter about AI.