Listen to this lesson
Estimated reading time: 8 minutes
AI tools can feel like having a knowledgeable colleague available around the clock. Ask a question, get an answer. Upload a document, get analysis. It's genuinely useful — but it comes with a catch that you need to understand clearly.
AI tools can be confidently wrong. They present information fluently regardless of whether it's accurate. This doesn't make them useless for research — far from it. But it means you need to use them as a starting point, not a source of truth.
This lesson covers what AI can reliably help with in research, what it can't, and how to use it wisely.
This is where AI shines. If you're entering a new field, encountering an unfamiliar concept, or trying to understand something complex, AI tools are excellent at providing clear, accessible explanations.
Example prompt: "I'm a small business owner with no finance background. Explain what EBITDA is and why it matters, in plain language."
AI is particularly good at this because:
Need to understand the differences between two approaches, products, or strategies? AI is good at structured comparisons.
Example prompt: "Compare the pros and cons of Xero vs MYOB for a small service-based business in New Zealand with 5 employees. I care about ease of use, cost, and payroll features."
If you have several documents or articles on a topic, AI can help you find themes, contradictions, and key points across them. Upload the documents and ask:
Example prompt: "I'm uploading 5 articles about remote work productivity. Summarise the key findings across all of them. Where do they agree? Where do they disagree?"
AI is useful for generating ideas and angles you might not have considered.
Example prompt: "I'm researching why staff turnover is high in our customer service team. What are the most common causes of turnover in customer-facing roles? What questions should I be asking in exit interviews?"
If you encounter statistics or data in a report, AI can help you interpret what they mean.
Example prompt: "This report says our NPS score dropped from 42 to 35 quarter-over-quarter. Is that a significant drop? What might cause that kind of change?"
AI tools generate text based on patterns in their training data. They don't "look up" facts in a database. This means:
Rule of thumb: If a specific fact, number, or claim matters, verify it independently. Search for the source. Check the original. Don't treat AI output as a citation.
AI can explain what a legal concept means. It can't give you legal advice for your specific situation. The same applies to medical, financial, tax, and engineering domains. AI doesn't understand your context the way a qualified professional does.
Use AI to prepare for conversations with experts, not to replace those conversations.
Most AI models have a training data cutoff — they don't know about events after a certain date. Tools with web browsing (ChatGPT, Gemini, Perplexity) can access current information, but even these can miss very recent developments or mix current and outdated information.
Always check: "When was this information current?" If the AI doesn't specify, ask.
AI doesn't know your organisation, your market, your customers, or your history. It can give general advice, but the most valuable research insights come from applying general knowledge to specific contexts — and that's still a human skill.
Here's a workflow that makes the most of AI's strengths while protecting against its weaknesses:
Ask the AI to explain the topic, identify key concepts, and suggest what to look into further. This gives you a map of the territory.
Once you understand the basics, ask more specific questions. Upload relevant documents for the AI to analyse. Ask it to compare sources, identify gaps, or challenge assumptions.
For any facts, statistics, or claims that matter:
AI can gather and organise information, but the synthesis — deciding what it means for your situation — is yours. The AI doesn't know what you know.
If your research will inform a decision or be shared with others, note where your information came from. "AI-assisted research, verified against [source]" is honest and appropriate.
Different tools suit different research needs:
The single most important skill for using AI in research is developing a verification habit. Not everything needs to be fact-checked — if you're using AI to brainstorm ideas, the ideas themselves don't need verification. But when you're dealing with:
...always verify before relying on it.
This isn't a weakness of AI tools — it's how research has always worked. You wouldn't cite a single Wikipedia article without checking the sources. AI deserves the same healthy scepticism.
AI-assisted research on a real topic.
1. What is the most important habit to develop when using AI for research?
Answer: c) Verifying specific facts and claims independently — AI can be confidently wrong, so important information should always be checked.
2. What is AI particularly good at during the research process?
Answer: c) AI excels at explaining concepts, structured comparisons, and synthesis — tasks that help you understand a topic quickly.
3. Why should AI not be treated as a replacement for professional advice (legal, medical, financial)?
Answer: b) AI lacks understanding of your specific context and can be confidently wrong — professional advice requires qualified human judgment.