- Ask AI to explain concepts and test your understanding, not to produce work you'll submit as your own.
- Use it to generate practice questions from your own notes. This is active recall at scale.
- AI "hallucinates", it confidently states things that are false. Never use a fact, quote, or citation from AI without verifying it against a primary source.
- The ethics line is clear: if you couldn't explain the reasoning behind a sentence in your own words during a conversation, you shouldn't have let AI write it.
- The more context you give, the better the output. Vague prompts produce vague answers.
The Core Distinction
AI becomes a problem when you use it to skip the thinking. It becomes an asset when you use it to extend your thinking. The difference isn't the tool, it's whether your brain is engaged.
If you paste a question from your homework and copy the answer, you haven't learned anything. You've also built a habit that will cost you in exams, interviews, and any situation where AI isn't available. If you instead ask AI to explain the concept behind the question, quiz you on it, and tell you where your reasoning breaks down, you've used the same tool to learn faster than you could alone.
Using AI as a Tutor
The most effective use of AI for learning is asking it to guide your thinking rather than replace it.
Ask for explanations, not answers
When a textbook section isn't clicking, ask: "Explain [concept] using a simple analogy." Or: "Walk me through the steps of this problem but don't give me the final answer yet, let me try the next step." The goal is to understand the logic, not just get the result.
Use the Socratic method
Tell the AI: "I'm studying [topic]. Ask me one question at a time to test my understanding, give me feedback on my answers, and tell me when I'm wrong." This turns AI into an interactive tutor rather than a search engine.
Get feedback on your own work
Paste a paragraph you wrote and ask: "Critique this for clarity and logic. Don't rewrite it, give me three specific suggestions for improvement." You stay in control of the writing; AI helps you see the gaps.
Give it context
Vague prompts produce vague answers. "Explain psychology" will get you a textbook intro. "I'm a sophomore studying clinical psychology, explain the difference between CBT and DBT and when each is used" will get you something actually useful.
Active Recall at Scale
One of the most practical uses of AI is generating practice materials from your own notes. Paste your notes into a conversation and ask for flashcard questions, practice problems, or a short quiz. When you then answer those questions from memory, you're doing active recall, the most effective study technique that exists, with material customized to exactly what you're studying.
Some prompts that work well:
- "Based on these notes, generate 10 challenging questions. Show the questions first, then the answers below a clear divider."
- "I'm writing an argumentative essay with this thesis. What are the three strongest counter-arguments? Where is my logic vulnerable?"
- "Create a study schedule for this syllabus, assuming I can study two hours a day and my midterm is on [date]."
The Ethics Line
Academic institutions and employers are increasingly clear about where AI assistance crosses into academic dishonesty or professional misrepresentation. The line is simpler than most people make it.
| Fine to use AI for | Use judgment here | Don't do this |
|---|---|---|
| Explaining concepts you don't understand | Brainstorming an outline you then write from | Generating text you submit as your own writing |
| Generating practice questions from your notes | Rewording a confusing sentence you drafted | Having AI write an entire essay or report |
| Checking your grammar and catching typos | Organizing research you've already gathered | Using AI-generated facts without verifying them |
A useful test: if someone asked you to explain the reasoning behind a sentence in your submitted work during a conversation, could you do it? If the honest answer is no, you've crossed the line.
AI Hallucinations
This is the most important technical fact to understand about how AI works. A language model predicts the next most likely word in a sequence, it doesn't actually retrieve or verify facts. This means it can, and does, confidently state things that are completely false: wrong dates, fabricated citations, quotes attributed to people who never said them, statistics that don't exist.
This isn't a bug that will eventually be fixed. It's a fundamental property of how these systems work. The practical consequence: never use a specific fact, date, statistic, or citation that came from an AI without finding the original source yourself and confirming it's real. See the Information Literacy page for how to vet sources quickly.
If an AI gives you a citation, a paper title, an author, a journal, a page number, search for it yourself before using it. AI-generated citations are frequently partially or entirely fabricated. This has caused real harm to students and professionals who submitted work with fake sources they didn't check.