Does AI Make Students Dumber? The Calculator Problem in the Age of AI.

Kavitta Ghai
March 24, 2026

Does AI Make Students Dumber?

No, AI does not inherently make students dumber. But unstructured, unguided use of AI can. A 2024 study from Wharton and the University of Pennsylvania found that AI can make it harder for students to learn when used without pedagogical guardrails. The critical variable is not whether students use AI, but how they use it.

When AI is embedded into coursework with faculty-set guardrails — including Socratic prompting and restrictions on providing direct answers — research shows students achieve higher GPAs, stronger retention, and increased motivation. The risk educators should focus on is not AI in the classroom. It is the unstructured AI use already happening outside of it.

Does AI in education create dependency?

The concern is real, and it's not new. In a Forbes interview, Maggie McGrath put it perfectly: her eighth-grade math teacher used to watch former students pull out calculators at the grocery store to count change, even though he knew they could do the math in their heads. The tool had replaced the thinking.

That maps directly onto AI. When a student goes to a consumer AI chatbot and asks, "How do I solve this?" the AI solves it for them. That's what it's designed to do. Do that enough times, and the student stops learning how to get there on their own.

But here's what gets left out: 90% of college students are already using AI regularly. Those students are using AI without guardrails or guidance, and with no one teaching them the difference between using AI as a thinking partner and as a shortcut.

How is AI in schools different from a student using AI on their own?

The difference is guardrails, structure, and faculty control.

A generic chatbot has one objective: to answer the question. It doesn't know the course, the professor's expectations, or the pedagogy. It will write the essay and hand back a finished answer with no friction.

A purpose-built tool like Nectir works differently. Faculty set the prompts and guardrails: they can require the Socratic method, block direct answers to homework, check thesis alignment without writing the paper, and redirect students to office hours after a set number of exchanges. The AI mirrors how the professor teaches, not how a chatbot guesses it should respond.

Is there research showing structured AI improves learning?

Yes. Peer-reviewed research with Los Angeles Pacific University found that after one term of using Nectir AI with faculty-set guardrails, students saw:

  • 20% increase in GPA campuswide
  • 13% rise in average final scores
  • 36% boost in intrinsic motivation to learn

Anecdotal feedback from instructors at the California Community Colleges also showed increased retention with students and lower Drop, Fail, Withdraw rates.

How should schools think about AI dependency versus AI literacy?

Nobody argues calculators should be banned from math class. Instead, teachers teach students when and how to use them, and make sure they can do the math without one first.

AI needs the same approach. The goal is to structure it so students think critically with AI rather than outsource their thinking to it. Schools that ban AI aren't solving the dependency problem, but pushing it off campus, where students use it anyway.

Watch the full Forbes interview where Kavitta discusses this and more.

Want to bring structured AI to your campus? Schedule a demo, and our team will walk you through how Nectir works and what it looks like at schools like yours.

Kavitta Ghai
March 24, 2026

This is the future of education.

Join over 45,000+ students, faculty, and staff using Nectir.