TL;DR
- When AI is purposefully designed for education, students leverage it for real learning instead of quick shortcuts.
- StudyFetch's AI learning platform powers authentic educational growth for more than four million students.
- Among the first of its kind, our new large-scale report analyzed 1 million anonymous student-AI conversations from 300,000+ unique users on StudyFetch's platform from January to April 2025.
- We divided interactions into eleven categories to understand how students use AI on our platform.
- Learners rely on our AI tutor Spark.E primarily for concept explanations (40%), content summarization (22%), and step-by-step guidance (10%); direct answer requests represented just 2.6% of conversations.
- Students are not delegating critical-thinking tasks to AI on StudyFetch, instead they are deepening understanding and mastery of materials.
- Notably, shortcut behaviors, like direct answer requests, declined by 80% as students gained experience and learned to use AI responsibly.
Introduction
It is a common concern in education circles that students have primarily used AI for answer generation and cheating. This concern exists because students and educators were first introduced to AI as a generic productivity tool, and because it was explained as an amplified google search, rather than a partner in education. Headlines like “Everyone is cheating their way through college”1 and “Increased AI use linked to eroding critical thinking skills”2 serve to perpetuate this belief.
The majority of platforms available to students today are not designed to engage students in higher order thinking aimed at mastery of learning outcomes. The design and implementation of generic AI platforms have shaped the critical usage patterns toward academic shortcuts, instead of using AI as a thought partner to understanding materials.
Our Hypothesis
When students are provided with AI platforms specifically designed to support authentic educational growth and learning, students learn to use AI effectively, using AI as a thought partner rather than a tool for shortcuts.
StudyFetch's platform was designed to support advanced learning and critical thinking skills, and has a wide variety of AI based study tools. Unique to StudyFetch rather than a simple LLM include a conversational AI tutor, flashcards, quizzes, tests, games, audio recaps, video explainers, notes, insights, and more. For the purpose of this comparative analysis, we will focus on our AI chatbot, which is comparable to generic chatbots like ChatGPT, Claude, or Gemini in capabilities and user experience.
Methodology
Our research analyzed one million randomized student conversations with our AI tutor Spark.E between January and April 2025, representing more than 300,000 unique users. Conversations were anonymized to protect student identities, and all personal information was removed from the data. Student data is never used to train our AI systems. This analysis focused exclusively on how students interact with AI, not the specific content of their discussions, and no original conversation text was included in any data categories or the resulting analysis.
To precisely identify patterns of student intent, we categorized each student-AI message into one of eleven interaction types:
Intention Category | Description |
---|---|
Concept Explanation Requests | Messages asking Spark.E to explain course concepts, theories, or terminology in simpler terms. |
Content Summarization | Requests to summarize or condense lecture material, readings, or notes. |
Elaboration Requests | Students who understand basics but want deeper insights or connections between concepts. |
Study Strategy Questions | Students seeking advice on how to study a particular topic or prepare for tests. |
Application Questions | Students asking how to apply concepts to new contexts or real-world situations. |
Clarification of Instructions | Questions about assignment instructions, requirements, or rubrics. |
Step-by-Step Problem Guidance | Students asking for walkthrough explanations of how to solve specific problems. |
Verification Questions | Students sharing their own work/answers and asking if they're correct. |
Quick Fact Checking | Brief interactions to verify specific facts, dates, formulas, or definitions. |
Direct Answer Requests | Messages directly asking for specific answers without showing interest in the learning process. |
Other | Messages that don't fit into any of the above categories. |
Note: Step-by-Step Guidance is highlighted in yellow because students explicitly request a walkthrough of the process thus indicating a desire to learn how to solve the problem, even though the AI may still yield a final answer at the end of the process. This differs from Direct Answer requests (red) where students seek only the solution with no learning intent.
AI Chat Intention Breakdown
Figure 1. Data Source - StudyFetch analysis of 1M anonymized student–AI conversations (Jan–Apr 2025)
Direct Answer Requests
Compared with Concept Explanation (41%), Content Summary (23%) and Step-by-Step Guidance (10%).
With certainty, we can assert that students using the StudyFetch platform do not utilize the platform as a means to academic shortcuts or cheating. 40% of students used Spark.E to explain concepts in student-friendly language. In fact, only 2.6% of requests explicitly asked for answers (e.g., "What is the answer to…", "Give me the solution…", "Write this essay and make it sound like I'm a first year psych student…").
The most common inquiry types were Concept Explanation (41%), Content Summary (23%), Step-by-Step Guidance (10%), Elaboration Requests (6%), and Study Strategy (6%). Altogether, 85% of chatbot interactions reflected earnest student intentions to engage with and understand academic materials.
A smaller proportion of students used Spark.E as a fact-checking assistant: Quick Fact Checking (1%), Clarification of Instructions (2%), Verification Questions (3%), and Application Questions (1%). In total, 7% of interactions served as an educational fact-finding resource or a substitute for teacher procedural questions.
These findings reinforce that purpose-built AI tools like StudyFetch facilitate authentic learning experiences rather than encouraging academic shortcuts.
Long-Term Learning Solutions
Direct Answer Requests Over Time
Students asked for direct answers far less often after their first few chats.
Educators often worry that continued reliance on AI tools will erode students' critical-thinking capabilities. Recent research by M. Gerlich (2025)3 suggests that how AI is used dictates its impact: "AI can enhance learning outcomes when used appropriately… Reducing cognitive offloading through active engagement can mitigate the negative impact of AI tools on critical thinking."
To measure cognitive offloading on StudyFetch, we tracked each learner's conversation patterns from their first chats through their hundredth. By normalizing usage to each student's first five conversations, we isolated genuine behavioral change resulting from overall platform familiarity.
Over time, students diversified their AI usage, moving away from simple answer-seeking toward a balanced mix of learning strategies. Direct answer requests dropped by 79% in early sessions and stabilized at roughly 82% below initial levels. Verification questions fell 81%, and quick fact-checking messages declined 86%, eventually stabilizing 95% below initial rates.
Learning-focused behaviors endured: concept explanations decreased only 29% while step-by-step guidance remained steady. Notably, study-strategy questions plummeted 96%, evidence that StudyFetch's built-in study plans, scaffolding, and practice resources teach learners to manage their studying independently.
Academic Disciplines
Figure 2. Data Source - StudyFetch Study: Subject Categories
Like previous research, our analysis revealed distinct patterns in how students use AI across different academic disciplines (von Garrel, Mayer, 2023)4, Students most frequently share study materials like documents or lecture slides when using AI for Psychology (67%), Social Sciences (64%), and Medicine (63%).
Biology and Medicine students primarily seek concept explanations at a higher rate than other subjects due to the complex terminology and intricate systems in these fields. Both subjects show high rates of material sharing and extended conversation lengths.
Mathematics shows the highest proportion of Step-by-Step Problem Guidance requests (51%), demonstrating students' focus on understanding solution processes rather than merely obtaining answers. It also shows how students learning math are putting less effort into conceptually understanding theorems and more interested in learning how to solve problems. Mathematics also has the highest average messages per conversation at nine.
Shortcut-oriented categories like Direct Answer Requests, Verification Questions, Quick Fact Checking collectively remain under 5%, respectively, in every discipline.
Bloom's Taxonomy
Figure 3. Data Source – StudyFetch Study: Taxonomy Categories
Educators and learning scientists have long depended on Bloom's taxonomy to define student learning and plan learning goals. The concern exists that with the introduction of AI into education, students may delegate higher-order thinking tasks to AI tools (Gonsalves, 2024)5. By offloading critical-thinking skills that are essential to cognitive development, learners risk failing to become critical consumers and users of information.
Our analysis of student conversations reveals a very different usage pattern. The vast majority of interactions (64%) focus on the "Understand" level, where students primarily seek comprehension of concepts and ideas. Only 6% of conversations target "Create" and just 4% target "Evaluate," Bloom's two highest tiers. This suggests students are using StudyFetch's AI chiefly as a learning aid to build comprehension rather than outsourcing their most cognitively demanding tasks.
Research supports this behaviour: "The use of generative AI tools enables students to quickly discover and structure a wide range of information, fostering a foundational understanding that serves as an initial guide for exploring complex topics" (Gonsalves, 2024)5.
Even in higher-order categories, iterative learning happens. Gonsalves notes that generative AI fosters metacognitive cycles of questioning, feedback, and adjustment, promoting adaptability and strategic thinking.
With purpose-built educational AI tools like StudyFetch, students naturally gravitate toward using AI to enhance understanding rather than complete their work for them.
Ongoing Studies
Our winter report of 1,000+ StudyFetch students revealed that 92% of regular active users improved their grades and study time dropped by 30%. Pilot programs with partner schools are currently replicating our report findings, and showing significant grade improvements and greater study efficiency in real classrooms. We continue to expand these institutional pilots across diverse learning contexts, while multiple efficacy studies remain in progress as we refine the platform.
Conclusion
Our data confirms we're making strong progress toward our vision, ensuring that every student has the opportunity for authentic learning on a platform designed specifically for educational growth. While challenges remain, we're dedicated to advancing AI in education responsibly and ethically.
In partnership with today's educators, we are creating the framework for effective AI integration in education and are actively seeking additional partnerships with experts who prioritize genuine student learning outcomes. Through all our efforts, we maintain an unwavering focus: providing every student everywhere with the tools they need for academic success in ways that enhance the learning process.
References
- Walsh, J. (2025). Everyone Is Cheating Their Way Through College. Intelligencer via New York Magazine. https://nymag.com/intelligencer/article/openai-chatgpt-ai-cheating-education-college-students-school.html
- Jackson, J. (2025). Increased AI use linked to eroding critical thinking skills. Phys.Org. https://phys.org/news/2025-01-ai-linked-eroding-critical-skills.html#google_vignette
- Gerlich, M. (2025, January). AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Social Sciences Research Network. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5082524
- von Garrel, J., & Mayer, J. (2023). Artificial intelligence in studies—use of ChatGPT and AI-based tools among students in Germany. Humanities & Social Sciences Communications, 10, 1–9. https://www.nature.com/articles/s41599-023-02304-7
- Gonsalves, C. (2024). Generative AI's Impact on Critical Thinking: Revisiting Bloom's Taxonomy. King's College London. https://kclpure.kcl.ac.uk/portal/en/publications/generative-ais-impact-on-critical-thinking-revisiting-blooms-taxo