Beyond Detection: Designing AI-Resilient Assessments with Automated Feedback Tool to Foster Critical Thinking
–arXiv.org Artificial Intelligence
ARTICLE TEMPLATE Beyond Detection: Designing AI-Resilient Assessments with Automated Feedback Tool to Foster Critical Thinking and Originality Muhammad Sajjad Akbar a a University of Sydney, Australia; ARTICLE HISTORY Compiled April 1, 2025 ABSTRACT The growing prevalence of generative AI tools such as ChatGPT has raised urgent concerns about their impact on student learning, particularly their potential to erode critical thinking and creativity in academic contexts. As students increasingly use these tools to complete assessments, foundational cognitive skills are at risk of being bypassed, challenging the integrity of higher education and the authenticity of student work. Current AI-generated text detection tools are fundamentally inadequate in addressing this challenge. They produce unreliable, unverifiable outputs and are highly susceptible to false positives and false negatives, especially when students apply obfuscation techniques such as paraphrasing, translation, or structural rewording. These tools rely on shallow statistical features rather than contextual or semantic understanding, making them unsuitable as definitive indicators of AI misuse. In response, this research proposes an AI-resilient, assessment-based solution that shifts focus from reactive detection to proactive assessment design. The solution is delivered through a web-based Python tool that integrates Bloom's Taxonomy with advanced natural language processing techniques including GPT-3.5 Turbo, BERT-based semantic similarity, and TF-IDF metrics to evaluate the AI-solvability of assignment tasks. By analyzing both surface-level and semantic features, the tool helps educators assess whether a task targets lower-order thinking (e.g., recall, summarization), which is more easily completed by AI, or higher-order skills (e.g., analysis, evaluation, creation), which are more resistant to AI automation. This framework empowers educators to intentionally design cognitively demanding AI-resistant assessments that promote originality, critical thinking, and fairness. By addressing the design of root issue assessment rather than relying on flawed detection tools, this research contributes a sustainable and pedagogically sound strategy to uphold academic standards and foster authentic learning in the era of AI. KEYWORDS Generative AI; ChatGPT; AI-resilient; Bloom's Taxonomy; Automated Assessments; AI-solvability;Automated Feedback; appendices 1. Introduction Integrating AI-technology with innovative thinking skills in higher education (HE) environment has grown more challenging due to rapid digital innovation and ubiquitous data availability. In applied education, innovative thinking is essential. It is charac-CONTACT Muhammad Sajjad Akbar. It entails thinking creatively to come up with original solutions to issues, enhance workflows, or open up new possibilities.
arXiv.org Artificial Intelligence
Mar-30-2025
- Country:
- Europe > Lithuania
- Vilnius County > Vilnius (0.04)
- Oceania > Australia
- New South Wales > Sydney (0.24)
- Europe > Lithuania
- Genre:
- Research Report
- Experimental Study (0.46)
- New Finding (0.46)
- Promising Solution (0.34)
- Research Report
- Industry:
- Education
- Assessment & Standards (1.00)
- Educational Setting > Higher Education (0.70)
- Educational Technology > Educational Software (1.00)
- Information Technology > Security & Privacy (0.93)
- Education
- Technology: