As artificial intelligence (AI) continues to transform higher education, it’s hard to ignore the promise it holds: personalised learning, smarter administrative systems, and more efficient ways to support students. AI has the potential to reshape how education is delivered, offering tailored experiences and automated support to enhance learning. However, for all its promise, there’s a significant challenge that often goes overlooked—AI’s impact on equality and diversity in education. In particular, we must confront how access to AI tools may further entrench existing inequalities.
AI is increasingly becoming a key part of the educational landscape. From chatbots helping with administrative tasks to intelligent tutoring systems guiding students through complex topics, these tools can be incredibly beneficial. In theory, AI offers the potential to level the playing field by providing personalised, immediate support for all students. But this vision hinges on one crucial factor: access. Without it, AI risks becoming a powerful tool for those who can afford it, while leaving others behind.
The issue of access is not just about whether students can get their hands on AI-powered devices, but whether they can engage with the technology in ways that genuinely enhance their learning. Students from lower-income backgrounds, those in rural areas, and those attending institutions with fewer resources are often at a disadvantage when it comes to access to the technology that powers AI tools. For these students, AI could become just another technological divide—reinforcing the gap between the haves and have-nots.
Take the issue of digital infrastructure, for example. While AI tools may be designed to help personalise and support learning, they often require reliable internet access, powerful devices, and up-to-date software. In regions where these resources are not readily available, students are excluded from these AI-enhanced experiences. If we rely on AI to enhance learning outcomes without ensuring that all students have access to the basic tools required to use it, we risk deepening the digital divide and further marginalising already disadvantaged communities.
But access isn’t just about technology—it’s also about the kind of content AI systems are designed to deliver. AI tools are not built in a vacuum; they are developed by teams with their own biases and perspectives, and this can impact the effectiveness of AI for diverse learners. For example, if an AI system is trained on data that does not reflect the cultural or socio-economic diversity of the student body, it can lead to unintentional bias in the educational content it provides. This could result in AI systems that fail to meet the needs of students from underrepresented groups, leaving them to navigate a system that doesn’t recognise their unique learning requirements.
Equally concerning is the issue of bias in AI itself. While we talk about AI as being an objective and neutral tool, the reality is that AI systems can perpetuate and even amplify biases that exist in the data they are trained on. If AI tools are used to assess students’ work or track their progress, there’s a risk that students from marginalised backgrounds could be unfairly disadvantaged. This could happen through the reinforcement of stereotypes, discriminatory algorithms, or simply by overlooking the diverse experiences and challenges faced by these students.
Beyond this, there’s the ethical question of surveillance. AI tools often collect large amounts of data on students’ progress, behaviour, and interactions. While this data can be valuable for improving learning outcomes, it also raises concerns about privacy, consent, and how that information is used. For students from already vulnerable or marginalised backgrounds, the constant monitoring and data collection inherent in many AI systems can feel invasive. The benefits of AI could easily be overshadowed by the risks of over-surveillance, particularly if students’ data is used in ways they cannot control or fully understand.
So, how can we address these challenges and ensure that AI doesn’t become a tool that deepens inequality? First, we need to ensure that AI is accessible to all students, regardless of their socio-economic background or geographical location. This means investing in infrastructure, particularly in areas where students currently have limited access to the necessary technology. Governments, institutions, and tech companies need to prioritise digital inclusion by providing affordable, reliable access to AI-enhanced learning tools.
Moreover, we must be intentional about the design and implementation of AI systems in education. Developers must be mindful of the biases that can creep into the data used to train AI tools, ensuring that the systems are inclusive and capable of serving the needs of diverse student populations. This means not only designing AI tools that recognise and accommodate different cultural and socio-economic contexts, but also involving diverse voices in the development process to ensure that AI reflects a broad spectrum of experiences.
Finally, we need to think about AI’s role in assessment and evaluation. In an age of advanced AI tools that can generate essays or solve complex problems, we must reconsider how we measure students’ learning. If AI is used to generate content or assist students in completing assignments, how can we ensure the integrity of the learning process? Institutions need to rethink assessment strategies, focusing on process-based evaluations that track students’ progress and engagement over time, rather than simply evaluating a final product.
AI holds great promise for improving education, but only if it is integrated in a way that promotes fairness and accessibility. By addressing the challenges of access, ensuring that AI systems are designed with diversity in mind, and reconsidering how we assess learning, we can begin to ensure that AI truly levels the playing field in higher education. The future of education depends not just on embracing technological advancements, but on making sure that every student, regardless of their background, has the tools and opportunities they need to succeed.
Professor Earle Abrahamson PhD, NTF, PFHEA, FISSOTL is Professor in the Scholarship of Teaching and Learning, University of Hertfordshire, UK
The author's views are their own.
Thank you for being a member of ICAI. Not a member of ICAI yet? Check out the benefits of membership and consider joining us by visiting our membership page. Be part of something great!