Expertise, Experience & Wisdom in the Age of AI
Their roles and consequences in shaping education and learning in academia.
Over centuries, expertise, experience and wisdom have played fundamental roles in shaping education and learning in academia. In fact, expertise has often been the bedrock upon which knowledge creation, education, and scholarly progress have depended upon. It represents deep, specialized knowledge in specific domains, with the mastery of theories, methodologies, and accumulated scholarship that define academic disciplines. On the other hand, experience encompasses practical knowledge often gained through direct engagement with real-world applications as well as research and teaching. It brings distinct and irreplaceable value to academia, complementing expertise which enhance both research and education. Wisdom transcends both, involving the meta-cognitive ability to synthesize knowledge across domains, understand contexts, make sound judgments, and recognize the limitations of what we know. In my view, it can be elusive, which is only exhibited often by a few and that too with many years of experience and substantial deep expertise.
Traditionally, these three elements have worked synergistically in academia. Expertise provided the foundation, experience added practical dimension, and wisdom guided their application. Senior academics (with many years of experience) embody this integration, serving as both repositories of knowledge and arbiters of scholarly judgment.
In this short note, I examine how the developments in artificial intelligence (AI) and the associated tools are fundamentally transforming the relationship between expertise, experience, and wisdom. Though each of these forms of knowledge brings unique value, their significance and application are being shifted in profound ways by AI. It is critically important for the senior leadership in higher education (or for that matter in any organization) to recognize this enormous transformation, which will increase even more at a rapid pace in the next years to come.
Expertise under pressure
AI tools can now generate sophisticated analyses, write research papers, and conduct literature reviews with remarkable competence. This democratization of access to expert-level knowledge is, on the one hand, liberating but, on the other hand, can be threatening to staff. Students can now have access to capabilities that previously required years of specialized training. An even more significant question this raises is about the value of developing deep expertise itself.
For instance, in the legal area, AI tools are capable of performing legal research, analysing case law by searching vast databases of cases and identifying relevant precedents, and generating legal briefs and drafting legal documents. This can put pressure on junior and even senior lawyers who have built their reputation on their mastery of case law and legal precedent, as AI tools can now synthesize legal information comprehensively as well as effectively.
In medical fields such as radiology and pathology, AI diagnostic tools can have high accuracy rates matching or even exceeding human specialists. AI tools can analyse medical images, suggest differential diagnoses, and recommend treatment protocols based on vast databases of medical literature. This raises questions about the traditional model of medical expertise that emphasizes years of training and pattern recognition.
In creative writing, AI tools are able to generate stories, music as well as visual art that often rival human creativity. Last year in 2024, we witnessed strikes in Hollywood, primarily led by the Writers Guild of America, where the major point of contention was the use of AI in scriptwriting and its potential to replace human writers impacting on writers’ job security and compensation. In the academic environment, students may begin to question the value of studying traditional artistic techniques when AI can produce compelling creative work almost instantly. Academics who have spent decades developing expertise in creative processes may find their specialized knowledge of techniques and form less distinctive when AI can potentially generate quite sophisticated creative outputs.
In computer science, AI coding assistants can generate functional code, debug programs, and explain algorithms. This puts pressure on the expertise of computer scientists and software engineers whose value proposition is based on deep knowledge of programming languages, algorithms, and software architecture. With such tools, students are able to build complex applications without understanding fundamental computer science principles!
Such examples1 show that AI is challenging the fundamental assumption that expertise requires years of specialized training to develop. When AI can perform tasks that previously required deep domain knowledge, it forces a reconsideration of what expertise actually means and where its enduring value lies. In academia, this pressure will be felt most acutely by mid-career academics who have invested heavily in developing specific technical skills that AI can now replicate. It will also affect early career academics, who now will need to think clearly about how to chart their career paths working with the AI tools. At present, it may be less threatening to senior scholars, whose wisdom and judgment remain somewhat difficult to automate, at least as of as of now.
Experience needs to be valued more
In my view, AI making expert-level information more easily accessible should lead to increasing the weight that one needs to associate with the experience component making it even more valuable in making decisions such as in recruitment. For instance, the ability to distinguish between AI-generated content that sounds authoritative and content that is actually reliable requires deep practical experience. I view experience as having lived experience, for instance, having done different projects in different environments tackling different constraints and requirements, and learning from them.
For instance, a cardiologist professor spending many years spent as a practicing physician can teach medical students not just what a heart murmur sounds like, but how to distinguish between the subtle variations that indicate different conditions, how patients actually present in emergency situations, and how to communicate difficult diagnoses. While AI tools can analyse ECGs with high accuracy, the experienced clinician knows a person’s anxiety about her chest pain is as important as one’s test results, and that bedside manner affects patient outcomes as much as technical precision.
A computer science professor who has worked in a large tech company can explain why elegant code sometimes fails in software production, how team dynamics affect software projects, and the difference between academic prototypes and industry solutions. They understand technical debt, the pressure of shipping deadlines, and how user feedback changes product design and development.
An academic in journalism who has worked as a war correspondent brings experiential knowledge about source protection, ethical decision-making under pressure, and the reality of reporting in dangerous conditions. They can teach students about the split-second decisions required when interviewing trauma victims, the practical challenges of fact-checking in rapidly evolving situations, and how newsroom economics affect story selection. AI tools can help with research and writing, but experience teaches the human judgment required for ethical journalism.
A political science professor who served as a diplomat can explain the gap between formal international relations theory and actual negotiations. They understand how cultural nuances, personal relationships, and behind the scenes dynamics affect treaty negotiations in ways that AI tools analysing diplomatic texts cannot capture.
Academics with lived experience and who have wrestled with real research challenges in different contexts and in diverse environments are not only better positioned to analyse critically the different situations and communicate them to students but also are able to evaluate better AI generated outputs. In fact, the experience of working with AI tools, and knowing their limitations and understanding their biases have started to become a form of valuable experience in their own right!
Wisdom even more valuable
Wisdom may become the most valuable academic asset in the age of AI proliferation. However, as noted above, not everyone is able to achieve wisdom which often requires many years of experience and deep expertise. Wisdom involves understanding the broader implications of research, encompasses the ability to ask the right questions, not just generate answers, maintain perspective on what truly matters, while recognizing the limits of both human and artificial intelligence.
In technology, disciplinary wisdom can involve understanding not just how to build a system but also the consequences of deploying such a system, the implications on maintainability and the people who will work with it later. Wisdom can also involve seeing patterns across different systems and anticipating how technologies will evolve or interact. This wisdom includes knowing when to break established rules, when complexity is warranted versus when simplicity serves better, and how to balance competing priorities like performance, security, and usability. The deepest disciplinary wisdom often emerges when someone connects their specialized knowledge to broader patterns of human experience, which enters the realm of philosophy.
Emerging Future
In 2003, when Google search became a successful product2, it provided pointers to sources of information. This meant one can always go to the source when needed for the required information. Now AI tools such as ChatGPT3 and Co-Pilot4 go to the information source and summarise and present the collected information in a convenient way for humans to consume. Having these tools at the fingertips, students can easily access the information they need in a form that is suitable for them. The information can be customized and personalised to suit each student’s needs. Such an environment is suggesting a future where academic value lies not in the possession of information but in the curation of information, its synthesis and most of all on the judgment based on critical analysis and experience. The most valuable academics of the future are likely to be those who can combine deep domain expertise with a wide practical experience (including the use of AI tools) and the wisdom to make judgments in their own disciplines while maintaining high values and ethical standards.
Though this transformation is still unfolding, it is critical that people who hold leadership positions take the time to understand what is happening. They need to understand how such transformation can impact their organization in a range of its operations from service provision to recruitment and career development. They need to ensure that their organization continues to evolve with such transformations; otherwise, they will be left behind as the world keeps on moving. But it is not about investing in AI blindly or just using AI for cost cutting measures alone. No organization has become the jewel in their sector just by cutting costs. However, they ought to start thinking about how AI can be synergistically deployed with the workforce to create new services, which can help to achieve genuine competitive advantage and hence make a difference.
About the Author
Professor Vijay Varadharajan is an Emeritus Professor at The University of Newcastle and an Honorary Professor at Macquarie University. With over 30 years of experience across the UK, USA, and Australia, he has held prominent roles such as the Global Innovation Chair Professor in Cyber Security and the Microsoft Chair Professor in Innovation. His extensive career spans both academia and industry, including leadership positions at Hewlett-Packard, British Telecom, and Microsoft. A recognized expert in Cyber Security, Cloud Computing, Internet of Things (IoT), and Big Data Security, he has served on several international advisory boards for government and technology sectors. He holds a PhD in Computer and Communication Engineering and is a Fellow of both the Australian Computer Society and Engineers Australia.
Such expertise are mainly application or domain or sector based rather than a general one.
Google search was officially launched in 1998. Larry Page and Sergey Brin initially developed a search engine called “BackRub” in 1996, which later evolved into Google Search.
Product of Open AI.
Product of Microsoft Corp.



