AI Guidance

Updated 12 May 2025


The purpose of this document is to set out the principles that define the faculty’s engagement with AI. It is intended to be neither comprehensive nor definitive. This is a rapidly evolving field, and we anticipate that this document will be updated at regular intervals. It is also important that it is read in conjunction with the other documents to which it provides links. Students should also refer to their tutors or supervisors on any specific issues regarding the use of AI.

The Faculty cautions students against the uncritical use of generative AI. In particular, the use of AI tools to prepare essays, or other written texts, will result in texts that will be bland, impersonal, and may contain rogue references (hallucinations, as they are often termed) to non-existent sources or works. It is therefore essential that students approach it as a critical tool, much as they would any other resource. The most important tool however is their own historical intelligence.

Led by the Russell Group Principles on the use of generative AI tools in education, the faculty supports students who may wish to use AI to complement their learning, and at the same time, wishes to ensure that students are aware of the limitations and biases associated with AI which can result in misleading or incorrect information.

Definitions of AI for the purposes of this document

Basic AI

Where an AI tool is simply automating tasks e.g., sorting data, generating graphs, or analysing trends, and the focus is on automation and analytics rather than creating new content.

Generative AI

Where an AI tool provides substantial intellectual contributions e.g., generating summaries, insights or interpretations, where the focus is on creating new content. Generative AI models such as ChatGPT work by predicting text based on the information on which they have been trained. These models (currently) have no understanding or consciousness and simply generate the next most likely word based on probabilities.

Generative AI

Using generative AI for exploratory purposes

The faculty recognises the value of students using generative AI, particularly for exploratory purposes and non-assessed tasks, and there are some good examples and suggestions of how to use generative AI tools for particular outcomes, on the university website e.g., Use of generative AI tools to support learning | University of Oxford.

 

Using generative AI in coursework and assessments

There are two important concepts to consider when using information generated by AI. Firstly, the accuracy and integrity of the information generated and secondly, how that information is then used to inform submitted work for assessment. If used, generative AI tools must support learning rather than replace independent thinking or critical analysis. These tools can enhance understanding and improve clarity, but are not substitutes for original work or intellectual effort.

It is imperative that students understand that submitting work for assessment that contains AI generated content as their own work, constitutes a breach of academic integrity and is considered plagiarism, a disciplinary offence Plagiarism | University of Oxford.

Using AI responsibly

Generative AI applications may store or use the data students upload to improve their models, depending on the platform’s terms of service. Before using tools like ChatGPT or Claude, students are advised to review their privacy policies to understand how their data will be handled.

Students should not share unpublished work by others on AI platforms without their explicit permission, as this could violate their intellectual property rights and privacy.

Students should always exercise caution when inputting sensitive or private information into AI tools and only collect the data necessary for the task, anonymising information wherever possible.

Once shared, AI-generated content becomes the responsibility of the initiator and any uncredited use, including over-reliance, may constitute academic misconduct under Statute XI of the University’s Code of Discipline and result in disciplinary action.

It is important to recognise too that AI uses considerable energy. Over-use of AI resources therefore adds to the energy footprint of the Faculty, with a consequent impact on the natural and human environment.

FAQs on the appropriate use of Basic AI and Generative AI tools in coursework and assessments

 

Yes, AI tools that focus on automation and analysis, rather than creating new content, may be used for supporting tasks e.g., to check for spelling and grammatical errors, and improve the clarity and consistency of prose.

However, submitting assessed work created by writing assistants such as GrammarlyGO is prohibited, because these tools go beyond simple grammar checks and can rewrite sentences or even generate full paragraphs.

You are responsible for maintaining the integrity of the content of your submitted work which should reflect your original thinking.

 

You may use generative AI tools to assist with brainstorming for ideas, gathering information, organizing your thoughts, or refining your research direction. However, the ideas, analysis, and final work must reflect your own critical thinking, creativity, and writing style. AI should support your process, not replace your unique insights and learning.

 

The use of AI translation tools e.g., Google Translate, is permitted for reference purposes such as understanding unfamiliar words or phrases or checking grammar. Translation tools can also be helpful in locating portions of text relevant to one’s research questions, but students should be wary of depending on the accuracy of the results, which should be checked. Consultation with supervisors is advised, as students may be penalised for inaccuracies.

Given the varying levels of accuracy of transcription tools, they should only be used with the advice of the supervisor/tutor, and always acknowledged. Students using such tools ought to check for the accuracy of the results, and be aware of the importance of appropriate training in palaeography.

 

Yes, you may use AI as a part of your research process or method, for instance in automated transcription, topic-modelling, or other forms of text analysis. If AI is an integral part of your research method, this must be acknowledged and explained in the methods section of your essay or dissertation.

 

If you are using AI for the purposes outlined in points 1-3, you do not need to cite it, unless your tutor has requested that you do so or the specific assessment requires it.

If AI tools contribute significantly to your work (e.g., point 4), and your tutor has approved their use or the specific assessment requires it, you must acknowledge and cite their use.

Example citation:

Artificial Intelligence - Managing your references - Oxford LibGuides at Oxford University

Cite Them Right

 

Your tutor or supervisor should be your first point of contact. They will be able to clarify what constitutes acceptable use of generative AI for your particular assessment.