This page contains Blackboard original support material

Visit BB Ultra support for information and guides for Blackboard Ultra

Draft Staff Guide to the use of Generative Artificial Intelligence

This is a working document and has been informed by the work of the Artificial Intelligence (AI) in Teaching and Learning Working Group. It is recognised that AI is rapidly evolving, and this document should be considered as a living document. This is version 1.1, created in August 2023. 

What is Generative Artificial Intelligence  

Generative Artificial Intelligence (AI) refers to the use of AI to create new content including text, audio, video, computer code and music. ChatGPT is the most widely known generative AI tool attracting a significant amount of media coverage, interest, and experimentation with the technology.  

 ChatGPT was the first experience that many of us had with conversational interfaces allowing us to ask the system to complete tasks iteratively, refining the output that was produced based on our natural language prompts. The system is trained on large data sets from three primary sources of information:

  1. information that is publicly available on the internet

  2. information that is licenced from third parties, and

  3. information that users or human trainers provide.  

 Other generative AI technologies include Google Bard, DALLE-2 and CoPilot.  

 ChatGPT is one of several solutions released by the parent company OpenAI. OpenAI started as a not-for-profit Research Lab, subsequently attracting venture capital investment including investment from Microsoft. As a result, the tools are becoming integrated into existing business and personal applications such as web browsers (Microsoft Edge) and the Microsoft Office suite.  As the solutions can be licenced, new plugin architectures are evolving allowing other businesses to integrate their services with the tools.  Similar commercial approaches are evolving with other solutions such as Google Bard where functionality will be surfaced in Google Docs. This is an important point - AI tools are, and will continue to be, embedded within applications that we use daily.  

Limitations of the tools 

The most rapid evolution, in these tools, has been natural language conversational interfaces which appear to respond in an intelligent human like way. The tools however do lack other attributes which we consider human including morality, critical thought, or common-sense judgement. The tools provide output based on patterns contained within training data which may contain bias, be incomplete, out of date or in some cases entirely false.   

 Some of the current limitations of Large Language Model (LLM) AI tools include

  • The tools do not understand what the words they produce mean. 

  • The tools will often generate arguments that are wrong. 

  • The tools will often generate false references and quotations. 

  • Content generated is not checked for accuracy. 

  • The tools can distort the truth and emphasise the strength of an opposing argument. 

  • The tools do not perform well on subjects that do not have a lot of public online discourse. 

  • The content generated is based on an historical data set which is fixed in time. 

  • Generated content can include harmful bias and reinforce stereotypes. These biases can be reinforced through further human interaction with the model. 

  • The models are trained on a data set from a Western English-speaking perspective again reinforcing particular perspectives.   

 Developing skills to prompt AI tools is likely to be a useful digital skill but users should also understand the limitations, remain open, curious, and critical when making judgements about the accuracy of the content generated. 

Using AI tools at Ulster 

Information for all staff 

At Ulster, we believe that these tools will be a part of our personal and professional lives and we wish to explore their use in ethical, transparent, and reasonable ways.  It is entirely appropriate to use generative AI as part of your work if you do not claim work generated by AI as your own.  

 You may wish to use the tools as part your workflow to check ideas, or to seek guidance, if you have questions. The technologies are constantly evolving and many of us are discovering new ways to use the tools, often informed by our own professional contexts.  

 Some of the technologies we licence, including Blackboard and Microsoft solutions, will have generative AI functionality in future releases.   

Information for staff involved in teaching and assessment. 

 The wider HE discourse has naturally been in relation to assessment and concern about academic integrity. Ulster has a long history of active learning pedagogies combined with authentic assessment design and the current AI in assessment discussions can help us to refocus on assessment design that measures active learning, critical thinking, problem-solving and reasoning skills rather than written assignments measuring declarative knowledge. Personalised, reflective accounts, developed iteratively, as understanding develops, are also valuable approaches and some subject disciplines have been using video and oral presentations to measure understanding and create a more personalised approach to assessment. These diverse approaches to assessment are identified as good practice across the sector; being more inclusive while reducing the risk of plagiarism. 

The QAA has recently published guidance on how to approach the assessment of students in a world where students have access to Generative Artificial Intelligence (AI) tools. 

 There are, however, many practical reasons where alternative assessment or assessment redesign may not be practical, or changes may take time and many colleagues have become curious about AI detection software. Whilst there are tools that claim to detect AI they demonstrate varying levels of reliability. Jisc and the QAA have provided helpful information on these detection tools: 

Jisc notes: “AI detectors cannot prove conclusively that text was written by AI.” 
Michael Webb (17/3/2023), AI writing detectors – concepts and considerations,  

Jisc National Centre for AI 

The QAA advises: “Be cautious in your use of tools that claim to detect text generated by AI and advise staff of the institutional position. The output from these tools is unverified and there is evidence that some text generated by AI evades detection. In addition, students may not have given permission to upload their work to these tools or agreed how their data will be stored.
QAA (31/1/2023), The rise of artificial intelligence software and potential risks for academic integrity: briefing paper for higher education providers 

 OpenAI, as of 24th July 2023, have disabled their own detection service following concerns about accuracy.   

 Turnitin do provide an AI detection service which is integrated within normal grading workflows. Instructors are presented with a prediction of the likelihood of a piece of work being generated by AI tools, such as ChatGPT, when they are grading a paper.  

 The AI working group, at Ulster, had many concerns from an ethical, accuracy and privacy perspective and initially made the decision not to enable the tool. This was very much in line with the Higher Education sector within the UK. However, some academic teams were experimenting with commercial detection services, due to the lack of centralised system, and the Teaching and Learning Committee made the decision to enable the tool during Semester 3 of academic year 23/24 to better understand the functionality and inform decision making.  

 There have been many examples of false positives and the system should be considered experimental at these early stages. Students do not see their score. This guide outlines Ulster’s current position as of July 2023, the key message is that, unlike an originality score, Turnitin can provide no evidence as to how the AI score was generated thus making any academic integrity judgement difficult.  

 Please note as of 7th September 2023, and following the pilot, the Turnitin AI detection tool has been disabled.

The availability of detection tools does provide the opportunity to talk to students about the use of generative AI and we encourage you to have conversations with your students and be clear about what is appropriate use within your subject context. This will differ greatly between subject disciplines. Ensure students are aware of the student guide to the use of AI, which also contains some ideas for what appropriate use may be for your context. 

 Ulster’s position on the use of AI in teaching and assessment is to 

  • Encourage a University culture that upholds the value of integrity 

  • Reinforce the expectation that work submitted for assessment is students own original work.  

  • Remain open to the benefits of the use of AI whilst highlighting the dangers of relying on the outputs as accurate sources of information. 

  • Develop guidance about how to accurately acknowledge the reasonable use of AI in student work. 

  • Encourage critical dialogue when AI tools are used within the curriculum.   

Experimenting with the tools 

Before experimenting with any generative AI tool, you should give some consideration to privacy. We do not know what data is being collected, by whom, and how it is applied in AI when we use these tools. For this reason, you should not share personal or sensitive data - for instance it would not be appropriate to ask an AI tool to perform some analysis on a dataset containing student data.  

Currently ChatGPT can be tested free online at but be careful, there are also paid for subscriptions. You might start experimenting with the tool by asking a question such as: 

  • What are the ethical considerations of using Generative AI? 

  • Explain AI bias in a way that a child can understand 

 You can get more specific results by being more specific with your prompts.  

  • Tell me how [add query] works in 50 words.  

  • Behave as a higher education lecturer. [Add query]  

  • Write a four-paragraph summary about [add query] 

  • My excel spreadsheet has two columns, A & B, how can I find results that are in both columns?  

  • You may also wish to test some subject specific prompts such as Build an HTML website homepage with three columns and a hero image. Can you explain Standard Deviation using an Excel example? 

Acknowledging the use of Generative AI tools

Ulster’s Academic Misconduct Policy has been updated in academic year 23/24 to explicitly mention the use of Generative AI and contains information about acknowledging the use of Generative AI. 


Where generative artificial intelligence (AI) tools have been used for academic purposes, they must be acknowledged appropriately to ensure that any output is not misconstrued as the author’s own work. Before students begin any piece of assessed work, they should be clear that the use of AI tools is authorised and appropriate as this practice may differ across modules and programmes of study. 

 Use the below links to find out more information about citing and referencing AI in the Harvard style for your faculty.

 If using a different referencing style to Harvard, please contact your Library Subject Team.