Redesigning Processes for Detecting misuse of AI in Student Work

Submit entry here

This challenge is currently closed but if you would like to discuss a potential solution, we are still open to conversations.

Support is fully funded by the UK Government through the UK Shared Prosperity Fund. Tees Valley Innovation Challenge is delivered by Edge Innovation Ltd and Health Innovation North East and North Cumbria on behalf of the Tees Valley Mayor and Combined Authority, and the Tees Valley Business Board.

Middlesbrough College is a prominent education provider in the Tees Valley area. With AI tools such as ChatGPT and Snapchat AI being recently made highly accessible to the world, Middlesbrough College have made it a priority to implement effective policies and procedures to mitigate the potential negative outcomes created by the use of these tools, whilst encouraging effective, appropriate usage of such tools.  

Generative AI offers huge potential to all sectors of work especially in its ability to reduce administrative workload and accelerate content and knowledge creation. It also provides opportunity as an area of education for students with potential to help them be more competitive in future job markets. However, current use of generative AI threatens to impact the academic integrity of the College’s assessment processes and negatively impact learning outcomes as there is potential for students to use it to take ‘short cuts’ in completing academic work. 

Checking for AI-specific plagiarism is currently an onerous process due to fully automated systems being unreliable. 


People current state and future state

Current state

Future state

Curriculum Staff and Managers: 

Time efficiency: At present, automated AI detection is inaccurate creating both false positives and failing to identify misuse. This unreliability means staff must rely on manual processes to detect AI plagiarism and misuse. Manual checks are time consuming, creating a significant increase in workload. 

Work life balance: High administrative workloads redirect resources away from improving quality of learning and poorly effects work life balance. 

Confidence in using AI: The negative effects of AI use have caused a poor perception of AI amongst staff preventing the acceptance of the use of AI in any form. There is also a lack of confidence in the quality of learning outcomes as a result of AI misuse. 

Curriculum Staff and Managers: 

Time efficiency: Automated AI detection is used in harmony with manual checks. Improved reliability allows for significant improvements in time-efficiency. 

Work life balance: An improved system allows AI to be used to improve efficiency, giving teachers more time and autonomy. 

Confidence in using AI: Confidence and trust in an established system means staff feel confident in what they are asked to do and the quality of the outcome. 

 

 

 

 

Students: 

AI integrated into education: Students utilise AI as a tool whilst completing time-consuming work. Currently, students may lack the skills to use AI appropriately and effectively.  

Quality of learning outcomes: Misuse of AI presents a threat to the development of essential skills, damaging their future employability. 

Perceptions of fairness: A lack of accuracy and transparency in the current system can result in students being falsely accused of AI plagiarism, damaging the trust between students and teachers and consequently the College's learning environment. 

Consequence for AI misuse: The unreliability of the current system potentially allows students to misuse AI without consequence. This prevents AI use from being discouraged and the fair assessment of work from student to student. Peers who do not use AI may feel cheated, further encouraging AI use. 

Students: 

AI integrated into education: Students are able to use AI in an appropriate way, improving learner outcomes.  

Quality of learning outcomes: Students educations are authentic and robust, making them better equipped to enter employment. 

Perceptions of fairness: Confidence in an established system from both staff and students creates an improved learning environment. Students do not fear false accusations of AI misuse. 

Consequence for AI misuse: Students are confident that AI misconduct is handled fairly and appropriately. Fair consequences for AI use discourage students from committing AI plagiarism. 

Local Employers: 

Desirability of local graduates: Due to the threat generative AI poses on the integrity of students’ educations there may be a sector-wide lack of confidence in the quality of graduates despite their qualifications. 

Implementation of AI tools: Many employers have not yet successfully implemented AI tools into their industry. This could be improved with graduates educated in effective and appropriate use of AI. 

Local Employers: 

Desirability of local graduates: Employers are more motivated to hire college graduates due to their confidence in the quality of their skills and education; increasing local employment. 

Implementation of AI tools: Graduates are knowledgeable in AI skills and tools, helping employers to integrate AI into their industry. 

Technology current state and future state

Current state

Future state

Accurate identification of misuse: The manual investigative process often relies on familiarity with student work and recognising when written work is atypical. This method is effective but requires the marker to be very familiar with the student's previous work and leaves room for human error, which is made more difficult with the increasing number of AI language models available. Automated alternatives to identifying AI plagiarism are unreliable.  

Time intensity: The requirement for recognising and evidencing misuse is time consuming for both staff and students. While automated systems exist, false positives can result in further time investment. 

Evidencing misuse: Currently, misconduct investigations involve conversations with learners and additional in person assessments which provide a point of comparison to the alleged plagiarised work.  Staff need to feel confident in their conclusions before beginning this process. 

Trust: False accusations or unidentified misconduct can lead to variations in fairness on a case-by-case basis. This has potential to damage relationships between staff and learners and create a general lack of confidence in the education providers ability to respond to AI effectively. 

Education on AI use and consequences: The current lack of a standardised process creates challenges in ensuring that students understand what constitutes misconduct and what the resulting consequences are.  

Preventative measures: It is difficult to implement effective preventative measures. Current methods include blocking access on networked computers and organising more supervised testing, which may become less effective as new AI tools become available. 

Accurate identification of misuse - A standardised process to help teachers to identify misuse across all common tools available to students. Automated flagging with manual verification/investigation processes to reduce the chance of both human and computer error.  

Time intensity: An efficient system allowing for reduction of manual workload, while maintaining accuracy. The process should be easy for staff to use and integrate with current assessment methods. 

Evidencing misuse: Improve staff confidence when challenging students by helping staff to evidence their claims of misconduct. 

Trust: High confidence amongst students and staff in a fair and equitable misconduct management process. 

Education on AI use and consequences: Robust and standardised processes allowing for higher transparency amongst students and staff. With clear guidance and processes in place, students can be properly educated on appropriate use of AI. 

Preventative measures: Reduced pressure to prevent AI use by students, instead providing effective parameters to empower students and staff to utilise AI appropriately. 

Finance current state and future state

Current state

Future state

Quality of learning outcomes: AI misuse affects the quality of learning as students take short cuts with tasks. AI misconduct also affects the accuracy of testing results.  

Integration of AI: Education providers must find new ways to educate students and staff on appropriate use of AI, as well as find ways to utilise AI to improve the efficiency of work. This is necessary to remain competitive as other providers have already begun to implement similar tools. 

Reputation of education providers: Legitimacy of learning outcomes could be called into question as the result of AI plagiarism and misuse, which has potential to impact on funding.  

Staff costs: The need for increased staff involvement during all stages including prevention, marking, investigation and discipline increases workload and has a direct impact on the resources available to improve quality of learning.  

Staff satisfaction, recruitment and retention: Increased workload and longer working hours decreases staff satisfaction and increases staff costs. Staff retention is the worst it has been with a turnover rate of 25% per annum. Staff recruitment is made difficult as desirability of roles is reduced. 

Transparency and trust: AI misuse damaging academic integrity creates an environment of mistrust within the college, affecting the learning environment. 

Learner employment: Employers may lose confidence in the quality of College graduates as a result of AI misconduct. There is a desire to ensure that students are educated to an equitable standard to students in other areas of the country. 

Quality of learning outcomes: Education providers can easily identify and take action when AI is misused using a common process that ensures the quality of learning outcomes.  

Integration of AI:  AI is not only taught to students as a valuable skill but is also used to improve learning materials and improve ways of working. Use of AI as a tool for those with learning difficulties help to improve education accessibility. 

Reputation of education providers: Improving academic integrity and creating higher quality academic outcomes will improve the confidence of examining bodies, parents and investors and ensure an excellent reputation for Middlesbrough College. This could contribute towards increased intake or funding.  

Staff costs: Decreasing workload by improving the efficiency of existing systems will allow for resources to be redirected away from admin and towards exceptional education provision. 

Staff satisfaction, recruitment and retention: Improving work life balance as a result of more efficient systems and the implementation of AI tools will improve staff retention and improve prospects for recruitment by improving desirability. 

Transparency and trust: Improved transparency in processes assure student parents and staff that the institution is competent in maintaining fairness and quality despite the challenge presented by AI. This could promote a sense of pride amongst the College’s community. 

Learner employment: Ensuring academic integrity and high -quality learning outcomes produces high quality graduates who local employers are keen to recruit. 

In scope

  • Integrate tools that detect AI plagiarism, across all common AI language models
  • Be focused on management of plagiarism in text
  • Have a digital element
  • Involve manual and automated steps within a time saving misuse management system
  • Consider flexibility to evolve and grow as AI understanding develops over time
  • Consider staff training or include training tools
  • Have the potential to benefit the sector as a whole with options to commercialise

Out of scope

  • Be a fully automated system, given limitations in current automated tools
  • Focus on the identification of AI generated images, audio, video
  • Require additional staff time and management
  • Require a high level of technical skill or knowledge
  • Focus on anything outside the realm of education
  • Aim to change or impact current assessment methods
  • Rely on changes to awarding body or government policy
  • Suggest the removal of written submissions/ assessments

Apply for this challenge

To access this support, you must be a small or medium-sized business based in Tees Valley (Darlington, Hartlepool, Middlesbrough, Redcar & Cleveland, or Stockton-On-Tees).

Support is fully funded by the UK Government through the UK Shared Prosperity Fund.

Tees Valley Innovation Challenge is delivered by Edge Innovation Ltd and Health Innovation North East and North Cumbria on behalf of the Tees Valley Mayor and Combined Authority, and the Tees Valley Business Board.

 

Tees Valley Innovation Challenges - Application form

Thank you for your interest in this challenge. Please complete the following form to register your application for this challenge.

Oops! We could not locate your form.