The use of AI in end-point assessment | NCFE

What can we help you find?

The use of AI in end-point assessment

We have seen a rise in the use/presence of artificial intelligence (AI) during end-point assessment (EPA). Whilst not all the examples we’ve witnessed involved the apprentice actively using AI, there have been clear signs of AI running in the background during assessments. This has included notifications on screens that applications such as Grammarly, ChatGPT, Google Bard and Fireflies Notetakers have either been running in the background or have been invited to Teams meetings during assessments.

Although we do not know what these applications are being used for at the time, they can be used for such things as:

  • generating new ideas, prompts, or suggestions for a given topic or theme
  • generating text with specific attributes, such as tone, sentiment, or formality
  • recording and/or transcribing meetings
  • analysing, improving, and summarising text.

Knowing this, any application that is running during any assessments could potentially be seen as a violation of exam conditions and will be treated as such. It could also affect the integrity of NCFE assessments as per the stipulations in our Regulations for the Conduct of EPA, which are outlined below:

  • Recording or copying any elements of assessments, either digitally or in written format, by anyone other than NCFE representatives is strictly forbidden. This includes recording or copying any questions that may be asked during an assessment, either verbally or in written format.
  • Independent training partners must ensure any device used to complete the EPA is free from any material/additional facilities that would give the apprentice an unfair advantage, such as retrievable information.
  • Access to the internet or any other form of digital resource during live assessment is strictly prohibited, unless otherwise stated in individual standards.
  • All work produced by the apprentice must be authentic. By submitting evidence for assessment, the partner is confirming the authenticity of the apprentice’s work. Any information taken from the internet or other sources should be cited and referenced accordingly.

Please remind apprentices to ensure that all AI software or apps are turned off during assessments, and are not invited to any meetings. Where we identify the use of AI, our IEPAs will discuss this with apprentices to ensure it is removed prior to the assessment or if AI is introduced part way through the assessment, the IEPA may stop the assessment. If IEPAs are unsure and do not wish to disadvantage or raise any concerns that may affect the apprentice’s performance during the assessment, they may wait until after the assessment to raise concerns to the Quality and Compliance team for further investigation.

Any use of AI during an assessment could potentially lead to malpractice cases being raised as stipulated in our NCFE Regulations for the Conduct of EPA.