No. You cannot and you should not use AI to generate evidence for training activities that you submit in your e-portfolio.
The NSHCS work-based assessment standards outline very clearly:
- that training activities are activities that you undertake, and
- that evidence submitted for training activities must include evidence that you successfully completed the activities and your reflections upon the experience of undertaking the activities.
The purpose of training activities is to learn the skills that you will need to qualify and work as a Clinical Scientist. Training activities are part of your work-based learning and most of them, especially in your specialist modules, are about practising and demonstrating practical and applied skills. AI cannot use the equipment for you and so AI should not generate the evidence that you have undertaken the activities.
Training activities are concrete activities that you must carry out in your workplace. The evidence you submit needs to demonstrate that you have undertaken the activities. As we outline in our guide ‘Developing evidence to demonstrate achievement of training activities’, evidence should be authentic, valid, reliable and current. You will very likely generate this evidence in completing a training activity; for example, by generating reports and images. Additional work may not be needed. The evidence that you have successfully completed a training activity is simply not a traditional academic essay. The reflection on training activities needs to be yours and needs to show what you have learned over time from the activities.
If you use AI to generate evidence that you have undertaken training activities, the evidence is not authentic, valid and reliable (even if AI can make it seem like it is). If you use AI to generate reflective writing about your training activities, the reflections are not yours and they are not evidence of learning (even if AI can make it seem like they are).
AI can be a great tool to stimulate learning, as we outline below, but it should not be used to generate evidence that you have undertaken training activities and to generate reflections about training activities.
The generation of evidence of activity, and the reflection upon that activity, stimulate learning. That is why you are required to work in this way as part of your work-based assessment. If you use AI to generate evidence of activity or reflections upon your activity, you are effectively avoiding learning, which may impact on your ability to work safely as a healthcare scientist. Using AI to generate evidence of activity or reflections upon activity is therefore unprofessional behaviour which additionally does not align with the standards expected from an HCPC registrant.
What do the work-based assessment standards say about training activities, evidence and reflection?
Standard 1.1, ‘The purpose of training activities’ states that:
‘Training activities require trainees to undertake activities in the workplace and submit evidence that they have completed the activity to their e-portfolio. The evidence is then assessed to ensure the trainee has achieved the training activity successfully.’
Within standard 1.2, ‘Completing training activities’ there is a section entitled ‘Generating evidence for training activities’ which states that:
‘The purpose of evidence is to demonstrate to an assessor that a trainee has successfully completed a training activity.’
‘Evidence should be personal to the trainee and specific to their experience including dated, concrete examples of their own observations, experiences, reflections, or practice. Wherever possible evidence should include items generated by the trainee in the course of their practice, demonstrating that the trainee is learning to do the job in their workplace.’
This standard outlines very clearly that evidence must be yours. It must be personal, specific and concrete, not content generated by AI.
Standard 1.3, ‘Types of training activity’ includes the following guidance for each of the three types of training activity about what your evidence should include.
Evidence submitted for observational training activities (OTAs) must include:
- Evidence that the observation or experience has been undertaken by the trainee.
- Reflections on the observation or experience including contextualisation to the trainee’s own practice, at two points in time.
Evidence submitted for entrustable training activities (ETAs) must include:
- Evidence that the activity has been undertaken by the trainee repeatedly, consistently, and effectively over time, in a range of situations. This may include occasions where the trainee has not successfully achieved the outcome of the activity themselves.
- Reflection at multiple points in time on the trainee’s practice of this activity.
Evidence submitted for developmental training activities (DTAs) must include:
- Evidence that the activity has been undertaken by the trainee.
- Reflection on the activity at one or more points in time after the event including reflection on what has been learned from the activity and/or reflection on how it can be used to develop the trainee’s practice.
This guidance makes it crystal clear that evidence submitted for each of the different types of training activity must be authentic, personal, context-specific and that it must articulate what you have learned. Working in this way stimulates your learning and helps to make you a safe healthcare scientist. The assessment of your evidence and of your reflections is an act of assurance by an expert healthcare scientist that you have learned and are on the way to becoming a safe and competent clinical scientist.
What are the consequences if I am found to have used AI to generate evidence and reflections for training activities?
The consequences are the same as they are if you are found to have plagiarised and submitted work to your e-portfolio that is not yours. If AI has been used to generate your evidence and reflections upon a training activity, the evidence and reflections are not yours.
So, if it is discovered that you have used AI to generate evidence and reflections for a training activity – for example, where Turnitin indicates a high AI score that cannot easily be explained – your assessor can return it to you and not sign it off. This is a failed submission. Read more about the AI writing score in Turnitin.
The National School will reserve the right to invoke its Training and Assessment Misconduct Policy if there are allegations and incidences of training or assessment misconduct using AI by trainees.
Can I use AI to enhance my learning on the STP?
Yes. AI is a rapidly-changing tool in our digital toolbox that you can use at the right time and in the right way to stimulate and support your learning.
In addition to being thought of as a tool, it can also be thought of more dynamically as a guide or tutor. A 2025 article in Forbes magazine begins with the following sentence: ‘Artificial intelligence is moving beyond the role of a tool and beginning to act more like a colleague.’
This is a helpful way of thinking about how you might use AI to enhance your learning on the STP. A colleague can stimulate your thinking. A colleague can be a critical friend. A colleague can provide you with information. A colleague will not do your training activities for you and it follows that a colleague will not generate your evidence and your reflections for you, just as AI should not.
A wide range of AI tools and services can be used like colleagues or ‘tutors’ in ways that have benefits for learning. For example, Google’s Notebook LM, ChatGPT’s study mode and Anthropic’s Claude can each be used to generate outputs that can enhance your learning, such as summaries of articles, flashcards about particular topics, summaries of sources that you point them to, structured quizzes, mind maps, visualisations and multimedia summaries of sources or topics.
AI tools can be prompted to stimulate critical thinking, to pose questions and provide counter arguments. For example, you might prompt an AI service to give you feedback on your reflective writing by asking it to apply a particular model of reflection to your writing. When AI is used to generate content of different types and in a range of media personalised to different learning needs, and when AI is used to provoke thinking about tasks and content, it can support learners to learn effectively and autonomously.
We have a choice in our use of AI that is neatly summed up by David Gurteen in this article, Chatbots as Critical Thinking Partners:
‘These tools don’t just provide answers – they provoke questions, encouraging users to engage critically with the content AI generates. Our responsibility lies in how we choose to use the technology: as a shortcut to evade effort or as a springboard for inquiry and exploration.’
When you use AI to stimulate your thinking and reflection, AI can enhance your learning. When you use AI to evade effort – for example, to generate your evidence in the workplace – you are using AI to avoid learning. It is vitally important that the next generations of healthcare professionals reach qualification and registration without having avoided the learning that will keep patients safe.
What if I use AI within my training activity, as a tool used within my specialty?
If, within your training activity, you are using AI as a tool that is used within your specialty, a tool used to help with the task you have to practise, then this is absolutely fine. The NSHCS work-based assessment standards make this clear within standard 1.2, ‘Completing training activities’:
‘Generative artificial intelligence can be used to complete a training activity where it is appropriate to the activity being undertaken and acceptable to the workplace i.e., where it used as a tool used to do the job. Its application should be clearly acknowledged. Generative artificial intelligence should not be used solely to generate personal items of evidence such as reflections.’