Blogs, News & Insights

Is AI good or bad? The case for and against AI in L&D

Written by Kayleigh Tanner | 10 December 2024 09:14:14 Z

With AI set to continue as the hot topic in L&D as we head towards 2025, you’ve probably heard a lot of arguments both for and against the sector’s most talked-about tech. For all the people who are 100% Team AI for its efficiency, there are just as many with major concerns about ethics and data security... and even more who, quite frankly, have no clue what to think.

Whichever camp you fall into, we’re going to be exploring both sides of the argument to decide once and for all: is AI good or bad? And specifically: is AI good or bad for L&D?

 

Why is AI bad?

 

In Donald Taylor and Egle Vinauskaite’s 2024 report, AI in L&D: Intention and Reality, they asked L&D professionals what their main concerns are around the use of AI in L&D.

The top 5 barriers to using AI, in order, are:

  1. Data privacy and security concerns
  2. Lack of trust in AI outputs
  3. Lack of skills in L&D team
  4. Integration issues with other systems
  5. Regulatory or compliance barriers

Barriers to using AI in L&D (Source: AI in L&D: Intention and Reality)

 

Let’s take a look at each of these concerns in a little more detail.

 

"Does AI keep my data secure?"

Problem: As with any software solution, there will always be concerns about privacy and data security. HR and L&D teams handle sensitive employee data, so any unauthorised access or data breaches from AI tools could spell disaster – especially if you’re operating within data protection regulations such as GDPR.

Solution: The most important thing any L&D team with data security concerns can do is to choose trusted AI vendors with robust data encryption and transparent privacy policies – bonus points if your chosen tools explicitly comply with your region’s data regulations.

 

"Can I trust AI’s outputs?"

Problem: It’s no secret that not all AI tools are made equal. Even the best-known tools, such as ChatGPT, can produce ‘hallucinations’ (in other words, false or inaccurate outputs). It goes without saying that L&D teams should be focused on delivering accurate, high-quality information, so any inaccuracies in your AI output could damage your entire content offering.

Solution: Your L&D team should ‘peer review’ any AI output to test and validate the content it generates, and SMEs can help ensure the quality of your AI content. Learning to create better prompts for your generative AI tools will help you avoid the common ‘garbage in, garbage out’ challenge, where asking the right questions with the right level of detail will significantly improve your AI content. Encouraging users to rate your AI outputs will help you spot any issues, helping you improve your solution over time.

 

"What if my L&D team doesn’t have any AI skills?"

Problem: In small businesses, L&D teams are almost always overstretched, leaving you with very little time to learn new skills (no, the irony isn’t lost on us!). Developing AI skills isn’t easy, especially if you’re not proficient coders already, which can make introducing AI to your learning tech stack feel especially daunting.

Solution: The good news is that you probably don’t need to build your own AI tools! There are plenty of L&D vendors who are already developing AI tools, meaning all you have to do is choose the best ones for your business – no coding required. You can also adopt AI gradually instead of jumping in at the deep end, and many vendors will happily work with you to help with your own team’s upskilling to help you get the most out of your AI tools.

 

"What if AI doesn’t integrate with my existing systems?"

Problem: Many L&D teams worry that any AI tools, which are likely very new to the market, won’t integrate with their existing systems, such as the LMS, HRIS or external content library. If this is the case, it can create more manual work for the L&D team, making processes more inefficient and frustrating.

Solution: Even the newest AI tools often come with API-based integrations, allowing you to connect them to your existing systems without needing entirely new learning ecosystems. If they don’t, automation tools like Zapier and Make can help bridge those gaps for you. Alternatively, consider working backwards – find out which AI tools your current vendor is integrated with, and start there if you’re keen to dip your toe in the water with the support of a trusted tech supplier.

 

"Will AI adhere to my company’s compliance restrictions?"

Problem: Compliance training is always important, but for industries like healthcare, finance or government, it’s crucial that every single employee understands the ins and outs of your sector’s regulations. The concern here is that AI tools may inadvertently produce non-compliant outcomes, or that the tools themselves may not be approved in the context of your industry’s strict legal regulations.

Solution: Again, humans + AI = compliant training. Some AI tools are already being created to comply with industry regulations such as GDPR and HIPAA, and many come with inbuilt audit logs and reporting for easier compliance visibility. Your human experts should also establish clear review processes to validate AI-generated output, and your legal team should evaluate your AI tools against regulatory standards to ensure it can be configured for compliance over time.

 

Why is AI good?

 

The benefits of AI for L&D

At 5app, we’re strong believers in the benefits of AI for L&D. We could wax lyrical for days about why AI is good for enhancing learning programmes, but let’s take a look at some of the very best AI benefits:

Personalised learning experiences
AI gives L&D teams the ability to easily build personalised learning experiences with tailored content, learning paths and recommendations. Instead of waiting for the L&D manager to manually review everyone’s learning activity and performance, AI does it all for you by automatically analysing a learner’s progress to recommend highly relevant content to help them achieve their goals.

Scalability for large, diverse teams
For small L&D teams, scaling learning is daunting. AI makes it significantly easier to deliver high-quality learning to teams of all shapes and sizes, and can even be used to translate and localise content for diverse global audiences. Instead of creating brand-new learning programmes every time your team grows, AI can handle it for you, freeing you up to focus on creating great learning experiences.

Time and cost efficiency for small L&D teams
There just aren’t enough hours in the day for a small L&D team to do everything they want to do, which is where AI can really help. AI can do the heavy lifting where all your tedious, repetitive admin tasks are concerned (like filling in forms, transferring users and building learning paths), which saves your team precious time and money which can be used on the more people-focused tasks that L&D professionals enjoy.

Real-time feedback and assessment
Managers and L&D teams just don’t have time to provide feedback on learning activities and assessments in the moment… but AI does! The moment a learner completes an assessment, AI can immediately analyse their results and provide personalised feedback and guidance. And it’s not just for multiple choice questions – today’s AI is sophisticated enough to analyse free text responses, image-based assessments, videos and more, as well as pointing learners towards useful content to further improve their understanding of the content.

Learning programmes that improve over time
It takes enough work to launch your original learning programme – let alone gathering learner feedback, analysing your data and updating your learning. AI makes light work of improving your learning programmes, as it does it automatically, based on real-time information. For instance, if one course is especially well received, it can be recommended to more learners. If everyone scored 100% in last year’s compliance quiz, AI can generate new questions to ensure that everyone is being properly tested next time.

 

Why does L&D need to know about AI?

Even if you have zero intention of using AI for L&D anytime soon, that doesn’t mean you can bury your head in the sand and pretend it doesn’t exist. Whether you think AI is good or bad, it’s very much here to stay, and for most businesses, it’s a case of when, not if, you start to use it.

That doesn’t mean you need to spend weeks or months upskilling your team to become AI experts, but it does mean that you need to at least know how L&D teams are using AI to improve learning, whether that’s generating content faster, analysing learner data and trends or building personalised learning.

AI is already disrupting academic learning, with many schools and universities struggling to come up with fair rules around the use of generative AI tools like ChatGPT. There are plenty of videos showing students using AI to complete quizzes and exams, and it’s not difficult to see how this could impact corporate L&D too.

At the very least, L&D professionals should understand:

  • What AI is being used for
  • What it could potentially be used for
  • Who is already using AI in the business
  • Why it could be useful
  • Why it could present challenges
  • How to overcome those challenges

The AI space is moving fast, so make it a priority to stay up to date with the latest AI news, developments and tools to make sure you don’t fall behind.

 

The ethics of AI for L&D

Many of the concerns around AI are related to ethics. For instance, a factory worker using AI to complete their health and safety training? Bad. A salesperson using AI to generate personalised email copy at speed? Fine! But there are plenty of grey areas, all of which bring nuance that needs to be carefully considered. 

In fact, more and more businesses are looking at creating AI policies to ensure that AI is used ‘properly’ for work. The lines will differ from company to company, and it’s not something that L&D can decide alone. It requires a conversation across the entire leadership team to determine when using AI is and isn’t appropriate, and this needs to be an ongoing conversation, as new AI solutions surface over time.

AI challenges Potential solution
Learners using AI tools to complete assessments and quizzes Introduce clear policies to forbid the use of AI in workplace assessments, and consider adding checks in assessments (such as flagging unusual answering patterns or response times)
Using AI to generate learning content, such as elearning courses or microlearning video scripts Ensure that human SMEs review all AI-generated content for accuracy, and edit content where necessary to fit the brand voice
Bias in learning recommendations, such as suggesting more leadership courses for male employees Regularly audit AI’s decision-making processes and diversify the data used to train AI tools
AI-generated content may not always be accessible or inclusive Choose AI tools that adhere to accessibility standards, such as WCAG, and ensure content generated supports text-to-speech, screen readers and language translation
Learners may have concerns about what AI tools will do with their data and learning records Clearly communicate to learners how AI will engage with their data, and allow learners to opt out of AI learning activities

 

The verdict: is AI good or bad?

No matter what you think about AI, the fact is that you’re almost certainly going to make a call about whether or not you use it in the next year or so – and if you decide to use it, how will you use it, and how will you ensure it’s used to improve your L&D offering?

Despite the potential challenges of AI, we wholeheartedly believe in the vast potential of AI to transform L&D for the better. Organisations of all sizes can benefit from automated, more efficient processes, the rapid generation of personalised learning content and superior data analysis, helping us do more with less and create learning that works harder with no extra human effort. 

AI will never fully replace human L&D teams, but the combination of human intelligence and artificial intelligence allows us all to benefit from the best of both worlds.