Putting the “AI” in rail: How will AI impact the rail industry and its employers?

Putting the “AI” in rail: How will AI impact the rail industry and its employers?

The transformative power of artificial intelligence (“AI”) has been hitting the headlines recently. There have been plenty of claims about the impact AI will have across various sectors and industries, and the rail industry is no exception. Whilst AI is unlikely to replace frontline staff anytime soon, there is still likely to be a significant effect on the rail industry and its people. In this article we pose a number of key questions for employers in the rail industry.

Use of AI in recruitment

The rail industry has been making great strides in trying to diversify its workforce, although there remains some way to go. Whether this is attracting more women to historically male roles, with organisations such as Women in Rail doing great things in this area, or encouraging the younger generation and those from diverse backgrounds to join the rail workforce via organisations such as the 5% club. The Equality, Diversity and Inclusion Charter for rail has over 200 signatories from across the rail industry, promoting positive change. A recent rail industry top table dinner hosted at our offices saw around 40% female identifying attendees, with wider diversity from an ethnic and experience perspective as well. So progress is being made, but that mustn't lead to complacency.

However, we have seen in other industries that valiant efforts to diversify can sometimes be thwarted when AI is thrown into the mix. If AI tools used in recruitment are machine learning tools which learn from previous employee data, this can reproduce and reinforce bias and discrimination. For example, Amazon’s machine learning tool used in recruitment taught itself what a “good” CV looked like – which resulted in it favouring male CVs over female CVs given the male dominated workforce and the data on which it was trained. Employers in the rail industry using AI should be careful about the bias that may be inherent in such tools.

Use of generative AI in the rail industry

While the rail industry has already been adopting and adapting to the use of AI in various guises over the years from helping to predict maintenance requirements – both for rolling stock and infrastructure – to improving efficiencies, it’s now “generative AI” in the form of ChatGPT, Bard and Microsoft Bing that are taking much of the limelight. These are examples of language processing tools trained on massive amounts of text data which are capable of understanding and generating human-like prose. What might this mean for the rail industry and what should employers consider?

Key considerations of use of generative AI for employers in the rail industry:

  • Which areas of your workforce will be permitted or indeed prohibited from using such generative AI? Will this be divided primarily by a safety vs non-safety related roles categorisation and then further broken down into different functions?
  • Are the senior people “at the top” aware and trained on the potential of generative AI and able to consider how and when it could and should be used?
  • If you do want to use generative AI in certain areas, do you need to consider any collective bargaining agreements in the event that the decision to bring types of AI into the workplace is a bargaining issue?
  • Have you already introduced guidelines or do you intend to do so? If so, will you also roll-out training? There is little point in policies sitting on the shelf gathering dust, but employees need to be actively trained on them and understand how they work in practice.
  • Have you considered setting out very clear guidance prohibiting employees from entering confidential information into generative AI platforms? There is a real risk of breaches of confidentiality to which staff should be alerted.
  • Have you considered banning the use of generative AI for certain roles or types of work product? Think about the marketing roles in the rail industry and rail's presence on social media. Creating posts or articles that an employee intends to pass off as their own, when there is a real risk that the same or similar content could be generated for another user, could cause reputational damage and embarrassment.
  • Will you establish a process for employees to report any concerns or issues related to the use of generative AI?
  • Do you need to consider adapting your performance processes or targets for those employees who use generative AI in their roles?

The sooner that rail organisations grapple with these questions the better, enabling them to take a pro-active rather than reactive stance to issues that might occur. This article has focused on the employment issues but there are a myriad of complexities to consider from a data protection, intellectual property, infringement and copyright perspective. For a previous article covering those areas please click here.

If you have any questions on the use of generative AI in your workplace please contact Anne Pritam, Leanne Raven, Darren Fodey or your usual Stephenson Harwood contact.