Unlocking faster decision-making with LLMs

How improving data quality, access, and management enabled a leading US university to harness AI for streamlined HR processes.

Teams at a leading US university were facing a constant struggle – simple HR-related queries were taking far too long to complete, with answers often based on inaccurate or incomplete data only accessible by colleagues with technical expertise.

Recognising the potential of AI technologies and Large Language Models (LLMs) to enhance decision-making and improve operational efficiency, the university realised it had an opportunity to explore the power of AI while also solving a real-world problem.

Partnering with Equal Experts, the university launched a three-month proof of concept project to transform how it accessed and used HR data with artificial intelligence. Through cloud infrastructure improvements, data engineering best practices and natural language LLM solutions, we supported the university in breaking down barriers to information and streamlining complex HR queries.

The project successfully demonstrated how the university could use AI and LLMs to modernise its HR data management – enabling all users to extract insights effortlessly and make faster, data-driven decisions while also providing a foundation for future LLM adoption across the university.

Outcomes

3 months

to develop proof of concept LLM

Instant insights

eliminating delays and bottlenecks

AI platform beginnings

a foundation for expanding AI use

About the client

Our client is a leading US university, with a large focus on research. A non-profit organisation, it aims to pursue groundbreaking research while supporting its students through higher education and college life.

Industry
Higher Education/ Not-for-profit
Location
USA
Length of project
3 months

Challenge

Fragmented, inaccurate data slowing critical decision-making

Human resources impact every single staff member in an organisation. Teams need to be able to rely on HR data to answer vital questions quickly every day – how many employees are nearing retirement? Who needs a performance review in the coming months? How many vacation days does a team member have left for the year?

But when this information is locked across fragmented, siloed systems with inconsistent definitions and duplicated data, answering simple questions can quickly become a time-consuming and resource-intensive task.

For our client, this challenge presented an ideal opportunity. By using HR data, it could assess its capacity to integrate modern AI technologies within a practical context and solve a real organisational challenge at the same time.

As part of a proof of concept, the university aimed to ingest raw data into a cloud-based data warehouse (Snowflake), improve the data quality and design a robust data model. This would support the organisation to investigate and prototype the use of LLMs to provide natural language access to HR data, simplifying complex data querying and making vital HR data more accessible to users across the university.

Solution

An AI-powered proof of concept focused on data best practices and natural language processing

Working closely with the university’s HR, data and technology teams, we took an agile, iterative approach to solve the challenge within a tight three-month timeline.

We began by understanding the university’s requirements and data landscape, taking a deep dive into HR data systems, pain points and success criteria. From this, we tackled the underlying data issues systematically, integrating raw HR data into a cloud-based Snowflake data warehouse and building a robust data model for efficient querying. Data quality issues were also addressed by unifying column names, standardising formats, and removing duplicate records.

With access to quality data in place, we turned to the LLM component, evaluating and prototyping an AI-driven solution. We investigated LLMs, including Anthropic, Claude and Mistral models, with Mistral selected as the baseline due to its performance. With complex query generation, we found strategic prompt engineering to be more effective and efficient. We developed a prompt strategy to guide LLMs in generating accurate SQL queries from natural language inputs without the need for costly fine-tuning and integrated the LLM into a user-friendly interface, allowing users to query data themselves and reduce reliance on technical expertise.

By ensuring that stakeholders were thoroughly engaged throughout the process, including through demonstrations and feedback sessions, we were able to develop the solution iteratively, ensuring it would meet the real-world requirements of the end users. We also implemented infrastructure as code, using Terraform to provision cloud infrastructure for repeatability and scalability into the future. Comprehensive documentation, including detailed technical guides and architecture diagrams, will also enable future engineers to build on the proof of concept, supporting long-term value for our client.

Results

Enhancing HR efficiency, data quality and decision-making with a scalable AI solution

The three-month proof of concept successfully validated our client’s ability to integrate AI and LLMs as powerful tools for data management and decision-making.

It resulted in:

  • Enhanced data accessibility: Non-technical users can now query HR data using natural language quickly and efficiently, significantly reducing reliance on technical teams. Valuable technical expertise is now available to focus on other initiatives that add greater business value.
  • Improved data quality: Standardised and cleaned data have improved the reliability of analyses and reports.
  • Efficient infrastructure: Automation and infrastructure as code have streamlined deployment processes, enabling scalability and consistency.
  • Empowered decision-making: Faster access to accurate HR data supports better decision-making across the university.

Beyond solving the immediate challenges of accessing and utilising HR data for decision-making, the project also laid the foundation for AI-driven innovation across the university. By embedding data ingestion, modeling and cleaning alongside the establishment of the required cloud infrastructure, the organisation can quickly and effectively extend the LLM solution to other data sets and departments, offering long-term value beyond the initial HR data use case.

You may also like

Blog

The role of LLMs in evaluating Gen AI applications

Case Study

Solving the data pipeline puzzle at John Lewis & Partners

Case Study

Equipment manufacturer makes light work of pricing data

Get in touch

Want to know more?

Are you interested in this project? Or do you have one just like it? Get in touch. We’d love to tell you more about it.