What to consider when choosing an LLM

what to consider when choosing an LLM

Copilot is here, and our customers are already hitting the ground running with staffing AI. As firms embark upon their generative AI journeys and incorporate Copilot into their workflows, one of their most frequently asked questions is, “Which LLM (Large Language Model) should I choose to integrate with Copilot?”

There’s no one right answer; which LLM is best for you and your team depends on your business goals and what you want to achieve with Copilot. Regardless of your decision, you’ll still be able to leverage generative AI to craft stronger messages, send more pitches, and generate profile summaries, screening questions, and more — all without leaving Bullhorn. 

Let’s break down what an LLM is, which LLMs integrate with Copilot, what the Bullhorn experts have to say about LLMs, and how you can decide which one is right for your recruitment business.

What is an LLM?

An LLM is a deep learning model that has been pre-trained on vast amounts of data and is used to power generative AI.

LLMs use tokens to understand and respond to prompts. They can be thought of as pieces of words, though they also include spaces and punctuation marks; each one is roughly four characters of typical English text. When you ask AI a question, it looks at each token – that is, each word, space, and piece of punctuation – to understand and respond to your question. You can learn more about tokens – and how to count them – here. Different LLMs also have different token limits, which limit the length of the prompt you input and the response the LLM generates; a shorter token limit means that you’ll need to input shorter prompts.

Why do I need to choose an LLM to integrate with Copilot?

“Copilot is the vehicle through which users access what the LLM can do,” said Nicole Krensky, Director of Product Marketing, Automation & AI at Bullhorn. Once you choose your LLM, you will connect that service with Copilot using an API. Copilot then pushes the information in your database, like candidate resumes, skills, education, and work history, along with the prompt you build or select, to the LLM to generate your desired output.

“Think of it as powering the engine in the background,” added Ben Carter, Senior Director, Automation & AI at Bullhorn.

Which LLMs integrate with Copilot?

Bullhorn currently supports Azure OpenAI and OpenAI LLMs. Customers have seen success with the following models: 

  • OpenAI GPT-3.5-Turbo
  • OpenAI GPT-4
  • Azure OpenAI GPT-3.5-Turbo
  • Azure OpenAI GPT-4 

OpenAI is a leading AI research lab known for developing advanced AI technologies, aimed at broad applications in various industries. Azure OpenAI is a collaboration between Microsoft and OpenAI that integrates OpenAI’s models with Azure’s cloud computing platform, offering scalable AI solutions to businesses. While OpenAI is a series of models, Azure OpenAI is a Microsoft-specific learning model. 

Each LLM has its own instructions for getting started, so we recommend reviewing the OpenAI Quickstart Guide, the Creating and deploying an Azure OpenAI Service Resource, and other help content from those providers. As part of the enablement process, you’ll need to provide details from your LLM such as the model, API key, and token limit.

What are the differences between these LLMs?

In addition to the differences between OpenAI and Azure OpenAI, GPT-3.5-Turbo and GPT-4 vary in a few ways.

Both GPT-3.5-Turbo and GPT-4 models are optimized for chat and work well for traditional completion tasks. Both models can understand and generate natural language, but GPT-4 can solve more difficult problems with greater accuracy. On the other handGPT-3.5-Turbo models are significantly faster than GPT-4 models and are offered at a lower price point. However, the increased price of GPT-4 models unlocks a connection to the internet that allows you to use responses from online sources. While token limits of GPT-3.5-Turbo models are lower than GPT-4 models, if you’d like to input longer prompts, both OpenAI and Azure OpenAI offer GPT-3.5-Turbo options with higher token limits.

Which LLM is right for my organization?

With Copilot, selecting the right LLM depends on factors like:

  • Your number of users
  • Number of requests your users might make per day
  • How much data you have
  • What speed you’re looking for

Consult with your IT team, research solutions favored by industry experts, or read comparative analyses of your options to determine the most suitable large language model for your needs. When choosing a third-party vendor, follow best practices and ensure that all of your security concerns have been addressed before you start using these technologies. Both Open AI and Azure Open AI have robust data security protections; you can review Open AI’s data usage for Consumer Services and data, privacy, and security for Azure OpenAI Service.

“[Think about] which one will be most suitable for the tasks you’re undertaking with Copilot,” said Carter. “No one passes a driving test and buys a Ferrari. The whole point of Copilot Starter is that it enables you to dip your toe in and see if your data is even in the right place to start using generative AI.”


To learn more about Copilot, visit our Copilot FAQ.

Subscribe to the Customer Blog

Subscribe for trends, tips, and insights delivered straight to your inbox.