Generative AI platform for customer service

Agent Copilot is a new conversational AI solution from Got It AI designed for customer service and sales operations with guard-railed generative AI.

Designed to empower agents with quick and accurate answers from complex knowledge sources such as insurance plans, financial products or manufacturer catalogues, Agent Copilot uses TruthChecker AI, a feature designed to catch and avoid inaccuracies in responses generated by Large Language Models (LLMs).

Businesses can customize their Agent Copilot by choosing from a selection of LLMs including ChatGPT, GPT-4, LLaMA2, MosaicML, and FLAN-UL2. The choice of LLM can even be based on the business's specific knowledge base and the measured performance of the LLM against the documents in it.

Agent Copilot can be configured with a variety of data sources including PDFs, web pages, documents, and presentations, enabling support of full multi-turn dialogues against the compiled knowledge base. Finally, for organizations with data privacy concerns, enterprise specific fine-tuned LLMs can be installed on-premises.

Below are hallucination rates for an identical set of questions for the same set of documents for one of Got It AI's customers.

  • OpenAI GPT-4, > 175 B, 8.39%
  • OpenAI ChatGPT-3.5 Turbo, 175B or less, 13.99%
  • Google Flan-UL2, 20B, 14.08%
  • MosaicML, 30B, 30.07%
  • Meta LLaMA-2, 13B, 36.76%
  • Meta LLaMA-2, 70B, 25.45%
  • Got It AI LLM, Less than 1B, 8.45%


"Providing accurate information to the agent is crucial in customer service and sales operations. The risks associated with incorrect information are simply too high," said Peter Relan, Chairman of Got It AI.

"With Agent Copilot, we offer businesses the opportunity to choose the most suitable LLM for their needs, including Got It AI's LLM, which matches GPT-4's in accuracy for the Agent Copilot task, ensuring the highest level of accuracy and efficiency in their customer interactions.

“Furthermore, with an additional guard rail called Got It AI's TruthChecker, hallucination rates of the base model can be decreased by 50% to 90% in responding to information Agent Copilot retrieves from a knowledge base. For our own LLM and OpenAI's LLMs, that level of fact checking leads to near human level accuracy."

It is due to become available in September 2023 with the following features:

Knowledge Base connectors to custom data sources and documents (structured and semi-structured) including PDFs, Slide decks and Web pages 

  • Choice of LLMs: FLAN UL-2, LLaMA2, MosaicML, ChatGPT, GPT-4 
  • TruthChecker AI fine-tuned for detecting hallucinations in multi-turn responses 
  • Product UI or Microservice API for customer specific UIs 
  • Fluid conversational dialogue with hallucination detection, mitigation & disambiguation 
  • Active learning, caching & fine tuning to achieve highest accuracy 
  • On-prem deployment with Open Source LLMs
  • Integration into other Agent, CX and CRM solutions