BizAI: The Swiss Army Knife of Enterprise AI
Fisent Technologies has unveiled BizAI, described as an Applied GenAI Process Automation solution. The company says it allows enterprises to select the right LLMs for the processes they are looking to automate and to effortlessly shift processes to different models, adjusting for variables such as content, structure, size, and data latency.
This flexibility addresses a growing trend in the enterprise sector, where businesses are no longer content with a one-size-fits-all approach to AI implementation.
According to a recent CB Insights survey, 97% of enterprises using LLMs work with multiple AI developers. Specifically, 34% use at least two developers, 41% use three, and 22% employ four or more. This diversification reflects the rapid advancements in AI technology and the emergence of new, well-funded challengers in the market.
Adrian Murray, Founder and CEO of Fisent, explained the rationale behind BizAI's model optionality: "Different AI models are designed and trained for specific tasks based on their architecture, capabilities, and the nature of the data with which they were trained. Where Gemini excels at audio transcription relative to other models, GPT has an advantage over others for text-based analysis, and the LLMs developed on internal data bring proprietary intelligence to the fold. BizAI allows all of these options to be harnessed to automate and optimize processes."
The platform's key features include support for initial model selection based on use case fit, seamless switching between models, flexibility in host selection and management, the ability to process various content types, and integration with existing business process management systems.
Fisent has already implemented BizAI in several real-world scenarios. For a health insurance company processing varied invoices from providers, BizAI utilizes GPT-4 to analyze complex care notes and expedite the review process. In contrast, for a high-volume credit dispute process, BizAI employs the more cost-efficient Llama 3.1 model to classify supporting evidence and reduce adjudication time.
As businesses continue to navigate the rapidly evolving AI landscape, solutions like BizAI that offer adaptability and optimization across multiple models may become increasingly crucial for maintaining competitive edge and operational efficiency.
BizAI capabilities include:
• Support initial model selection based on use case fit
• Facilitate seamless switching between models if better options emerge or as initial choices evolve
• Offer flexibility in host selection and management for both Foundational models (when available) and proprietary LLMs
• Enable the models to process any content source/type
• Integrate seamlessly with any application/business process management (BPM) layer