Oracle recently announced a partnership with Cohere to develop generative AI services aimed at helping companies automate their business processes. Oracle already offers its enterprise customers the Oracle Cloud Infrastructure (OCI), while Cohere provides foundational large language models (LLMs) and expertise in natural language processing (NLP). Offering these combined capabilities in an intelligent Conversational Cloud solution will enable OCI customers to deploy strategic generative AI solutions for improved service delivery.
Potential Implementations in Healthcare
Imagine a hospital that wants to automate its medical coding process. Medical coding involves translating diagnoses, procedures, and treatments into standardized codes for billing and record-keeping purposes. Traditionally, this process is time-consuming and prone to errors when done manually.
With Oracle’s generative AI services, the hospital could leverage the power of large language models (LLMs) developed by Cohere. These LLMs are trained on a vast amount of medical data and have a deep understanding of medical terminology and coding rules.
Here’s how the implementation process could work:
- Training the model: The hospital provides its own medical coding data to the generative AI service. This data includes historical medical records, coding guidelines, and billing information. The hospital’s data is combined with the pre-trained LLMs from Cohere to create a customized model specific to the hospital’s coding requirements.
- Model deployment: The hospital deploys the generative AI model on Oracle Cloud Infrastructure (OCI).
- Integration with applications: Oracle’s generative AI services are directly integrated into the hospital’s existing applications, such as electronic health record systems or billing software.
- Automated medical coding: When a new medical record needs to be coded, the hospital’s application sends the relevant information to the generative AI model running
Generative AI on Oracle’s Supercluster for Security and Speed
OCI customers also have access to Oracle’s Supercluster capabilities, which are well-suited to supporting enterprise grade Generative AI solutions. Supercluster capabilities provide heightened levels of security, enhanced performance, and increased value by optimizing resource utilization and reducing costs.
The integration of Cohere’s models into Oracle’s cloud applications, including Oracle Fusion Cloud Applications and NetSuite, will enable customers to deploy generative AI quickly and securely.
We Can See a Pattern Here
Every Conversational Cloud provider must now have at least one “native” implementation of a foundational LLM to power natural language input and results based on Generative AI. Oracle has selected Cohere. For Microsoft, it’s OpenAI with GPT 3.5 running inside Azure powering Bing Search, co-pilots for the productivity software in Office 365 as well as the Digital Contact Center. Salesforce recently matched up with Google to promote what it calls “hyper-efficient data sharing” that ultimately lets Salesforce customers pursue a “Bring Your Own AI Model” approach while abiding by data residency and security mandates.
You’ll see similar approaches among the leaders in the Conversational Cloud, looking specifically at the worlds of customer experience, contact centers and workforce engagement. In quick succession Opus Research is attending global gatherings of customers, partners and fellow analysts held by NICE, Verint and Genesys, respectively. We’ll compare and contrast competing approaches toward widespread implementation of domain specific, generative AI-infused use cases in our next post.
Categories: Intelligent Assistants, Articles