How Highmark Health and Google Cloud are using Gen AI to streamline medical claims and improve care: 6 key lessons


Join a reliable event by enterprise leaders in about two decades. VB Transform, Real Enterprise AI strategy brings together people who build. Learn more


Among the Numerous Education and Amazing Deep Fasteshy Panel Discussions on AI Enterprise Integrations reflecting industrial leaders Venturebeat’s Transform 2025 Conference This week was a leader Google Cloud Platform President and Chief Technology Officer (CTO) will grannis and Richard Clarke, Highmark Health‘s General Vice President and General Information and Analytical Officer.

O session “New AI stack in Healthcare: Multi-model, architect for multiple modal environments “ The AI ​​of the two organizations has a pragmatic glance in the US health care system (Western Pennsylvania)

In addition, cooperation puts all these employees of all these employees and turned them into active users without losing complexity, regulation or clinemic trust.

So how was Google Cloud and Highmark about it? Read to find out.

A partnership established on designed foundations

An integrated payer provider, which serves more than 6 million members, Google Cloud’s AI models and infrastructure modernize the inheritance systems, to increase the inner efficiency and improve patient results.

The thing that is separate from each other in this initiative is the focus of the AI, not only another technological layer, but the basic change in the case to establish the case.

Richard Clarke, Highmark’s General Data and Analytical Officer, stressed the importance of building an early infrastructure. “There is no more inherit from a coded employment platform in Cobol,” said Clarke, but Highmark also combined these systems with cloud-based AI models. Conclusion: Up to 90% of the systemic intersection allows you to smooth transitions and real-time concepts of complex administrative processes.

Google Cloud CTO will reflect Grannis’s success. “It can take three, four, five years,” he said.

Proof-Concept to daily use

More than 14,000 Higharkm’s 40,000+ employees regularly use the internal generative AI tools of the company that is equipped with Google Cloud Vertex AI and Gemini models.

These tools are applied along a number of use – individual member communication to receive documents for the processing of claims.

Clarke stressed a sample of a costly associated with credentials and contract inspection. Previously, an employee will look for a large number of systems to check for a provider’s training.

Now generate the requirements and adapted access to AI, data, cross-check checks – is completed by quotes and contextual recommendations.

What driving this high adoption speed? Combining structured fast libraries, active training and user feedback loops. “We just do not throw tools and hope people use them,” Clarke said. “We show how much it makes their work, and then the scale is based on what is happening.”

Architecture outside the crack

One of the most forward-looking topics from the session was to go to many agent systems capable of completing the end points of the conversation. Grannis described it as a quick response conversation models as a movement to the task synthesis and automation.

“Think less about a chat interface and saying. These agents coordinate multiple models, potentially between different functions – to conduct research for research for the flow of work.

Highmark is currently piloting single-use agents for specific work flows and the long-term goal is to support them within support systems to move them into the autonomy. This will reduce the need for more than one interface or connector and allow centralized management with more information.

The task-the first is not the first

Both speakers stressed the main mental turn for businesses: Stop starting the model. Instead, start the task and select selected or orchestral models accordingly.

For example, highmark uses Gemini 2.5 Pro for long, research-intensive surveys and twins flash for fast, real-time interactions. In some cases, even classic determinist models are used when it is better to task as to translate patient communications into many languages. As Grannis puts, “Your work processes are your IP. Think about accomplishing an assignment and think to do orchestral models.”

To support this convenience, Google CLOUD model invests in redirect capabilities and open standards. The latest agent protocol initiative applied with the Linux Foundation is designed to promote interaction and stability in many agent environments.

Practical recommendations for enterprise managers among sectors

For those who want to repeat the success of Higharkm, the panelists proposed a concrete management:

  1. The early sleeping of the foundation: Now invest in the preparation and system integration. Although the placement of the AI ​​is for years, the payment depends on the early groundwork.
  2. Avoid building your own foundation models: Unless you work have Building models, it is prohibited. Focus on orchestra and delicate arrangements for specific use.
  3. Accept a platform mentality: Centralize model access and use tracking. Create a structure that supports the experience without a victim of management.
  4. Start tasks, not tools: First determine the result. Then match it with model or agent architecture that makes it best matches it.
  5. To measure and share: Internal adoption workers grow when you see practical results. Follow the use, capture success stories and update approved instructions and streaming libraries.
  6. Not only information, design to move: The future of the enterprise is a static idea, not a static idea. Set up agents that can safely and secure the actions of the real world in your systems reliable and securely.

Looking forward

While the partnership between Higharkm and Google Cloud still develop, so far offers a model for others, wishing to build health and beyond, health and highly used AI systems.

Clarke, “This is not related to crazy properties; it is about what helps people work better.”

Venture leaders who missed the session can be comforted in this work: Generative AI successes are not divided for those who have the biggest budgets, but are not protected for those with flexible platforms and strategic structures.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *