The great AI agent acceleration: Why enterprise adoption is happening faster than anyone predicted

[ad_1]

Want smarter ideas in your inbox? Sign up for our weekly newsletters to get what is important for businesses, information and security leaders. Subscribe now


The conversation around Artificial General Intelligence (AGI) can prefer the headlines from Silicon Valley companies Open, Meta and XaiHowever, attention for the enterprise leaders in the Earth is in practical applications and measurable results. At the end of Venturebeat Transform 2025 An event in San Francisco appeared in a clear picture

Companies like Intuit, Capital, Linkedin, Stanford University and Highmark Health It silently allows the EI agents to solve production, concrete problems and see material returns. The four biggest ways of the event for technical decision makers.

1. AI agents are produced faster than anyone carried out

Enterprises are now placing AI agents in applications facing customers and accelerate a tempo in the trend. The last initiative request for 2000 industrial experts appeared before VB transformation 68% of the company’s companies (With 1000+ employees) already accepted the agent of AI – then it is a high-looking figure. (In fact, I was worried that it was very high to be reliable, so I announced the results of the survey in the research phase and reported high adoption venturebehat reflected the special readership.)

However, new information confirms this rapid change. One KPMG survey One day after our event was released on June 26 33% of organizations now place AI agentsThree layers of surprisingly from 11% in the previous two quarters. This Sunday queue confirms the trend VentureBeat defined in the transform query several weeks ago in advance.

This acceleration is fueled by material results. Ashan WillyCEO New relicHe noted something amazing 30% quarter on quarter growth Due to the monitoring of AI applications by customers, mainly due to adoption of customer agents. Companies place AI agents to help customers automate the workflows they need to help. IntuitFor example, the QuickBooks program placed an invoice generation and reminder agents in the program. The result? There are businesses using feature Five days faster paid and 10% more to be fully paid.

Even developers feel the turn. Scott WhiteProduct device Anthropical Claude AI product describes however, despite being a professional programmer, Now manufactured software itself. “It was not possible six months ago,” he said, emphasized the power of tools such as a cland code. Similarly, Openai Head of the product for API platform, Olivier godingDetailed information about how customers are Wale and Boxing have Agents to build multiple agent systems using SDK.

2. Hyperscaler race is very cloudy, there is no clear winner as much model prevail

The Between the Great Language model (LLM) provider ended in betting days. A consistent theme along the transform 2025 has moved to a very model and very cloudy strategy. Enterprises are a strong proposed model or a delicate open source alternative, wants the best tool to choose the best tool for work.

Like Armand RuizAI platform VP Ibm explainedDevelopment of a model combination of the company – which is the most effective and performer for a special case of the LLM, the most efficient and performer for a special case, directly responds to the customer requirement. IBM has started offering its open source models to enterprise customers, then added open source support and finally realized that it is necessary to support all models. This wishes for this comfort was echoed by the zooming CTO describing the company of Huang, Three-graded model approach: Supports property models that offer their subtle adjustable model and allow customers to create their subtle adjustable versions.

This trend creates a strong but limited ecosystem, where the power needed to create GPU and Tokens is limited. Like Dylan Patel one Semicolar and fellow panelists Jonathan Ross one Gle and Sean Lies one Brain erectedWhen it exists, it is pressure on the profitability of many companies that receive more signs, and these verses continue to fall. Enterprises are smarter about how different models for different tasks to optimize for both costs and performance, and do not trust NVIDIA chips, but something more customized Soligm around The emergence of special memory and storage solutions for AI.

3. Enterprises are aimed at solving real problems that do not follow AGI

Technical leaders like Elon Musk, Mark Zuckerberg and Sam Altman talked about super heading dawn, rounds the entity practitioners and solve work problems immediately. Transformed conversations are justified in reality.

Take Highmark Health, The third largest integrated health insurance and provider company of the nation. Its basic information officer Richard Clarke said using llms for Practical applications such as multilingual communication to better serve different customer base and facilitate medical claims. In other words, use technology to provide better services today. Similarly, Capital have Building groups of agents that mirror the functions of the companyTasks such as risk assessment and audit, including car dealers, help to combine customers with special agents to help their customers with proper loans.

The travel industry also sees a pragmatic turn. From CTO Expedition and Kayak Discussed how the LLMS adapted to new search paradigms actively. Users must now have this level for the “infinity pool” in ChatGpt to be a hotel search and the natural language discovery of travel platforms. Focus refers to the customer rather than technology for his own.

4. The future of AI teams was small, flexible and enhanced

The age of AI agents also changes the way the teams are built. The consensus is the most effective of small, flexible “squads” from three to four engineers. Varun MohanCEO Windfast growing agent ide, arguing into the event by arguing The product of this small team structure allows the rapid testing of assumptions and allows larger groups to slow down the plague.

This change means “Everyone is a builder” and is increasingly “everyone” is a manager of AI agents. “As Entrusted and Satlasian Note, engineers Now learn to manage agents’ fleets. The required skills are evolving by paying clear communication and strategic thinking to guide these autonomous systems.

It is supported by the growing setting of developing sand boxes. Andrew ngA leading voice in AI recommended participants Leave security, management and observation by the end of the development period. Although it seems opposite reflection for large enterprises, the idea is to quickly protect rapid updates in an environment managed to prove the value. Reflected in our survey investigating our thinking 10% of the organization adopted by the EU No Special AI Security GroupIn this early stages, the proposal to be ready to prioritize.

Together, this is a clear picture of a scene of the AI view of roads, extensive experience, extensive experience, which is expanding. Transform conversations in 2025, if companies have placed AI agents today, and if they have to study difficult classes on the road. Many have passed one or two big pivots before many testing a generative AI or two years ago – it is good to start early.

To further analyze these topics more talk and more from the incident, you can listen to my last podcast with an independent AI developer Sam Witteveen. We also loaded the main stage talks in VB Transform here. Articles from the event are the full coverage here.

Listen to VB Transform Takeaways Podcast with Matt Marshall and Sam Witteveen:

https://www.youtube.com/watch?v=padg1lqgvo8

Editor’s note: Asked our readers, we opened an early bird registration for VB Transform 2026 – only $ 200. This is ai fame, and you will want to be in the room in the room. Protect your place now.


[ad_2]
Source link

Leave a Reply

Your email address will not be published. Required fields are marked *