AWS doubles down on infrastructure as strategy in the AI race with SageMaker upgrades


Want smarter ideas in your inbox? Sign up for our weekly newsletters to get what is important for businesses, information and security leaders. Subscribe now


Boring wants to extend the market position with Updates to SageMakerHis machine learning and management of AI model education and result platform, new observation capabilities, coding environments and GPU cluster performance.

However, AWS continues in front of competition Google and MicrosoftAI offers many features that offer many features that help you to train and accelerate the result.

SageMaker, this became a single center To connect data sources and access the machine learning tools in 2024, the calculated calculated calculated calculated AWS customers, which are allocated for the development of the Model Performance, will add more controls.

Other new features include a local integrated environment (ID) to close to SageMaker, so local written AI projects can be placed on the platform.

Seadmaker General Director Ankur Mehrotra said to Venturebeat, many of these new updates occurred by customers themselves.

“When developing wide AI models, it is really difficult to find out what happened in that layer of the heap when something is wrong or when something is wrong.”

SageMaker Hyperpod observation allows engineers to explore different layers of stack as a computing layer or network layer. If something goes wrong or the models are slower, the SageMaker can warn them and broadcast the dashboard sizes.

Mehrotra pointed to a true issue faced by his team while preparing new models, where the training code was stressed by GPU, which causes the temperature. He said that without the latest tools, the developers will take a few weeks to determine the source of the issue and then correct.

Closed Identes

SageMaker has already offered two ways to prepare and operate the models of AI developers. Jupyter lab or code editor allowed fully managed identities to be regularly operated in models through SageMaker. They allowed them to use local identifications of other engineers, including all the extensions they installed, including their codes to work their cars.

However, Mehrotra, local coded models are local locally, so the developers have proven significant problems if they want to be scale.

AWS performed a new reliable remote to allow customers to continue to work on the ide of choosing – either local or controlled and connect to SageMaker.

“Thus, if these abilities now give you the best of both worlds, if they want, if they want, the local id can develop local idrooms, but then the actual task can benefit from the scale of the Sagenemaker, in terms of performance.

More comfort to calculate

AWS SageMaker launched Hyperpod December 2023 Manage groups of servers for training models as a tool to help customers. Looks like providers such as CoreweaveHyperpod SageMaker allows customers to focus on unused settlement forces. Hyperpod allows you to use the use of GPU based on demand samples and allow you to effectively balance the resources and costs of organizations.

However, many AWS said that many customers want the results of the same service. Many outcome tasks, education, usually scheduled during peak hours during the day they use models and applications.

https://www.youtube.com/watch?v=as1eu_kkgci

Although Mehrotra, even in the world, the developers may prefer the results in the spotlight of the hyperpod.

Laurent SIFRE, co-founder and CTO in AI Agent WhIn an AWS blog post, SageMaker said he used the hyperpod when setting up the company’s agent platform.

“This problematic transition, which did not come from training, has made work flow, and time for production, and gave consistent performance in live environments,” he said.

Aws and competition

Amazon, the cloud provider cannot offer Splashiest fund models like competitors, Google and Microsoft. Again, the AWS has been focused on ensuring the spine of infrastructure to establish enterprises AI models, applications or agents.

In addition to SageMaker, AWS offers the bedA special platform designed specifically for applications and agents.

SageMaker serves around for many years, first to connect the bench learning tools to the lakes of information. As a generative AI Boom began, AI engineers began to use SageMaker to help train language models. But Microsoft pushes a lot for the fabric ecosystem, 70% of Fortune 500 has accepted itinformation and become a leader in the AI ​​acceleration space. Google is silently made via VERTEX AI Adoption of the entrance in the enterprise.

Of course it has the advantage of being AWS The most commonly used cloud provider. Any updates that will facilitate many AI infrastructure platforms that are easy to use will always be a benefit.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *