Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Vibe-coding startup Windsurf launches in-house AI models


On Thursday, a start developing popular AI tools for Windsurf, software engineers, declared The beginning of the first family of AI software engineering models or SWE-1 for a short time. Beginner, AI models – SWE-1, SWE-1-1-Lite and SWE-1-1 – “All software engineering process” is taught not only to encoding.

Launch of AI models in Windsurf’s house can come as a shock that Openai has been reported Windsurf tied a $ 3 billion contract to obtain. However, the launch of this model shows that Windsurf is trying to expand outside the development of applications to develop only strengthening them.

According to WindsurfSWE-1, the largest and most capable AI model of the gang, claude 3.5 Sonnet, GPT-4.1 and Gemini 2.5 Pro in domestic programming criteria. However, SWE-1 falls shortly after clode 3.7 Sonnet in the program engineering positions.

Windsurf says it will be free or payable for all users on the Swe-1-1-1-1-1-1-mini platform. In the meantime, SWE-1 will only be accessible to paid users. Windsurf has not immediately announced the price for SWE-1 models, but Claude claims that 3.5 is cheaper than sonnet.

Windsurf, software engineers are best known by means that allow you to write and edit the code through conversations with an experience known as “Vibe coding”. Other popular vibe-coding beginnings are the largest in the space, as well as loved in space. Most of these beginnings, including Windsurf, traditionally trusted in Openai, anthropic and Google to strengthen their appeals to AII models.

In a video announcing Swe models, Windsurf research head Nicholas Moy stressed its newest efforts to distinguish the approach of Windsurf. “Today’s border models are optimized for coding and have taken mass steps in the last few years,” he said. “But they are not enough for us … coding is not software engineering.”

Windsurf notes note that if other models are well written in writing, they fight between numerous surfaces – often fight between programmers such as terminals, ides and internet. The startup is trained using a new data model and “training recipe” that covers incomplete states, long-term tasks and numerous surfaces.

The beginning depicts SWE-1 as the “Preliminary Proof of the Concept” to leave more AI models in the future.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *