newsence
來源篩選

Agentic Coding Arrives in Apple's Xcode with Anthropic and OpenAI Agents

Techcrunch

Apple's Xcode 26.3 now supports agentic coding, allowing developers to integrate AI agents like Anthropic's Claude Agent and OpenAI's Codex directly into the development environment for more complex automation and access to up-to-date documentation.

newsence

Apple Xcode 引入代理編碼功能,整合 Anthropic 與 OpenAI 代理

Techcrunch
25 天前

AI 生成摘要

Apple Xcode 26.3 現已支援代理編碼功能,開發者可將 Anthropic 的 Claude Agent 和 OpenAI 的 Codex 等 AI 代理直接整合至開發環境,實現更複雜的自動化並存取最新開發文件。

Agentic coding comes to Apple's Xcode with agents from Anthropic and OpenAI | TechCrunch

Image Image

Topics

Latest

AI

Amazon

Apps

Biotech & Health

Climate

Cloud Computing

Commerce

Crypto

Enterprise

EVs

Fintech

Fundraising

Gadgets

Gaming

Google

Government & Policy

Hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

Security

Social

Space

Startups

TikTok

Transportation

Venture

More from TechCrunch

Staff

Events

Startup Battlefield

StrictlyVC

Newsletters

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

Contact Us

Image

Agentic coding comes to Apple’s Xcode with agents from Anthropic and OpenAI

Apple is bringing agentic coding to Xcode. On Tuesday, the company announced the release of Xcode 26.3, which will allow developers to use agentic tools, including Anthropic’s Claude Agent and OpenAI’s Codex, directly in Apple’s official app development suite.

The Xcode 26.3 Release Candidate is available to all Apple Developers today from the developer website and will hit the App Store a bit later.

This latest update comes on the heels of the Xcode 26 release last year, which first introduced support for ChatGPT and Claude within Apple’s integrated development environment (IDE) used by those building apps for iPhone, iPad, Mac, Apple Watch, and Apple’s other hardware platforms.

The integration of agentic coding tools allows AI models to tap into more of Xcode’s features to perform their tasks and perform more complex automation.

The models will also have access to Apple’s current developer documentation to ensure they use the latest APIs and follow the best practices as they build.

At launch, the agents can help developers explore their project, understand its structure and metadata, then build the project and run tests to see if there are any errors and fix them, if so.

Image

To prepare for this launch, Apple said it worked closely with both Anthropic and OpenAI to design the new experience. Specifically, the company said it did a lot of work to optimize token usage and tool calling, so the agents would run efficiently in Xcode.

Xcode leverages MCP (Model Context Protocol) to expose its capabilities to the agents and connect them with its tools. That means that Xcode can now work with any outside MCP-compatible agent for things like project discovery, changes, file management, previews and snippets, and accessing the latest documentation.

Developers who want to try the agentic coding feature should first download the agents they want to use from Xcode’s settings. They can also connect their accounts with the AI providers by signing in or adding their API key. A drop-down menu within the app allows developers to choose which version of the model they want to use (e.g. GPT-5.2 vs. GPT-5.1-mini).

In a prompt box on the left side of the screen, developers can tell the agent what sort of project they want to build or change to the code they want to make using natural language commands. For instance, they could direct Xcode to add a feature to their app that uses one of Apple’s provided frameworks, and how it should appear and function.

As the agent starts working, it breaks down tasks into smaller steps, so it’s easy to see what’s happening and how the code is changing. It will also look for the documentation it needs before it begins coding. The changes are highlighted visually within the code, and the project transcript on the side of the screen allows developers to learn what’s happening under the hood.

This transparency could particularly help new developers who are learning to code, Apple believes. To that end, the company is hosting a “code-along” workshop on Thursday on its developer site, where users can watch and learn how to use agentic coding tools as they code along in real-time with their own copy of Xcode.

At the end of its process, the AI agent verifies that the code it created works as expected. Armed with the results of its tests on this front, the agent can iterate further on the project if need be to fix errors or other problems. (Apple noted that asking the agent to think through its plans before writing code can sometimes help to improve the process, as it forces the agent to do some pre-planning.)

Plus, if developers are not happy with the results, they can easily revert their code back to its original at any point in time, as Xcode creates milestones every time the agent makes a change.

Topics

Image

Consumer News Editor

Image

Tickets are live at the lowest rates of the year. Save up to $680 on your pass now.Meet investors. Discover your next portfolio company. Hear from 250+ tech leaders, dive into 200+ sessions, and explore 300+ startups building what’s next. Don’t miss these one-time savings.

Most Popular

Meet the new European unicorns of 2026

Meet the new European unicorns of 2026

Nvidia CEO pushes back against report that his company’s $100B OpenAI investment has stalled

Nvidia CEO pushes back against report that his company’s $100B OpenAI investment has stalled

OpenClaw’s AI assistants are now building their own social network

OpenClaw’s AI assistants are now building their own social network

Elon Musk’s SpaceX, Tesla, and xAI in talks to merge, according to reports

Elon Musk’s SpaceX, Tesla, and xAI in talks to merge, according to reports

Waymo robotaxi hits a child near an elementary school in Santa Monica

Waymo robotaxi hits a child near an elementary school in Santa Monica

Tesla is killing off the Model S and Model X

Tesla is killing off the Model S and Model X

Tiny startup Arcee AI built a 400B-parameter open source LLM from scratch to best Meta’s Llama

Tiny startup Arcee AI built a 400B-parameter open source LLM from scratch to best Meta’s Llama

Image

© 2025 TechCrunch Media LLC.