DataCentreNews India - Specialist news for cloud & data centre decision-makers
Flux result af364f56 7ae7 4755 adaf f748c525b01b

Cloudflare expands Agent Cloud for AI software agents

Tue, 14th Apr 2026

Cloudflare has expanded its Agent Cloud with new tools for developers building and running AI agents, adding compute, storage, sandboxing and model access for long-running workloads.

The update targets software agents that can write code, use tools and complete multi-step tasks, rather than simply answer prompts in chatbot form. It includes Dynamic Workers, Artifacts, the general availability of Sandboxes, a new framework called Think, and wider model access, including OpenAI's GPT-5.4 and Codex.

Cloudflare is presenting the release as part of a shift in software development, as more developers test systems that can autonomously generate code, call APIs and manage longer workflows. It argues that conventional infrastructure based on persistent virtual machines or containers is too expensive when large numbers of agents run at once.

Compute shift

At the centre of the update is Dynamic Workers, a runtime for AI-generated code designed to start in milliseconds, execute short tasks such as data transformation or API calls, and then shut down instead of remaining active like a traditional container.

The approach is intended to lower the cost of running large volumes of agent tasks while isolating workloads in a sandboxed environment, a key issue for developers deploying model-generated code.

Cloudflare also made Sandboxes generally available for cases where an agent needs a full Linux environment. These sandboxes provide a shell, file system and background processes, allowing agents to clone repositories, install packages and run build tasks in a persistent environment.

The distinction reflects a broader pattern in AI tooling: some tasks can be handled with brief bursts of code execution, while others require a more durable operating environment resembling a developer workstation.

Storage layer

Another part of the rollout is Artifacts, a Git-compatible storage layer for code and file repositories created or maintained by agents. It is designed to support very large numbers of repositories while remaining accessible through standard Git clients.

The move suggests Cloudflare sees source control and software artefact management as an emerging bottleneck in agent-led development. If autonomous tools begin generating and revising code at scale, repository infrastructure could face heavier and more frequent activity than systems built mainly for human developers.

Alongside the infrastructure changes, Cloudflare introduced Think, a framework within its Agents SDK for longer-running, multi-step tasks. It is intended to support agents that persist across sessions and continue work over time rather than respond to a single prompt and terminate.

Model access

The expanded model catalogue includes OpenAI's GPT-5.4 and Codex, following Cloudflare's acquisition of Replicate. Developers will be able to choose between proprietary and open-source models through the same platform instead of managing multiple model providers separately.

That reflects a broader push among infrastructure vendors to abstract the model layer, making it easier for customers to switch providers as pricing, reliability and performance change. For developers, model portability has become a practical concern as new systems arrive in rapid succession.

Matthew Prince, co-founder and Chief Executive Officer, Cloudflare, framed the announcement as part of a wider change in software development. "The way people build software is fundamentally changing. We are entering a world where agents are the ones writing and executing code," he said. "But agents need a home that is secure by default, scales to millions instantly, and persists across long-running tasks. We've spent nine years building the foundation for this with Cloudflare Workers. Today, we are making Cloudflare the definitive platform for the agentic web."

OpenAI also linked the launch to the growing use of coding models in business settings. "Cloud agents are quickly becoming a foundational building block for how work gets done, and with Cloudflare, we're making it dramatically easier for developers to deploy, production-ready agents powered by GPT-5.4 and Codex to run real enterprise workloads at scale," said Rohan Varma, Product, Codex, OpenAI.

Market shift

Cloudflare's latest move places it more directly in competition with cloud providers and developer platform companies trying to become the default infrastructure layer for AI software. The contest is no longer only about hosting models, but also about who manages execution, state, storage and security as agents begin acting more like software workers than chat interfaces.

For developers, the significance will depend on whether these tools reduce the cost and complexity of running agents in production. Cloudflare's argument is that the next wave of AI software will require infrastructure built around short-lived execution, persistent environments and repository systems that can handle machine-generated code at much greater volume.

Developers using the service will be able to access OpenAI and open-source models through one platform while pairing them with persistent Linux environments, Git-compatible storage and frameworks for long-running tasks.