Atlas v3: The Only AI SDK You Need for Laravel
Atlas v2 did what it was supposed to do. It gave AI features in Laravel a real home. Agents, tools, pipelines, all the organizational structure that was missing. But it sat on top of Prism PHP, which meant every AI request went through someone else's provider layer. Atlas handled the architecture. Prism handled the connection.
That split worked until I needed more. Video generation. Real-time voice. Providers like ElevenLabs and Cohere. Each one meant waiting on Prism's roadmap, and I don't have time to wait for someone else to decide when a feature ships.
So I did what I always end up doing. I built it myself.
Why a Full Rewrite
I tried patching around Prism's limitations. Adding workarounds. Finding ways to extend what was there. But the dependency itself was the constraint. It wasn't a bug I could fix. It was an architecture I'd outgrown.
v3 is a ground-up rewrite of the codebase. Atlas now owns its entire provider layer. No Prism. Direct API integration with seven providers, support for any OpenAI-compatible API, and the ability to build custom drivers for anything else. Nobody's release cycle blocks me anymore.
What It Unlocked
v2 handled text, images, audio, and embeddings through Prism. No real-time voice, video, music or sound effects.
v3 supports thirteen modalities, and the API is consistent across all of them:
Atlas::text('anthropic', 'claude-sonnet-4-20250514')
->instructions('You are a helpful assistant.')
->message('Summarize this document')
->asText();
Atlas::image('openai', 'dall-e-3')
->instructions('A minimalist logo for a developer blog')
->asImage();
Atlas::video('openai', 'sora')
->instructions('A cat walking through a garden')
->asVideo();
Atlas::embed('openai', 'text-embedding-3-small')
->fromInput('Search query here')
->asEmbeddings();Pick a modality, give it instructions, get the output type you asked for. Text, image, audio, speech, music, sound effects, video, voice, embeddings, reranking, vision, structured output.
The one I'm most proud of is real-time voice. Bidirectional conversations where the browser connects directly to the provider for audio while your Laravel server handles tool execution and transcripts. The AI talks to a user, decides it needs to look something up, calls a tool on your server, and keeps going. All in real time.
You can set up a voice session directly:
Atlas::voice('openai', 'gpt-4o-realtime')
->instructions('You are a friendly customer support agent.')
->withTools([...])
->createSession();Or if you already have an agent defined:
Atlas::agent('support')->asVoice();You can use the same agent with text or voice. The fact that this just works still catches me off guard sometimes. Nobody else in the Laravel ecosystem is doing this right now.
Atlas can also query what providers actually have available:
Atlas::provider('openai')->models();
Atlas::provider('elevenlabs')->voices();Build a UI where users pick their own model or voice from live data instead of a hardcoded dropdown. Small thing, but it wasn't possible before.
What Else Is New
Persistence. v2 was stateless. v3 ships with an optional eight-table schema. Conversations, messages, execution steps, tool calls, assets, voice calls. Full Eloquent models with branching and polymorphic ownership. Don't need it? Don't publish the migrations.
I know I said Atlas should remain stateless, but as a base foundation, I think it was worth providing solid observability for most common use-cases.
Middleware. Pipelines are gone, replaced by four layers (agent, step, tool, provider) plus lifecycle events. Full observability without touching your agents or tools.
The documentation goes deep on all of this.
What Got Removed
Agent decorators, MCP support (it depended on prism-php/relay), the AnonymousAgent / AgentDefinition split. Nothing got cut without a reason. Everything that's gone either has a better replacement or wasn't pulling its weight. Yes, MCP will come back, I just had to prioritize other things.
Why This Matters to Me
I didn't build Atlas because I wanted to maintain an open source AI package. I built it because I have projects I need to ship. Things I can't talk about just yet, but every time I sat down to build, I was blocked. The tools available couldn't do what I needed, and I can't afford to wait for someone else to get there.
Atlas exists out of necessity. It's the AI SDK for Laravel I need for what I'm building next.
I'm excited to see how everyone receives v3. Check out Atlas on GitHub and let me know what you think. I've got some big projects coming that I'm looking forward to sharing.
Building AI features in Laravel?
I'm writing about everything I learn building Atlas and shipping AI in production. New patterns, new modalities, real problems.
