As part of Adobe MAX 2025, Adobe and Google Cloud have announced a strategic expansion of their collaboration to integrate Google’s AI models — Gemini, Veo, and Imagen — directly into Adobe applications. This aims to usher in a new era of AI-assisted creation, offering greater precision, control, and variety of models for creative professionals, brands, and marketing teams.
The partnership translates into something very tangible for users: access to Google’s models from Firefly, Photoshop, Adobe Express, Premiere, and other Adobe apps, as well as through the GenStudio platform in the enterprise space. In a later phase, companies anticipate that enterprise clients will be able to customize and deploy “branded” models via Adobe Firefly Foundry, trained on their own IP and run on Vertex AI with Google’s data commitments (ensuring client data isn’t reused to retrain base models).
What does the partnership mean in practice
The announcement goes beyond a marketing agreement: it unites the top models in the industry with the professional editing tools that creatives already master, all within a single environment.
- In Adobe apps (Firefly, Photoshop, Express, Premiere…): the ability to select the most suitable model for each task — for example, Imagen for photo-realistic generation, Veo for video, or Gemini for multimodal workflows — and continue editing with pixel-level precision within familiar software.
- In the enterprise (GenStudio + Vertex AI): access to Google’s models with governance and data security over corporate data, content pipeline automation, and in the future, customized models via Firefly Foundry to generate on-brand assets at scale.
- In go-to-market strategies: Adobe and Google Cloud coordinate sales and adoption efforts to bring these capabilities to clients worldwide, focusing on measurable creative productivity (less time on repetitive tasks, more consistency, and controlled versioning).
For teams, the major operational shift is integrating AI options within the workflow: instead of toggling between portals and models, AI choices are embedded into the flow: select the model, generate content, edit with familiar tools, and publish or orchestrate across established channels.
Why now (and why together)
Both Adobe and Google Cloud bring proven components to the table. Adobe has evolved Firefly into a comprehensive AI studio, infused Creative Cloud with generative features (Generative Fill, Generative Upscale, Object Mask, Harmonize…), and extended GenStudio into content supply chains. Google, on its part, has scaled Vertex AI with a multimodel offering (Gemini, Imagen, Veo) and enterprise data commitments, along with a planet-wide infrastructure optimized for AI.
The combined goal is to achieve two main objectives:
- Provide true model choice without disrupting the creative workflow.
- Enable secure customization (branded models with Foundry) at production scale, with full traceability and rights management.
The convergence of creativity and AI requires exactly this: precision tools for professionals — who retain control — and governed systems for companies — integrating data, processes, and policies.
Implications for daily creative work
- Higher quality from the first draft. Choosing Imagen or Gemini as the engine behind Generative Fill or within Firefly can reduce iterations when aiming for photo-realism or stylistic consistency.
- Authentic AI video. Veo opens pathways for text-to-video and image-to-video within workflows integrated with Premiere, shortening the gap between storyboarding and editing.
- Less friction for small teams. Those working against tight deadlines (social media, e-commerce, editorial) gain speed with templates and appropriate models for each project, all within the familiar Adobe environment.
- Pixel-level control. AI does not replace professional judgment: Photoshop, Premiere, and Lightroom continue to provide fine-tuned tools to perfect the piece, guided by AI as an assistant.
What it offers to the business (marketing, content, branding)
- Branded models: with Firefly Foundry, trained on own IP and shipped with secure commercial use, designed to generate on-brand series for campaigns, retail media, performance, and multi-format content.
- Execution on Vertex AI with data commitments: clear separation between client data and base models, and options for customization without knowledge leaks to third parties.
- Operationalizing content: GenStudio integrates planning, production, activation, and measurement; AI ceases being just a plugin, becoming a production system with KPIs related to time, cost, and performance.
An expanding ecosystem (and connecting with YouTube)
The Google alliance follows shortly after Adobe announced with YouTube a creation space for Shorts within Premiere Mobile (“Create for YouTube Shorts”), enabling direct editing and publishing on Google’s platform. The fit is clear: Google models integrated into Adobe workflows, editing and publishing without leaving the mobile environment, and channels with broad reach. For creators, less friction; for brands, more control and consistency in execution.
Voices behind the partnership
Shantanu Narayen, Adobe CEO, summarized the spirit of the agreement: “This alliance combines Adobe’s creative DNA with Google’s AI models to empower creators and brands through Firefly, Creative Cloud, and Firefly Foundry”. For Thomas Kurian, CEO of Google Cloud, the key is integrating Google’s models into the Adobe ecosystem so that creators, professionals, and brands have tools and platforms to accelerate creation and make possible what was previously infeasible.
Risks and cautions
Both companies emphasize that these are prospective statements: regional availability, initial model catalog, and deployment speed may vary. For enterprises, success depends on:
- Data governance and roles within GenStudio/Vertex.
- Cleaning the IP used for training branded models.
- Team training in prompting, best practices, and quality control.
- Impact metrics (content time-to-impact, cost per variation, campaign lift).
Market signals
The model-agnostic integration within Adobe — now with Gemini, Veo, and Imagen — confirms a trend: creators choose the model based on the task, not vice versa. And in business, the combined GenStudio + Vertex AI + Foundry transition suggests moving from siloed tools towards orchestrated platforms where AI is at the heart of the creative and operational flow.
Frequently Asked Questions
Will I be able to choose Google’s model within Photoshop, Firefly, or Premiere?
Yes. The partnership envisions access to Google models within Adobe apps and GenStudio, allowing creators to select the best model for their goal (image, video, multimodal) and continue editing with precision tools.
What guarantees are there regarding my data when customizing branded models?
For enterprise clients, Google models run on Vertex AI with data commitments (client data is not used to train Google’s foundational models). The Firefly Foundry customization starts from own IP and safe-to-use Firefly models designed for secure commercial use.
When will it be available and where?
Adobe and Google Cloud discuss global deployment with joint go-to-market activities. Availability may vary by region and product. Some features will arrive gradually. Keep an eye on Adobe MAX updates and the official Newsroom.
How does this connect with the new creation space for YouTube Shorts?
The alliance with YouTube introduces “Create for YouTube Shorts” in Premiere Mobile — offering direct editing and publishing. Combined with Google models in Adobe, it reduces friction between creation and publication within the Google ecosystem.

