The new joint solution enables the implementation of departmental-specific AI inferences, combining efficiency, security, and low cost.
NetApp and Intel have announced the launch of NetApp AIPod Mini, an integrated solution designed to facilitate access to artificial intelligence (AI) for specific business units within companies. Aimed at simplifying the adoption of AI inference models in departments without technical expertise or large budgets, AIPod Mini will be available in the summer of 2025 and promises to transform the approach to AI in business environments.
AI Inference Within Everyone’s Reach
According to Dallas Olson, Chief Commercial Officer of NetApp, “our mission is to unlock the potential of artificial intelligence for every team, at every level, without barriers of complexity or cost.” AIPod Mini presents itself as an affordable and easily deployable alternative that allows rapid customization of AI applications to meet the needs of diverse areas such as legal, sales, retail, or manufacturing.
Thanks to pre-packaged workflows based on RAG (Retrieval-Augmented Generation), the solution enables the combination of generative models with private business information to provide contextual and actionable responses, while simultaneously protecting data privacy and security.
Intel Xeon 6 and NetApp ONTAP: Power and Security
The solution is built on 6th generation Intel Xeon processors with AMX extensions (Advanced Matrix Extensions), allowing it to accelerate inference tasks with energy efficiency and high scalability. On the data storage and management front, it relies on NetApp’s all-flash platform and its ONTAP system, recognized for its cyber resilience, version control, traceability, and embedded regulatory compliance.
According to Greg Ernst, corporate vice president of Intel for the Americas, “this combination puts the power of AI into the hands of business users without them having to deal with oversized infrastructure or unnecessary technical complexity.”
Three Key Pillars: Cost, Simplicity, and Security
NetApp AIPod Mini has been designed with three main objectives:
- Affordable: with an entry price tailored to departmental budgets, it offers enterprise-level performance without unnecessary additional costs.
- Simple: its pre-validated design allows for rapid implementation, frictionless integrations, and customization without overhead.
- Secure: by processing data locally, it protects privacy and facilitates regulatory compliance through NetApp ONTAP’s governance capabilities.
Use Cases Tailored to Each Sector
The solution is designed for localized AI inference, covering specific cases such as:
- Draft automation and document retrieval in legal teams
- Personalization of shopping experiences and dynamic pricing in retail
- Predictive maintenance and supply chain optimization in manufacturing
This modular, data-centric approach contrasts with the use of generic, poorly adapted AI that, according to a study by Harvard Business School, can limit the return on investment if not adequately customized.
Availability and Launch Partners
NetApp AIPod Mini will be available in the summer of 2025 through distributors and strategic partners, including Arrow Electronics, TD SYNNEX, Insight Partners, CDW (USA and UK), Presidio, and Long View Systems. These integrators will provide support and specialized services to ensure successful implementation.
With this initiative, NetApp and Intel take a significant step toward bringing artificial intelligence to all levels of the enterprise, enabling functional teams to leverage their own data for real results, without the need to rely on complex infrastructures or advanced data science knowledge.
Source: NetApp