Lexar launches the first “AI storage core” for next-generation edge devices

Lexar has introduced what it defines as the industry’s first AI storage core, a solution specifically designed for the new wave of edge AI devices: AI PCs, smart vehicles, advanced cameras, and robotic systems. The company aims to address an increasingly evident problem: traditional storage is becoming a bottleneck for real-time AI.

According to Gartner forecasts, AI PCs are expected to reach 143 million units by 2026, representing over half of the global PC market. In this context, simply having “more capacity” is no longer enough; more bandwidth, lower latency, greater durability, and much more flexibility are needed to move models, data, and entire systems between devices.

Lexar’s new storage core responds to this scenario with three clear pillars: high performance, high reliability, and high flexibility.


A module of up to 4 TB designed for AI, not just for storage

Unlike conventional memory cards or SSDs, Lexar’s AI storage core is conceived as a removable module of up to 4 TB, designed to:

  • handle multimodal AI workloads in real time,
  • support extremely random I/O patterns,
  • and survive in environments where heat, vibration, or dirt are the norm, not the exception.

The brand’s proposal is to turn the module into a kind of “intelligence cartridge”: it not only stores data but also facilitates moving models, runtime environments, operating systems, and security configurations between compatible machines, thanks to its PCIe boot support. In other words, it’s possible to boot Windows and applications directly from the storage core without a traditional internal SSD.


Three key innovations for the AI era

1. High performance: optimized for LLM, 8K video, and random workloads

Lexar promises sequential read and write speeds significantly higher than traditional memory cards, but the focus goes beyond linear performance.

The storage core is optimized for:

  • small 512 B blocks, crucial for typical random I/O in databases, LLMs, and inference engines,
  • and close collaboration between the host and the module through layers like SLC Boost and Read Cache, designed to improve loading of language models, generative image workflows, and other real-time AI tasks.

In other words, it’s not just about copying large files faster, but about consistently responding to thousands or millions of small operations per second, which is critical when deploying complex models directly on the device.

2. High reliability: from laptops to autonomous vehicles

The second pillar is robustness. Leveraging Longsys’s integrated packaging technology, the module offers:

  • protection against dust, water, shocks, and radiation,
  • and in future models, support for an extended temperature range of -40°C to 85°C.

This type of specification is common in automotive, industrial robotics, and outdoor equipment, but less so in consumer-oriented storage solutions. Lexar aims to cover use cases from AI PCs and mobile workstations to autonomous driving systems, logistics robots, and outdoor professional cameras.

3. High flexibility: hot-swappable and device-to-device collaboration

The hot-swappable design allows inserting or removing the storage module while the system is running, without shutdown or restart.

This enables several scenarios:

  • switching AI models or OS versions simply by replacing the module,
  • migrating entire work environments between a PC with AI, a lab setup, and a field device,
  • or physically separating sensitive data when the device is not in use.

To prevent performance degradation under sustained workloads, Lexar has developed a co-designed thermal solution that maintains temperature control even during prolonged inference sessions or high-resolution video capture.


Five use cases where Lexar aims to make a difference

The company highlights five major domains where its AI storage core intends to become a key component of the architecture:

AI PCs

In computers with dedicated NPU and GPU for AI, storage can be the weakest link if not up to the task. Lexar’s module is designed to:

  • accelerate model loading and switching,
  • enhance the fluidity of LLM workflows and content generation,
  • and enable full portability: users can carry their entire environment on the module and insert it into different mobile workstations.

AI Gaming

Nederland’s next-gen games are starting to incorporate AI-driven NPCs, dynamic environments, and near real-time content generation, which increases I/O demands.

The storage core offers:

  • high IOPS and fast random reads, reducing load times and micro-stutters,
  • support for high-speed frame rendering and adaptive AI systems that respond to player actions with low latency.

AI Cameras

In professional cameras and smart video systems, the module aims to provide:

  • continuous 4K/8K video recording,
  • real-time AI processing for tracking subjects, scene analysis, and image enhancement,
  • and mechanical and environmental durability suitable for outdoor shoots, action sports, or industrial environments.

Autonomous Driving

In connected vehicles and assisted or autonomous driving, the storage core must process multiple data streams simultaneously from:

  • cameras,
  • radars,
  • LiDAR sensors, and other subsystems.

The combination of impact resistance, future high-temperature support, and sustained performance aims to ensure stable operation even in demanding automotive conditions.

Robotics with AI

In warehouse robots, industrial arms, or autonomous mobile units, space and thermal dissipation are limited. The compact design and robustness of the module allow:

  • integration into factory, logistics, or outdoor robots,
  • updating their “intelligence” simply by changing the module (e.g., different security configurations or specialized models),
  • and supporting continuous operations in environments with vibrations, dust, or rapid temperature changes.

A strategic move in the race for “intelligent storage”

The announcement of Lexar’s AI storage core reflects a broader trend: as AI shifts from the cloud to the edge, storage ceases to be just a data container and becomes a strategic component of on-device AI architecture.

Where a fast SSD once sufficed, now the requirements include:

  • Sustained bandwidth for multimodal models and data,
  • Very low latency for real-time decision making,
  • Physical resilience suitable for vehicles, robots, and field cameras,
  • and logical portability to transfer complete systems between devices.

With this initiative, Lexar positions itself as a key player in the emerging “smart storage for AI” segment, ahead of a landscape where endpoints — not just data centers — will be the true battleground for AI computing in the coming years.

Scroll to Top