AAEON MAXER-2100 AI Inference Server

AAEON Releases MAXER-2100 AI Inference Server

AAEON, a leading provider of advanced AI solutions (Stock Code: 6579), has launched the MAXER-2100, the first product in its AI Inference Server line. This 2U Rackmount server is powered by the Intel Core i9-13900 Processor, designed to meet high-performance computing needs.

The MAXER-2100 supports both 12th and 13th Generation Intel Core LGA 1700 socket-type CPUs, up to 125 W, and features an integrated NVIDIA GeForce RTX 4080 SUPER GPU. It is also compatible with NVIDIA-Certified Edge Systems for NVIDIA L4 Tensor Core and NVIDIA RTX 6000 Ada GPUs.

Equipped with a high-performance CPU and industry-leading GPU, the MAXER-2100 can execute complex AI algorithms, process multiple high-definition video streams, and refine large language models and inferencing models. It offers up to 128 GB of DDR5 system memory and various storage options.

For connectivity, the server includes multiple RJ-45 ports, USB 3.2 Gen 2 ports, and industrial communication options. Display interfaces such as HDMI 2.0, DP 1.4, and VGA ports are also available.

The MAXER-2100 is compact and features a novel cooling architecture to manage thermal output effectively. AAEON has identified three primary user bases for the system - edge computing clients, central management clients, and enterprise AI clients.

For more information and detailed specifications, please visit the MAXER-2100 product page.