Product range
My Mercateo
Sign in / Register
Basket
 
 

QNAP Mustang-F100 interface cards/adapter Internal


Quantity:  piece  
Product information
QNAP Mustang-F100 interface cards/adapter Internal
QNAP Mustang-F100 interface cards/adapter Internal
QNAP Mustang-F100 interface cards/adapter Internal
QNAP Mustang-F100 interface cards/adapter Internal
QNAP Mustang-F100 interface cards/adapter Internal
Item number:
     CIEN2-69961884
Manufacturer:
     QNAP
Manufacturer no.:
     MUSTANG-F100-A10-R10
EAN/GTIN:
     842936100887
Search terms:
Application accelerator
Application accelerators
application accelerator
As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-F100 is a PCIe-based accelerator card using the programmable Intel® Arria® 10 FPGA that provides the performance and versatility of FPGA acceleration. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads.

OpenVINO™ toolkit
OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware and maximizes performance.

It can optimize pre-trained deep learning model such as Caffe, MXNET, Tensorflow into IR binary file then execute the inference engine across Intel®-hardware heterogeneously such as CPU, GPU, Intel® Movidius™ Neural Compute Stick, and FPGA.

Get deep learning acceleration on Intel-based Server/PC
You can insert the Mustang-F100 into a PC/workstation running Linux® (Ubuntu®) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-F100 can also work with Intel® OpenVINO™ toolkit to optimize inference workloads for image classification and computer vision.

QNAP NAS as an Inference Server
OpenVINO™ toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance. When used with QNAP’s OpenVINO™ Workflow Consolidation Tool, the Intel®-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO™ toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-F100 to achieve optimal performance for running inference.
More information:
Design
Internal
Yes
Product colour
Black, Grey
Cooling type
Active
Number of fans
2 fan(s)
Logistics data
Harmonized System (HS) code
84733020
Packaging data
Quantity
1
Operational conditions
Operating temperature (T-T)
5 - 60 °C
Operating relative humidity (H-H)
5 - 90 %
Power
Power consumption (typical)
60 W
Ports & interfaces
Host interface
PCIe
Expansion card form factor
Low-profile
Expansion card standard
PCIe 3.0
Weight & dimensions
Width
169.5 mm
Depth
68.7 mm
Height
33.7 mm
Features
Chipset
Intel Arria 10 GX1150 FPGA
Other search terms: interface card, interface cards, interface adapter, interface adapters, IO port card, IO port cards, I/O port card, I/O port cards, expansion card, expansion cards, expansion board, expansion boards, adapter card, adapter cards
An overview of the conditions1
Delivery period
Stock level
Price
from € 2,704.82*
  
Price is valid from 500 pieces
Select conditions yourself
Share itemAdd item to shopping list
Staggered prices
Order quantity
Net
Gross
Unit
1 piece
€ 2,824.09*
€ 3,473.63
per piece
from 2 pieces
€ 2,808.32*
€ 3,454.23
per piece
from 5 pieces
€ 2,754.48*
€ 3,388.01
per piece
from 10 pieces
€ 2,723.16*
€ 3,349.49
per piece
from 500 pieces
€ 2,704.82*
€ 3,326.93
per piece
* Prices with asterisk are net prices excl. statutory VAT.
Our offer is only aimed at companies, public institutions and freelancers.