logo Multitronic Oy
Korsholmsesplanaden 38
65100 Vasa
Web: www.multitronic.fi
Telefon: 06 - 319 77 00
E-post: info@multitronic.fi

Accelerator Card

Accelerator Card

Tillverkare: QNAP SYSTEMS
ID: MUSTANG-V100-MX8-R10
1547,90 € Visa Moms 0% Visa Moms 24%
Inte tillgänglig för tillfället Beräknad leverans: Okänd
Produktspecifikation
Beskrivning
På lager i butiker
Leverans
Prisutveckling
Anslutningar
Interncheckmark
VärdgränssnittPCIe
Vikt & dimension
Höjd23,2 mm
Bredd169,5 mm
Djup80 mm
Miljökrav
Intervall för relativ operativ luftfuktighet5 - 90%
Temperatur vid drift5 - 55 ° C
Förpackningens innehåll
Antal1
Design
Antal fläktar1 fläkt/-ar
Kylning typAktiv
Prestanda
KretsarIntel Movidius Myriad X MA2485
Produktens färgSvart
EAN4713291730896
Garanti2 år
Source: Icecat.biz

As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-V100 is a PCIe-based accelerator card using an Intel® Movidius™ VPU that drives the demanding workloads of modern computer vision and AI applications. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads.

OpenVINO™ toolkit
OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware and maximizes performance.

It can optimize pre-trained deep learning model such as Caffe, MXNET, Tensorflow into IR binary file then execute the inference engine across Intel®-hardware heterogeneously such as CPU, GPU, Intel® Movidius™ Neural Compute Stick, and FPGA.

Get deep learning acceleration on Intel-based Server/PC
You can insert the Mustang-V100 into a PC/workstation running Linux® (Ubuntu®) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-V100 can also work with Intel® OpenVINO™ toolkit to optimize inference workloads for image classification and computer vision.

QNAP NAS as an Inference Server
OpenVINO™ toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance. When used with QNAP’s OpenVINO™ Workflow Consolidation Tool, the Intel®-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO™ toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-V100 to achieve optimal performance for running inference.

- Half-height, half-length, single-slot compact size
- Low power consumption, approximate 2.5W for each Intel® Movidius™ Myriad™ X VPU
- Supported OpenVINO™ toolkit, AI edge computing ready device
- Eight Intel® Movidius™ Myriad™ X VPU can execute eight topologies simultaneously

MT Vasa: 0 st
Beräknad leverans: Okänd
Jakobstad: 0 st
Beräknad leverans: Okänd
Seinäjoki: 0 st
Beräknad leverans: Okänd
Jyväskylä: 0 st
Beräknad leverans: Okänd
Lappeenranta: 0 st
Beräknad leverans: Okänd
Mariehamn: 0 st
Beräknad leverans: Okänd
iT Vasa: 0 st
Beräknad leverans: Okänd
Avhämtning från butiken
Beräknad leveranstid: Okänd
Gratis
Postens paketautomat
Beräknad leveranstid: Okänd
Gratis
Postpaket
Beräknad leveranstid: Okänd
Gratis
Matkahuolto Busspaket
Beräknad leveranstid: Okänd
Gratis
Till dörren-paket
Beräknad leveranstid: Okänd
2,90 €
Matkahuolto Närpaket
Beräknad leveranstid: Okänd
6,20 €
Hempaket
Beräknad leveranstid: Okänd
11,70 €