NVIDIA Tesla A100 80G Deep Learning GPU Computing Graphics Card OEM Version
- 34%

NVIDIA Tesla A100 80G Deep Learning GPU Computing Graphics Card OEM Version

Original price was: $12,000.00.Current price is: $7,900.00.

Placeholder

Original price was: $12,000.00.Current price is: $7,900.00.

Add to cart
Buy Now
  • Size Guide

    Size Guide

    SIZE CHEST WAIST HIPS
    XS 34 28 34
    S 36 30 36
    M 38 32 38
    L 40 34 40
    XL 42 36 42
    2XL 44 38 44
  • Delivery Return

    Delivery

    We ship to all 50 states, Washington DC. All orders are shipped with a UPS tracking number. Always free shipping for orders over US $200. During sale periods and promotions the delivery time may be longer than normal.

    Return

    Elessi will accept exchanges and returns of unworn and unwashed garments within 30 days of the date of purchase (14 days during the sales period), on presentation of the original till receipt at any store where the corresponding collection is available within the country of purchase. Your return will usually be processed within a week to a week and a half. We’ll send you a Return Notification email to notify you once the return has been completed. Please allow 1-3 business days for refunds to be received to the original form of payment once the return has been processed.

    Help

    Give us a shout if you have any other questions and/or concerns. Email: contact@mydomain.com Phone: +1 (23) 456 789
  • Ask a Question
Category:

Guaranteed Safe Checkout

Free

Worldwide Shopping

100%

Guaranteed Satisfaction

30 Day

Guaranteed Money Back

SPECIFICATIONS

A100 40GB PCIe A100 80GB PCIe A100 40GB SXM A100 80GB SXM
FP64 9.7 TFLOPS
FP64 Tensor Core 19.5 TFLOPS
FP32 19.5 TFLOPS
Tensor Float 32 (TF32) 156 TFLOPS | 312 TFLOPS*
BFLOAT16 Tensor Core 312 TFLOPS | 624 TFLOPS*
FP16 Tensor Core 312 TFLOPS | 624 TFLOPS*
INT8 Tensor Core 624 TOPS | 1248 TOPS*
GPU Memory 40GB HBM2 80GB HBM2e 40GB HBM2 80GB HBM2e
GPU Memory Bandwidth 1,555GB/s 1,935GB/s 1,555GB/s 2,039GB/s
Max Thermal Design Power (TDP) 250W 300W 400W 400W
Multi-Instance GPU Up to 7 MIGs @ 5GB Up to 7 MIGs @ 10GB Up to 7 MIGs @ 5GB Up to 7 MIGs @ 10GB
Form Factor PCIe SXM
Interconnect NVIDIA® NVLink® Bridge for 2 GPUs: 600GB/s **
PCIe Gen4: 64GB/s
NVLink: 600GB/s
PCIe Gen4: 64GB/s
Server Options Partner and NVIDIA-Certified Systems™ with 1-8 GPUs NVIDIA HGX™ A100-Partner and NVIDIA-Certified Systems with 4,8, or 16 GPUs
NVIDIA DGX™ A100 with 8 GPUs
Customer reviews
  • 0
    0 ratings
  • 5 Stars
    0%
    4 Stars
    0%
    3 Stars
    0%
    2 Stars
    0%
    1 Star
    0%
Reviews

There are no reviews yet.

Write a customer review

Be the first to review “NVIDIA Tesla A100 80G Deep Learning GPU Computing Graphics Card OEM Version”

Product Recently View

You have not recently viewed item.

X