Meet us at Computex 2026.

AI Training Calculator

Estimate VRAM usage and find the best GPU configuration for your model — inference or fine-tuning.

VRAM Breakdown

21.2GB
Model Weights16.0 GB75%
KV Cache0.5 GB3%
Activations3.2 GB15%
Framework Overhead1.5 GB7%

GPU Requirements

View Pricing →
A100 40GB
Recommended
VRAM / Card
40 GB
Cards Needed
1x
Total VRAM
40 GB
53%
B200
VRAM / Card
192 GB
Cards Needed
1x
Total VRAM
192 GB
11%
H200
VRAM / Card
141 GB
Cards Needed
1x
Total VRAM
141 GB
15%
H100 NVL
VRAM / Card
94 GB
Cards Needed
1x
Total VRAM
94 GB
23%
A100 80GB
VRAM / Card
80 GB
Cards Needed
1x
Total VRAM
80 GB
27%
H100 SXM5
VRAM / Card
80 GB
Cards Needed
1x
Total VRAM
80 GB
27%
L40S
VRAM / Card
48 GB
Cards Needed
1x
Total VRAM
48 GB
44%
RTX 4090
VRAM / Card
24 GB
Cards Needed
1x
Total VRAM
24 GB
88%