Forget Moore's Law, here's (Jensen) Huang's Law?
NVIDIA continues it's dominance in the GPU space
Moore's law says transistors double every two years, Huang's law states that the performance of GPUs will more than double every two years (up for debate).
In my last post, I highlighted about how the H100 GPU constraints have to be overcome sooner rather than later, and yesterday with Jensen Huang unveiling the B200 and GB200, it seems like this is coming to fruition sooner rather than later!
In fact, it's coming to fruition a little too fast, with the B200 at supposedly 10x faster LLM inference and 7.5x less energy consumption. It seems like year-over-year Huang's Law is here to stay, proven further by NVIDIA saying it will deliver a 30x speedup for resource-intensive applications (e.g. the 1.8T parameter GPT-MoE) compared to the previous H100 generation.
They're not going to be cheap, probably around $40k USD per piece, so if I'll expect we all need to collect our spare change + probably a premium (because GPU scarcity) if we want to get our hands on one.