Grok-1 converted to PyTorch fp16 (638GB lol)
https://huggingface.co/hpcai-tech/grok-1 (I'm not the author!)
Maybe someone can quantize this 638gb monster?
Although to cramp it into a somewhat reasonable personal computer (128gb ram + 2x3090 = 176gb total) you'd need to achieve <2.2bpw