AI Recommended CPU Only Build for Local LLM

I've been wanting to host my own LLM for the large models like 70B and such with minimal cost. Meanwhile GPUs, while I'm inclined to invest in, are quite costly, but the thing is that I do not really need super fast inference speed; as long as I have a machine that can slowly chunk through data throughout the data, it is all fine.

I've seen mentioned in reddit multiple times that in this case, the most cost effective option might be to purchase a server-grade CPU with enough memory bandwidth (high max # of memory channels), so did some research and consulted with Perplexity, and this is the build I am thinking of now:

  1. CPU: AMD EPYC 7282
  2. Supermicro H11DSI motherboard
  3. Cooler: Arctic Freezer 4U SP3
  4. RAM: 8 x 16 GB DDR4 RDIMM
  5. Boot Drive: Crucial P3 Plus 1TB NVMe
  6. Power Supply: EVGA SuperNOVA 750 GT

All this comes up to ~ 1200 dollars with tax, I think (?). There should be enough memory to run a Mistral MoE model maybe?

And then I'm thinking of adding one GPU at a time, kind of like a gift for myself after each paycheck.

Does this build make sense? I've never built computers before and so wanted some confirmation that this could work lol.

Also if there's any recommendation for a case that could fit all these, it would be much appreciated. Thanks in advance, and hope this + the comments help other budget constrained people run their own local llms.