As per the title...
I am spec'ing a machine for running (not training - I have another machine for that) a LLM and have a GPU for the heavy lifting but have having issues finding information on how much budget should be put on the CPU for this type of system.
Does the CPU just handle normal system tasks (disk, network OS standard processes etc) or is it more involved with running a LLM even if a GPU is already involved.
What would be a good choice of CPU for a mid-range locally hosted private chatbot. Low(ish) TDP would be good.
Thanks
I am spec'ing a machine for running (not training - I have another machine for that) a LLM and have a GPU for the heavy lifting but have having issues finding information on how much budget should be put on the CPU for this type of system.
Does the CPU just handle normal system tasks (disk, network OS standard processes etc) or is it more involved with running a LLM even if a GPU is already involved.
What would be a good choice of CPU for a mid-range locally hosted private chatbot. Low(ish) TDP would be good.
Thanks