Recommendations for External GPU Docks for Home Lab Use - Lemmy
(lemmy.ml)
from WQMan@lemmy.ml to selfhosted@lemmy.world on 28 Jun 20:01
https://lemmy.ml/post/32409272
from WQMan@lemmy.ml to selfhosted@lemmy.world on 28 Jun 20:01
https://lemmy.ml/post/32409272
#selfhosted
threaded - newest
I haven’t used any but have researched it some:
Minisforum DEG1 looks like the most polished option, but you’d have to add an m.2 to oculink adapter and cable.
ADT-Link makes a wide variety of kits as well with varying pcie gen and varying included equipment.
What do you mean connect to the GPUs GPIO?
What @mierdabird@lemmy.dbzer0.com said, but the adapters arent cheap. You’re going to end up spending more than the 1060 is worth.
A used desktop to slap it in, that you turn on as needed, might make sense? Doubly so if you can find one with an RTX 3060, which would open up 32B models with TabbyAPI instead of ollama. Some configure them to wake on LAN and boot an LLM server.