tinwhiskers@lemmy.world to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 11 months agoNew technique to run 70B LLM Inference on a single 4GB GPUai.gopubby.comexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkNew technique to run 70B LLM Inference on a single 4GB GPUai.gopubby.comtinwhiskers@lemmy.world to Free Open-Source Artificial Intelligence@lemmy.worldEnglish · 11 months agomessage-square0fedilink