cm0002@programming.dev to LocalLLaMA@sh.itjust.worksEnglish · 2 months agoRunning Local LLMs with Ollama on openSUSE Tumbleweednews.opensuse.orgexternal-linkmessage-square27fedilinkarrow-up117arrow-down13
arrow-up114arrow-down1external-linkRunning Local LLMs with Ollama on openSUSE Tumbleweednews.opensuse.orgcm0002@programming.dev to LocalLLaMA@sh.itjust.worksEnglish · 2 months agomessage-square27fedilink
minus-squareBetaDoggo_@lemmy.worldlinkfedilinkEnglisharrow-up6·2 months agoIt also sets context length to 2k by default iirc, which breaks a lot of tasks, and gives a general bad first impression to users who are likely using local models for the first time.
minus-squarebrucethemoose@lemmy.worldlinkfedilinkEnglisharrow-up3·2 months agoYes, and it’s hard to undo, and not obvious!
It also sets context length to 2k by default iirc, which breaks a lot of tasks, and gives a general bad first impression to users who are likely using local models for the first time.
Yes, and it’s hard to undo, and not obvious!