Irdial@lemmy.sdf.orgtoLinux@lemmy.ml•How can I use a local LLM on Linux to generate a long story?English
0·
8 months agoOllama provides a Python API which may be useful. You could have it generate the story in chunks, having it generate a list of key points which get passed to subsequent prompts. Maybe…
It’s awesome to see a project written with Zig!