zhenbo_endle@lemmy.caM · 3 months agoSeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustplus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareSeekStorm - sub-millisecond full-text search library & multi-tenancy server in Rustplus-squarezhenbo_endle@lemmy.caM · 3 months agomessage-square0fedilink
zhenbo_endle@lemmy.caMEnglish · 7 months agoRust and Neovim - A Thorough Guide and Walkthroughplus-squarersdlt.github.ioexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkRust and Neovim - A Thorough Guide and Walkthroughplus-squarersdlt.github.iozhenbo_endle@lemmy.caMEnglish · 7 months agomessage-square0fedilink
zhenbo_endle@lemmy.caMEnglish · 7 months agoMozilla-Ocho/llamafile: Distribute and run LLMs with a single file.plus-squaregithub.comexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkMozilla-Ocho/llamafile: Distribute and run LLMs with a single file.plus-squaregithub.comzhenbo_endle@lemmy.caMEnglish · 7 months agomessage-square0fedilink
zhenbo_endle@lemmy.caM · 7 months agoLocal LLaMA Server Setup Documentationplus-squaregithub.comexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkLocal LLaMA Server Setup Documentationplus-squaregithub.comzhenbo_endle@lemmy.caM · 7 months agomessage-square0fedilink
zhenbo_endle@lemmy.caMEnglish · 7 months agoRunning Local LLMs, CPU vs. GPU - a Quick Speed Testplus-squaredev.toexternal-linkmessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkRunning Local LLMs, CPU vs. GPU - a Quick Speed Testplus-squaredev.tozhenbo_endle@lemmy.caMEnglish · 7 months agomessage-square0fedilink