Uillem – an offline, containerized LLM interface

  • I noticed that offline LLM builds running on personal computers are now possible, but it seemed like all the solutions required the installation of dependencies, so I created a containerized solution that makes it easy to swap out the model in use: https://github.com/paolo-g/uillem

  • Nice, love seeing Paolo post this! He is a great guy, we used to work together, and I’m excited to see where he takes this.

  • This is pretty neat! Now I just need a good library of models to plug in haha

  • Very impressive, can't wait to give it a try!

  • Cool