I am mostly running local models using Ollama on a 32G on chip memory Mac Mini. This morning I am experimenting with function calling. Yesterday was a RAG app with a simple web UI.
Historically I have worked in NLP, and LLMs have made many older techniques obsolete, and that has been fun.
I have also been using OpenAI APIs for about 2 years and now I also experiment with Anthropic’s Claude 2 and Google Bard. Nothing important, I just want to have a solid intuition on what different commercial models can do.
I am mostly running local models using Ollama on a 32G on chip memory Mac Mini. This morning I am experimenting with function calling. Yesterday was a RAG app with a simple web UI.
Historically I have worked in NLP, and LLMs have made many older techniques obsolete, and that has been fun.
I have also been using OpenAI APIs for about 2 years and now I also experiment with Anthropic’s Claude 2 and Google Bard. Nothing important, I just want to have a solid intuition on what different commercial models can do.