I use it to help me write text.
Don't use any tools. I run it from the command line:
./main -f ~/Desktop/prompts/multishot/llama3-few-shot-prompt-10.txt -m ~/Desktop/models/Meta-Llama-3-8B-Instruct-Q8_0.gguf --temp 0 --color -c 1024 -n -1 --repeat_penalty 1.2 -tb 8 --log-disable 2>/dev/null
I prefer `main` to the new `llama-cli` because when searching history for "llama" I want to get commands that contain the "llama" models, not "mistral" ones, for example.
I've not used it myself, but I see a lot of people referencing Ollama. It uses llama.cpp (and maybe more).
I use it to help me write text.
Don't use any tools. I run it from the command line:
./main -f ~/Desktop/prompts/multishot/llama3-few-shot-prompt-10.txt -m ~/Desktop/models/Meta-Llama-3-8B-Instruct-Q8_0.gguf --temp 0 --color -c 1024 -n -1 --repeat_penalty 1.2 -tb 8 --log-disable 2>/dev/null
I prefer `main` to the new `llama-cli` because when searching history for "llama" I want to get commands that contain the "llama" models, not "mistral" ones, for example.