Removing Ollama From My Mac
The AI experiment in my "laboratory" is over. I've been running Ollama locally on my laptop, as in:- an 8G, local, offline language model, running on local, offline command line software. Don't get me started about the carbon pollution environmental disaster being created by serverside queries to AI like, "Does my partner really love me." The local-only model seemed like an affordable and accountable approach and my M1 Mac is energy efficient fast enough that even really tricky queries were sorted way under 2 minutes. The thing is, I never got a single, useful response to a query. Not one, that was fit for purpose, anyway, out of Ollama's "mouth." I figured that between my ability to describe a problem, and its probable solution, architecturally and simply, might result in useful code fragments for my various microcontroller coding and OpenSCAD design tasks. Not one. The work required to make anything useful was at least as much as designing i
Comments
Post a Comment