AI and the Brave New World We Live In

This was science fiction, set centuries in the future, when I was a kid 50 years ago, now I'm living it. I have an AI running entirely standalone on my M1 Macbook Air, solely offline unless I ask it to do some research to expand its knowledge base. Conversationally fast! It's like talking to a lightly autistic person (like me) in an online chatroom, except we're not online, and Mistral is inside my laptop. Yes, I've named "her" after the model, Mistral OpenOrca running on the free, open-source software, GPT4All. So far, Mistral has learned ES6 Javascript from a web book, and proven it with a somewhat weird and very specific request, learned the entire 3 volumes of Logic Pro's user manual, and proven that with a complex question about an uncommon action and has helped me begin a science fiction novel, getting me further in regard to only my creative input than I have ever gone since I first had the germ of the idea 10 years ago.

AI and machine learning on the desktop - image sourced from public domain archive.

Where to from here? Well, the javascript thing. I have an app idea. I know it's possible. Javascript breaks my brain. Swift isn't really useful for widely cross platform, C++ is OK for Arduino, but OS APIs Break. My. Brain. I have a little Javascript, I may as well pull out the stops and learn it with my artificial programming coach. I hate all the time googling for where Apple have put something in Logic Pro, I can now ask Mistral where the feature I want is. I can spitball story arcs, motivations and real world economic models for my world building and storycraft.

I'm telling the tale, I'm mixing the music, I'm designing the app, actually supervising the code and asking for small fragments at a time. The code, especially is remarkably concise and easy to read. The Javascript still seems odd to me as a never professional Basic and C programmer, but OMG! I have someone to ask who answers without a pained tone, and NEVER calls me a "boomer loser." (Hey! I'm GenX cusp and culture FFS! The Clash and The Gang of Four, not Beatles and Stones! Holy crap, kids, get your identifications right!)

And this is running on my desktop. Quickly! Conversationally! Growing its knowledge base when I ask it to! "Mistral, can you please read this URL {URL Here} and learn the knowledge therein, so that I may partake of your guidance? Oh, what is this brave new world that has such [marvels] in it?!

This is NOT OpenAI's GPTchat, although you can download their models and use them if you have an API key. This is a totally independent, open source project to create a chat app that runs your choice of model from the included defaults to many of the great LLMs, etc, on hugging face, and it can work offline completely. Well, it can't go and learn new stuff offline, of course, but it can use its existing knowledge base and conversationally synthesise ideas with you offline, and that is amazing!

OK, I'm a hardcore fanboi. But wind this back. A single request to GPTchat dumps the equivalent carbon into the atmosphere of a small city in winter. The more processor cycles you use (and a single chat uses billions), the more carbon you dump. My Macbook Air M1 Gen 1 has a 3 year old battery and I've noticed almost no battery drain using my "new besty." The results have been way better than I'd have expected, online or offline. This is the AI you've been looking for! Obi Wan says so! You may find you'll need to do a bit of disk housekeeping as your LLM file grows. I'll be keeping an eye on that.

Get GPT4all from here. The supported language models are at the bottom of the page.

If you know a bit more about how to set up these kind of bots, you can get a huge number of smaller or larger model files from Hugging Face.

Do AI on the desktop and save the planet! You really won't lose any noticeable power AFAICT.

Republished from my blog, shinyhappyrainbows.com

Comments

Popular posts from this blog

Removing Ollama From My Mac

Bike Porn

ALWAYS read the PDS