Gus Mueller on Apple and AI

Gus Mueller argues that Apple needs to support developers running LLMs locally on the Mac, rather than simply try to catch up on their own.

There isn’t one specific quote that I can pull out of Gus’ post that makes his point so do go and read his post. But he also said:

So I can run models locally on my M1 Mac, and while it’s not as fast as running it on Anthropic or OpenAI’s servers, it was still usable. Which is mind blowing to me. I honestly never expected to see this tech in my lifetime. (Yes, LLMs get a lot wrong, but they also get so many things right and help me out with tedious coding chores).

I prefer running LLMs locally. I ran them on my Intel based Mac and now on this M4 based Mac it is so much nicer. The models are getting smaller and more capable, the tooling is steadily improving, and the chips in our computers are amazing. So let us run them locally! Natively! Securely!

I have more to write about LLMs. They are amazing and infuriating, an abomination and a revelation, a coup and a boon, a bubble and a burst – all at the same time.

Last updated:

Powered by Hubbub Pro+


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *