As my usage of LLMs has been increasing lately, I find myself more and more frustrated with Siri, specifically on the Mac.
As a Mac user, I have this incredible wealth of GPU and CPU power, which in turn allows me to run LLMs locally.
A few weeks ago, before a trip out of the country for my daughter's spring break, I set up a local instance of DeepSeek and made sure I could connect to it via Tailscale running on my Mac.
Why did I do this? Two reasons.
The first was because I could and there's something inherently cool and fun about running these models locally. It's a joy to play around with this stuff.
The second was a tinge of paranoia. What if I wasn't able to access the models I usually use from out of the country? LLMs are so useful for so many things, I really don't want to lose access now that I know about them. Yes, I could route all requests through my VPN, but … still, what if I couldn't?
So I can run models locally on my M1 Mac, and while it's not as fast as running it on Anthropic or OpenAI's servers, it was still usable. Which is mind blowing to me. I honestly never expected to see this tech in my lifetime. (Yes, LLMs get a lot wrong, but they also get so many things right and help me out with tedious coding chores).
A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.
Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.
The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don't have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple's. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.
I'm glad I'm not the only one thinking about this. Ben Thompson writes at the end of Apple AI’s Platform Pivot Potential:
This doesn’t necessarily preclude finally getting new Siri to work; the opportunity Apple is pursuing continues to make sense. At the same time, the implication of the company’s differentiation shifting to hardware is that the most important job for Apple’s software is to get out of the way;
This passage isn't the crux of the article, but it really resonated with me, and I hope it does with some folks inside Apple as well.
…
(Update) Manton Reese is thinking along the same lines: Apple's response to AI:
I’m not sure Apple knows what a big risk they are taking by letting OpenAI and others lap them in the AI race. It’s a risk that will pay off if they can execute. Just as likely, though, we are seeing such a disruption in computing that Apple is vulnerable for the first time in a decade.