I get what you mean, but there’s currently nothing planned or in the works to run local AI on phones, also they’re still way too demanding. They can’t even handle an 7b model if we’re talking about ram usage alone.
Futureproofing, I guess, but also I’m sure there are things other than language models that can benefit from a dedicated processing unit, like photo processing, smaller models, Lens, etc.
I get what you mean, but there’s currently nothing planned or in the works to run local AI on phones, also they’re still way too demanding. They can’t even handle an 7b model if we’re talking about ram usage alone.
Actually, Speach To Text is done locally on newer pixel devices. So is the audio recognition, the camera processing and a lot more AI features.
AI isn’t just chatgpt and basically every device in the last 4 years has a dedicated AI chip
Futureproofing, I guess, but also I’m sure there are things other than language models that can benefit from a dedicated processing unit, like photo processing, smaller models, Lens, etc.