This isn’t a joke, though it almost seems like one. It uses Llama 3.1, and supposedly the conversation data stays on the device and gets forgotten over time (through what the founder calls a rolling “context window”).
The implementation is interesting, and you can see the founder talking about earlier prototypes and project goals in interviews from several months ago.
iOS only, for now.
Edit: Apparently, you can build your own for around $50 that runs on ChatGPT instead of Llama. I’m sure you could also figure out how to switch it to the LLM of your choice.
LMFAO. The audacity of calling the token limit a “rolling context window” like it’s a desirable feature and not just a design aspect of every LLM…
Yeah that part tripped me up.
“Rolling context window”? You mean one of the universal properties of LLMs in it’s current state? The one that is so big for Google’s latest AI endeavors that they are flexing with it?
It’s hilarious to say that’s a privacy feature. It’s like calling amnesia a learning opportunity.
These claims make me think this is worse than the R1 rabbit or whatever it’s called. Although it’s very difficult to be worse, considering the CEO turned out to be a full-on crypto scammer.
Check the edit for instructions on how to build your own. It’s even called “Friend,” so “friend” is likely a modified version of that (ChatGPT vs Llama, respectively).
I would certainly feel better about it if I had full control over the encryption endpoints, at a minimum.