They mean the 16 pro, right? Because in all likelihood, the 16 will use the A17 Pro or something equivalent.
I’m also not confident that a phone chip will be strong enough to run a LLM like ChatGPT within the next 5 years. I am aware of LLMs that can run on phones right now, but they’re slow and they’re incapable of tasks that require a moderate level of thought. I’m not optimistic about the capabilities of these exclusive features, if they are coming.
They mean the 16 pro, right? Because in all likelihood, the 16 will use the A17 Pro or something equivalent.
I’m also not confident that a phone chip will be strong enough to run a LLM like ChatGPT within the next 5 years. I am aware of LLMs that can run on phones right now, but they’re slow and they’re incapable of tasks that require a moderate level of thought. I’m not optimistic about the capabilities of these exclusive features, if they are coming.
Why not just use chatgpt in browser
Latency? Privacy? Integration with on-device data? Server costs?