Your phone already knows more about you than anyone. If your AI agent could access everything on your phone, it would be able to tailor its responses to you. Think about how much YouTube, Netflix and Twitter know about your preferences and habits. Imagine integrating all that data into an AI agent. It would know you better than anyone.
By keeping everything local and using open-source software, EXO doesn't handle any of your data - it stays on your device, it's owned by you and you can delete it any time. This avoids the regulatory hurdles of handling user data. Cloud platforms like OpenAI or Anthropic attempting the same thing would be very difficult. Even Apple is having trouble building out their Apple Intelligence ecosystem. Cloud platforms need explicit permissions from every platform and as a user, you'd need to trust them with all of your data.
Local models don't need to match the performance of closed-source models. They just need to be close enough. A model that's slightly less powerful but tailored with all your data will perform better than a closed source one that knows little about you. For example, imagine asking "What should I watch tonight?" - a generic model might suggest popular shows like Succession or Wednesday, but a personalized model that knows you've watched and liked every Korean drama on Netflix, regularly watch K-pop videos on YouTube, and follow Korean directors on X would immediately recommend relevant Korean content you haven't seen yet. This personalization makes the AI dramatically more useful, even if its raw capabilities are slightly lower. Fortunately, we're already there - open-source models from DeepSeek and Meta are close to the performance of the best closed-source models.
At WWDC 2024, Apple announced iPhone mirroring. The basic idea is that your iPhone display appears in a window on your Mac and you can use your trackpad and keyboard to control your iPhone.
We've integrated this directly into the EXO Desktop App so it's as frictionless as it possibly could be. It's one click to give the EXO Desktop App control of your phone. It works by controlling the iPhone mirroring instance on your mac. A vision language model takes the phone screen as input and outputs commands to be executed on the phone. It's available today in the EXO Desktop app (early alpha - version 0.0.3-alpha) on the releases page of the exo github repo.
Getting started with personalization is simple - just click the iPhone icon in the EXO Desktop App. This starts an iPhone mirroring session and takes control of the iPhone mirroring window.
After enabling, you'll be prompted on your iPhone to allow screen control. We have made this prompt prominent as this is still an experimental feature. The entire process takes about 2 minutes.
We've already built integrations with several popular apps, allowing EXO to access your YouTube watch history, Netflix viewing data, and X likes. This gives the AI valuable context about your interests and preferences while keeping your data private and secure on your own device.
This approach points toward a future where AI can serve as a trusted extension of ourselves—an EXOcortex. For your EXOcortex to be maximally useful and trusted, it should know everything you know. For that, we need more integrations. We're asking the community for requests for integrations. This could be anything from your ChatGPT chat history to your photo album - we want your suggestions. Visit our Discord community to join the discussion or make a suggestion by clicking the button below:
Request a New Data Integration