*By Carlo Versano* Apple instructed Facebook to remove an app that lets users redirect their mobile data through a VPN managed by Facebook servers, saying the software violated new rules Apple put in place to limit the data developers can collect. The iPhone maker's demand to remove Onavo Protect ー which is ostensibly designed to protect user privacy ー for being too broad in how it tracks those users is a blow to Facebook as the social media giant grapples with new controversies related to its ad model, privacy, and the distortion of the platform by bad actors. The story was first reported late Wednesday by the [Wall Street Journal](https://www.wsj.com/articles/facebook-to-remove-data-security-app-from-apple-store-1534975340). Apple said in a statement that it "made it explicitly clear that apps should not collect information about which other apps are installed on a user’s device for the purposes of analytics or advertising/marketing and must make it clear what user data will be collected and how it will be used.” Facebook told the Journal, “We’ve always been clear when people download Onavo about the information that is collected and how it is use." The company also removed another app, mostly out of use since 2012, that it said may have mishandled the personal data of about 4 million users. The "myPersonality" app is the second casualty of Facebook's app auditing process, which it instituted amid the fallout from Cambridge Analytica. Meanwhile, Facebook's partnership lead Dan Rose, one of the company's first executives, [announced](https://www.facebook.com/drose/posts/10105190309509931) Wednesday that he is leaving the company. His departure comes after communications chief Elliot Schrage [vacated his post](https://variety.com/2018/digital/news/facebook-elliot-schrage-departure-1202846683/) in July after the Cambridge Analytica scandal, and chief security officer Alex Stamos [stepped down](https://www.businessinsider.com/alex-stamos-is-leaving-facebook-2018-3) at the start of this month.

Share:
More In Technology
Sex is a big market for the AI industry. ChatGPT won’t be the first to try to profit from it
OpenAI has announced that ChatGPT will soon engage in "erotica for verified adults." CEO Sam Altman says the company aims to allow more user freedom for adults while setting limits for teens. OpenAI isn't the first to explore sexualized AI, but previous attempts have faced legal and societal challenges. Altman believes OpenAI isn't the "moral police" and wants to differentiate content similar to how Hollywood differentiates R-rated movies. This move could help OpenAI, which is losing money, turn a profit. However, experts express concerns about the impact on real-world relationships and the potential for misuse.
Load More