Google has released TensorFlow Privacy to the public. The open source tool is designed to help you to keep your data anonymous, even from the likes of artificial intelligence (AI).
TensorFlow Privacy works based on the theory of differential privacy. The code is similar to the one used in Gmail’s Smart Compose. That predicts what you’re going to type next by collecting data from other people’s emails while preventing Smart Compose from exposing sensitive data.
To clarify further, differential privacy is a statistical technique aimed at maximising the accuracy of queries. From statistical databases while measuring the privacy impact on individuals whose information is in the database. In other words, the AI in TensorFlow Privacy does not encode personal data but instead learns from collective patterns so sensitive data does not show up in a stranger’s email.
Google is hoping developers will use this TensorFlow Privary on other machine learning tools. To that end, the search engine has also made it easy to use in order to speed up the adoption rate.
The tool is currently accessible on Github. If you desire to learn more about TensorFlow, Google has also published a technical paper about the subject, which explains the workflow in greater detail.
(Source: Engadget, Google AI, Medium // Image source: Medium)
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.