隐私权
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
Responsible AI 中的隐私做法涉及考虑使用敏感数据的潜在影响。这不仅包括严格遵守法律及监管要求,还应充分考虑社会规范和普通个人的期望。例如,考虑到 ML 模型可能会记住或透露其接触过的部分数据信息,需要采取哪些保护措施来保障个人隐私?需要采取哪些措施来确保用户能够充分了解相关信息,并掌控自己的数据?
通过 PAIR Explorables 的交互式演示,详细了解机器学习隐私权:
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-07-27。
[null,null,["最后更新时间 (UTC):2025-07-27。"],[[["\u003cp\u003eResponsible AI privacy practices involve respecting legal and regulatory requirements, social norms, and individual expectations regarding sensitive data.\u003c/p\u003e\n"],["\u003cp\u003eSafeguards are crucial to ensure individual privacy, as ML models can retain and potentially reveal aspects of the data used in training.\u003c/p\u003e\n"],["\u003cp\u003eTransparency and user control over their data are essential considerations in responsible AI development.\u003c/p\u003e\n"],["\u003cp\u003eGoogle's PAIR Explorables offer interactive learning experiences to deepen your understanding of ML privacy concepts like randomized response, federated learning, and data leakage.\u003c/p\u003e\n"]]],[],null,["# Privacy\n\n\u003cbr /\u003e\n\n**Privacy** practices in Responsible AI\ninvolve the consideration of potential implications in using sensitive\ndata. This includes not only respecting legal and regulatory requirements, but\nalso considering social norms and typical individual expectations. For example,\nwhat safeguards need to be put in place to ensure the privacy of individuals,\nconsidering that ML models may remember or reveal aspects of the data that they\nhave been exposed to? What steps are needed to ensure users have adequate\ntransparency and control of their data?\n\nLearn more about ML privacy through PAIR Explorables' interactive walkthroughs:\n\n- [How randomized response can help collect sensitive information responsibly](https://pair.withgoogle.com/explorables/anonymization/)\n- [How Federated Learning Protects Privacy](https://pair.withgoogle.com/explorables/federated-learning/)\n- [Why Some Models Leak Data](https://pair.withgoogle.com/explorables/data-leak/)"]]