You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 7, 2024. It is now read-only.
There are several different ways to quantify privacy. Entropy has been particularly prevalent in discussions around web-based fingerprinting. Other measurement techniques may be better or worse for evaluating the impact on user privacy for a given API such as k-anonymity, differential privacy and probably others. I think we should add a section discussing the merits of each and when each is applicable to a given issue.
The text was updated successfully, but these errors were encountered:
@bslassey This is a great idea. I had suggested in the first meeting we highlight the severity of the respective risks - that way we know the privacy exposure level. I, however, noticed the team did a great job in the Self-review questionnaire which helps assess the exposure - https://www.w3.org/TR/security-privacy-questionnaire/#threats . A section on tested techniques should be a good addon to help users get more from the threat model
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
There are several different ways to quantify privacy. Entropy has been particularly prevalent in discussions around web-based fingerprinting. Other measurement techniques may be better or worse for evaluating the impact on user privacy for a given API such as k-anonymity, differential privacy and probably others. I think we should add a section discussing the merits of each and when each is applicable to a given issue.
The text was updated successfully, but these errors were encountered: