### Table 1. Privacy and accuracy factors

"... In PAGE 17: ... We applied uniform distributions with the range [50%, 100%] for the initial distortion and generated 200 range-sum queries with the change of size from (50, 100) to (1,000, 2,000). The experimental results are shown in Table1 . In the table, the obtained average values for the privacy factor and the accuracy factor of 200 queries show that the zero-sum method yields better accuracy on data sets with adjusted distortion, as well as satisfactory privacy.... ..."

### Table 2: Results of privacy provided by the GDTMs

2003

"... In PAGE 12: ... Clearly, the above measure to quantify privacy is based on how closely the original values of a modified attribute can be estimated. Table2 shows the privacy provided by our GDTMs, where for each ordered pair [ 1, 2], 1 represents the privacy level for the attribute age, and 2 represents the privacy level for the attribute salary. These values are expressed in percentage.... In PAGE 12: ...Table 2: Results of privacy provided by the GDTMs Based on the results showed in Table2 , one may claim that our GDTMs could be restrictive in terms of privacy. Indeed, TDP may be sometimes restrictive since the variance of a single attribute always yields 0% of privacy level, even though the individual data records look very different from the original ones.... ..."

Cited by 24

### Table 2: Results of privacy provided by the GDTMs

"... In PAGE 14: ... Clearly, the above measure to quantify privacy is based on how closely the original values of a modified attribute can be estimated. Table2 shows the privacy provided by our GDTMs, where for each ordered pair [fi1, fi2], fi1 represents the privacy level for the attribute age, and fi2 represents the privacy level for the attribute salary. These values are expressed in percentage.... In PAGE 14: ...Table 2: Results of privacy provided by the GDTMs Based on the results showed in Table2 , one may claim that our GDTMs could be restrictive in terms of privacy. Indeed, TDP may be sometimes restrictive since the variance of a single attribute always yields 0% of privacy level, even though the individual data records look very different from the original ones.... ..."

### Table 1: Definitions of anonymity and privacy based on opaqueness.

2004

"... In PAGE 14: ... The first step is to represent the system in question as a set of configurations parameterized by the relevant attributes. Given an intuitive notion of a security property, we specify the property in terms of attribute opaqueness (see Table1 in the case study for examples). The second step is to define an observational equivalence relation AO on configura- tions: BV AO BVBC if and only if an observer cannot distinguish between BV and BVBC.... In PAGE 27: ...5, our technique is still applicable if the chosen process specifica- tion formalism provides a suitable observational equivalence relation that abstracts away from probabilities. We give brief informal descriptions of several common anonymity properties with references to relevant protocols in Table1 . We also show how they correspond to atomic properties (or combinations thereof) from the taxonomy of section 4.... In PAGE 28: ...ave to do with hiding information about D7D6 (i.e., protecting identities of the endpoints of conversations), privacy has to do with hiding information about AS. In Table1 , privacy is formally defined as the requirement that AS is 2-value opaque. The corresponding... ..."

Cited by 22

### Table 1: Definitions of anonymity and privacy based on opaqueness.

2004

"... In PAGE 14: ... The first step is to represent the system in question as a set of configurations parameterized by the relevant attributes. Given an intuitive notion of a security property, we specify the property in terms of attribute opaqueness (see Table1 in the case study for examples). The second step is to define an observational equivalence relation AO on configura- tions: BV AO BVBC if and only if an observer cannot distinguish between BV and BVBC.... In PAGE 27: ...5, our technique is still applicable if the chosen process specifica- tion formalism provides a suitable observational equivalence relation that abstracts away from probabilities. We give brief informal descriptions of several common anonymity properties with references to relevant protocols in Table1 . We also show how they correspond to atomic properties (or combinations thereof) from the taxonomy of section 4.... In PAGE 28: ...ave to do with hiding information about D7D6 (i.e., protecting identities of the endpoints of conversations), privacy has to do with hiding information about AS. In Table1 , privacy is formally defined as the requirement that AS is 2-value opaque. The corresponding... ..."

Cited by 22

### Table 1: Definitions of anonymity and privacy based on opaqueness.

2004

"... In PAGE 14: ... The first step is to represent the system in question as a set of configurations parameterized by the relevant attributes. Given an intuitive notion of a security property, we specify the property in terms of attribute opaqueness (see Table1 in the case study for examples). The second step is to define an observational equivalence relation AO on configura- tions: BV AO BVBC if and only if an observer cannot distinguish between BV and BVBC.... In PAGE 27: ...5, our technique is still applicable if the chosen process specifica- tion formalism provides a suitable observational equivalence relation that abstracts away from probabilities. We give brief informal descriptions of several common anonymity properties with references to relevant protocols in Table1 . We also show how they correspond to atomic properties (or combinations thereof) from the taxonomy of section 4.... In PAGE 28: ...ave to do with hiding information about D7D6 (i.e., protecting identities of the endpoints of conversations), privacy has to do with hiding information about AS. In Table1 , privacy is formally defined as the requirement that AS is 2-value opaque. The corresponding... ..."

Cited by 22

### Table 1. Privacy of constraints measured by the minimum (min), median (med) and average (avg) of the numbers of consistent constraint matrixes. Averaged on 10 instances.

"... In PAGE 13: ... Similar results appear for communication costs. Table1 contains the values of parameters min, med and avg to measure the privacy of constraints. Larger values mean higher privacy.... ..."

### Table 10: Comparison within each condition for the proportion of products purchased from sites anotated with privacy icons. The high p value indicates that the nul hypothesis canot be

2007

"... In PAGE 20: ... Participants within each condition did not purchase from a significantly greater number of sites with beter privacy policies when purchasing the vibrator, as compared to the bateries. These proportions are detailed in Table10 . Instead, Figure 5 indicates that there are larger clusters of purchases made at the high privacy sites for both bateries and vibrators.... ..."

Cited by 1

### Table 2 (Absence of) Dissemination Impact on Privacy Concern

"... In PAGE 5: ... We find no significant effect on privacy concerns across any of the six categories. Table2 shows the difference between means of privacy concerns for each condition. The control condition comes close to significance with a p-value of .... ..."

### (Table 5). Table 5. Effect of Expectations of Privacy on Workplace Use of Email (H2 and H2a) WORKUSE Regression F Beta t-value p-value Whole sample

2002

Cited by 2