Results 1 -
3 of
3
Read What You Trust: An Open Wiki Model Enhanced by Social Context
"... Abstract—Wiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia’s successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. From our perspective, the ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
(Show Context)
Abstract—Wiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia’s successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. From our perspective, the conflict issue results from presenting the same knowledge to all readers, without regard for the importance of the underlying social context, which both reveals the bias of contributors and influences the knowledge perception of readers. Motivated by the insufficiency of the existing knowledge presentation model for Wiki systems, this paper presents TrustWiki, a new Wiki model which leverages social context including social background and relationship information, to present readers with personalized and credible knowledge. Our experiment shows, with reliable social context information, TrustWiki can efficiently assign readers to their compatible editor community and present credible knowledge derived from that community. Although this new Wiki model focuses on reinforcing the neutrality policy of Wikipedia, it also casts light on the other content reliability problems in Wiki systems, such as vandalism and minority opinion suppression. I.
Design and Implementation of FAITH, an Experimental System to Intercept and Manipulate Online Social Informatics
"... Abstract — Social informatics is the core of Facebook’s business and is its most valuable asset which consists of the social graph and the private data of over 500 million users. However, without secure methods of managing this data, Facebook has become vulnerable to privacy risks and devaluation. I ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
(Show Context)
Abstract — Social informatics is the core of Facebook’s business and is its most valuable asset which consists of the social graph and the private data of over 500 million users. However, without secure methods of managing this data, Facebook has become vulnerable to privacy risks and devaluation. In Facebook’s model, users are asked upon access to grant applications the required permissions without sufficient knowledge of the applications ’ intentions. As a result, if they are deceived, users risk the exposure of sensitive and personal data. This paper presents a system dubbed FAITH (Facebook Applications: Identification, Transformation & Hypervisor) to mitigate or eliminate these issues by enhancing the management of social data. First, FAITH allows users to adjust the visibility of their social informatics for each individual application depending on how much they trust the application. Users can configure FAITH to let non-trusted applications run with the least privileges (least amount of social informatics) to minimize potential privacy leaks. Second, FAITH logs the activities of applications to assist users in making more secure decisions. Users can closely monitor each activity performed by applications to adjust their privacy settings more securely. Third, FAITH allows users to transform their social graph such that different applications see different social graphs preventing the formation of friendship inflation caused by applications. The implementation of FAITH only needs the resources and tools available to the public by Facebook and requires no further cooperation from the social network. FAITH is a prototype system: the design and concept can be extended to secure other OSNs (Online Social Networks). Currently, FAITH contains thirteen Facebook social applications and has been officially released for public usage with approximately two hundred monthly active users as of now. Index Terms—Facebook; social network sites; privacy protection; Facebook applications W
SmartWiki: A Reliable and Conflict-Refrained Wiki Model Based on Reader Differentiation and Social Context Analysis
, 2013
"... Wiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia’s successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. Frequently modified controversial ..."
Abstract
- Add to MetaCart
(Show Context)
Wiki systems, such as Wikipedia, provide a multitude of opportunities for large-scale online knowledge collaboration. Despite Wikipedia’s successes with the open editing model, dissenting voices give rise to unreliable content due to conflicts amongst contributors. Frequently modified controversial arti-cles by disagreeing editors leads to inconsistent information incorporated into knowledge bases. To address this well known issue, Wikipedia administrators are able to intervene and lock overheated controversial articles, however in doing so, they introduce their own biases. These actions in turn undermine both desirable neutrality and freedom policies of Wikipedia. In this paper we present an open Wiki model, called SmartWiki, which bridges readers closer to reliable information while also allowing all editors to freely contribute. From this perspective, the conflict issue results from presenting knowledge in an identical way to all readers, without regard for differences in their knowledge-seeking motivations and social context. This in turn negatively impacts the knowledge perception of readers. To address this, SmartWiki therefore considers two types of readers, “value adherents ” who prefer sim-pler and more compatible viewpoints and “truth diggers ” who crave deeper understanding of a topic, including both sides of controversies. Social contex-t, in the form of reader and contributor social background and relationship information, is then embedded in both knowledge representations to present