Constructs of Privacy Online

Constructs-of-privicy-online-

Online Privacy is generally surrounded with doubt and scepticism, putting trust into a machine and allowing details of your private life to be absorbed is a daunting thought. Who can see it? How safe is it? What sites can be trusted? Are all prevalent questions that most of us ask. When we are online we want to feel secure and there are many threats out there that can effect our online behaviour, making us cautious.

Dourish and Anderson suggest that Privacy online can be determined by social and cultural circumstances, including group formation, group identity, group membership, and acceptable behaviours characterised within the group. They defined privacy as a social practice including aspects such as solitude, confidentiality and autonomy.

The researchers challenged existing models of privacy and security in regards to Human Computer Interaction concerns, citing areas where prior research has shown the failings of technological and design solutions in addressing usability needs and anticipating the real-world practices, preferences, and behaviours of end-users.

Dourish and Anderson outlined three models of theoretical approaches to understanding privacy and security that have failed to account for actual information habits and practices of users.

  • First, they considered the existing model of privacy as an economic rationality, where the risk and reward of exchanging information takes on a value that can be prioritised (e.g., the benefit of online purchasing versus the costs of disclosing financial information) and the user is positioned as an entity able to calculate the impact of such a risk.
  • Second, they presented the model of privacy as a practical action whereby security measures are considered in the context of everyday life and how people achieve privacy in real settings. Viewing privacy and security as a practice in action makes them ongoing measures rather than any static or fixed ideals. It also forces considerations of privacy and security beyond technical and computer systems and toward considerations of the context, so the people, organisations, entities or even the physical space involved.
  • Third, and lastly, the researchers presented the model of privacy as a discursive practice where use of language constitutes how privacy and security are characterised. Depending on the social or cultural circumstances, the language used to describe a risk will take on a clear perspective of whether the action is acceptable and securem, or unacceptable and insecure (e.g., co-workers choosing to share passwords as a display of teamwork, contradicting company policy).

For the most part, the first model of privacy as an economic rationality has dominated information system design. However, Dourish and Anderson (2006) next reframe privacy and security, keeping in line with the latter two models, which are more inclusive of social aspects and then position privacy and security as a collective rather than individual practice. In terms of how groups or cultures interpret risk, the researchers focus on risk as an aspect of danger assessment or the moral value shared by the collective.

This perspective reinforces the requirement for information technology systems to acknowledge individuals may be part of a collective and consequently aid them where necessary as individuals and a collective.

Another aspect of collective practice is the dynamic relationships within the collective itself and how secrecy and trust are expressed and group identity formed. Social practices of the users (how they manage, share, or withhold information) are positioned as markers of group membership which dictate trust and information sharing.

The researchers then argue that information system design must recognise the need to selectively share information and should negotiate a continuum of public versus private boundaries rather than giving information an inherent value of one over the other.

Finally, Dourish and Anderson (2006) presented their system design solutions that include a visual feedback of system performance so that users are aware of the potential consequences of their actions and they suggest integrating a system’s configuration panels (i.e., the visual control panels that manage privacy settings) with the relevant user actions so that users are aware of the constraints their security preferences have placed on their system’s performance.

Dourish, P. & Anderson, K. (2006). Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena. Human-Computer Interaction, 21(3), 319-342.

http://www.dourish.com/publications/2006/DourishAnderson-InfoPractices-HCIJ.pdf


Dourish, P. & Anderson, K. (2006). Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena. Human-Computer Interaction, 21(3), 319-342.
http://www.dourish.com/publications/2006/DourishAnderson-InfoPractices-HCIJ.pdf 

Andrea M Lewis
Andrea Lewis is a psychologist and Managing Director of Ad Hoc Global Ltd. With a foundation in equity research, she has been leading product development in technology and digital media for over 12 years and leads UX due diligence assessments, research, and strategy.
Andrea M Lewis

About Andrea M Lewis

Andrea Lewis is a psychologist and Managing Director of Ad Hoc Global Ltd. With a foundation in equity research, she has been leading product development in technology and digital media for over 12 years and leads UX due diligence assessments, research, and strategy.