The current context of surveillance: An overview of some emerging phenomena and policies
This article takes a sociological approach to identifying a number of phenomena and processes which, it is posited, profoundly affect the dynamics of contemporary surveillance. Further, an overarching framework for these phenomena and processes is proposed: to provide a basis for the development of relevant policies in this field. Special reference is made to the increasing significance of human agency in contemporary societies, the socialisation of technological innovation processes, and collective technological responsibility. The notion of context of meaning is used to analyze the relationship between human agency, security technologies, surveillance system and privacy, in terms of the dynamic relationship between “ dangers - social regimes - social risks ”.
1. Human agency, security, surveillance and privacy
This article aims to provide insights into some aspects of the new framework of surveillance. It is not an attempt to provide a further contribution to the theory of surveillance itself, which has already been the subject of important and systematic theoretical analyses and treatises (Lyon, 2007; Ball, Haggerty and Lyon, 2012), but rather, to identify, within a sociological approach, certain phenomena and processes which profoundly affect the dynamics of contemporary surveillance, and to propose an overarching framework for these phenomena and processes, which can serve as a basis for the development of policies in this field. In conjunction with this, we will consider surveillance as a specific site for the manifestation of broader change processes.
Special reference is made to the increasing significance of human agency in contemporary societies and to the management of technological innovation processes. An initial question is that concerning the management of the relationship between the dynamics of surveillance and the needs of citizens, so as to encompass both their demand for expression, communication and movement as well as their need for privacy.
Within the framework of the project 1, based on an analysis of papers and academic research in the field of surveillance - and, more generally, in the fields of technological innovation and governance in contemporary societies - specific attention has been paid to the identification of significant processes and phenomena, with particular regard to the situation within advanced Western societies.
It is important, at the outset, to acknowledge the conventionality and the limitations of the proposed approach. It is not an aim to explain the "totality" of social phenomena related to surveillance. It is an aim to shed light on particular (inter-related) aspects of the ways in which a person, in the context of current ICT, may be subjected to threats to his/her identity and rights, in order to generate proposals for both research and action. Two perspectives need to be considered: the promotion of an innovation process able to initiate an effective synthesis between technology and society; the protection of citizens from the imbalances usually prompted by innovation processes.
A fundamental set of phenomena and processes identified within the field of surveillance concerns the significance of human agency in contemporary societies. It is useful to emphasize here that, as shown by a wide array of authors who have deepened the turn from modernity to late modernity (Giddens, 1991; Beck, 1992; Bauman, 2000), the dynamics of mass society have produced a highly diversified society - or plurality - in terms of behaviours, lifestyles, identities and capacities, instead of producing standardization and massification.
A rapid weakening of the capacity of social structures to impose themselves on individuals can be observed. In turn, due to factors such as increasing education, broadening access to rights, the escalating diffusion of powerful technologies at affordable costs, individuals increasingly show stronger cognitive and interpretative capacities, a sharper tendency to be independent in their choices, to have a greater propensity to autonomously express their own orientations and personal attitudes and to have greater agency and power in general (Quaranta, 1986).
Technologies - especially those related to transport, knowledge production and circulation (particularly the web) - have multiplied the effects of this process. This tendency has manifested itself in various ways: not least, the increased desire of individuals to satisfy ever increasing expectations more independently and freely than in the past. Examples include: the desire to consume a large array of available goods; enjoyment of aesthetic pleasures; use of games and access to other entertainment opportunities; access to banking services; use of communication technologies (from mobile phones to computers); increasing mobility (for example, domestic and international tourist flows) (Zavrsnik, 2010; Sheller, Urry, 2004; Lyon, 2003a).
It seems appropriate, within this framework, to speak of an increasing “human subjectivity” at mass level, reflecting a large scale increase in the importance, complexity and density of the cognitive, intellectual and emotional dimensions of individuals (LSC, 2010). It may be said that new forms of human agency are emerging and producing a "surplus“ of human energy (or human vitality and dynamism) at various levels. Individuals are more "capable" today than a few decades ago, their action field is broader and they are able to act beyond territorial boundaries and to overcome everyday life constraints.
A typical phenomenon in this respect is that of “self-disclosure”: the propensity to give information about oneself and one's feelings (Lasch, 1979; Ben-Ze’ev, 2004; Koskela 2004), which is expressed through popular social networks. This surplus of human energy is also reflected in the availability of more opportunities for people (e-commerce, e-banking, tourism, travel, etc.) (Zavrsnik, 2010; Sheller, Urry 2004; Lyon, 2003a). These opportunities can be also used to commit crimes, in the forms of cyber-crime or E-crime, and, in general, allow criminals and terrorists access to increasingly sophisticated and potentially dangerous material and technological devices (Beck, 2002; Ball and Webster, 2003; Wall, 2007; Bigo, 2004; Stenson, 2008; Metcalf, 2007).
In short, this energy creates both new opportunities and new problems. This broader and constantly evolving field of action has great etiological importance in broadening and shaping the arena of social control and surveillance. In fact, traveling, access to communication technologies, the desire to communicate and express oneself, multiplied on a large scale, increase the likelihood that people "get into trouble" and suffer abuse of their physical integrity or personal identity (especially involving personal data on the web and electronically).
As a result of this increased human subjectivity, a new demand for privacy emerges amongst those who act and expose themselves on the social scene more than others (Strickland, Hunt, 2005; Cannataci, 2009 and 2010). Obviously, an increase in security demands as well as in security services and technologies also emerges. Demands for security are grounded in “objective facts” (terrorism, identity theft, job loss, illegal immigration, etc.), but they also show a strong, and perhaps dominant, cognitive component (related to information, expectations, fears, symbols, myths, and social representations) which is managed by the many actors who exercise influence and power in the field of surveillance. Therefore, these demands are partly artificially induced. Feelings of insecurity are usurped by politicians, sensationalised by the media, and exploited to the full by private firms (Monahan 2010). It is within this framework that surveillance strategies and policies are developed with the aim of preventing crime and protecting citizens.
2. A dynamic context of meaning: dangers, social regimes, risks
Establishing links between these phenomena give rise to a specific context of meaning, that is a particular social situation where human interaction is governed by specific rules of differing types, which call for identification (Searle, 1969). This notion of context of meaning, according to an abductive approach, enables a coherent means of understanding the mutual connections between ostensibly disparate phenomena (Pierce, 1931-1935).
In this case, the context of meaning comprises the protection of the human person within a field of information and communication technologies and with respect to the risks associated with the continuing development of human subjectivity. This social situation, for the purposes of this proposal, is shaped and regulated by four constitutive phenomena and their relationships:
a. an increase in human subjectivity, i.e. an increase in the surplus of human agency/energy / vitality / dynamism at various levels, leading to greater activity and social exposure;
b. privacy, i.e. a form of defence of the self, at various levels: from the ethological level, linked to the defence of one’s own personal territory, to the psychological and gradually to the legal;
c. security, i.e. a set of services and technologies for protection from threats to communities, people and things;
d. surveillance, i.e. an institutional / authoritative or economic project or programme aimed at monitoring and, simultaneously, managing human groups in order to protect and/or influence them.
In response to the question of how this context works, an examination of these phenomena leads us to their interconnection through the dynamics of social risks. It is important to clarify the nature of the concepts of danger, social regime and risk (d’Andrea and Quaranta, 1996). Dangers can be defined as events or processes that are potentially out of control and threaten individuals, communities and/or social groups. S ocial regimes can be defined as the set of laws, institutions, policies and regulatory activities that as a whole increase the capacity to handle various dangers. Due to the effects of functioning social regimes, dangers can become risks, that is, dangers (when they are identified, known and made objects of systematic actions) become socially managed and controlled.
The place of fire in human civilization illustrates this notion well. At the dawn of humanity, fire was largely pure ‘danger’: as an uncontrolled and unmanaged phenomenon, which could cause further fires and other types of threat. Gradually, fire has been domesticated, and the importance of the discovery and management of fire in human evolution is accepted (Goudsblom, 1987). Therefore, within the course of time, fire becomes a more controlled risk, through social regimes including technologies, specialised bodies (firemen), insurance, information, education, behavioral rules, etc.
Within the terms of this research, it is possible to identify numerous dangers requiring the protection of citizens. Everyday dangers affecting individuals and communities: such as, violations of privacy in commercial transactions and online stalking. There are dangers which are the specific object of security and surveillance systems, such as crime and terrorism. There are also numerous social regimes: operating at the general regulatory and political levels (top-down), through legislation and specific control policies; and at a more contextual and local level (bottom-up) through the regulation of privacy within the work or neighbourhood setting. Therefore, apparent dangers may become controllable risks through knowledge, public debate and collective actions aimed at managing, reforming and adapting them.
It is worth noting here that this is not a linear process: but a complex process, always dynamic and never clear-cut, as each ‘social regime’ can give raise to further dangers (d’Andrea and Quaranta, 1996). In fact, security and surveillance technologies, created mainly to meet public needs, in turn create new problems related to personal freedoms, civil rights and privacy protection. Since the rise of the "information society“ in the '90s resulting in widespread digitization and expanded data mining capacities, there is an even more pressing need to consider the dangers created by the specific use of security systems and surveillance policies.
Examples of damage at the biographical and social level created or fostered by security and (especially “smart") surveillance have been drawn from our research, in areas such as 2 Counter Terrorism, Law Enforcement, Border Control, E-Government, Cyber-Space, Consumers and Mobile Devices.
Work within EU countries has emphasized that although technology cannot guarantee security, security is impossible without technology. This leads to the approach taken that promotes the use of technology in ways that minimize or reduce damage to respect, privacy, social justice, community cohesion and the integration of minorities (Wright and Friedewald, 2010). Nevertheless, the sociological research identifies numerous harms arising from the application of surveillance for purposes of counter-terrorism or national security, especially post 9/11. These, notably, include: financial targeting and exclusion; errors and misidentification; discrimination or racism; and breaches in data security.
Lyon’s (2004) term ‘categorical suspicion’ can be used to describe the risk assessment and classification strategies applied to post-9/11 financial tracking procedures, which have been identified as targeting particular and under-privileged sections of the population. For example, a concentration on international wire transfers from the US resulted in a focus upon a population comprising largely migrants, students and the unemployed. Furthermore, the sizeable, informal, cash economy of migrant workers in the US is described as being less involved with crime and terrorism than with the labour flexibility associated with global capitalism (Amoore and De Goede, 2005). The result is seen to be greater concentration and restrictions upon an already vulnerable population whilst other risks are displaced and harder to control.
In turn, such targeted populations are also increasingly exposed to the harm that can arise from errors or misidentification (‘false positives’) and ‘function creep’ or the use of data beyond its original purpose, as well as from breaches in data security. This increased vulnerability to potential harm is compounded by the use of racial or ethnic profiling encountered in surveillance measures for counter-terrorism purposes. This was witnessed with the post-9/11 detention of Arab ‘suspects in the US and the FBI’s gathering of information about Middle Eastern students. Also, in the greater latitude afforded by US courts to the use of race and ethnicity in national security contexts than in other border control and law enforcement cases (Guzik, 2009). This tendency towards discrimination of particular racial and ethnic groups has also been noted in studies within the UK, France, Sweden and the Netherlands (Pantazis and Pemberton, 2009; Mythen et al 2009). Guzik (2009) warns of ‘discrimination by design’ in which discrimination is or will be codified within data mining algorithms.
These concerns have been highlighted also with regard to mainstream, law enforcement activities. A significant shift in focus from detection towards anticipating and pre-empting criminal behaviour has resulted in recent developments involving aggregate data analysis (using sophisticated data algorithms), which now allow police to single-out people deemed as showing ‘risky profiles’ for potential trouble (Haggerty, 2012). This form of ‘social sorting’ creates concerns about the exacerbation of inequalities and the potential ‘criminalisation’ and targeting of vulnerable groups before any crime has been committed (if it is ever to be committed at all). Lyon (2003) describes how coded categories are not innocuous but rather allow or prevent access to opportunities, affect eligibility for resources and benefits and bestow credentials or generate suspicion. In his words, “They make a real difference. They have ethics, politics” (Lyon, 2003a p. 27).
Closed-Circuit Television (CCTV) has also been identified as a powerful tool in ‘social sorting’, where knowledge from the camera can contribute to the construction and reinforcement of a condemnatory gaze on the powerless (Coleman and Sim, 2000; Koskela, 2002; Coleman, 2003). Pallitto and Heyman (2008) illustrate social sorting using the example of the PAL scheme, which allows pre-registered, fee-paying car drivers to pass swiftly through the US/Mexican border using a swipe card (or smart card) that uses electronic technology to recognise them. Similar systems allow registered, frequent air travellers to pass through airport controls more swiftly and may use biometric data for recognition. These instances of pre-registered rights or privileges accentuate the contrasting otherness of those waiting in long queues and subject to the routine checks and interventions applied at border controls.
With the establishment of otherness, condemnatory potential is usually found to be close. In their article on Australian border control, Wilson and Weber (2008) relate how a discursive criminalization of asylum seekers has occurred, reinforced by a securitisation of borders that applies a range of surveillance technologies, which, in turn, amplify the refugee label of deviance and risk.
However, Lyon (2004) also points out how surveillance is subject to cultural differences, not only in the selection of techniques for surveillance but also in responses to them (Lyon 2004; EU CONSENT Project 2011). He calls for more detailed comparative research to achieve the serious and subtle analysis necessary to understand how power relationships emerge in relation to surveillance. As Marx (2012) reminds us, surveillance was neither good nor bad, but context and comportment made it so. However, we should maintain an awareness that threats to data privacy don’t only occur with the use of force and state power, but may appear in the service of benign ends (Marx 2012).
Citizens’ attitudes to surveillance vary between different states but degrees of scepticism, mistrust and objection are evident. As part of their Surveillance Project, the Canadian Queen’s University conducted an international survey in nine countries of citizens’ attitudes to a broad range of surveillance activities (The Surveillance Project, 2008). With respect to E-Government, their findings showed that most of the citizens did not consider themselves knowledgeable about laws protecting information in government departments. Amongst those considered knowledgeable, there was scepticism about the effectiveness of those laws with about half of the citizens assuming that they were ineffective. Levels of trust in the government achieving the right balance between national security and individual rights were low with trust in private companies, such as banks, shops and credit card companies, being higher. Nevertheless, between a third and a half of citizens objected to the government sharing their information with the private sector. Additional concerns have been raised about the digital divide, which may be due to differences in generation, economic or educational status and geographical location.
3. The socialisation of technologies and approaches: an open question
As this brief review has shown so far, there is still a significant gap between security and surveillance technologies and the needs of the individuals and communities they serve. It can be said that these technologies are still little "socialised": where socialisation constitutes (in this field) the processes of construction of a relationship between science, technology and society (d'Andrea, Quaranta and Quinti, 2005; Quaranta, 2007; Bijker and d'Andrea, 2009; Mezzana, 2011). These processes can be dual-natured:
– an adaptation of science and technology to the needs and expectations of society and of its members;
– a search for identity 3 or the ability of scientific and technological research (STR) to exert a greater control over itself and over social dynamics (political, cultural, organisational, or communicational) increasingly embedded in the research.
The socialisation of STR is a composite and multidirectional process. Areas have been identified in which actors involved in STR construct relationships between science, technology and society, both in terms of adaptation and identity. These “areas of STR socialization” include: research practice conducted by research groups (and the pressures and opportunities faced by them); scientific communication at all of the levels; the evaluation of science and technology; the scientific mediation (management, coordination, promotion and programming of scientific activities); the innovation (the use of scientific and technological research results to foster social and economic development); the governance of scientific and technological research; gender issues in research activity; and the critical relationship between western science STR and other forms of knowledge (d'Andrea, Quaranta and Quinti, 2005; Quaranta, 2007; Bijker and d'Andrea, 2009; Mezzana, 2011).
Our argument so far suggests that security technologies and surveillance strategies and arrangements have developed quickly but their embeddedness in society is still weak. Indeed, whilst they are certainly social constructions (Monahan 2011; Norris, Moran et al. 1998, Lyon 2007), they are most often under-socialized. Sociological research in this field has highlighted that these technologies, strategies and arrangements are often developed without any real interaction with the different users or beneficiaries (Albrechtslund and Nørgaard Glud, 2010). There is still little public control and assessment of the technologies and their impacts, and there is a considerable heterogeneity in the assessment instruments between continental and even different national contexts (Lips et al. 2009). Legislation is still lagging behind operational and technical features (United Nartions 2010; Cannataci, 2010). Technologies are used with little regard to gender issues or without foreseeing the potential for negative impacts on women (McCahill 2011). Security technologies and surveillance strategies and arrangements are little known to the public and of little interest to them, and there is low citizen trust in Governmental protection of personal data (The Surveillance Project, 2008; Samatas, 2005).
This complex situation creates ongoing damage to both individuals and communities. This can occur through explicit decision-making by private entities applying a "predatory" orientation towards the personal data of individuals for commercial purposes (Bellia, 2005; Manzerolle and Smeltzer, 2011). However, these technologies and strategies work, in the main in an "acephalous" manner without explicit planning and without centralized control. Their impact occurs through the accumulation and interweaving of action by different actors in different contexts (Martin, Van Brakel et al. 2009).
Therefore, the emergence on the social scene of large masses of bodies / minds and individuals who are more aware, competent and keen to "express themselves" becomes a huge and ambiguous phenomenon, which has both positive and negative aspects and is far from being effectively managed.
A potential governance system could include forms of "self-control," based on consensus and the exercise of individual power to direct public choices, which has in some cases been tested and experienced. For example, since the 60s, many forms of grassroots democracy or joint governance experiences between governments and citizens' organizations have been enacted (Quaranta, 1981). The spirit of some of these experiences is now reinforced by the participatory nature of Web 2.0, which can help in supporting new forms of large-scale social mobilization (Beinin and Vairel, 2011). But in most cases, controls over human subjectivity are often imposed from without and from above, in the political and economic arena: with elite without replacement; a lack of participatory channels; the emergence of technocratic figures into power rule at national and transnational levels; the excessive power of financial organizations. The danger of hidden authoritarianism is quite evident and often develops in a headless and dispersed form: even within the field of surveillance, which can and often is, a powerful control tool at the service of authoritarian interests. The increasingly technological nature of surveillance enables even stricter control to be exerted within the most mundane of everyday activities (Lyon, 2007).
However, with regard to security and surveillance, there are also counter-intuitive processes: occurring, for example, in the cases of compliance, negotiation, and resistance by the people.
Most of the time and in most contexts, people comply with surveillance (PINs, CCTV cameras, loyalty cards, etc.) (Lyon, 2007). But in some cases compliance is questioned. Several forms of negotiation (Declich, 2011) 4 can take place: for example, in industrial relations (Ball and Wilson, 2000); in specific residential areas; and in forms of redefinition, collaboration and co-construction between citizens and those who design and/or propose surveillance technologies (Albrechtslund and Nørgaard Glud, 2010). There are also stronger forms of reaction toward intrusive forms of surveillance, both at individual and collective levels. At the individual level, many kinds of responses (noncompliance, resistance, neutralization) have been detected (Marx, 2003; McCahill, 2011). At the collective level, through the mobilization of several kinds of civil society organizations, people address the issue of surveillance and privacy to varying degrees of intensity and in different ways (Deibert, 2003).
These phenomena require careful consideration as part of a broader phenomenology of innovation management in this field: defined by some authors as the multi-actor character of surveillance management at any level (Lyon, 2007). This may involve international, governmental, private, non-governmental, research and communication organisations. Such management, therefore, is often the result of a complex "arrangement" process of a social, political and legal nature, involving diverse (often hidden) agendas and policies (Fussey, 2007), and which varies according to different contexts (Lyon, 1994; Coleman and McCahill, 2011; Marx, 2012).
This suggests that we are facing a new "social construct": a specific context of meaning, which seems to systematically connect phenomena such as human agency, security technologies, surveillance systems and the need for privacy. Such phenomena seem to have entered into a relationship through a complex and permanent dynamic: linking dangers, social regimes and social risks. This dynamic is, as already highlighted, still poorly “socialised” and governed. In this regard, the development of appropriate solutions for managing the delicate relationship between innovation and society in this field requires the avoidance of several potential pitfalls:
- of laissez-faire, or indifferent delegation to the technologies of security and surveillance managers:
- of "conflation" (Archer, 2003), or a reductionist approach (for example, believing that what is to be protected is only the privacy of individuals, and not their desire to live, to communicate and to expand their field of experience);
- of an a-systemic approach, which does not take into account the complexity of the issues and actors in the field, and hence the plurality and flexibility of the strategies and policies needed.
4. Sociological knowledge and technological responsibility: towards new regimes in the field of surveillance
There are several possible lines of action available to cope with potential pitfalls.
A first reflection concerns scientific knowledge and, in particular, the role of sociology. It would be important to promote more systematic assessment of the social mechanisms and forces which, in this new context, can serve to ensure a democratic governance and human and civil rights protection in modern societies. However, this can only be effective on the basis of an accurate interpretation of the phenomena and processes in place: including the problematic issues, the open questions and the existing potential. This requires constant attention and a capacity for scientific analysis.
The increasing complexity of contemporary societies requires public policies based on scientific research (i.e. evidence-based policies) in order to better manage the complex relationships between technological and social dynamics (Bijker and d'Andrea, 2009). In this regard, many authors have noted the gap between decision making and research - bilaterally – requiring the development of specific knowledge brokerage strategies to allow the effective penetration of knowledge into the decision-making system (Bijker, Caiati and d'Andrea, 2012). However, prior to this, the production of relevant and reliable knowledge, based on multidisciplinary and multi-perspective research is essential. In this context, the significance of the social sciences and sociology in particular is crucial. They permit the study of phenomena and processes that would not be captured otherwise: such as, the relations between actors and their cognitive structures; elements of conflict and tension; existing dangers; current trends, risks and the social impacts of possible measures to be taken; and more. These are just a few examples of how such an approach could be applied to the field of innovation studies and the relationship between security, surveillance and privacy.
A further reflection concerns the kind of social regimes to be promoted and the observation of a regulatory lag of a different nature: cognitive, political, legal, technological, informative, educational. Due to the nature and scope of the problems found, single or isolated actions cannot be effective. Instead, it is essential to simultaneously act on several fronts – legislative, technological, resistance, educational and informative - so as to promote diverse regimes which, together, can manage security and surveillance in a way that maximises benefit to the community.
Potential regimes can be promoted in areas such as: knowledge (mapping existing dangers and identifying possible solutions); law (developing and/or modifying norms at national and transnational level); specific public policies (establishing authorities of guarantee and control); social action (mobilising civil society organizations to negotiate on how technologies are set up and used, or promoting civic watchfulness activities "here and now" in specific territorial and organizational contexts, etc.); technology (developing and disseminating privacy protection software, developing technological solutions in agreement with the various stakeholders and users, etc.); education and information (promoting information campaigns, educational activities in schools, etc.).
The varying types of regime are already in place, in more or less extensive and coordinated ways, involving different actors (EU and international organisations, governments, banks, businesses, civic organizations, etc.). However, they are, inevitably, precarious and provisional: especially in a social and economic context characterized by an extreme complexity (Spier, 2005). This complexity should not only be analyzed and understood, but also to some extent “practised” through the elaboration and implementation of various regimes.
The third reflection concerns orientation, which should gradually be shared by all of the actors in this field, albeit differently. Scientific capability and social regimes should be part of a new and greater collective responsibility concerning the management of these new technologies in relation to their effects on society: what, elsewhere, has been called "technological responsibility" (Mezzana, 2011) at various levels (from the transnational to local). At the same time, it is important to promote a “common citizenship” (Quaranta, 2007) based on a consideration of knowledge and technology as a collective good to be preserved, enhanced and exploited for the benefit of all and therefore linked to the dimension of rights. Within this framework, protecting the rights of citizens facing invasive forms of surveillance should be crucial.
Finally, security and surveillance should be viewed as not limiting but supporting the current growth of human energy. We cannot expect security and surveillance to be eliminated and a certain amount of visibility is inevitable - and indeed gradually increasing with the dynamics of human agency - in our democratic societies. However, following Lyon (Lyon, 2003b) and other scholars, it is important to develop different technological systems that are subject to greater public control and where damages are minimised or reduced to below a given threshold of (negotiated) social acceptability.
Responses to this need must include: the systematic application of the social sciences and sociology in particular; the continuous promotion and adaptation of social regimes; the adoption of a perspective linked to technological responsibility; and promotion of a new, common citizenship within the field of surveillance studies.
1 The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013).
2 These are the areas of investigation of the project.
3 For references to the notion of identity used here, see Luckmann, 1982.
4 Negotiation in this research and intervention field would deserve further study. Negotiation here is understood as an activity that involves two or more persons or groups of people who interact to resolve an issue on which there is no agreement (Declich, 2011). This is a typical and constitutive capacity of social action, which, especially in complex situations characterised by strong and rapid change, can help involve many agents and coordinate many views and orientations to create common representations and interpretations of a phenomenon or a situation, coordinate emotions and motivations, set rules, activate and support powers and resources, manage conflicts of various kinds, verify and monitor the performance of an organisation or a program, etc. (Cacace, 2010; Quaranta, 2012).
Albrechtslund, A and Nørgaard Glud, L (2010) ’Empowering Residents: A Theoretical Framework for Negotiating Surveillance Technologies’, Surveillance & Society 8(2): 235-250.
Amoore, L and De Goede, M (2005) ‘Governance, risk and dataveillance in the war on terror,’ Crime, Law & Social Change 43: 149–173.
Archer, M (2003), Structure, Agency and the Internal Conversation. Cambridge, Cambridge University Press.
Ball, K and Webster, F (eds.) (2003), The Intensification of surveillance: crime, terrorism and warfare in the information age (London: Pluto Press).
Ball, K and Wilson, D (2000) ‘Power, control and computer-based performance monitoring: A subjectivist approach to repertoires and resistance’, Organization Studies 21(3): 539-65.
Ball, K, Haggerty, K and Lyon, D (eds.) (2012), Routledge Handbook of Surveillance Studies (London: Routledge).
Bauman, Z (2000), Liquid Society (Cambridge: Polity Press).
Beck, U (1992). Risk Society: Towards a New Modernity (London: Sage Publications).
Beck, U (2002) ‘The Terrorist Threat: World Risk Society Revisited’ Theory, Culture & Society 19(4): 39-55.
Beinin, J, and Vairel F (eds) (2011), Social Movements, Mobilization, and Contestation in the Middle East and North Africa (Stanford: Stanford University Press).
Bellia, P (2005) ‘Spyware and the Limits of Surveillance Law’ Berkeley Technology Law Journal 20: 1283-1344.
Ben-Ze’ev, A (2004), Love Online: Emotions on the Internet (Cambridge: Cambridge University Press).
Bigo, D (2004) ‘Global (In)security: The Field of the Professionals of Unease Management and the Ban-opticon’, Traces: A Multilingual Series of Cultural Theory, 4 (Solomon, J and Naoki S eds) (Hong Kong: University of Hong Kong Press).
Bijker, W and d‘Andrea, L (eds) (2009), Handbook on the Socialisation of Scientific and Technological Research, Social Sciences and European Research Capacities (SS-ERC) Project, European Commission, Rome.
Bijker, W, Caiati, G and d’Andrea, L (2012), Knowledge brokerage for environmentally sustainable sanitation, Position Paper and Guidelines from the EU-FP7 BESSE project, available at: http://www.besse-project.info/besse
Cacace, M (2010), Guidelines for gender equality programmes in science, ASDO, PRAGES project, 7 th Framework Programme for Technological Research and Development of the European Commission, available at: http://www.retepariopportunita.it/Rete_Pari_Opportunita/UserFiles/Progetti/prages/pragesguidelines.pdf
Cannataci, J A (2009) ‘Privacy, Technology Law and religions across cultures’, Journal of Information Law & Technology, 1.
Cannataci, J A (2010), ‘Squaring the Circle of Smart Surveillance and Privacy’, Fourth International Conference on Digital Society, February 2010.
Coleman, R (2003) ‘Images from a Neo-liberal City: The State, Surveillance and Social Control’, Critical Criminology 12: 21-42.
Coleman, R and McCahill, M (2011), Surveillance and Crime (London: Sage Publications).
Coleman, R and Sim, J (2000) ‘You’ll Never Walk Alone: CCTV Surveillance, Order and Neo-liberal Rule in Liverpool City Centre’ British Journal of Sociology 5(1): 623-639.
d’Andrea L, Quaranta, G and Quinti, G (2005), Manuale sui processi di socializzazione della ricerca scientifica e tecnologica (Roma: CERFE).
d’Andrea, L and Quaranta, G (1996), Civil society and risk. Contribution for a general theory, Workshop CERFE-Amsterdam School of Social Research, Amsterdam, February 26 1996.
Declich, G (2011), Guidelines on Gender Diversity in S&T Organisations, project WHIST, 7 th Framework Programme for Technological Research and Development of the European Commission, available at: http://www.retepariopportunita.it/Rete_Pari_Opportunita/UserFiles/whist/whist_gl_def_ok_28112011.pdf
Deibert, R J (2003) ‘Black code: Censorship, surveillance, and the militarisation of cyberspace’, Millennium-Journal of International Studies 32(3): 501-530.
EU CONSENT Project (2011), http://www.consent.law.muni.cz/storage/1326010004_sb_consentbrochure_electronicversion_27_7_11.pdf
Fussey, P (2007) ‘An interrupted transmission? Processes of CCTV implementation and the impact of human agency’, Surveillance & Society 4(3): 229-256.
Giddens, A (1991), Modernity and Self-Identity. Self and Society in the Late Modern Age (Cambridge: Polity Press).
Goudsblom, J (1987) ‘The Domestication of Fire as a Civilizing Process’, Theory Culture & Society 4(2): 457-476.
Guzik, K (2009) ‘Discrimination by Design: Data Mining in the United States’s ‘War on Terrorism’’, Surveillance & Society 7(1): 3-20.
Haggerty, K (2012), ‘Surveillance, Crime and the Police: The Changing Landscape of Police Surveillance’, in Ball, K, Haggerty, K and Lyon, D (eds.) Routledge Handbook of Surveillance Studies (London: Routledge).
Koskela, H (2002) ‘Video Surveillance, Gender, and the Safety of Public Urban Space: ‘Peeping Tom’ Goes High Tech?’, Urban Geography 23(3): 257-278.
Koskela, H (2004) ‘Webcams, TV Shows and Mobile phones: Empowering Exhibitionism’, Surveillance & Society 2(2/3): 199-215.
Lasch, C (1979), The culture of narcissism: American life in an age of diminishing expectations (New York: W.W. Norton).
Lips, M, Taylor, J et al. (2009) ‘Identity Management, Administrative Sorting and Citizenship in the New Modes of Government’, Information, Communication and Society 12: 715-734.
LSC (2010), Internal report contextualising cultural differences and privacy, project CONSENT, 7 th Framework Programme for Technological Research and Development of the European Commission.
Luckmann, T (1982), ‘Individual action and social knowledge’, in Von Cranach M, Harré, R (eds) The analysis of action (Cambridge: Cambridge University Press).
Lyon, D (ed) (2003a), Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination (London and New York: Routledge).
Lyon, D (2003b), ‘Surveillance Technology and Surveillance Society’, in Misa, T J, Brey, P and Feenberg, A (eds) Modernity and Technology (Cambridge Mass.: The MIT Press).
Lyon, D (2004) ‘Globalizing Surveillance: Comparative and Sociological Perspectives’, International Sociology 19(2): 135-149.
Lyon, D (2007), Surveillance Studies: An Overview (Cambridge: Polity Press).
Manzerolle, V and Smeltzer, S (2011) ‘Consumer Databases, Neoliberalism, and the Commercial Mediation of Identity: A Medium Theory Analysis’, Surveillance & Society 8(3): 310-322.
Martin, A, Van Brakel, R, et al. (2009) ‘Understanding resistance to digital surveillance: Towards a multi-disciplinary, multi-actor framework’, Surveillance & Society 6(3): 213-232.
Marx, G (2003) ‘A tack in the shoe: Neutralizing and resisting the new surveillance’, Journal of Social Issues 59(2): 369-90.
Marx, G (2012), ‘Preface’ in Ball, K, Haggerty, K and Lyon, D (eds.) Routledge Handbook of Surveillance Studies (London: Routledge).
McCahill, M (2011), The Social Impact of ‘New Surveillance’ Technologies: An Ethnographic Study of the Surveilled. ESRC Impact Report, RES-062-23-0969 (Swindon: ESRC).
Metcalf, C (2007) ‘Policing operation ore’, Criminal Justice Matters 68(1): 8-9.
Mezzana, D (ed) (2011), Technological responsibility. Guidelines for a shared governance of the processes of socialization of scientific research and innovation, within an interconnected world, SET-DEV, 7 th Framework Programme for Technological Research and Development of the European Commission (Roma: Consiglio Nazionale delle Ricerche), available at: http://www.set-dev.eu/images/pdf/setdev-pg2110728
Monahan, T (2010), Surveillance in the Time of Insecurity (New Brunswick: Rutgers University Press).
Monahan, T (2011) ‘Surveillance as Cultural Practice’, The Sociological Quarterly 52(4): 495-508.
Mythen, G, Walklate, S and Khan, F (2009) ‘I’m a Muslim, but I’m Not a Terrorist: Victimisation, Risky Identities and the Performance of Safety’, British Journal of Criminology, 49(6) 736-754.
Norris, C, Moran, J, et al. (eds) (1998), Surveillance, Closed Circuit Television and Social Control (Brookfield: Ashgate).
Pallitto, R and Heyman, J (2008) ‘Theorizing Cross-Border Mobility: Surveillance, Security and Identity’, Surveillance & Society 5(3): 315-333.
Pantazis, C and Pemberton, S (2009) ‘From the ‘Old’ to the ‘New’ Suspect Community: Examining the Impacts of Recent UK Counter-Terrorism Legislation’, British Journal of Criminology 49(5): 646-666.
Peirce, C S (1931-1935), The Collected Papers of Charles S. Peirce (8 Vols.) (Cambridge MA: Harvard University Press).
Quaranta, G (1986), L’era dello sviluppo (Milano: Franco Angeli).
Quaranta, G (1981), Governabilità e democrazia diretta (Bari: De Donato).
Quaranta, G (2007) ‘Knowledge, responsibility and culture: food for thought on science communication’, JCOM, 6(4), available at: http://jcom.sissa.it/archive/06/04/Jcom0604%282007%29C01/Jcom0604%282007%29C05/Jcom0604%282007%29C05.pdf
Quaranta, G (2012), Unpublished lecture, School of Sociology and Social Sciences, Rome, March 3 2012.
Samatas, M (2005) ‘Studying Surveillance in Greece: Methodological and Other Problems Related to an Authoritarian Surveillance Culture’, Surveillance & Society 3(2/3): 181-197.
Searle, J R (1969), Speech Acts (Cambridge: Cambridge University Press).
Sheller, M and Urry, J (2004), Tourism Mobilities: Places to Play, Places in Play (London: Routledge).
Spier, F (2005) ‘How Big History Works: Energy Flows and the Rise and Demise of Complexity’, Social Evolution & History 4(1): 87-135.
Stenson, K (2008), ‘Surveillance and Sovereignty’, in Deflem, M and Ulmer, J T (eds) Surveillance and Governance: Crime Control and Beyond (Bingley UK: Emerald Publishing).
Strickland, L S and Hunt, L E (2005) ‘Technology, Security, and Individual Privacy: New Tools, New Threats, and New Public Perceptions’, Journal of the American Society for Information Science and Technology 56(3): 221-234.
The Surveillance Project (2008), An International Survey on Privacy and Surveillance: Summary of Findings (Kingston CA: Queen’s University).
United Nations Twelfth Congress on Crime Prevention and Criminal Justice (2010), Recent developments in the use of science and technology by offenders and by competent authorities in fighting crime, including the case of cybercrime. A/CONF.213/1.
Wall, D S (2007), Cybercrime: The Transformation of Crime in the Information Age (Cambridge; Polity Press).
Wilson, D and Weber L (2008) ‘Surveillance, Risk and Preemption on the Australian Border’, Surveillance & Society 5(2): 124-141.
Wright, D, Friedewald, M et al. (2010) ‘Sorting out smart surveillance’, Computer Law & Security Review 26(4): 343-354.
Zavrsnik, A (2010) ‘Technical surveillance of everyday life - ‘post-disciplinary’ theoretical perspectives’, Revija Za Kriminalistiko in Kriminologijo 61(2): 178-190.