Function Creep occurs when personal data, collected and used for one purpose and to fulfil one function, migrate to other ones that intensify surveillance and privacy invasions beyond what was originally understood, and considered socially, ethically and legally acceptable. In the case of Oyster cards in the UK, data that begin life in the commercial sphere of public transit, are increasingly required in police inquiries. Function creep usually happens quietly, as a bit of administrative convenience. Indeed, because new technologies permit increasing amounts of data interchange and because organisational efficiency is frequently seen as a top priority, the human consequences of function creep are all-too-often unknown, ignored or downplayed (Pattison, 2008: p101).
In a European democracy, most surveillance has to be authorised by law - whether that surveillance relates to contagious diseases or countering the threat of terrorism. If surveillance relates to identifiable individuals, Article 8 of the Human Rights Act becomes engaged and this requires specific legislation to be enacted in order to ensure the lawfulness of any surveillance that interferes with private and family life (Harvey, 2005).
Usually, the surveillance legislation contains its own mechanism for individual protection (e.g. the conditions needed for authorisation of a surveillance activity), and often this legislation identifies a regulator whose role is to ensure that the rules that relate to a surveillance activity are followed (Pounder, 2007). In addition, if personal data are captured as a result of a surveillance activity, data protection legislation becomes engaged, subject to any exemption (Clarke, 2008).
It can be seen that the protective mechanisms that apply to surveillance can be spread over a minimum of three separate pieces of legislation, data protection, human rights and the surveillance legislation each mechanism having its own characteristics (Pounder, 2006). Thus in cases where surveillance has been unnecessarily invasive, individuals could face a confused picture of three possibly divergent routes of redress (Pounder, 2007).
Technologies are critical to surveillance, but two important things must also be remembered: One, 'human surveillance' of a direct kind, unmediated by technology, still occurs and is often yoked with more technological kinds. Two, technological systems themselves are neither the cause nor the sum of what surveillance is today. We cannot simply read surveillance consequences off the capacities of each new system. For the surveillance society properly to be understood, we have to understand how technologies work, how they are used (this is an interactive process, involving in-house personnel as well as technology consultants and operatives), and how they influence the working of the organisation. Moreover, we need to understand these things clearly enough to influence policy and practice as our later discussion of impact assessments suggests (Loveday, 2007: p332).
A further concern regard technologies is that many argue that anxieties about surveillance society may be allayed by technical means. Certainly, some so-called privacy-enhancing technologies (PETs) serve well to curb the growth of technological surveillance and their use should be encouraged where appropriate. But these are at best only ever part of ...