Challenges and Opportunities of Lifelog Technologies: a literature Review and Critical Analysis

Download 211.62 Kb.
Date conversion08.07.2018
Size211.62 Kb.
  1   2   3   4   5   6
Challenges and Opportunities of Lifelog Technologies: A Literature Review and Critical Analysis

Tim Jacquemard, Peter Novitzky, Fiachra O’Brolcháin, Alan F. Smeaton, Bert Gordijn


In a lifelog, data from various sources are combined to form a record from which one can retrieve information about oneself and the environment in which one is situated. It could be considered similar to an automated biography. Lifelog technology is still at an early stage of development. However, the history of lifelogs so far shows a clear academic, corporate and governmental interest. Therefore, a thorough inquiry into the ethical aspects of lifelogs could prove beneficial to the responsible development of this field. This article maps the main ethically relevant challenges and opportunities associated with the further development of lifelog technologies as discussed in the scholarly literature. By identifying challenges and opportunities in the current debate, we were able to identify other challenges and opportunities left unmentioned. Some of these challenges are partly explained by a blind spot in the current debate. Whilst the current debate focuses mainly on lifelogs held by individuals, lifelogs held by governmental institutions and corporations pose idiosyncratic ethical concerns as well. We have provided a brief taxonomy of lifelog technology to show the variety in uses for lifelogs. In addition, we provided a general approach to alleviate the ethical challenges identified in the critical analysis.

1. Introduction

The increase in digitized activities has produced a surge of digital personal data such as financial transactions, electronic mail, forum posts and visited websites. Simultaneously, the possibilities to extract digital information from the physical world have soared, not least since everyday objects are increasingly connected to the Internet and equipped with sensing devices. These developments facilitate digital collections of information about individuals, so-called lifelogs.

Lifelogs, and the activity of lifelogging, is a new and evolving field. Consequently a generally accepted definition of lifelogging has yet to be crystallized. We use the following working definition: a lifelog is a “form of pervasive computing consisting of a unified, digital record” (Dodge & Kitchin, 2007, p. 2) about an individual and the physical and digital environment in which the person is situated when lifelogging using multimodally captured data which are gathered, stored, and processed into meaningful and retrievable information accessible through an interface.1,2 In a lifelog different data from various sources are gathered and processed to form a record from which one can retrieve information about oneself and the environment in which one is situated when lifelogging. It could be considered similar to an automated biography. The storing, processing and organizing of data happens by default, requiring no active input from the lifelogger. There is also a sense of ubiquity as data for lifelogs are obtained partly through wearable devices such as the smartphone and other similar gadgets. The amount and variety of information that can be stored in a lifelog has increased significantly. Moreover, the burden of recording a lifelog has lessened by the increased immersion of ICT devices in everyday life, such as the smartphone, which can be used to capture information about physical as well as digital activities without human intervention (Goggin 2005).

Lifelogging has been an intriguing idea for some time. The idea surfaced in non-fiction literature before the technology was considered even remotely feasible. In 1945 Vannevar Bush, an American engineer, discussed a technology called the ‘Memex’, which would be a mechanized personal memory supplement (Bush 1945). Current technology has long surpassed the idea of the Memex. Gordon Bell and Steve Mann are well-known for gathering personal information to create individual databases of their lives. Mann started wearing a camera in the 1980s, as a precursor to what he calls lifeglogs, which can be conceived as a different term for a lifelog (Mann 2004a). From 2001 to 2007 Bell, who coined the term ‘lifelog’ around 2001, digitized all sorts of information about himself such as books he has read, music he has listened to, memos he has written and photographs from a wearable camera, all for the Microsoft project MyLifeBits (Bell & Gemmell 2009, p. 29).

The US Defense Advanced Research Projects Agency (DARPA) has a track record of developing lifelog-like programs or similar programs that capture personal information. Both ‘LifeLog’, a “system that captures, stores, and makes accessible the flow of one person’s experience in and interactions with the world” (DARPA/IPTO 2003), and ‘Total Information Awareness’ (TIA), a data mining program to combat terrorism, were programs aimed to collect as much information as possible about a person. Both projects had to be withdrawn within two years as a result of controversy, because they were deemed too intrusive to privacy and an infringement of civil freedom (DARPA/IPTO 2003; DARPA 2003). However, shortly afterwards, similar programmes were established by DARPA such as ‘ASSIST’ (Advanced Soldier Sensor Information System and Technology) in 2004, which is a project to equip soldiers in a combat zone with sensors in order to augment their memory with digital reports (Schlenoff, Weiss & Potts Steves 2011; Shachtman 2004). There is also commercial interest in lifelog technology. For instance, Microsoft developed a camera especially designed for lifelogging, namely the SenseCam (Nokia 2007; Microsoft 2011).3 Today, lifelog cameras have the size of a sizeable postage stamp.4

Since lifelog technology is still at an early stage of development, a thorough inquiry into the ethical aspects of lifelogs could prove beneficial to the responsible development of this field. Hence, this article maps the main ethically relevant challenges and opportunities associated with the further development of lifelog technologies as discussed in the scholarly literature as uncovered by us. By identifying the challenges and opportunities in the ethical debate, we aspire to aid the responsible development of lifelog technology. This literature review offers researchers in the field of lifelogging an instrument to obtain more insight into the challenges and opportunities which are currently identified and the sources addressing these. By elucidating questions surrounding lifelogging, we hope to advance the ethical debate, which will, in turn, aid the technology’s responsible development. At first we will clarify the method used to select the relevant sources. Next, we present the results of the review. Our main finding is that the debate on lifelogging distinguishes insufficiently between the various applications of lifelog technologies. First, we will bring more clarity into the concept of lifelogging to provide insight into the many domains. In the last part, we critically discuss the result of focusing in on a particular domain by highlighting an issue of concerning the choice of keeping lifelogs previously left undebated and present our general approach to alleviate ethical concerns. We propose/focus on control as a central value to alleviate ethical concerns. Moreover, we will provide some general recommendations arising from this approach.

2. Method

The Google Scholar database was used to find relevant sources. The first limitation was selecting only English language material. The search results were judged by reading the abstract. The whole article was skimmed in the case of missing abstracts. The second limitation was that the content had to involve ethical considerations on contemporary lifelog technology as discussed here. This latter criterion ruled out Bush’s article about the Memex, which was still a far cry from current technologies. Also sources discussing a radical different technology from what we consider lifelogs to be, such as a lifelog as a weblog, were left out of this literature review.5 The third limitation, partly arising from the previous, was to limit the search to articles from 2001 onwards. The year in in which the MyLifeBits project started, creating the idea of lifelog technology as we currently know it. The fourth limitation entailed that sources had to elaborate on ethics. Therefore, sources mentioning applications to research ethics committees or acknowledging potential ethical issues without explaining or explicitly materializing them in their research are omitted from this literature review as they do not constitute an advancement of the ethical debate.






Lifelog ethic




Lifelog moral




Lifelog normative




MyLifeBits ethic




MyLifeBits moral




MyLifeBits normative




SenseCam ethic




SenseCam moral




SenseCam normative




Cyborglog ethic




Cyborglog moral




Cyborglog normative




Steve Mann lifelog




Table I
e searched for the terms “lifelog” and “ethic”, specifying that the words must occur anywhere in the article.6 It yielded 348 results, 30 of which were used in this review (Allen 2008; Byrne, Kelly & Jones 2010; Clowes 2012; Clowes 2013; Dib 2008; Dib 2012; Dodge & Kitchin 2007; Dodge 2007; Del Giudice & Gardner 2009; Jayaram 2011; Kang et al. 2011; Kelly et al. 2013; oops 2011; Lemos 2010; Mann 2004a; Mann 2005a; Moreno 2004; Murata 2011; O’Hara & Hall 2008; O’Hara, Tuffield & Shadbolt 2009; O’Hara 2010a; O’Hara 2010b; O’Hara 2012; Price 2010a; Price 2010b; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Sonvilla-Weiss 2008; Sweeney 2004; Van den Eede 2011). Then we changed the term “ethic” to “moral”, which yielded 1 used source (Van Dijck 2012) and “moral” to “normative”, which yielded no sources that met our criteria. For the next queries, we conducted the same searches only replacing “lifelog” with “MyLifeBits” , “MyLifeBits” with “SenseCam” and “SenseCam” with “cyborglog” and combined each of these terms with “ethic”, “moral” and “normative” as described in the first search. The last search replaced both “cyborglog” and “normative” with “Steve Mann” and “lifelog”. The results are listed in Table I which lists the number of results yielded by a search term under ‘total’, the number of results which did not occur in previous queries under ‘new’, and the sources which are used in this literature review under ‘useful’.7

The first queries, “lifelog” and “ethic”, “moral” and “normative” are self-explanatory. The second term “MyLifeBits” was chosen because MyLifeBits was an early and pivotal lifelog project. Gordon Bell, who was the subject of this research, coined the term ‘lifelog’. The query “MyLifeBits” and “ethic” provided 4 used sources (Bannon 2006; Bannon 2011; Curry 2007; Turkle 2011) and the query “MyLifeBits” and “moral,” 3 (Hall, Johansson & de Léon 2013; Katz & Gandel 2008; Lahlou 2008). This third term, “SenseCam,” was chosen because the SenseCam is one of the first devices designed especially for lifelogging and worn by prolific researchers such as Gordon Bell and Cathal Gurrin. The query “SenseCam” “Ethic” only yielded one result (Weber 2010).The fourth term “cyborglog” has been chosen because of the importance of Steve Mann, considered a pioneer of lifelog technology. The terms ‘cyborglog’ or ‘lifeglog’, are the terms he uses for technology similar to lifelogs. The term “cyborglog” “ethic” yielded 2 usable results (Mann 2005b; Mann, Fung & Lo 2006) The terms “Steve Mann” and “lifelog” have been chosen as the term “cyborglog” seemed less commonly accepted as “lifelog”. This yielded 4 results which were used in this literature review (Mann 2004b; Nack 2007; Sellen & Whittaker 2010; Werkhoven 2005). Before we started this endeavour, we identified the main project, names and technologies. By choosing the most prominent projects, names and technologies within lifelogging, we hoped to provide an overview of the current academic debate which can be used as an instrument to further the ethical debate. All those queries were conducted on the 8th and 9th of April 2013. Snowballing yielded 7 further results (Bailey & Kerr 2007; Bell & Gemmell 2009; Cheng, Golubchik & Kay 2004; Dijck 2005; O’Hara et al. 2006; Smith, O’Hara & Lewis 2011; Turkle 2008).
























Table II
. Results

The searches resulted in 52 relevant sources (23 journal articles, 11 book chapters, 8 conference papers, 6 workshop papers, 1 book, 1 column – in a scientific journal –, 1 talk, and 1 working paper) after discounting the overlapping entries. Table II shows the sources and their year of publication. The debate got started by the aforementioned DARPA project ‘Lifelog’ (Moreno 2004; Sweeney 2004) and the researchers Mann and Bell (Mann 2004a; Mann 2004b; Cheng, Golubchik & Kay 2004).8

3.1. Challenges

We distinguished 8 challenges from this accumulated literature (see Table III), which we will elaborate on in decreasing order of frequency of occurrence in the academic debate.






Deleterious influences on perception


Shortcomings technology


Impeding forgetting




Impairing social interaction


Psychological and Health risks


Issues concerning the protection of research subjects


Table III
.1.1. Infringements on privacy

Lifelogs are said to be detrimental to privacy. However, privacy is often ill-defined or not defined at all, making it puzzling what they mean by the term ‘privacy’. This is arguably the case in the following sources: Byrne, Kelly & Jones 2010; Cheng, Golubchik & Kay 2004; Del Giudice & Gardner 2009; Price 2010a; Price 2010b; Sweeney 2004; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Smith, O’Hara & Lewis 2011; Werkhoven 2005. Most sources suppose an intuitive idea of privacy as the control over personal information. Some have explicated their concept of privacy (Allen 2008; Mann 2005a; Jayaram 2011). Others aimed to redirect a misconception about privacy with regard to lifelogs.9 Others offer an elaborate discussion in order to offer recommendations for developers (Lahlou 2008).

If we consider privacy at least to be influenced by control and access of personal information and monitoring, consent should be considered a related challenge. There are consent issues which have been mostly addressed without explicitly mentioning consent.10 The non-consensual logging of third parties is an obvious challenge (Allen 2008; Bailey & Kerr 2007; Cheng, Golubchik & Kay 2004; Del Giudice & Gardner 2009; O’Hara 2010a; O’Hara 2012; Sonvilla-Weiss 2008). It might become impossible to stay off the grid (Sonvilla-Weiss 2008). Another issue is the freedom to choose to keep a lifelog. There might be considerable societal pressure to keep a lifelog (Allen 2008; O’Hara 2010a; O’Hara 2012). A lifelog could become a prerequisite to show good intentions, since the absence of a lifelog could be interpreted as signifying the intention of hiding malign behaviour. Also, the consequences of sharing information are unclear. Although one might be able to choose the information one wants to share, one has little influence in how self-publicized information is used and interpreted (Bailey & Kerr 2007; Murata 2011). For example, videos can be edited to use only certain parts. Also one has little insight into the retention and functioning of the data.

Another related challenge is surveillance. The relation between citizens and authorities or companies may be affected by lifelogs as they could be a source of information for states (Allen 2008; Bailey & Kerr 2007; Del Giudice & Gardner 2009; Dodge & Kitchin 2007; Lemos 2010; Moreno 2004; O’Hara, Tuffield & Shadbolt 2009; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Sonvilla-Weiss 2008; Weber 2010). Consequently, citizens are vulnerable to pernicious surveillance by either governmental institutions or corporations (Bailey & Kerr 2007; Del Giudice & Gardner 2009; Dodge & Kitchin 2007; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Sonvilla-Weiss 2008; Weber 2010). Indeed, by using lifelogs citizens can be turned into recreational spies as well as revealing confidential information (Allen 2008; Dodge & Kitchin 2007). Recreational spies, meaning people who investigate without it being their profession, might have little awareness of the legal and moral interests of their target and lack the professional ethics which professional investigators are assumed to possess (Allen 2008, p. 20).

3.1.2. Deleterious influences on perception

Lifelogs have been ascribed a potentially deleterious influence on our perception of the past, our memories, and the present, with three specific examples.11 Firstly, there is a blurring of past and present. The longevity of digitized information renders information about the past as readily available as information of the present. Consequently the past will be judged with standards of the present and vice versa (Allen 2008; O’Hara 2010a; O’Hara 2010b; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011). A related challenge is the amount of information created: trivial data might marginalize important information (Allen 2008; Katz & Gandel 2008). The source of information changes as well. Lifelogs produce information without a social community (Curry 2007). This will extend a solipsistic view of the world and oneself. Moreover, lifelogs might lead to epistemological uncertainties because data are easily manipulated and therefore not always to be trusted (O’Hara et al. 2006; O’Hara 2010b; Weber 2010).

Second, lifelogs have difficulties capturing subjective experiences and are able to capture only concrete information. This might limit the interest for subjective interpretations. Therefore, in memories, values such as truth might become overstated, narrowing the use of memory (O’Hara 2010a; O’Hara 2012; Dib 2008; Van Dijck 2005; Van Dijck 2012). For instance, memories are also relevant to the composition of identity (Dib 2008). By leaving the archiving of information to devices, we affect our control over personal information and the way we perceive ourselves and others. An additional challenge is one’s assessment of past behaviour. Lifelogs provide retrospection to decisions made in the past. However, the right decision might be more obvious in hindsight with lifelog information than it was at the time, leading to callous judgements about the past (Bannon 2011; Del Giudice & Gardner 2009; O’Hara 2010a; O’Hara 2012).

Third, lifelogs could influence one’s perception of the present. A loss of interest in information that cannot be archived in a lifelog could occur (Turkle 2008; Turkle 2011). Also, lifelogs might have a similar effect on perception as the photo camera, which made people look at reality as potential photo opportunities (Van Dijck 2005). Even our existence could be affected; the ability to obtain information from anywhere at any time and the source of information, could change people’s understanding of being present (Weber 2010). This challenge is based on the idea that our perception of the world is based on information rather than objective facts (Weber 2010). Lifelog technology would change information and therefore possibly our perception.

3.1.3. Shortcomings of the technology

The functioning of lifelogging technologies has been questioned. Some of these challenges are practical, such as the inconvenience of wearing devices; unintentional lifelogging (i.e. lifelogging without being aware that one is lifelogging); the distress caused by the loss of data; the practical limitations of the devices; the inconveniences imposed on others of knowing that they are being recorded (Bell & Gemmell 2009; Byrne, Kelly & Jones 2010; Mann 2004a; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011). O’Hara (2010a) argued against these challenges stating that if the technology does not function according to standards, people will refrain from using it. Thus, these challenges will only be an issue for developers and not of significant ethical concern.

However, there are other, more intricate challenges. One of them is that lifelogs might be unable to capture relevant information. The physical world is too complex for all aspects of reality to be measured. Therefore, lifelogs gather only bits of information instead of providing an integrated overview of reality (Curry 2007; Del Giudice & Gardner 2009). Lifelogs are said to be intrinsically limited in capturing information (Dodge & Kitchin 2007). They only gather empirical information; they are unable to capture subjective experience. Other sources also mention the impossibility of capturing context in which information gets its meaning (Bannon 2011; Del Giudice & Gardner 2009).

In addition, the idea of ‘memory retrieval’ is questioned (Bannon 2006; Curry 2007; Moreno 2004; Nack 2005; Sellen & Whittaker 2010; Van Dijck 2005; Van Dijck 2012). Memories are dissimilar to data, as they are subjective revisions of the past. In contrast to a photo which is taken once at a certain point in time, a memory is constructed whenever it is prompted. This process differs at given times, hence the memory changes. Correspondingly, ‘the sharing of experiences’ seems equally farfetched: because experiences are subjective interpretations the genuine sharing of any experience might be/is impossible (Del Giudice & Gardner 2009). In contrast, some consider the ability of lifelogs to mirror reality of lesser importance than their effect on [perceptions of] representation and temporality, because an absolute distinction between objective reality and subjective interpretation is troublesome (Dib 2012).

3.1.4. Impeding forgetting

The desirability of one of the objectives of lifelogs, namely the capturing of events of a person’s life, can be questioned because forgetting can be important (Allen 2008; Bannon 2006; Bannon 2011; Byrne, Kelly & Jones 2010; Clowes 2012; Dodge & Kitchin 2007; Dodge 2007; Koops 2011; Murata 2011; Nack 2005; Sonvilla-Weiss 2008; Van Den Eede 2011; Van Dijck 2012). There are various reasons identified to support this line of thinking. One is the ‘clean slate’ argument: It should be possible to forget the past to allow persons to move beyond their past deeds (Koops 2011). This also has positive societal effects. For example, expunging records, such as financial and criminal records, can have a positive effect on productivity as they limit one’s eligibility for loans and jobs. Secondly, forgetting aids self-development because people should be able to change their opinions without this change being held against them (Koops 2011). In a broader sense, people could feel limited by the constant awareness of the possibility of their deeds being remembered at all times (Koops 2011). Thirdly, the recalling of events could impair reconciliation between people. Again this has societal implications as shown in South Africa with the establishment of the Truth and Reconciliation Commission in 1995 after Apartheid (Bannon 2006). Fourthly, non-forgetting might not be the enhancement one would hope for; the influx of memories could render one apathetic while the details obscure one’s potential for abstract thought (O’Hara 2010a). Also forgetting is an intrinsic part in controlling one’s memory. Lifelogs would trigger memories one would prefer were forgotten (Murata 2011). Subsequently, one loses control of one’s life story as one cannot choose what to forget (Clowes 2012; Murata 2011). After all, one is unable to choose which information is used and which left unused. Finally, non-forgetting could hinder intellectual growth. Data of past behaviour might be used to personalize services which might be based on previous behaviour (Murata 2011). By doing so, they confirm and/or establish past and/or current behaviour. A final critique is more abstract. It holds that biological and technological memory are interwoven making it difficult to separate them. Moreover, both storing and deleting personal information, i.e. remembering or forgetting, will have an intricate effect both good as well as harmful on an individual’s memory and society which we may not always notice (Van den Eede 2011). Therefore we have to critically assess both the merits of remembering/storing and forgetting/deleting.

3.1.5. Uncertainties

The current early developmental stage of lifelog evolution poses challenges because there are variables which limit our ability to assess the consequences of the technology once it is used by individuals. The inability to completely legally regulate the technology before it has fully developed is an example of such a challenge. Thus, it remains unclear how stakeholders, such as companies, authorities, or fellow citizens, are legitimately allowed to use the technology (Allen 2008; Bell & Gemmell 2009; Cheng, Golubchik & Kay 2004; Dodge 2007; Bailey & Kerr 2007; Del Giudice & Gardner 2009; Koops 2011). The uncertainty about regulation also obscures the functioning of a technology in society (Bailey & Kerr 2007). There are further reasons why the functioning of a lifelog is uncertain: the control one has over the functioning of a lifelog and the information it produces (Bailey & Kerr 2007; Dodge & Kitchin 2007; Dodge 2007); the influence of a lifelog on identity (O’Hara 2010b; Clowes 2012; Moreno 2004); the interplay between biological memory and the lifelog (Clowes 2012). These variables can pose challenges to users and developers, but as yet it is uncertain if they will.

3.1.6. Impairing social interaction

Social interaction can be negatively affected by lifelogs (Allen 2008; Bell & Gemmell 2009; Murata 2011; O’Hara et al. 2006; O’Hara, Tuffield & Shadbolt 2009; O’Hara 2010b; Sonvilla-Weiss 2007). The disappearance of face-to-face encounters, i.e. the disappearance of a physical human presence in obtaining and spreading information has to be faced (O’Hara et al. 2006; O’Hara, Tuffield & Shadbolt 2009; O’Hara 2010b). Moreover, as mentioned, lifelogs could hinder social forgetfulness and thereby impair social bonds (Allen 2008; Murata 2011). Lifelogs will affect our set of social norms to the extent that it is likely it will require a redefinition of norms (Bell & Gemmell 2009; Sonvilla-Weiss 2008). A further challenge is that lifelogs lead to a decrease in particular human emotions when dealing with others. People might become more dependent on lifelogs to memorize. In order to memorize, a lifelog will retrieve information without a social context or subjective experience. This loss might affect social interaction as this information is conveyed without human emotions as compassion and empathy. As a result, society as a whole could develop characteristics similar to autism or schizophrenia because they use this dehumanized information for interaction (Murata 2011). Finally, the disappearance of others and the replacement by lifelogs as the source of information, which leaves less space for subjective interpretations, might influence one’s identity (Murata 2011). The result is that lifelogs can affect or change who one is.

3.1.7. Psychological and health risks

Lifelogs have been ascribed possible negative effects on health. Some mention cognitive laziness (Del Giudice & Gardner 2009): people will not use their own memory but rather rely on their lifelogs. This could harm the capacity to remember. The human brain is malleable, it adjusts to external conditions. When parts of the brain are left unused they might lose their functionality. This way, an artificial memory is not necessarily an enhancement of the brain, because it could possibly reduce biological memory (Murata 2011). In addition, a technological rather than a biologically or socially constructed personal identity or awareness of the self might lead to autism or schizophrenia (Murata 2011). Another challenge is a lifelog being the cause of pathological rumination by facilitating ponderings for sufferers from bipolar and unipolar depression (Allen 2008; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011). Also, the recalling of events can be harmful. If, for example, a memory of the event that led to a post-traumatic stress disorder were carelessly evoked, it could lead to a deterioration of the disorder (Allen 2008).

3.1.8. Issues concerning the protection of research subjects

‘The protection of research subjects’ is the only challenge that takes trials into account (Byrne, Kelly & Jones 2010; Price 2010b; Kelly et al. 2013; Sweeney 2004). This challenge has not been elaborated except in Kelly et al. (2013) who discuss an ethical framework for doing trials with wearable cameras. The advance an elaborate account of ethical challenges brought forward by lifelogging. The other sources only mention the problem without specifying them.




Citizen Empowerment


Personalized services


Valuable information


Health benefits


Behaviour modification


Shaping identity


Table IV

Table V
.2. Opportunities

Table IV shows the number of sources from our literature survey which identify a particular opportunity. We distinguished 6 opportunities which we will elaborate on in decreasing order of frequency in which they occur in the academic debate. The difference between the number of challenges (101) and opportunities (30) is not necessarily an indication of widespread adversity towards the further development of the field. In fact, some challenges are identified to streamline the development and integration. Smith, O’Hara & Lewis (2011), for example, identify more challenges than opportunities although they propose lifelog software.

3.2.1. Citizen empowerment

Lifelogs may empower citizens against undesirable behaviour of the authorities. The sources invariably mention sousveillance: citizens monitoring the authorities (Allen 2008; Bell & Gemmell 2009; Mann 2004a; Mann 2004b; Mann 2005a; Mann 2005b; Mann, Fung & Lo 2006; O’Hara, Tuffield & Shadbolt 2009; Rawassizadeh 2011; Weber 2010). Sousveillance is a reversal of surveillance in which the authorities watch citizens. Mann, who coined the term, broadly interprets it as both the secret taping of police brutality as well as questionnaires from the management handed to shoppers about their staff (Mann 2002). This latter example is in-band sousveillance and organized within an organization. Relevant to lifelogs is out-of-band sousveillance. This is sousveillance by people outside the organisation. Lifelogs could record the behaviour of the authorities. These records can be shared. This way, the authorities are better controlled, because behaviour by officials is increasingly made visible.

Equiveillance, balancing surveillance and sousveillance, is another opportunity identified by Mann. This concept seems to entail that the adverse effects of surveillance would be cancelled out by sousveillance (Mann 2004a). A related opportunity is the ability to provide information about one’s innocence to the authorities to refute accusations (O’Hara, Tuffield & Shadbolt 2009).

3.2.2. Personalized services

By using lifelogs, data software can be developed to increasingly accommodate the needs of specific users or groups, such as aids to memory, information retrieval, recommendation systems, educational tools, research tools, policy information, organisational information, information for historical studies (Bell & Gemmell 2009; Kang et al. 2011; Mann 2004a; Mann 2004b; Mann 2005a; Mann, Fung & Lo 2006; O’Hara, Tuffield & Shadbolt 2009; Rawassizadeh 2011).

3.2.3. Valuable (non medical) insights

Lifelogs offer valuable information as well as valuable emotional information (Allen 2008; Bell & Gemmell 2009; Kang et al. 2011; Mann 2004b; Nack 2005). Lifelog data can serve to reflect on oneself or society as a whole, thereby gaining deeper personal or collective understanding. Increased self-understanding by lifelog information can positively influence self-control (Hall, Johansson & de Léon 2013). Mann (2004b) considers lifelogs to have artistic potential producing art and culture. Finally, the development of lifelog technology itself could also provide valuable information as it incites a rethinking on what constitutes a human being (Nack 2005). Also, it could gather or conserve emotionally valuable information about a loved one (Bell & Gemmell 2009).

3.2.4. Health benefits

Lifelogs might benefit health or improve medical practice. A case in point is the improvement of therapeutic tools (Allen 2008; Bell & Gemmell 2009; O’Hara, Tuffield & Shadbolt 2009). The ability to measure the patient’s behaviour could lead to better diagnoses, improved therapies and beneficial lifestyle changes (Bell & Gemmell 2009). Another opportunity is telemedicine (O’Hara, Tuffield & Shadbolt 2009). Physiological signals do not necessarily have to be measured in the hospital, which makes it possible to provide some medical assistance from a distance, thereby enhancing patients’ independence (O’Hara, Tuffield & Shadbolt 2009). Moreover, the vast amount of information one might collect from subjects could be used to improve medical studies (Bell & Gemmell 2009).

3.2.5. Behaviour modification

By increasing knowledge about their behaviour and feeding back this knowledge which is derived from lifelogs, people may improve their performance at some task (Rawassizadeh 2011) or change their behaviour to their benefit. Also, lifelogs could play a role in the prevention of criminal behaviour. The threat of being visible may make criminals think twice before committing a crime (Allen 2008; Bell & Gemmell 2009). There is also a more abstract discussion about the interplay between organic memory and artificial memories in the sense of lifelogs in which it is suggested that lifelogs may extend the mind i.e. lifelogs would be considered a part of the human mind (Clowes 2013). This may be defined as a phenomenological position on lifelogs.

3.2.6. Control of identity

Identities can be constructed and imposed more easily as a consequence of using lifelogs. Some are constructed formally by authorities such as one’s financial profile or identity card data and some informally by, for example, a friend’s views as to one's trustworthiness. The concept of identity, as used by O’Hara, Tuffield & Shadbolt 2009, in this sense is quite thin. It consists of certain properties and characteristics ascribed to the individual by another entity which can use this information. A lifelogger has more control over those externally imposed identities. The lifelogger could have a vast database of information which could be used to create a new identity or to oppose identities that have been ascribed to oneself by others. Without lifelogs, one would have less information at one’s disposal to do this (O’Hara, Tuffield & Shadbolt 2009).

  1   2   3   4   5   6

The database is protected by copyright © 2016
send message

    Main page