Dodge, M. & Kitchin 2005. The ethics of forgetting in an age of pervasive computing. CASA Working Papers 92. London: Centre for Advanced Spatial Analysis.
Dodge, M. & Kitchin, R., 2007. Outlines of a world coming into existence’: pervasive computing and the ethics of forgetting. Environment and Planning B Planning and Design, 34(3), pp.431-445.
Dodge, M. 2007. Do we need an ethics of forgetting in a world of digital ‘memories for life’? Position paper for Designing for Forgetting and Exclusion conference.
Feldman, M.S., & March, J.G.. 1981. Information in Organizations as Signal and Symbol. Administrative Science Quarterly, 26(2), pp. 171-86.
Fernandez, K. C., Levinson, C. A. & Rodebaugh, T. L. 2012. Profiling: Predicting Social Anxiety From Facebook Profiles. Social Psychological and Personality Science.
Fleming, N. 2013. I'm recording everything for the first living archive [Online]. NewScientist.com. Available from: http://www.newscientist.com/article/mg21829110.300-im-recording-everything-for-the-first-living-archive.html [Accessed on: 10-04-2013]
Giudice, K. Del & Gardner, M. 2009. The Message of the Pensieve: Realizing Memories through the World Wide Web and Virtual Reality. Unpublished. Available from: http://web.mit.edu/comm-forum/mit6/papers/Delgiudice.pdf.
Goggin, G. 2005. “Have Fun and Change the World”: Moblogging, Mobile Phone Culture and the Internet [Online]. Blogtalk. Available from: http://incsub.org/blogtalk/?page_id=119 [Accessed on: 01-02-2012].
Golbeck, J., Robles C. & Turner K. 2011. Predicting personality with social media. Conference on Human Factors in Computing Systems - Proceedings, pp. 253-262
Hall, L., Johansson, P., & de Léon, D. 2013. Recomposing the Will: Distributed motivation and computer mediated extrospection. In: Vierkant, T., Clark, A. & Kiverstein, J. (Eds.). Decomposing the will. Oxford: Oxford University Press: Philosophy of Mind Series.
Heersmink, R., Den Hoven, J., Van Eck, N.J. & Den Berg, J. 2011. Bibliometric mapping of computer and information ethics. Ethics and Inf. Technol, 13(3), pp. 241-249.
Helfert, M., Walshe, R. & Gurrin, C. 2013. The Impact of Information Quality on Quality of Life: An Information Quality Oriented Framework. IEICE Transactions, 96-B(2), pp. 404-409.
Jayaram, M. 2011. The Business of Privacy: From Private Anxiety To Commercial Sense? A Broad Overview of Why Privacy Ought To matter To Indian Businesses. NUJS Law Review, 4(4), pp. 567-594.
Kang, J., Shilton, K., Estrin, D., Burke, J.A & Hansen, M. 2011. Self-Surveillance Privacy. UC Los Angeles: UCLA School of Law, pp. 1-43.
Kelly, P., Marshall, S.J., Badland, H., Kerr, J., Oliver, M., Doherty, A.R. & Foster C. An Ethical Framework for Automated, Wearable Cameras in Health Behavior Research. Am J Prev Med, 44(3), pp. 314-319.
Koops, E.J. 2011. Forgetting footprints, shunning shadows: A critical analysis of the 'right to be forgotten' in big data practice. SCRIPTed, 8(3), 229-256.
Lemos, A. 2011. Locative Media and Surveillance at the Boundaries of Informational Territories. In: Firmino, R., Duarte, F., & Ultramari, C. ICTs for Mobile and Ubiquitous Urban Infrastructures: Surveillance, Locative Media and Global Networks, IGI Publishing, pp. 129-149.
Mann, S. 2002. Sousveillance [Online]. Wearcam.org. Available from: http://wearcam.org/sousveillance.htm. [Accessed on: 05-04-2012].
Mann, S. 2004a. “Sousveillance” Inverse Surveillance in Multimedia Imaging. MULTIMEDIA '04 Proceedings of the 12th annual ACM international conference on Multimedia, pp. 620-627.
Mann, S. 2004b. Continuous lifelong capture of personal experience with EyeTap. Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experiences. ACM SIGMM. New York, NY.
Mann, S. 2005a. Sousveillance and cyborglogs. A 30-year empirical voyage through ethical, legal and policy issues. Presence: Teleoperators and Virtual Environments, 14(6), pp. 625–646.
Mann, S. 2005b. Equiveillance: The equilibrium between Sur-veillance and Sous-veillance.
Mann, S., Fung, J. & Lo, R. 2006. Cyborglogging with Camera Phones: Steps Toward Equiveillance. Proceedings of the ACM Multimedia 2006.
Mayer-Schönberger, V. 2009. Delete: The Virtue of Forgetting in the Digital Age. Princeton: Princeton University Press.
Mehdizadeh, S. 2010. Cyberpsychology, behavior and social networking. Mary Ann Liebert, Inc., 13(4), pp. 357-364.
Memoto AB. 2013. Memoto Automatic Lifelogging Camera [Online]. Available from: http://memoto.com/ [Accessed on: 15-01-2013].
Microsoft. 2011. Microsoft Research SenseCam [Online]. Available from: http://research.microsoft.com/en-us/um/cambridge/projects/sensecam/ [Accessed on: 30-01-2012].
Murata, K. 2011. The right to forget/be forgotten. In: CEPE 2011: Crossing Boundaries.
Moreno, J. 2004. DARPA On Your Mind. Cerebrum, 6(4), pp. 92-100.
Nack, F. 2005. You Must Remember This. IEEE Multimedia, 12(1), pp. 4-7.
O'Hara, K., Morris, R., Shadbolt, N., Hitch, G. J., Hall, W. & Beagrie, N. 2006. Memories for Life: A Review of the Science and Technology. Journal of the Royal Society Interface, 3 (8). pp. 351-365.
O’Hara, K. & Hall, W. 2008. Trust on the Web: Some Web Science Research Challenges
O'Hara, K., Tuffield, M. & Shadbolt, N. 2009. Lifelogging: Privacy and Empowerment with Memories for Life. Identity in the Information Society, 1(2), pp. 2-3.
O’Hara, K., 2010a. Narcissus to a Man: Lifelogging, Technology and the Normativity of Truth. E. Berry et al., eds. Second Annual SenseCam Symposium. Available at: http://eprints.ecs.soton.ac.uk/21904/.
O'Hara, K. 2010b. Arius in Cyberspace: Digital Companions and the Limits of the Person. In: Yorick Wilks (ed.), Close Engagements with Artificial Companions: Key social, psychological, ethical and design issues, Amsterdam: John Benjamins.
O'Hara, Kieron. 2012. The Technology of Collective Memory and the Normativity of Truth. In: Goldberg, D., McCarthy, N. & Michelfelder, D. (eds.) Philosophy and Engineering: Reflections on Practice, Principles and Process, Berlin: Springer-Verlag.
Price, B.A. 2010a. Towards Privacy Preserving Lifelogging. Proceedings of the second annual SenseCam symposium.
Price, B.A. 2010b. Challenges in Eliciting Privacy and Usability Requirements for Lifelogging. Conference’10.
Rawassizadeh, R. & Min Tjoa, A. 2010. Securing Shareable Life-logs. The Second IEEE International Conference on Information Privacy, Security, Risk and Trust. Available from: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5590506.
Rawassizadeh, R. 2011. Towards sharing life-log information with society. Behaviour & Information Technology, pp. 1-11.
Schacter, D.L. 2001. The Seven Sins of Memory: How the Mind Forgets and Remembers. Boston & New York: Houghton Mifflin Company.
Schlenoff, C., Weiss, B. & Potts Steves, M. 2011. Lessons Learned in Evaluating DARPA Advanced Military Technologies [Online]. Tech Beat. Available from: http://www.nist.gov/customcf/get_pdf.cfm?pub_id=906654 [Accessed on: 30-01-2012].
Sellen, A.J. & Whittaker, S. 2010. Beyond total capture: a constructive critique of lifelogging. Communications of the ACM, 53(5), pp. 71-72.
Shachtman, N. 2004. Pentagon Revives Memory Project [Online]. Wired.com. Available from: http://www.wired.com/politics/security/news/2004/09/64911 [Accessed on: 30-01-2012].
Smith, A., O'Hara, K. & Lewis, P. 2011. Visualising the Past: Annotating a Life with Linked Open Data. Web Science Conference 2011, June 14th-17th 2011, Koblenz Germany.
Sonvilla-Weiss, S. 2008. (IN)VISIBLE: Learning to Act in the Metaverse. Vienna: Springer.
Sweeney, L. 2004. Navigating Computer Science Research Through Waves of Privacy Concerns: Discussions among Computer Scientists at Carnegie Mellon University. ACM Computers and Society, 34(1), pp. 1-19.
TED. 2011. Hasan Elahi: FBI, here I am! [Online]. Available from: http://www.ted.com/talks/hasan_elahi.html [Accessed on: 23-05-2012].
Thurm, S. & Kane, I.Y. 2010. Your Apps Are Watching You. The Wall Street Journal [Online]. Available from: http://online.wsj.com/article/SB10001424052748704694004576020083703574602.html?mod=WSJ_hps_sections_tech [Accessed on: 23-05-2012].
Turkle, S. 2008. Inner History. In: Turkle, S. (ed.) The Inner History of Devices. Cambridge: The MIT Press.
Turkle, S. 2011. Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.
Valentino-De Vries. 2011. IPhone Stored Location in Test Even if Disabled [Online]. The Wall Street Journal. Available from: http://online.wsj.com/article/SB10001424052748704123204576283580249161342.html?mod=WSJ_article_comments#articleTabs%3Dcomments [Accessed on: 23-05-2012].
Van Den Eede, Y. 2011. Technological remembering/forgetting: A Faustian bargain? Empedocles: European Journal for the Philosophy of Communication, 2(2), pp. 167-180.
Weber, K. 2010. Mobile Devices and a New Understanding of Presence. In: Workshop paper from SISSI2010 at the 12th annual ACM International conference on Ubiquitous Computing, p. 101.
Werkhoven, P. 2005. Experience machines: capturing and retrieving personal content. In: Bruck, P.A., Buchholz, A., Karssen, Z. & Zerfass, A. (eds.) E-content Technologies and Perspectives for the European Market New York, Heidelberg, Berlin: Springer.
1 Similar to lifelogs, the term ‘pervasive computing’ lacks a clear-cut definition as it is also at an early stage of development. However, there are certain characteristics widely ascribed to pervasive computing which are applicable to lifelogs:
“Embedded”: integrated in the environment
“Context aware”: ability to recognize individual users and situations
“Personalized”: they can be made to conform to individual preferences
“Adaptive”: they can change as a reaction
“Anticipatory”: they can change without interference (Aarts & Marzano 2003, 14)
“Autonomous”; they record data about the wearer’s life in an independent way with no selection of what to record and what not to record
2 This definition is a modification of the definition as formulated by Dodge & Kitchin: “A life-log is conceived as a form of pervasive computing consisting of a unified, digital record of the totality of an individual’s experiences, captured multimodally through digital sensors and stored permanently as a personal multi-media archive” (2007, p. 2).
3 The SenseCam has been rebranded the Vicon Revue. The Vicon Revue surpasses the ability of conventional cameras as it combines various sensors, sensing the environment and the wearer, in order to capture environmental information and react on this information. Moreover, interfaces have been designed in order to query and present this information. Therefore, the Vicon Revue in combination with intelligent software can be considered one of the first primitive lifelogs.
4 Memoto is a lifelog camera which has GPS and organizes the photos it autonomously takes (Memoto 2013).
5 As for the large amount of sources not included, this has various reasons of which a few examples will be provided: the term ‘lifelog’ used in the source signifies a radically different technology such as a weblog; there is a reference made in the source to lifelog devices or projects but these are not the focus of inquiry; sources can mention the existence of ethical issues only to mention that those do not fall within the scope of their research; applications to Research Ethics Committees as a formal procedure are mentioned but ethical concerns are not discussed; or the terms ‘life’ and ‘log’ appear closely together in the texts without mentioning the technology leading to false positives.
6 We use the term ‘ethic’ instead of ‘ethics’ or ‘ethical’ to include variations on ethic such as ethical and ethics. The same applies for the term ‘moral’, which also includes variations such as morality and morals.
7 We acknowledge that our literature review may not exhaustively include all sources on the ethical debate on lifelogs. Our literature review differs from studies such as the one undertaken by Heersmink et al. 2011 that provide an insight into the relations between key terms in the field of computer and information ethics as mentioned in particular databases using software. Their endeavour provided the academic field with an insight into the frequency in which combinations of terms occur in selected journals. A disadvantage of this approach is that one obtains very little insight into the debate except for the terms used. Our aim is to provide a more in-depth insight by providing references to specific sources and by briefly explaining the core of their arguments. This way, researchers can use our literature review to find sources which point them to challenges and opportunities. Unfortunately, this approach is more demanding leaving it unfeasible to incorporate a plenitude of search terms as they were included by Heersmink et al 2011. Therefore we have excluded terms such as ‘privacy’, ‘surveillance’, ‘autonomy’. In addition, we want to stress that by limiting research to particular databases and journals, one can never be sure to have included all important sources. This applies equally to us as to Heersmink et al. 2011. This concern can be somewhat alleviated through the method of snowballing as influential articles would often be referred to by others. More importantly, the aim of this research is not necessarily to gather all relevant sources. We aim to set an agenda both for engineers as they may become aware of ethical issues previously unknown to them and for ethicists as they may discover underdeveloped areas within the current debate.
8 Cheng, Golubchik & Kay (2004) took part in the CARPE 2004 workshop chaired by Jim Gemmell, who is a partner of Bell on the MyLifeBits project.
9 Kang et al. (2011) explicitly had a privacy account which has control as a central value. Some considered privacy a public good instead of an individual interest (O’Hara et al. 2006; O’Hara, Tuffield & Shadbolt 2009; O’Hara 2010a; O’Hara 2012). Bailey & Kerr (2007) tried to redress the idea of privacy as an individual interest trumped by waivers and consents.
10 Bailey & Kerr 2007 mention it explicitly as they consider the lack of clarity of the consequences of sharing information.
11 Sources mentioning challenges within this category are: Allen 2008; Bannon 2011; Curry 2007; Del Giudice & Gardner 2009; Nack 2005; O’Hara et al. 2006; O’Hara 2010a; O’Hara 2010b; O’Hara 2012; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Sonvilla-Weiss 2008; Turkle 2011; Weber 2010
12 Only Jayaram (2011) stresses the importance of privacy for businesses. Murata (2011) considers the issues with intellectual growth as companies have that much information to confirm previously established information. Del Giudice & Gardner (2009) consider the distance between management and staff if lifelogs are used. Others only briefly mention surveillance without much elaboration: Bailey & Kerr 2007; Del Giudice & Gardner 2009; Dodge & Kitchin 2007; Rawassizadeh & Min Tjoa 2010; Rawassizadeh 2011; Sonvilla-Weiss 2008; Weber 2010.
13 Chen and Jones (2012) have examined the reasons that result in the motivations for lifelogging by private individuals. According to their research people want to: “re-live the remote or recent past”; “memory backup”; “telling and passing life stories”; “re-use”; “evidences”; “collection and archiving”; and “learning about unknown early age” (Chen & Jones 2012).
14 One is uncertain about how much of his/her privacy has been waived leaving one at risk if the data is ineffectively secured, handled sloppily, used for malign purposes, or shared with commercial or surveillant intentions. This concern is heightened if the information depicts a faulty state of affairs or if the information inferred from the data is false.
15 Even mental states such as mental illnesses or undesirable character traits, such as narcissism and/or social anxiety, can be identified from posts on websites (Fernandez, Levinson & Rodebaugh 2012; Golbeck, Robles & Turner 2011; Mehdizadeh 2010).
16 We discuss the fact that we cannot be fully informed about the functioning of a technology because the technology evolves. There are other relevant variables that complicate a distinction between public and private information over time. The lifelogger and the environment in which they lifelogged can change. The lifelogger or the person recorded may consider data captured in the past inappropriate and harmful at present. Moreover, the information stored in a lifelog can become outdated. Lifestyles, social positions, behaviour, and beliefs change. Lifeloggers might be unaware of these changes and lifelogs might not capture them. When opportunities to correct this information are lacking, this can lead to incorrect profiles. This proves especially troublesome if information is used or spread by the lifelogger or third parties, such as corporations, governmental institutions, and hackers with nefarious intentions. Moreover, the (symbolic) value of information changes, meaning that information might obtain different and unforeseen connotations over time (Rawassizadeh 2011). Previously accepted or unenforceable yet unhealthy behaviour at an early stage of life might be punished at a later stage, for example, through higher insurance quotes, lowered coverage, or social exclusion.
17 There are numerous examples of devices sharing and gathering information without people’s knowledge; these are a random pick: a smartphone taking photos (Chen 2011); a smartphone with serious security issues (Diaz 2008); software sharing information without people’s awareness or consent (Thurm & Kane 2010); a smartphone stores whereabouts even when location services are disabled (Valentino-De Vries 2011).
18 External devices can be added by the user, such as camera lenses, heart monitor gear, and covers. These devices are accompanied by software.
19 This approach does undermine the usefulness of lifelogs in terms of sousveillance as this opportunity relies on evidence which becomes less trustworthy. Therefore we have to weigh the worth of sousveillance against the benefits of this approach. We think that this would favour our approach as data such as photos, videos or audio, can already be easily manipulated outside the lifelog while we still consider the wearing of cameras to be some form of sousveillance. Moreover, as Brin (1998, 31) stated, issues with reliability will be alleviated by the fact that in most cases there will not be one camera but multiple cameras.
20 An obvious weakness of both approaches is that they provide little control over lifelogs worn by other people. If one would combine lifelog data one would probably be getting a more reliable and detailed picture. This, however, equally applies to both approaches (Allen 2008). However, in favour of both approaches, this combining of lifelogs requires coordinated efforts requiring resources and time lessening its importance for everyday use of citizens which contrary to authorities have lesser resources available to them.
21 This is not to say that forgetting is an imperfection per se. We agree that forgetting can be beneficial. However, forgetting as a random computerized process cannot be said to have the same function as forgetting by human beings.
22 This can be especially troublesome if institutions such as governmental or financial institutions can get their hands on lifelog information and circulate wrongful information. The spreading of false information without the necessary checks and balances could severely impair someone’s opportunities as has happened. For example, this happened in the Netherlands with a businessperson who was wrongfully arrested several times (ANP 2009). Somehow the databases could not correct the erroneous information. Identity fraud could either be facilitated by lifelogs, as one could steal more information, or impaired as one can hand over more information to correct the situation.
23 More information on Google Glass can be found at: http://www.google.com/glass/start/.
24 The quality of information influences the benefits which can be reaped from life providing a prima facie reason to allow lifelogs. Hence lifelog should provide information of a sufficient standard of quality. If lifelogs reliably provide information of a sufficient standard, they can be useful for a higher quality of life (Helfert, Walshe, Gurrin 2013).