Challenges and Opportunities of Lifelog Technologies: a literature Review and Critical Analysis



Download 211.62 Kb.
Page5/6
Date conversion08.07.2018
Size211.62 Kb.
1   2   3   4   5   6

4.3. Control


After this short discussion on the choice of whether or not to keep a lifelog, we can propose our own approach to alleviate ethical concerns about the choice to keep a lifelog made by private individuals for reasons affecting lifestyle. We hold that these three ethical challenges can be alleviated by providing users with extensive control over the lifelog to mediate the functioning of a lifelog. It can be viewed as a reaction to proposals to build in forgetting which has been the only abstract approach proposed to alleviate ethical concerns as uncovered in this literature review which has received widespread attention (Allen 2008; Bannon 2006; Dodge & Kitchin 2005; Dodge & Kitchin 2007; Mayer-Schönberger 2009; O'Hara, Tuffield & Shadbolt 2009; O’Hara 2012).

Let us briefly elaborate on the concept of built-in forgetfulness before we discuss our own point of view. Dodge and Kitchin (2005; 2007) were the first to propose a built-in technological variation of biological forgetting. Their account of forgetting, which is based on the imperfections of memory as identified by Schacter (2001), is not solely the omission of information but also imitates remembering incorrectly. Their built-in mechanism for technological forgetting is materialized through techniques such as temporarily blocking information retrieval, tweaking details, and imperfectly recording data while leaving the broader picture intact (Dodge & Kitchin 2005). They assert that by developing software that preserves the main storylines while randomly forgetting more specific information, we would still possess more information due to the lifelog than we would have had without one. The result of incorporating forgetting into the fabric of a lifelog is that we could not be confident that the retrieved information is true or whether elements were left out; forgetting renders information less reliable. This would reduce issues with surveillance and enhance our control over our past (Dodge & Kitchin 2007, 18). We accept that forgetting is contrary to the raison d’être of lifelogs (O'Hara, Tuffield & Shadbolt 2009; O’Hara 2012) and that by building in forgetting we would be throwing out the baby with the bath water. However, we do support the premise that it would alleviate some of the ethical concerns discussed in this literature review; hence we propose a different approach albeit along similar lines.



Our solution to alleviate ethical concerns with regard to lifelogs held by private individuals for reasons affecting lifestyle relating to the choice of keeping lifelogs is to maximize user control over lifelogs. This way, we would obtain a similar effect without suffering the same loss of functionality. More importantly, we can alleviate the three challenges regarding the choice of keeping a lifelog mentioned above. When users can choose and manipulate the information within a lifelog with a minimum of effort and skill, it becomes less reliable as important elements could have been changed, deleted or not lifelogged at all. Moreover, these manipulations should be untraceable, as the ability to retrieve these changes would undermine the goal of this feature. By doing so lifelogs would become less useful for society in serving as evidence.19 We hold that this would even lessen the reliability of lifelog information to the lifeloggers themselves. This assumption is plausible because the changes are made over time, can be numerous, and are likely to become inundated by other data. For instance, we would have to be able to remember manipulated locations, deleted photos, or modified transcripts by heart over a course of years between all other photos, locations and transcripts in order to persuade ourselves that the information is reliable. Moreover, the ability to control what information is and is not logged means that we are able to curate what information will become part of the lifelog. The fact that lifelogs are a patchwork of stored and manipulated elements of life becomes more apparent. Furthermore, control also involves being informed about the functioning of the lifelog. Only with sufficient information we can control lifelogs. Therefore we have to take into account the competence of the users as well as providing transparency into the functioning of the device. Hence, the three mentioned challenges can be alleviated using this approach.20 An unrelated advantage of this approach is that we do not propose to mimic the imperfections of our own memory.21 This begs the awkward position of needing a technological fix to diminish the usability of a lifelog. For these reasons, we recommend that user control of lifelog devices, information, and data should extend to the greatest extent possible and be one of the focus points when designing lifelogs for individual consumers for purposes affecting lifestyle.

4.3.1. Recommendations


This approach can lead to multiple recommendations suitable for lifelogs held by private individuals for reasons affecting lifestyle. Below we have identified ten recommendations which state that we should be able to choose and change the information within a lifelog; be informed about the functioning of the lifelog; be informed appropriately; and be protected from third parties accessing information. These recommendations follow directly from the issues as discussed above and the approach suggested.

As this is not a ‘value in design’-project, the ability to provide detailed recommendations following our approach is being curtailed by a lack of information. Exact design recommendations are untenable when the focus of inquiry is a technology in general instead of a specific lifelog (device). In different contexts, the perceived harms and benefits and the (weight of) ethical norms, varies. Notwithstanding this limitation, it is possible to provide some general recommendations as rules of thumb. These recommendations are neither sufficient nor final as particular technologies require more specific recommendations. This list aims at the development, use, and regulation of lifelog technology by private consumers for reasons of leisure; other goals of lifelogs need other sets of recommendations. For example, lifelogs for memory therapy are more useful when the lifeloggers have little control so that their therapists can rely on their information. Moreover, the patient may want to hide the device for fear of stigmatization as the device may reveal their disease to others. Furthermore, we stress that these recommendations are not meant to thwart innovation. These recommendations provide instruments to establish consumer and societal trust in lifelogs necessary for the adoption of the technology; therefore the recommendations should strengthen innovation in this field. Also for this reason, we should be wary of trading-off ethical considerations for other interests when developing technology. We must avoid shortcutting ethical concerns for short-term benefits. In addition, the recommendations are also not solely solved by technological fixes. Whilst most of these recommendations do require advancements of technology, others are directed to other disciplines. One of the challenges is concerned with the communication of information, which has mostly to do with communication studies and psychology. We consider the development of lifelogs to require an application of a multidisciplinary field of sciences. Finally, the design recommendations are not sufficient to solve the previous issues or address all issues; at most they alleviate the harms. This is certainly not a cure for all the challenges lifelogs pose. Moreover, some recommendations have more significance to particular groups such as competent adults than others such as children.



  1. Ability to manipulate data: The lifelogger should be able to manipulate the information contained in the lifelog. This can take different forms. One of them is the ability to correct erroneous information: lifelogs do not only gather and store data but also process these into meaningful and retrievable information. This process can be erroneous and the lifelogger should have the opportunity to correct it as this information may spread to systems and platforms which offer less space for correction.22 Also, the lifelogger should be able to annotate and change information as they may feel this information provides an unfair image of the self or the availability of this information is contrary to his/her conception of appropriateness.

  2. Ability to remove information: The lifelogger should be able to remove data and these deletions should not be retrievable. The ability to remove data and information is essential for control. Part of this is that the lifelogger may think that some data or information is or has become inappropriate. When the data are indiscriminately kept or stored in a lifelog, on back-ups, or copied to other databases, this would severely diminish the lifelogger’s control over the dataset.

  3. Control over sources of information: The lifelogger should be able to stop recording with a minimum of effort. This chiefly concerns devices gathering information on others. Both Memoto and the SenseCam lacked a pause or stop function in their first design and the Memoto (2013) is still lacking this feature. This illustrates that although some recommendations seem obvious, they may not be. In addition, the lifelogger should be able to take off or stop using a lifelog device without any problem. This may be problematic as lifelog devices are likely to become more embedded into the fabric of everyday life. Consider for example the potential revolution of glasses to smart glasses, such as Google Glass. There is a relationship of dependence on these products as people with vision loss need glasses in order to correct their vision. As the lifelogger of the glasses needs them for vision, this favours the wearing of glasses when the choice has to be made between not wearing and wearing.23 It gives the lifelogger an unfair advantage as it undermines the weight of objections of others.

  4. Control over information: The lifelogger should have the opportunity to determine both the data and information contained in a lifelog to the greatest extent possible. The choice of what information is appropriate for storage is subjective. Depending on the lifelogger, some information may be contrary to religious convictions, may cause embarrassment, or cause anxiety. As there may be some social pressure to keep a lifelog, we should avoid as much as possible that people have to choose between losing all lifelog functionality and the generation of unwanted information. The challenge of determining information is distinct from having the possibility to select data as data can be processed in various ways to create different kinds of information. To offer a simple example, heart rate can both be used to obtain information about stress levels as well as about health, exercise and excitement. Therefore, a heart rate monitor can both be deemed appropriate and inappropriate as a source of data as information about health might be desired while at the same time information about excitement and stress levels may be unwanted. In addition, developments to gather information from existing data or new data sources may lead to novel information being retrievable in a lifelog. Lifeloggers should have a choice if they want this information to appear. Furthermore, lifelogs should only collect the minimum of necessary data. The collection of unnecessary data may enhance the risks of creating undesirable information. By doing so, we also minimize the stakes when needing to trade-off ethical concerns such as privacy for functionality. Finally, we should minimize data about others. Depending on the use of the lifelog we have to assess the importance of gathering information about others and ways to reduce this. For example, by the blurring of people in photos who have not consented to be recorded.

  5. Informed about data stored: The lifelogger should be informed about the data stored by lifelogs and the information obtained from lifelogs. A lifelog should provide optimal transparency of what data and information it contains. Any attempts to obscure this transparency may be deemed unethical. Even a summary of what information can potentially be obtained from the set-up of lifelogging devices by the lifelogger may prove beneficial as it might provide some clarity of what information third parties might obtain. This again is challenging as this information should be conveyed and presented in a way that is understandable to most or all lifeloggers. Moreover, this information should be accessible without much effort.

  6. Informed about reliability information: The lifelogger should be informed about the manipulation of information (retrieval) for purposes such as advertisement. The control over a lifelog is partly determined by the quality of information. The ability to refer to information provides a justification and meaning for one’s choices. Information has, for that reason, symbolic value that cannot be underestimated. Hence, it would be unlikely that this lifelog data has no influence on the decisions we make in daily life. The extent to which lifelogs influence beliefs has an effect on the responsibility of developers to explicate the built-in normative standards and to represent information fairly. Therefore, the manipulation of data for the purpose of making a profit could be unethical.24

  7. Informed about ownership data: The ownership of data, including activities arising from ownership such as the licensing of lifelog data, should be determined and communicated clearly to the lifelogger to avoid confusion. It should be clear as to what happens when the lifelogger wants to remove the lifelog altogether or passes away, or when the company switches ownership or files for bankruptcy. In addition, limitations on the goals of data use should be formulated and communicated clearly to the lifelogger. Comprehensive but understandable ‘terms of use’ and/or ‘legal terms’ - similar to plain language statements for research trials - are paramount but difficult due to the multitude of lifeloggers with different levels of competence and even competent persons can have issues providing consent. Finally, we do not exclude a philosophical discussion on the issues of ownership as the current regulation can be ethically insufficient or unethical. Compliance with legal regulations is insufficient to render lifelogs ethical.

  8. Informing others: The lifelogger should inform others when he/she is recording their information. This can be achieved by the lifelog device indicating it is lifelogging. Instead of being integrated into the fabric of everyday life to the extent that its functioning goes unnoticed, the device which is lifelogging may have to indicate that it is in fact lifelogging to bystanders. This can also be important to the lifelogger [him/her]self as it would leave no doubt whether they are in fact lifelogging.

  9. Appropriateness of information feedback: The information obtained from a lifelog can be fed back to the lifeloggers in various forms. It is important to deliver this information appropriately which is largely dependent on the motivation for creating this information and the target audience. Children, for example, are a vulnerable population. If we would not consider legally regulating the use of lifelogs by children for purposes of leisure, we could at least consider adjusting the content and the way it is presented. One could think of the use of metadata with various levels of abstraction using negative descriptions as ‘no alcohol consumed’, or, ‘maintaining a healthy lifestyle’, or positive descriptions such as ‘at school’, or, ‘within proximity of the house’ instead of raw photos, GPS coordinates or other data. The functionality of lifelogs should be adjustable to the competence of the lifeloggers.

  10. Security: Data should be stored securely: the security of the data is vital as this is essential for access to and control over our information. This should be considered an integral aspect of the technology. The possibility of third parties, such as governmental institutions or companies, accessing this information should be explained to lifeloggers so they will be aware of the risks of storing their data.

5. Conclusion

The history of lifelogs so far shows a clear corporate and governmental interest in lifelogs. Moreover, there seems to be an interest of consumers as well. Although the technology is still at an early stage of development, there has been considerable ethical significance attached to the development of lifelogs.

First, we provided an insight in the current ethical debate on lifelogs by identifying challenges and opportunities. The terminology of challenges and opportunities has been chosen as it distinguishes clearly areas of opportunity and need. By identifying challenges and opportunities in the debate as yet, we hope to inspire others to identify challenges and opportunities that thus far have been barely debated. The identification of challenges and opportunities provides an instrument which could aid the further development of this technology. Our prediction is that some of these newly identified challenges and opportunities will partly arise from blind spots in the current debate regarding users and motivations. Whilst the current debate focuses mainly on lifelogs held by individuals, lifelogs held by governmental institutions and corporations pose idiosyncratic ethical concerns as well. The recent history of lifelogs creates an urgent need to scrutinize the consequences of those entities holding them.

Second, we identified various goals of lifeloggers. By clarifying the concept of lifelogs we provided the conceptual precision necessary to scrutinize the ethical debate we presented. This showed a wide variety of potential domains of applications. The current ethical debate does not distinguish sufficiently between various domains to which lifelogs can be applied. The lack of distinction is problematic as the challenges and opportunities and their weight are dependent on the domain.

Third, we have addressed areas of ethical interest which have yet to be further developed according to our research. Despite the rich academic debate on lifelogs for private individuals, we identified in our discussion some challenges previously left untouched concerning the choice of keeping lifelogs with regard to lifelogs held by private individuals for reasons affecting lifestyle and provided recommendations to alleviate these concerns.

Fourth, we presented a new approach to alleviate ethical concerns regarding the choice of keeping lifelogs which has the benefits of the built-in forgetting as proposed by Dodge and Kitchin (2007) albeit avoiding objections brought forward against them. Moreover, we have advanced ten recommendations which would follow from this framework which can be used as abstract rules of thumb to guide development.

An ethically desirable maturation of the technology requires meeting further challenges and reaping more opportunities. We suggest that further research should identify more specific avenues toward this goal by identifying clearly the uses of lifelogs.

Acknowledgement:

This research was partly supported by the Science Foundation Ireland under grant 07/CE/I1147.



Literature:

Aarts, E. & Marzano, S. 2003. The New Everyday: Views on Ambient Intelligence. Rotterdam: 010 Publishers.

Allan, A. & Warden, P. 2011. Got an iPhone or 3G iPad? Apple is recording your moves [Online]. O’Reilly Radar. Available from: http://radar.oreilly.com/2011/04/apple-location-tracking.html [Accessed on: 15-01-2013].

Allen, A.L. 2008. Dredging up the past: Lifelogging, Memory, and Surveillance. The University of Chicago Law Review, 75(1), pp. 47-74.

ANP. 2009. Minister: excuus aan slachtoffer identiteitsfraude [Online]. Nrc.nl. Available from: http://vorige.nrc.nl/binnenland/article2368828.ece/Minister_excuus_aan_slachtoffer_identiteitsfraude [Accessed on: 08-01-2013]

Bailey, J. & Kerr, I.R. 2007. Seizing Control? The Experience Capture Experiments of Ringley & Mann. Ethics and Information Technology, 9(2), pp. 129-139.

Bannon, L. 2006. Forgetting as a Feature not a Bug. CoDesign. 2(1), pp. 3-15.

Bannon, L. 2011. Reimagining HCI: Toward a More Human-Centered Perspective. Interactions, 18(4), pp. 50-57.

Bell, G & Gemmell, J. 2009. Total Recall: How The E-Memory Revolution Will Change Anything. New York: Dutton.

Bell, G. & Gemmell, J. 2007. A Digital Life. Scientific American. Available from: http://www.scienceandsociety.org/web/Library_files/A.Digital.Life.pdf. [Accessed on 28-05-2012].

Bush, V. 1945. As we may think. The Atlantic Monthly, 176(1), pp. 101-108.

Brin, D. 1998. The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom? Cambridge: Perseus Books Group.

Byrne, D, Kelly, L., & Jones, G.J.F. 2010. Multiple Multimodal Mobile Devices: Lessons Learned from Engineering Lifelog Solutions. In: Alencar P. and Cowan, D. 2012. Handbook of Research on Mobile Software Engineering: Design, Implementation and Emergent Applications, IGI Publishing.

Chen, B.X. 2011. Creepy Bug Gives Some iPhones Unwanted FaceTime [Online]. Wired.com. Available from: http://www.wired.com/gadgetlab/2011/04/creepy-iphone-bug/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wired%2Findex+%28Wired%3A+Index+3+%28Top+Stories+2%29%29 [Accessed on: 23-05-2012].

Chen, Y. & Jones, G.J.F. 2012. What do people want from their lifelogs? 6th Irish Human Computer Interaction Conference.

Cheng, W.C. Golubchik, L. & Kay, D.G. 2004. Total recall: are privacy changes inevitable? Proceedings of the 1st ACM workshop on Continuous archival and retrieval of personal experiences, pp. 86-92.

Clowes, R.W. 2012. Hybrid Memory, Cognitive Technology and Self. The proceedings of the AISB and IACAP World Congress 2012.

Clowes, R.W. 2013. The Cognitive Integration of E-Memory. Review of Philosophy and Psychology, 4(1), pp 107-133.

Curry, M.R. 2007. Being there then: Ubiquitous computing and the anxiety of reference. International Review of Information Ethics, 8(8), pp. 13-19.

DARPA. Advanced Soldier Sensor Information Systems and Technology (ASSIST) [Online]. Available from: http://www.darpa.mil/Our_Work/I2O/Programs/Advanced_Soldier_Sensor_Information_Systems_and_Technology_(ASSIST).aspx [Accessed on: 30-01-2012].

DARPA/IPTO. 2003. BAA # 03-30 LifeLog Proposer Information Pamphlet [Online]. Available from: http://realnews247.com/lifelog.htm [Accessed on: 30-01-2012].

DARPA. 2003. 'Terrorism Information Awareness' (TIA) Program formerly known as 'Total Information Awareness' [Online]. Available from: http://www.iwar.org.uk/news-archive/tia/total-information-awareness.htm [Accessed on: 30-01-2012].

Diaz, J. 2008. Huge iPhone Security Flaw Puts All Private Information at Risk [Online]. Gizmodo. Available from: http://gizmodo.com/5042332/huge-security-iphone-flaw-puts-all-private-information-at-risk [Accessed on: 30-01-2012].

Dib, L. 2008. Memory as Concept in the Design of Digital Recording Devices. Altérités, 5(1), pp. 38-53.

Dib, L. 2012. The Forgetting Dis-ease: Making Time Matter. A Journal of Feminist Cultural Studies, 23(3), pp. 43-73.

Dijck, J. van. 2005. From shoebox to performative agent: the computer as personal memory machine. New Media & Society, 7(3), pp. 311-332.

Dijck, J. van. 2012. Mediated memories in the Digital Age. Stanford: Stanford University Press.

1   2   3   4   5   6


The database is protected by copyright ©dentisty.org 2016
send message

    Main page