Trust in Robots - Trusting Robots



1. Autor, D. H. (2015): "Why Are There Still So Many Jobs? The History and Future of Workplace Automation", Journal of Economic Perspectives, 29(3), pp. 3 - 30.
2. Bauer, W. (Hrsg.), Bender, M., Braun, M., Rally, P., Scholtz, O. (2016): Lightweight Robots in Manual Assembly - Best to Start Simply!, Fraunhofer IAO, Stuttgart.
3. Bischof, Glück, and Kugi, (2017): Combined path following and compliance control for fully actuated rigid body systems in 3d-space, IEEE Transactions on Control Systems Technology, 25(5), pp. 4806-4811
4. Bösch, V. et al. (2017): Thesenpapier Arbeitsorganisation im Zeitalter der Digitalisierung, Verein Industrie 4.0 Österreich
5. Buchner, R., Wurhofer, D., Weiss, A., and Tscheligi, M. Robots in Time: How User Experience in Human-Robot Interaction Changes over Time. In ICSR2013: Proceedings of the 5th International Conference on Social Robotics (2013), pp. 138-147
6. Davenport, T. H., & Kirby, J. (2016). Just How Smart Are Smart Machines?. MIT Sloan Management Review, 57(3), 21.
7. Desai, M., Kaniarasu, P., Medvedev, M., Steinfeld, A. and Yanco, H. 2013. Impact of robot failures and feedback on real-time trust. The 8th ACM/IEEE international conference on Human-robot interaction (HRI '13), 251- 258.
8. Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and autonomous systems, 42(3), 177-190.
9. Du Toit and Burdick, (2012): Robot Motion Planning in Dynamic, Uncertain Environments, IEEE Transactions on Robotics, 28(1), pp. 101-115
10. Engen, V., Pickering, J. B., & Walland, P. (2016). Machine agency in human-machine networks; impacts and trust implications. In International Conference on Human-Computer Interaction (pp. 96-106). Springer International Publishing.
11. Frey, C. B. and M.A. Osborne (2013), The Future of Employment: How Susceptible are Jobs to Computerization?, University of Oxford
12. Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125-130
13. Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., de Visser, E. J. and Parasuraman, R. 2016. A Meta Analysis of Factors Affecting Trust in Human-Robot Interaction. Human Factors, 53, 5, 517-527.
14. Hawes, N., Fäulhammer, T., Zillich, M., Vincze, M., Aldoma Buchaca, A. et al. 2016: "The STRANDS Project: Long-Term Autonomy in Everyday Environments"; IEEE Robotics & Automation Magazine, 23, 2016.
15. Hoff, K. A. and Bashir, M. 2015. Trust in Automation: Integrating Empirical Evidence on Factors That Influence Trust. Human Factors, 57, 3, 407-434.
16. Huber, A. Weiss, A., Rauhala, M. The Ethical Risk of Attachment. How to identify, Investigate and Predict Potential Ethical Risk in the Development of Social Companion Robots. In HRI2016: Proceedings of the 11th International Conference on Human-Robot Interaction (altHRI track) (2016).
17. Latour, B. (1996). On actor-network theory: A few clarifications. Soziale welt, 369-381.
18. Leite, I., Martinho, C., Paiva, A., 2011, "Social robots for long-term interaction: A survey," International Journal of Social Robotics, 2013.
19. Lohani, M., Stokes, C., McCoy, M., Bailey, C. A. and Rivers, S. E. 2016. Social Interaction Moderates Human-Robot Trust-Reliance Relationship and Improves Stress Coping. The 11th ACM/IEEE International Conference on Human Robot Interaction (HRI '16), 471-472.
20. Lorenz, T., Weiss, A., and Hirche, S. Synchrony and Reciprocity: Key Mechanisms for Social Companion Robots in Therapy and Care International Journal of Social Robotics, (2015).
21. Luhmann, N. 1989. Vertrauen. Ein Mechanismus der Reduktion sozialer Komplexität. (3. Auflage). Stuttgart. Enke.
22. McColl, D. and Nejat, G. 2014. Recognizing Emotional Body Language Displayed by a Human-like Social Robot. International Journal of Social Robotics, 6, 2, 261-280.
23. McColl, D., Hong, A., Hatakeyama, N., Nejat, G. and Benhabib, B. 2016. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. Journal of Intelligent & Robotic Systems, 82, 1, 101-133.
24. McColl, D., Jiang, C., Nejat, G. 2017. Classifying a Person's Degree of Accessibility from Natural Body Language During Social Human-Robot Interaction. IEEE Transactions on Cybernetics, 47, 2, 524-538.
25. Muir, B. M. 1987. Trust between humans and machines, and the design of decision aids. International Journal of Man-Machine Studies, 27, 5, 527-539.
26. Nagl, W., Titelbach, G., Valkova, K. (2017): Digitalisierung der Arbeit: Substituierbarkeit von Berufen im Zuge der Automatisierung durch Industrie 4.0. Projektbericht IhS, Wien
27. Orlikowski, W. J. (2000). Using technology and constituting structures: A practice lens for studying technology in organizations. In Resources, co-evolution and artifacts (pp. 255-305). Springer, London.
28. Orlikowski, W. J., & Scott, S. V. (2008). Sociomateriality: challenging the separation of technology, work and organization. Academy of Management Annals, 2(1), 433-474.
29. Otterbacher, J., & Talias, M. (2017). S/he's too Warm/Agentic!: The Influence of Gender on Uncanny Reactions to Robots. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (pp. 214-223). ACM
30. Rammert, W. (2012). Distributed agency and advanced technology. Or how to analyze the constellation of collective inter-agency. In Passoth, J. H., Peuker, B., & Schillmeier, M. (Eds.). Agency without actors?: new approaches to collective action. 58. Routledge.
31. Robinette, P., Li, W., Allen, R., Howard, A. M. and Wagner, A. R. 2016. Overtrust of Robots in Emergency Evacuation Scenarios. The 11th ACM/IEEE International Conference on Human Robot Interaction (HRI '16), 101- 108.
32. Rose, J., & Truex III, D. (2000). Machine agency as perceived autonomy: an action perspective. In Organizational and social perspectives on information technology (pp. 371-388). Springer US.
33. Rose, J., & Jones, M. (2005). The double dance of agency: a socio-theoretic account of how machines and humans interact. Systems, Signs & Actions, 1(1), 19-37.
34. Salem, M., Lakatos, G., Amirabdollahian, F., Dautenhahn, K. 2015: Would You Trust a (Faulty) Robot? Effects of Error, Task Type and Personality on Human-Robot Cooperation and Trust; ACM HRI 2015.
35. Schlund, S., Schnabel, U., Stahlecker, T., Schwarz-Kocher, M. (2017): Arbeit in der Industrie 4.0 in Baden-Würtemberg, Allianz Industri 4.0 Baden-Würtemberg, 2017
36. Spath, D. (Hrsg.), Ganschar, O., Gerlach, S., Hämmerle, M., Krause, T. Schlund, S. , Produktionsarbeit der Zukunft - Industrie 4.0, Fraunhofer IAO, Stuttgart
37. Stadler, S. Weiss, A., and Tscheligi, M. I Trained this Robot: The Impact of Pre-Experience and Execution Behavior on Robot Teachers. In RO-MAN 2014: Proceedings of the 22nd IEEE International Symposium on Robot and Human Interactive Communication (2014).
38. Sung, J. Y., Guo, L., Grinter, R., & Christensen, H. (2007). "My Roomba is Rambo": intimate home appliances. UbiComp 2007: Ubiquitous Computing, 145-162
39. Thunberg, S., Thellman, S., & Ziemke, T. (2017). Don't Judge a Book by its Cover: A Study of the Social Acceptance of NAO vs. Pepper. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 443-446). ACM
40. Van de Perre, G., Cao, H.-L., De Beir, A., Gomez Esteban, P., Lefeber, D. and Vanderborght, B. 2017. Generic Method for Generating Blended Gestures and Affective Functional Behaviors for Social Robots. Autonomous Robots, First Online (July 2017):
41. Vincze, M., Fischinger, D., Bajones, M., Wolf, D., Suchi, M., Lammer, L., Weiss, A., Pripfl, J., Körtner, T., Gisinger, C. 2016: "What Older Adults would Like a robot to Do in Their Homes - First results from a User Study in the Homes of Users"; 47th Symposium on Robotics - ISR, Munich, 2016.
42. Vincze, M. 2017. "Vertrauenswürdige Roboter"; Elektrotechnik und Informationstechnik (e&i), 134(6).
43. Wang, N., Pynadath, D. V. and Hill, S. G. 2016. Trust Calibration within a Human-Robot Team: Comparing Automatically Generated Explanations. The 11th ACM/IEEE International Conference on Human Robot Interaction (HRI '16), 109-116.
44. Yang, X. J., Wickens, C. D. and Hölttä-Otto, K. 2016. How users adjust trust in automation: Contrast effect and hindsight bias. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60, 1, 196-200.