Michael Shulman's Shared Notes

Powered by 🌱Roam Garden

@Dang.Liu2021

  • Takeaway: We're more "ambivalent" towards highly intelligent robots; and Americans more than Chinese. But ambivalent means hold both positive and negative beliefs simultaneously - so Americans both like and dislike robots more than Chinese. So basically - "no fixed answer exists as to whether mindful AI robots are more favorable than mindless AI robots, and whether robots are more favorable in East Asian versus Western cultures (p. 6).
  • Metadata::

  • Author(s):: Jianning Dang, Li Liu
  • Title:: Robots are friends as well as foes: Ambivalent attitudes toward mindful and mindless AI robots in the United States and China
  • Type:: Article
  • Publication:: Computers in Human Behavior
  • Abstract::
    • In light of the ongoing and rapid development of innovative technologies, two intriguing issues arise: do people have more positive or more negative attitudes toward robots with high (versus low) mental capabilities, and do attitudes toward robots differ between Western and East Asian cultures? Past work on these topics has produced contradictory results. Inspired by the perspective that attitudes are ambivalent rather than bipolar, we argue that these controversial findings stem from people's ambivalent attitudes toward robots. To test the assumption that ambivalent attitudes toward robots differ by type of robots and by cultural background, we conducted an experimental study. By manipulating the level of robot mind and recruiting both American and Chinese participants, we examined how robot mind and culture influence ambivalent attitudes toward robots. We simultaneously measured participants' perceptions of robots as “ally” or “enemy”. The results revealed that robots with high (versus low) mental abilities elicited more ambivalent attitudes and that American participants reported more ambivalence toward robots than Chinese participants. These findings enhance our understanding of human–robot interaction and provide guidance for modulating people's attitudes toward robots.
  • Topics::
  • Date:: 2021
  • Date added:: November 12th, 2020
  • Citekey:: Dang.Liu2021
  • Zotero links:: Local library, Web library, Dang, Liu (2021) Computer....pdf 🔗
  • URL:: http://www.sciencedirect.com/science/article/pii/S0747563220303599
  • Tags:: Ambivalent attitudes, artificial intelligence, Culture, Human–robot interaction, Mind, ZoteroImport
  • Takeaway: We're more "ambivalent" towards highly intelligent robots; and Americans more than Chinese. But ambivalent means hold both positive and negative beliefs simultaneously - so Americans both like and dislike robots more than Chinese. So basically - "no fixed answer exists as to whether mindful AI robots are more favorable than mindless AI robots, and whether robots are more favorable in East Asian versus Western cultures (p. 6).
  • Includes brief overview of atttitudes towards robots.
  • Whoa - lit review says that Western vs. Eastern cultures differ on how unique people are - and therefore Westerners are more likely to highlight differences between humans and robots, and less likely to accept them. But on the other hand, some studies show that US participants have less negative attitudes towards robots than Asians. (Other studies find no difference, so these aren't definite.) But they critique that these studies tend to measure polarized attitudes; say maybe mixed results are because people are just ambivalent about it.
  • "Attitudinal ambivalence" is when you have both positive and negative attitudes towards something at the same time (COnner & Sparks, 2002).
  • Method - gave them descriptions of mindful or mindless robots, asked them to respond to a survey with 4 questions about them. (Eh.)
  • Highlights:
    • Dang, Liu (2021) Computers in Human Behavior- Robots are friends as well as foes
      • Page 1:
      • . The results revealed that robots . with high (versus low) mental abilities elicited more ambivalent attitudes and that American participants re- ported more ambivalence toward robots than Chinese participants.
      • Page 2:
      • ambivalent attitudes that underly people’s , hesitance to use technology agents (Stein, Newell, Wagner, & 􏰨alliers, 201􏰥).
      • Page 2:
      • AI refers to “a growing resource of interactive, autonomous, self- , , - learning agency, which enables computational artifacts to perform , tasks that otherwise would require human intelligence to be executed successfully” (Taddeo & Floridi, 2018, p. 􏰌􏰥1).
      • Page 2:
      • , , . Robots are machines that are programmed by AI and operate semi or fully autonomously to perform tasks done traditionally by humans (Clarke, 201􏰩).
      • Page 2:
      • cogni- . , - tive evaluations of robots have been operationalized as warmth􏰫com- - petitiveness and competence judgments about or perceived believability of robots (Bergmann, Eyssel, & 􏰧opp, 2012; Demeure, 2011; Demeure, , , , , Niewiadomski, & Pelachaud, 2010; Fraune et al., 201􏰌). I
      • Page 2:
      • , , ., . Some researchers have found that more negative attitudes are associated with robots that are perceived to have a higher level of mind. For . example, people perceive human-like robots as having the ability to feel, , - , which engenders a greater sense of eeriness (􏰨ray & Wegner, 2012; Macdorman & Chattopadhyay, 201􏰪).
      • Page 2:
      • Popular opinion holds that . Western and East Asian cultures differ in their views about what is human, which then determines their distinct attitudes toward robots , (􏰨eraci, 200􏰪; Macdorman, 􏰎asudevan, & Ho, 2008). Specifically, , , , , . , people in Western cultures hold static beliefs that regard humans as unique and are more likely to highlight the distinction between humans and non-human entities; conversely, people in East Asian cultures have a - , dynamic perspective that views all things as having a spirit and are less likely to regard humans as particularly special as is more common in Western cultures (􏰧aplan, 200􏰙; 􏰧itano, 200􏰌).
      • Page 2:
      • , . However, some cross-cultural studies have challenged this popular , - belief, showing that US participants have fewer negative attitudes to- , ward robots than Japanese and Chinese participants (Bartneck, Nomura, , , 􏰧anda, Suzuki, & 􏰧ennsuke, 200􏰥; Bartneck, Suzuki, 􏰧anda, & Nomura, , 200􏰌).
      • Page 2:
      • , , an individual can have positive and negative , . , attitudes simultaneously toward an object, which is known as attitudinal ambivalence (Conner & Sparks, 2002).
      • Page 4:
      • . Mindful AI . robots were described as follows: “Mindful AI robots can feel the outside world like humans, experience various emotions in social interaction, , , and express their own experiences and emotions. They can indepen- . - dently plan ahead, use strategies to solve problems, and use natural , , language for communication.” Mindless AI robots were depicted as . follows: “Mindless AI robots have limited ability to feel the outside world and they cannot experience and express various emotions in social interaction. They execute actions according to human instructions and . have a limited level of autonomy. They cannot communicate with nat- . - ural language.” This paradigm was demonstrated as an effective . manipulation of robot mind (􏰨ray & Wegner, 2012; Waytz et al., 201􏰙)
      • Page 4:
      • . . Responses to images of robots were measured by four adapted items
      • Page 6:
      • The . results suggest that people strongly view mindful robots as both allies and enemies simultaneously, and that Americans simultaneously like , and dislike robots more than Chinese people. Stated differently, no fixed . , answers exist as to whether mindful AI robots are more favorable than mindless AI robots, and whether robots are more favorable in East Asian , versus Western cultures.
      • Page 7:
      • Specifically, human-like characteristics (e.g., a , . , - . ., human-like appearance and expressive faces) of robots blur the differ- ence between them and humans and elicit more ambivalent attitudes. . For example, compared with 􏰧orean participants, US participants seem ˇ , , to prefer machine-like robots to human-like robots (Lee & Sˇabanovi ́c, 201􏰙).
@Dang.Liu2021