eprintid: 47864 rev_number: 10 eprint_status: archive userid: 1482 importid: 105 dir: disk0/00/04/78/64 datestamp: 2023-06-14 12:55:51 lastmod: 2024-04-16 09:35:24 status_changed: 2023-07-18 14:04:42 type: article metadata_visibility: show creators_name: Makovi, Kinga creators_name: Sargsyan, Anahit creators_name: Li, Wendi creators_name: Bonnefon, Jean-François creators_name: Rahwan, Tahal creators_idrefppn: 076374645 creators_halaffid: 1002422 ; 441569 title: Trust within human-machine collectives depends on the perceived consensus about cooperative norms ispublished: pub subjects: subjects_ECO abstract: With the progress of artificial intelligence and the emergence of global online communities, humans and machines are increasingly participating in mixed collectives in which they can help or hinder each other. Human societies have had thousands of years to consolidate the social norms that promote cooperation; but mixed collectives often struggle to articulate the norms which hold when humans coexist with machines. In five studies involving 7,917 individuals, we document the way people treat machines differently than humans in a stylized society of beneficiaries, helpers, punishers, and trustors. We show that a different amount of trust is gained by helpers and punishers when they follow norms over not doing so. We also demonstrate that the trust-gain of norm-followers is associated with trustors’ assessment about the consensual nature of cooperative norms over helping and punishing. Lastly, we establish that, under certain conditions, informing trustors about the norm-consensus over helping tends to decrease the differential treatment of both machines and people interacting with them. These results allow us to anticipate how humans may develop cooperative norms for human-machine collectives, specifically, by relying on already extant norms in human-only groups. We also demonstrate that this evolution may be accelerated by making people aware of their emerging consensus. date: 2023-05 date_type: published publisher: Springer Nature id_number: 10.1038/s41467-023-38592-5 official_url: http://tse-fr.eu/pub/128118 faculty: tse divisions: tse language: en has_fulltext: FALSE doi: 10.1038/s41467-023-38592-5 view_date_year: 2023 full_text_status: none publication: Nature Communications volume: vol. 14 number: 3108 place_of_pub: London. refereed: TRUE issn: 2041-1723 oai_identifier: oai:tse-fr.eu:128118 harvester_local_overwrite: publish_to_hal harvester_local_overwrite: pending harvester_local_overwrite: creators_idrefppn harvester_local_overwrite: publisher harvester_local_overwrite: place_of_pub harvester_local_overwrite: volume harvester_local_overwrite: creators_halaffid oai_lastmod: 2024-04-15T14:46:05Z oai_set: tse site: ut1 publish_to_hal: TRUE citation: Makovi, Kinga, Sargsyan, Anahit, Li, Wendi, Bonnefon, Jean-François and Rahwan, Tahal (2023) Trust within human-machine collectives depends on the perceived consensus about cooperative norms. Nature Communications, vol. 14 (3108).