14
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Do conversations end when people want them to?

      , , ,
      Proceedings of the National Academy of Sciences
      Proceedings of the National Academy of Sciences

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Do conversations end when people want them to? Surprisingly, behavioral science provides no answer to this fundamental question about the most ubiquitous of all human social activities. In two studies of 932 conversations, we asked conversants to report when they had wanted a conversation to end and to estimate when their partner (who was an intimate in Study 1 and a stranger in Study 2) had wanted it to end. Results showed that conversations almost never ended when both conversants wanted them to and rarely ended when even one conversant wanted them to and that the average discrepancy between desired and actual durations was roughly half the duration of the conversation. Conversants had little idea when their partners wanted to end and underestimated how discrepant their partners’ desires were from their own. These studies suggest that ending conversations is a classic “coordination problem” that humans are unable to solve because doing so requires information that they normally keep from each other. As a result, most conversations appear to end when no one wants them to.

          Related collections

          Most cited references5

          • Record: found
          • Abstract: found
          • Article: not found

          Toward a mechanistic psychology of dialogue.

          Traditional mechanistic accounts of language processing derive almost entirely from the study of monologue. Yet, the most natural and basic form of language use is dialogue. As a result, these accounts may only offer limited theories of the mechanisms that underlie language processing in general. We propose a mechanistic account of dialogue, the interactive alignment account, and use it to derive a number of predictions about basic language processes. The account assumes that, in dialogue, the linguistic representations employed by the interlocutors become aligned at many levels, as a result of a largely automatic process. This process greatly simplifies production and comprehension in dialogue. After considering the evidence for the interactive alignment model, we concentrate on three aspects of processing that follow from it. It makes use of a simple interactive inference mechanism, enables the development of local dialogue routines that greatly simplify language processing, and explains the origins of self-monitoring in production. We consider the need for a grammatical framework that is designed to deal with language in dialogue rather than monologue, and discuss a range of implications of the account.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Human Self as Information Agent: Functioning in a Social Environment Based on Shared Meanings

            A neglected aspect of human selfhood is that people are information agents. That is, much human social activity involves communicating and discussing information. This occurs in the context of incompletely shared information—but also a group's store of collective knowledge and shared understanding. This article elucidates a preliminary theory of self as information agent, proposing that human evolution instilled both abilities and motivations for the various requisite functions. These basic functions include (a) seeking and acquiring information, (b) communicating one's thoughts to others, (c) circulating information through the group, (d) operating on information to improve it, such as by correcting mistakes, and (e) constructing a shared understanding of reality. Sophisticated information agents exhibit additional features, such as sometimes selectively withholding information or disseminating false information for self-serving reasons, cultivating a reputation as a credible source of information, and cooperating with others to shape the shared worldview in a way that favors one's subgroup. Meaningful information is thus more than a resource for individual action: It also provides the context, medium, and content within which the individual self interacts with its social environment.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              When chatting about negative experiences helps—and when it hurts: Distinguishing adaptive versus maladaptive social support in computer-mediated communication.

                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                Journal
                Proceedings of the National Academy of Sciences
                Proc Natl Acad Sci USA
                Proceedings of the National Academy of Sciences
                0027-8424
                1091-6490
                March 01 2021
                March 09 2021
                March 01 2021
                March 09 2021
                : 118
                : 10
                : e2011809118
                Article
                10.1073/pnas.2011809118
                33649209
                6ae6cd4a-89a0-4218-9a0d-cd37f001f9ae
                © 2021

                Free to read

                https://www.pnas.org/site/aboutpnas/licenses.xhtml

                History

                Comments

                Comment on this article