Search

Study Finds Strong Links Between Trust and Social Media Use - NC State News

serongyu.blogspot.com

A recent study finds a powerful correlation between the extent to which users trust Facebook, and the intensity of their Facebook use. The study also finds what contributes to that user trust.

“We looked at both trust and distrust, testing for them separately,” says Yang Cheng, an assistant professor of communication at North Carolina State University first author of the study.

Broadly speaking, trust is when you expect a person or entity to behave in a positive way, whereas distrust is when you expect a person or entity to behave in a negative way. But in the context of this study, it’s also fair to think of trust as being more cognitive in nature (the way you think about an entity), whereas distrust is more intuitive (or the way you feel about an entity).

To begin addressing issues of trust and social media use, the researchers conducted a survey of 661 social media users in the United States.

Survey questions addressed a variety of issues, including:

  • The extent to which study participants trust Facebook;
  • The extent to which they distrust Facebook;
  • Information trustworthiness, or the extent to which they think items posted on Facebook are true;
  • Information elaboration, or the extent to which they think about the consequences of misinformation on Facebook;
  • Self-efficacy, or how good participants think they are at avoiding misinformation;
  • Prescriptive expectancy, or the extent to which they think Facebook should be pro-active about addressing misinformation; and
  • Intensity of Facebook use, or the extent to which they use and rely on Facebook.

The researchers found that trust was very strongly correlated with the intensity of Facebook use. Distrust, however, was not.

“This is an important lesson for communicators: you need to cultivate trust,” Cheng says.

But what builds trust?

The characteristic most strongly correlated with trust was self-efficacy.

“In other words, the better you think you are at sorting misinformation from accurate information, the more likely you are to trust Facebook,” Cheng says. “And the more you trust Facebook, the more likely you are to be a high-intensity Facebook user. Unfortunately, thinking that you are better than other people at identifying misinformation does not mean you are actually better than other people at identifying misinformation.”

The other variable that was positively correlated with trust in Facebook was information trustworthiness, or the extent to which people thought posts on Facebook were true.

“While our work highlights the importance of building trust, it also highlights the challenge this poses for a company like Facebook,” Cheng says. “Facebook can promote media literacy, but actual media literacy is not necessarily related to self-efficacy. And Facebook has not shown that it can ensure the posts on its platform are true. If people don’t trust Facebook, they’re less likely to spend as much time there, or to engage as fully with content on the site. And it remains unclear how much control Facebook has over the variables that contribute to trust in the platform.”

The other variables the researchers examined were both negatively correlated with trust in Facebook. In other words, the more people thought about the consequences of misinformation shared online, the less they trusted Facebook. And the more people thought Facebook should proactively work to limit misinformation, the less they trusted Facebook.

Again, neither of those variables are things that are inherently within Facebook’s control. However, one could hypothesize that increased efforts from Facebook to reduce misinformation on its platform could reduce the negative correlation between those variables and distrust in Facebook.

The paper, “Encountering Misinformation Online: Antecedents of Trust and Distrust and Their Impact on the Intensity of Facebook Use,” is published in Online Information Review.

-shipman-

Note to Editors: The study abstract follows.

“Encountering Misinformation Online: Antecedents of Trust and Distrust and Their Impact on the Intensity of Facebook Use”

Authors: Yang Cheng, North Carolina State University; Zifei Fay Chen, University of San Francisco

Published: Dec. 4, Online Information Review

DOI: 10.1108/OIR-04-2020-0130

Abstract: This study focused on the impact of misinformation on social networking sites. Through theorizing and integrating literature from interdisciplinary fields such as information behavior, communication, and relationship management, this study explored how misinformation on Facebook influences users’ trust, distrust, and intensity of Facebook use. Based on data from an online survey with 661 participants in the U.S., results showed that information trustworthiness and elaboration, users’ self-efficacy of detecting misinformation, and prescriptive expectancy of the social media platform significantly predicted both trust and distrust toward Facebook, which in turn jointly influenced users’ intensity of using this information system. Theoretical and practical implications were discussed.

Let's block ads! (Why?)



"use" - Google News
December 10, 2020 at 07:05PM
https://ift.tt/37SUQ1I

Study Finds Strong Links Between Trust and Social Media Use - NC State News
"use" - Google News
https://ift.tt/2P05tHQ
https://ift.tt/2YCP29R

Bagikan Berita Ini

0 Response to "Study Finds Strong Links Between Trust and Social Media Use - NC State News"

Post a Comment

Powered by Blogger.