Open access peer-reviewed chapter

Interpersonal Trust within Social Media Applications: A Conceptual Literature Review

Written By

Kevin Koidl and Kristina Kapanova

Submitted: 02 August 2021 Reviewed: 25 February 2022 Published: 03 May 2022

DOI: 10.5772/intechopen.103931

From the Edited Volume

The Psychology of Trust

Edited by Martha Peaslee Levine

Chapter metrics overview

341 Chapter Downloads

View Full Metrics

Abstract

Interpersonal trust within social media applications is a highly discussed topic. The debate ranges from trusting the application, related to security and privacy, to trusting content and the underlying content delivery algorithms. Several trust-related phenomena have surfaced in recent years, known as filter bubbles, echo chambers and fake news. Addressing these phenomena is often pushed to either the regulator or directly to the provider of the social media application. Interpersonal trust within social media applications is a more complex topic and not limited to the application or the content, but has to include the behaviour of the user. To broaden the debate beyond the prevalent focus on the application and content this paper presents a conceptual literature review studying interpersonal trust within social media with the goal to deepen the understanding of the complex interplay between user behaviour in relation to interpersonal trust. Based on this review modalities of interpersonal trust are identified and presented. To extend on these findings an information-dense word embedding based analysis is presented by using unsupervised machine learning techniques.

Keywords

  • social media
  • trust
  • truth
  • literature review
  • machine learning

1. Introduction

Social media is an important part of interpersonal communication and essential for building and maintaining lasting and meaningful relationships. Recently, social media has been challenged by policymakers to promote and spread content that is not truthful, often referred to as fake news, and with that, has led to a crisis of trust [1]. In addition the global pandemic has moved several physical interactions online with several technical developments, such as remote work and online learning, being conducted online with a significant impact on trust based interpersonal interactions.

At the core of this crisis lies the question of responsibility. Technology providers tend to push responsibility to the users by claiming that the application only facilitates the transaction and cannot be responsible for the nature or purpose of the content. This, however, is rejected by policymakers, which tend to argue that personal information is misused and sold for content targeting. On the one hand, it can be argued that social media providers should protect the interests of their users and ensure that their personal information is not used to target them with potentially false or harmful information. On the other hand, it can be argued, that users should become more aware of such information and not ‘trust’ everything they see. This includes making their own background checks and spending the time to investigate the source and intention of the message. This trust-related debate, therefore, bears the question of what responsibility the user holds in trusting information that is spread on social media applications in validating its trustworthiness. In addition to this reduced, transaction-based, point of view, between the user and the social media application. Interpersonal trust has to be investigated. The recent debate related to echo chambers and filter bubbles points to the fact that users tend to trust content from trusted peers more than from unknown users. Furthermore, users tend to focus more on what engagement their own content gleaned and not so much what other content they engage on themselves [2]. Content engagement is used to gauge trust for example via likes, shares, comments and reactions in the form of emoticons. The main assumption is, that content with high engagement, is most likely content that can be trusted [3]. However, it is easy to conclude that content engagement is not suited to assess if the content is true, false or misleading. The main reason being, that any reaction can be fabricated (e.g. by false accounts). To compound this challenge, the underlying content distribution algorithms of social media applications react strongly to content that receives increased engagement by assuming that content with a high number of engagement is interesting to more users, hence the content is spread wider and faster. This specific action can affect information diffusion and the role of the users. Indeed, social media algorithms give more visibility to contents with higher engagement by hiding the visibility of contents with less engagement (e.g. post in Facebook groups). Trust, therefore, cannot be assessed by assuming the content is trustworthy due to the level of engagement. Based on this assessment two possible viewpoints can be introduced. The first pointing to the social media applications screening content and the second pointing to the user needing to trust their own ability to judge content. Furthermore, a good trust model needs to take into account several aspects which involve the level of trustworthiness a user has towards both the content and the user sharing the content. Moreover, it can be argued that if the user does not trust their own ability to assess information they might prefer a regulator to decide. This, however, points to the challenges of censorship and how political bias within the screening teams should be handled. An addition to social media related inter person trust the recent global pandemic has led to an increased online usage with several physical social interactions moving online, specifically these are online working and online learning. However, there is to date no indication that the global pandemic and its impact on online has had a significant impact on social media based interpersonal reactions or that it has changed anything in relation to trust dynamics in social networks. Pandemic related online technologies used are mostly focused on video live and real-time conferencing without the need or usage of a social network or any related social technologies that create an social activity overlay. There are instances of trust related aspects such as companies trusting their workers less due to the lack of control and insight. This argument has led to the increased development of surveillance technology for online workers which however is not directly related to interpersonal trust covered in this article. It remains however an interesting and ongoing topic to reflect how the pandemic, once it is over, changes the dynamics of online and social media related interpersonal trust.

This article is organised as follows. First, a discussion of interpersonal trust within social media applications is provided. This is followed by a conceptual literature review resulting in the identification of modalities of trust in social media. Finally, a rudimentary and brief information-dense word embeddings analysis is provided to illustrate how impactful the terminology around interpersonal trust within the state of the art of interpersonal trust within social media applications is. This concluding study is based on unsupervised machine learning techniques. Finally, a discussion is provided.

Advertisement

2. Interpersonal trust in social media applications

Trust is a complex construct and often defined from different perspectives. This makes it difficult to define and to categorize conceptually. In this section, we seek to provide an overview of definitions and categories of trust with the goal to frame the conceptual literature review discussed in Section 3.

2.1 Trust and trustworthiness

A common opinion is that connected people trust one another’s content. However, trust is a far more complex concept which takes several aspects of the human dynamics into account. Different definitions of trust can be related to the real-world relationships of people and based on this the trust aspects of the relationship take various aspects and definitions into account.

From a general perspective, it can be argued that modern societies are becoming increasingly complex due to technologies that provide instant access to a large amount of information and services. Based on this, it can be argued that trust is a key concept that ensures all members of a complex society to deal with a high level of complexity. Only by trusting the technology to do the job right it is possible to ‘give away’ the control to technology. The same argument stands for non-technical processes and trust such as financial and regulatory processes in which trust is given to a central bank and/or governments. Trust hence is an essential fabric of our society without which the complexities of it would not be manageable and there existing within a complex society would not be possible.

On a more specific level trust can be defined as a derivation of the reciprocity, learned when people are in cooperation with others, like in associations and other forms of voluntary organizations [1]. In addition to this definition, trust is composed of personal values (e.g. personal happiness), but also by political and economic values.

The argument of complexity reduction, central to trust, is aligned with the concept of trusted agents. A trust agent has the purpose to complete tasks on behalf of a person. It can be a person, a governmental agency or technology. However, trusting an agent is not easy. The main reason for this is the role of risk related to trust [4, 5]. The more an individual trusts, the more risk the individual is willing to take. In this context O’Hara [6] discusses that for technology to increase the quality of life, it is necessary that technology assists in increasing trustworthiness throughout society. However, it can be argued, social media applications as one of the most used technology for social interactions within societies, specifically those relying on social interactions on content, are not designed to increase trustworthiness. They typically revert to simplified low-risk substitutes of trust, such as ratings, recommendations and engagement. The risk argument is essential however, without risk there is no trust. This argument on the flipside implies that any action that holds no risk does not require any trust. The result of risk minimization within social interactions in social media applications is a significantly decreased impact such applications have on increasing the quality of social interactions and with that the quality of life throughout societies [6]. As mentioned above, trust as a concept is complex, due to it being based on the person’s beliefs and attitudes. Therefore, it is challenging to understand what properties within social interactions increase trustworthiness and specifically how these properties can be utilized in a mostly automated social media application.

From an economic point of view, an essential view point, especially when discussing publicly trading social media companies, trust is partially a product of people’s capacity to assess the trustworthiness of their potential partners. People, as homo economicus, often calculate the costs and projected outcomes of their decisions to trust. From a rational perspective, trusting involves expectations about interaction partners based on calculations, which weigh the cost and benefits of certain courses of action to either the trustors or the trustees [7]. In this context Weber et al. [8] notes that in some cases people display a willingness to trust people they do not know and will never meet or see. Moreover, a more technology-related view is taken by Friedman et al. [9] in which an end-user must first trust in that atmosphere—technology and human community combined—and only then the interacting partners are positioned to trust in any particular online interaction with other people. In addition, trust can depend on non-rational factors, such as love or altruism and may involve a loose confluence on diverging interests. In extreme cases, trust is even necessary when people are in desperate situations from which they cannot extricate themselves [10]. For example, two parties having an asymmetrical dependency in a trusting relation—one is dependent on the other, but not the other way around [10]. Lewis and Weigert [11] argue that trust, from a sociological perspective, should be viewed as a property of collective units (such as groups and collectives), and not of isolated individuals. As a collective attribute, trust is applicable to the relations among individuals rather than to their psychological states taken individually [11].

It is not clear however what role social media applications play in increasing or decreasing interpersonal trust and what implications this has on the overall society which can only function if trust exists. Several research studies have been proposed and they show the wide effect that social media have on the creation of the trust. However, researching trust within online interactions is a complex task and requires the replication of physical interaction with its sets of interpersonal cues in the context of online exchange may be a feasible method to promote online trust. We postulate that the infusion of social presence in websites for online transactions may increase users’ trust in online organizations which is in line with Beldad et al. [10]. Thus, the problem for establishing trust online is how to do so in light of uncertainty about both the magnitude and the frequency of risk and potential harm [9]. The inclination to view trustors and trustees symmetrically under the premise that each party interprets each other’s actions similarly [8].

In the context of online trust functions of ongoing image and reputation management are important to discuss. The potential partners have the burden of not only creating trust but also maintaining it and this process involves the duty of presenting themselves as trustworthy persons [12]. This corresponds to Goffman’s presentation of the self-theory, which proposes that people are constantly engaged in managing and controlling the impressions they make on others to attain their goals [13]. Specifically, in interhuman relationships, trust can be viewed as a product of people’s capacity to assess the trustworthiness of their potential partners. More specifically, trust, therefore, can be considered as the reflected trustworthiness of the trustees and their trustworthiness that is subjectively entertained in the judgment of the trustors [14]. A further view on trust is offered by Zand [15] by viewing trust as a concept that increases the vulnerability to others whose behaviour one cannot control. Essentially meaning that online trust is defined as an attitude of confident expectation in an online situation of risk that one’s vulnerabilities will not be exploited which can be viewed as an argument that trust in offline settings is applicable to trust in an online environment [16]. Following the argument of increased vulnerability Zand [15] argues that trust can be viewed as the willingness of people to be vulnerable to the actions of others based on the expectation that the latter will perform a particular action important to the former, irrespective of the ability to monitor and control the latter. However, the idea of being vulnerable when trusting skews towards the realization that while uncertainties and ambiguities are abounding in all forms of exchanges and transactions, risks creep underneath. Doney et al. [17] extend this argument by arguing that the sources of risks are related to vulnerability and/or uncertainty about an outcome. Therefore, trust can be regarded as people’s behavioural reliance on others on a condition of risk [18].

The connection between risk and trust has been highlighted in Rousseau et al. [19]. If considered that trust is one of the major concepts of an online/offline human social relationship, two specific aspects characterize a trust relationship between humans: the risk and the interdependence. The risk regards the intention of the other party which is not certain before, instead, the interdependence concerns the interests of the two parties which are related. These two conditions are needed to consider a human relationship as a trust relationship, and changes in these two factors may change the level of trust [19].

In relation to risk Koller [20] and Lewis and Weigert [11] ask if we trust because there are risks or do we take risks because we trust? The first question emphasizes that risks determine trust, while the second question supposes that trust is an antecedent of risk-taking behaviour in any relationship, in which the form of risk-taking, according to Mayer et al. [21]. This, however, depends on the situation and/or context. The people’s level of trust in their interaction partners is positively related to the perceived risks present in the situation. This means that an increase in risk perceptions could result in the augmentation of people’s degree of trust [20, 21].

There are several more specific viewpoints on what makes a person trustworthy and with that different perspective how this can be achieved online in comparison to offline. Sztompka [14] for example employs three criteria in estimating the trustworthiness of a person being reputation, performance, and appearance. Mayer et al. [21] describes that trustworthy occurs when the transactional partners (1) have the required skills, competencies, and characteristics that enable them to exert influence within a specific domain (competence criterion), (2) are believed to do good to trustors (no egocentric motive), and (3) are perceived to adhere to a set of principles that trustors consider acceptable—a definition of integrity [10].

In digital media studies, authenticity has often been discussed in relation to online identity and self-presentation [22]. Especially social media has changed the possibilities for self-presentation. The main reason for this is that users present themselves in flat spatial and temporal context [23]. This happens because social media changed the nature of the interpersonal relationship in two ways: space and time. Time because the Internet is able to reduce the barriers of time thanks to asynchronous communications. Instead, as concerns space, social information can spread to a very wide set of interested users [24].

This relates to Goffman’s impression management framework addressing challenges for the separation of backstage and frontstage identity performances [10, 25] which relates to the users online and offline persona. Social media allows users to perform strategic authenticity by revealing personal information, displaying symbolic connections, and responding to their audiences immediately and regularly. This controlled selection, along with monitoring self-disclosures [23] and constant redaction of profiles [26], help users to perform authenticity for multiple audiences by presenting themselves in different ways based on different strategic personas. Based on [4] authenticity stems from the construction of identity. Giddens explains that an authentic person is one who knows herself and is able to reveal that knowledge to the other. Based on this it can be argued that social media applications influence how individuals build and express the overarching biographical narrative upon which their authenticity claims rest [27].

A trust dimension related to authenticity is reputation which is often represented online as reviews and ratings. It can be argued that the ever-increasing popularity of review websites that feature user-generated opinions (e.g., TripAdvisor and Yelp) are increasingly gamed to increase monetary value through opinion spam (e.g. fraudulent reviews) [28]. Ratings and reviews, therefore, are a weak indicator of trust simply because it is impossible to gauge if the person who produced the rating or review have behaved with goodwill. There is furthermore no incentive for the same to do so. This is further reduced by decreasing anonymity and increase violations of privacy and undermine personal autonomy [9]. Further reasons for a reduced trust in online reputation is emotional bias and deceptive opinion spam which in both cases are highly subjective and mostly motivated based on different reasons than validating reputation. Moreover, there is a high risk of reviews and ratings being purchased and therefore false. Should a person be identifiable and therefore related to a real person the history of comments, reviews and ratings can create a collection of records which indicates the user’s performance in a prior transaction which can increase trustworthiness [29].

In relation to other online applications that evolve around use cases that include social interactions video calling applications can be mentioned. In relation to trust these applications overlap slightly with social media apps specifically if the social media app is focused on short form video presentation. Within such apps several technology enhancements, such as facial filters and background filters can lead to a distortion of the persons actual look and a distortion of where the person is, such as by using different background filters. However, there are two sides to this, pandemic related surge in online social applications, which is short term (recorded) video posted on a social media platform and a live transmission for meetings are learning which is either on a social media platform via live feature or directly via a plethora of video conferencing tools that are available for free.

In relation to trust aspects of real-time o video based trust are understudied however due to the added visual effect are similar to content posted. Hence, content can be viewed widely not only as posts that contain images and text but also videos and which as argued above underly commenting, sharing, rating, etc. which are all prone to be use to validate trust. In relation to real-time video however interactions drastically change with trust being assessed based on multimodal content experiences, such as image, voice and speech in real-time [REF]. A more worrying development in relation to trust and video based content, live or recorded, has become known as deep fake [REF]. This AI empowered technology allows not only the changing of the look and feel of a individual within a video, including voice and speech, but allows for the fake replication of a person as if it is the person [REF]. Deep fake technologies will further evolve and become more easy and cheaper to produce. To ensure videos and interactions of the same, which can still be categorized under the umbrella of fake news, technologies will need to evolve in terms of validating the authenticity of content [REF].

In the following sub-section a closer view is placed on methods to measure and model interpersonal social interaction related to trust and within social media applications.

2.2 Measuring and modelling interpersonal trust in social media applications

Trust has several properties, which are usually used to define models or to measure it. As concerns social media applications [30]. The most interesting ones are dynamic, propagative, no-transitive, asymmetric, and composable. Dynamic means that trust can change. It can increase, decrease, or decay with time. There are two approaches to update the trust value: event-driven, where all trust data are updated when an event happens, and time-driven, where trust is periodically updated. From a properties point of view trust is propagative, which means that if a generic user A trusts B, and B trusts C, A can trusts C, which is the basic concept of a recommendation system. However, trust is not transitive, which means that if A trusts B, and B trusts C, this does not imply that A trusts C. The composable property mean that propagation of trust (and distrust) can follow long social chains, which allows a member A to create a trust connection with a member D that was not directly connected to the member A. When several social chains recommend different values of trust for the member D, then A needs to compose the trust information. Finally, trust is asymmetric, which means that a level of trust between the two members is not the same. Indeed, a member A may have a certain level of trust with a member B, but the level of trust from B to A can be less or more than it is trusted back. We can, therefore, note a strong correlation between trust and similarity. Indeed, users with trust relations are likely to be similar, and this similarity is called homophily. Based on these properties several models have been proposed. Usually, methods are classified by using the propagative characteristic. The techniques used take into account statistical and machine learning techniques, heuristics-based techniques, and behaviour-based techniques.

Advertisement

3. Conceptualization of interpersonal trust in social media

Trust in technology significantly differs from the interpersonal trust. Specifically, the technical environment is one of the building stones helping to build trust between people and can, therefore, be referred to as a socio-technical research challenge. Trustworthy environments are intertwined with social aspects and together they build trust resulting in the usage of the online application. The following overview serves as a high-level conceptualization of how trust is modelled and realized within social media applications (Figure 1). This overview has be developed based on the literature review in the table below the illustration. Its main categories are defined as modalities of trust in interpersonal interactions within social media applications. These modalities are perceived risk, perceived reputation, perceived authenticity, perceived complexity reduction. Each modality is mapped to concept derived from the literature review.

Figure 1.

Modalities of trust in social media.

Extending this literature social-technical research related to interpersonal trust in social media applications can be found in the following table.

Perceived risk (verification, privacy)

Dwyer et al. [31]Trust and privacy concern within social networking sites: a comparison of Facebook and MySpaceA study describing the impact of trust and Internet privacy concern on the use of social networking sites for social interactions. Comparison of Facebook and MySpace.
Fogel and Nehmad [32]Internet social network communities: risk-taking, trust, and privacy concernsRisk-taking, trust and privacy attitude on social networks (MySpace, Facebook) among 205 college students using scales and ANOVA.
Dhami et al. [33]Impact of trust, security and privacy concerns in social networking: an exploratory study to understand the pattern of information revelation in FacebookThe study explores the impact of security, trust and privacy concerns on the willingness of sharing information on social networking sites. Using an online questionnaire, empirical data were collected from 250 Facebook user’s of different age groups.
Paramarta et al. [34]Impact of user awareness, trust, and privacy concerns on sharing personal information on social media: Facebook, Twitter, and InstagramAn experiment from 340 social-media users through a questionnaire-based online survey over a period of 2 months was conducted. This research shows that user awareness, trust, and privacy concerns have a positive and significant effect on sharing personal data on social media.
Sharif et al. [35]Antecedents of self-disclosure on social networking sites (SNSs): a study of Facebook usersThis study investigates how self-disclosure on social networking sites (SNSs) leads to connectedness and trust increases specifically in relationship building. The study investigates the antecedents of self-disclosure under the lens of the technology acceptance model (TAM). The research is quantitative, and the data were collected from 400 Pakistani Facebook users with a variety of demographic characteristics.
Social media never shake the role of trust building in relieving public risk perceptionThis paper introduces three surveys on the perceptions covering two groups over and 6 months and their risk perception on social media. Results show that social media information provision reshapes the risk perception by increasing self-reported knowledge, reducing trust, and making them more fearful.
Fake news detection and social media trust: a cross-cultural perspective.This paper studies how fake news is detected by users from a perspective of risk and how it impacts the trustworthiness of the social media interaction. The cross-cultural study presented in the paper was conducted in Spain and Lebanon and uses structural equation modelling to explore these factors and presents them within a behavioural model.

Perceived reputation (aspects of reputation)

Abdul-Rahman and Hailes [36]Supporting trust in virtual communitiesA trust model that is grounded in real-world social trust characteristics, and based on reputation mechanism, or word-of-mouth.
Matsuo and Yamamoto [37]Community gravity: measuring bidirectional effects by trust and rating on online social networksEffects from trust to the rating (and reputation) within the Japanese community site @cosme analyzed. A theoretical model is presented with a measure of community gravity, which measures how strongly a user might be attracted to the community.
Zacharia and Maes [38]Trust management through reputation mechanismsInvestigation of two complementary reputation mechanisms which rely on collaborative rating and personalized evaluation of the various ratings assigned to each user.
Chen and Fong [39]Social network collaborative filtering framework and online trust factors: a case study on FacebookTrust discussed through relation models (weight of ties) and reputation attributes.
Rosen et al. [40]CouchSurfing. Belonging and trust in a globally cooperative online social networkA study of engagement activities in an online resource exchange community exploring elements of belonging, connectedness, and trust and how to develop them.
Jiang and Wang [41]Generating trusted graphs for trust evaluation in online social networksWork presenting user-domain-based trusted acquaintance chain discovery algorithm by using the small-network characteristics of online social networks. A model calculating trust.
Ott et al. [28]Estimating the prevalence of deception in online review communitiesA generative model of deception used to explore the prevalence of deception in six popular online review communities: Expedia, Hotels.com, Orbitz, Priceline, TripAdvisor, and Yelp.
A survey of trust in social networksA review of existing definitions of trust and social trust in the context of social networks.
Fake it till you make it: reputation, competition, and yelp review fraudA study exploring the potential of fake reviews and consequences on the reputation of users.
Li et al. [42]Static and dynamic structure characteristics of a trust network and formation of user trust in an online societyA study investigated the characteristics and formation of the online social trust network of Epinions.com, a general consumer review site. User activeness had a larger effect on trust formation in online social networks, indicating a “diminishing returns” phenomenon. This phenomenon contrasts with the Matthew effect (i.e., the more reputation a person has, the more likely he or she is to be trusted) in real-world social networks.

Perceived authenticity (personas, anonymity, information quality)

Henderson and Gilding [43]“I’ve never clicked this much with anyone in my life”: trust and hyperpersonal communication in online friendshipsA qualitative study with 17 Internet users about foundations of trust in online friendships, drawing on Sztompka’s theoretical framework.
Duguay [44]Dressing up Tinderella: interrogating authenticity claims on the mobile dating app TinderTinder’s framing of authenticity within mobile dating, using Gidden’s conceptualization and Callon’s sociology of translation. Identifying both human and technological influences on the construction of authenticity with digital media.
McGloin and Denes [45]Too hot to trust: examining the relationship between attractiveness, trustworthiness, and desire to date in online datingStudy examining how the enhancement of a dating profile picture influences perceptions of trustworthiness.
Djafarova and Rushworth [46]Exploring the credibility of online celebrities’ Instagram profiles in influencing the purchase decisions of young female usersStudy investigating the impact of Instagram on source credibility. Non-traditional celebrities (YouTubers, bloggers and “Instafamous” profiles) are perceived as more credible than traditional celebrities.
Amin and Khan [47]Online reputation and stress: discovering the dark side of social mediaThis study provides an important perspective by studying social media user’s concern for online reputation and its relationship with stress which is moderated by social media dependency and trust issues. This study was conducted on university students in India on a sample size of 350. Using structural equation modelling, the relationship between ‘concern for online reputation’ and ‘social media stress’ was tested which revealed there is a positive relationship between the two variables. The results also suggest positive moderating role played by social media dependency in the relationship between ‘concern for online reputation’ and ‘social media stress’ leading to a decrease in trust.
Ryu and Han [48]Online reputation and stress: discovering the dark side of social media.This study identifies the dimensions and items in the existing literature that can effectively measure a social media influencer’s reputation that is verified by trust based relevance as a measure of a social media influencer’s reputation. Based on in-depth interviews with 30 experts and empirical findings from 557 adults, this study identified dimensions that impact on a user’s perception of a social media influencer and developed a scale. The results showed that the social media influencer’s reputation scale comprises four distinctive dimensions: communication skills, influence, authenticity, and expertise.

Perceived complexity reduction

Friedman et al. [9]Trust onlineExploration of the nature of trust and how and where it flourishes online from the perspectives of technology and human community (interpersonal cues).
The emergence of trust networks under uncertainty—implications for Internet interactionsAn application of experimental sociological research to different types of computer-mediated social interactions, particular attention focused on “trust networks” (networks of those one views as trustworthy).
Sheldon [49]“I’ll poke you. You’ll poke me!” Self-disclosure, social attraction, predictability and trust as important predictors of Facebook relationshipsSelf-disclosure, predictability and trust survey measured using scales; results supported by uncertainty reduction theory.
Adali et al. [50]Measuring behavioral trust in social networksAlgorithmically quantifiable measures of trust which can be determined from the communication behaviour of the actors (behavioural trust) in social communication networks are presented and validated on the Twitter network.
Beldad et al. [10]How shall I trust faceless and the intangible? A literature review on the antecedents of online trustA literature review covering empirical studies on people’s trust in and adoption of computer-mediated services (online and offline trust).
Lankton and McKnight [51]Do people trust Facebook as a technology or as a “person”? Distinguishing technology trust from interpersonal trustTwo second-order factor structures that represent alternative ways to model the three interpersonal and three technology trust beliefs were tested on data collected from 362 university-student Facebook users.
Rosen et al. [40]CouchSurfing. Belonging and trust in a globally cooperative online social networkStudy of engagement activities in an online resource exchange community exploring elements of belonging, connectedness, and trust and how to develop them.
Habibi et al. [52]The roles of brand community and community engagement in building brand trust on social mediaA model depicting how consumers’ relationship with the elements of a brand community based on social media influences brand trust.
Anderson and Simester [53]Reviews without a purchase: low ratings, loyal customers, and deceptionA comprehensive review of how low ratings and rating dynamics affect reputation.
Huber et al. [54]Fostering public trust in science: the role of social mediaThis study leverages a 20-country survey to examine the relationship between social media news use and trust in science which is viewed as a topic to complex to understand hence requires a high level of trust. Results show a positive relationship between these variables across countries. Moreover, the between-country variation in this relationship is related to two cultural characteristics of a country, individualism/collectivism and power distance.
Mourey and Waldman [55]Past the privacy paradox: the importance of privacy changes as a function of control and complexityThis paper introduces three studies that provide initial evidence of an alternative explanation in which one’s subjective importance of trusting in privacy within social media itself varies as a function of who is in control of managing privacy and the extent to which managing privacy is perceived to be easy or difficult. When privacy is complex to manage, individuals perceive privacy to be more important when they control privacy management but less important when a social network/company controls privacy management. This changing importance predicts an individual’s intentions to disclose private information and moderates established effects that risk-benefit trade-off tolerance and trust in a company’s expertise (but not benevolence) have on disclosure.
Gierth and Bromme [56]Attacking science on social media: how user comments affect perceived trustworthiness and credibilityThis paper introduces two exploratory studies that were performed to investigate the effects of science-critical user comments attacking Facebook posts containing scientific claims. The claims were about one of four controversial topics (homeopathy, genetically modified organisms, refugee crime, and childhood vaccinations). The user comments attacked the claims based on the thematic complexity, the employed research methods, the expertise, or the motivations of the researchers. The results reveal that prior attitudes determine judgments about the user comments, the attacked claims, and the source of the claim. After controlling for attitude, people agree most with thematic complexity comments, but the comments differ in their effect on perceived claim credibility only when the comments are made by experts.

To extend the categorisation and introduction of moralities of trust in social media applications we introduce a machine learning based analysis of terminology usage in relation to the trust modalities introduced. This analysis is conducted by using word embedding within social media related publications. The main objective of the following section is to assess the popularity of trust related aspects in social media publication.

Advertisement

4. Analysis of modalities via information-dense word embeddings with unsupervised machine learning techniques

In order to assess the popularity of the discussed topic the following section discusses the topic of trust and social media as a topic that is becoming increasingly more important, especially with recent trust breach cases, such as Cambridge analytic. In addition the emerging field of autonomous networks and the requirement for trust valuation (e.g. ad-hoc networks) indicates the need for increased scientific production in the field of interpersonal trust in social media applications. To gain a more comprehensive overview of the debate an in-depth review of trust-related keywords is presented below. The extraction of facts, knowledge and relationships from this increasing body of literature requires a more generalized approach, such as machine learning based text mining through the collection of abstracts. To achieve this we relied on natural language processing techniques, such as doc2vec, for word embeddings performed on abstracts from scientific papers containing the keywords ‘trust’ and ‘social networks’. We focus on the abstracts since they represent a compressed view of the informational content according to Atanassova et al. [57]. The decision to analyse abstracts only was supported by a processing point of view, with abstracts typically short (usually about 300 words) and available as part of the metadata, access to them is relatively easy for analysis. The processing was conducted with publications up to 2020. The main rational for this was to ensure discussions around the impact of the pandemic are not counted which in the view of the authors creates a distortion of the topic of this paper. Further publications, once the pandemic is over, can apply this same approach to focus and possibly compare the impact of the pandemic on the interpersonal trust related debate in social media.

4.1 Data collection

From Google Scholar we collected 560 unique articles in English, which had ‘trust’ and ‘social network’ in their keywords. This selection was focused on articles within the research field of computer science and related fields such as computational sociology. The article dates ranged from 1980 to 2020 and therefore represent exactly 20 years. Hence, this analysis can be defined as pre-pandemic. From each article we extracted and processed the article name, the publication year and the article’s abstract. The following Figure 2 shows a histogram (amount per year) of the above mentioned terms.

Figure 2.

The figure shows the publication year of the collected and analysed articles.

4.2 Preprocessing

In order to prepare the abstract texts for natural language processing we tokenized each document, therefore processing the abstract of each article as a separate document. The result of this pre-processing was a bag-of-words consisting of the token (a non-stop word, hence any term that holds meaning), a token-id and the token-count, such as 2-tuples, which together created the text corpus for further research. In a second step all tokens were normalized. This resulted in 4623 unique words representing the overall size in the processed corpus (or the size of the vocabulary) with a vector size of 300 (which was defined manually). In the following subsection we discuss the overall findings.

4.3 Findings

After the data processing stage we performed a frequency analysis of the words that were collected from the abstracts. The plot below (Figure 3) shows the 10 most frequent words, with ‘trust’, ‘social’, ‘network’, and ‘user’ being the most frequent.

Figure 3.

Most frequent words.

It has to be noted that the word frequency analysis disregards important relations between the words. To mitigate this affect, we selected the 20 most common bi-grams and tri-grams in the data set. In the bi-gram case (see Figure 4) we can see that ‘trust’ co-occurs with ‘social trust’, ‘trust model’, ‘trust network’, ‘based trust’, ‘trust management’, ‘trust social’, ‘trust reputation’, ‘trust relationships’, ‘trust-based’, ‘trust distrust’, and ‘trust evaluation’. In the case of tri-grams (see Figure 5), ‘trust’ interestingly appears in relation to ‘trust news media’, ‘context-aware trust’, ‘trust reputation systems’, ‘trust social commerce’, ‘trust social media’, ‘trust social networks’. In the case of ‘context-aware trust’, it is interesting to note that the notion of trust is related to the specific context a user finds himself/herself in. Therefore trust values are different depending on the context.

Figure 4.

Top 20 bigrams.

Figure 5.

Top 20 trigrams.

To further analyse the relationship between words from the corpus is used for a word embedding analysis, where semantically similar words are mapped to proximate points in geometric space. As shown on Figure 6 below, the semantically similar words to ‘trust’ are ‘application’, ‘context’, ‘prediction’, ‘recommend’, ‘collaborative’. In the case of ‘trustworthy’ the most similar words were ‘applicable’, ‘approach’, ‘exist’ and ‘relation’.

Figure 6.

Word vector representation based on semantically close words to “trust” from the doc2vec model.

To gain a deeper understanding of the topics used in the publications we explored topic modelling to the data set. For this the latent semantic analysis (LSA) technique was applied resulting in the clusters indicated in Figure 7 topics. The largest number of articles was clustered around the topic ‘trust social network’.

Figure 7.

t-SNE visualization of the word clusters from the scientific abstracts. The trained model was reduced to a vector of 50, with a concatenation of context vectors and a max vocabulary size of 1000.

Finally, as illustrated in Figure 8, a topic analysis cluster was applied after normalisation resulting ‘trust’ and ‘online’ being the most used terms.

Figure 8.

Term clustering of the corpus.

Advertisement

5. Conclusions and future work

Studying the implications of trust online is challenging due to the complexity of the topic. Specifically in relation to the literature review it has proven very difficult to identify publications that focus on interpersonal trust online, specifically social media. Most publications in the intersection of trust and online relate to topics of security. However, the advent and growing discussion around the topic of fake news has proven to be a good reference point in the identification of relevant papers. A further difficulty in the categorisation has been the global pandemic which has changed the dynamic of online application usage towards real-time and away from posted content based on recordings or text. During the inception of this paper the pandemic was still ongoing there a concluding pandemic related investigation into trust in online interpersonal interactions can be concluded as valuable future work. Early work in the topic of pandemic related trust implications can be found in Dwyer et al. [58]. Moreover, the advent of DeepFake, a concept that dynamically creates fake news, possibly also in live interactions, is an important topic to review in future work. Overall, it can be concluded that this work, both the literature review introducing modalities of trust and the extended review of the abstracts of research papers should provide a solid foundation for further more focused investigations and studies in the implications of trust in online interactions, specifically related to social media and social media related online technologies. Moreover, this paper has laid the foundation of deeper and more comprehensive literature and state of the art review in interpersonal trust as a foundational dimension in addition to the current debate that focuses mostly on the application or the user behaviour. For this, the paper introduced a comprehensive overview and identified key aspects related to interpersonal trust and truthfulness.

Advertisement

Acknowledgments

This work is supported by the ADAPT Centre for Digital Content Technology, which is funded under the Science Foundation Ireland Research Centres Programme (Grant 13/RC/2106) and is co-funded under the European Regional Development Fund.

References

  1. 1. Fisher C. What is meant by ‘trust’ in news media? In: Otto K, Köhler A, editors. Trust in Media and Journalism. Wiesbaden: Springer VS; 2018
  2. 2. Koidl K, Conlan O, Reijers W, Farrell M, Hoover M. The BigFoot initiative: An investigation of digital footprint awareness in social media. In: Proceedings of the 9th International Conference on Social Media and Society (SMSociety ’18). New York, NY, USA: ACM; 2018. pp. 120-127. DOI: 10.1145/3217804.3217904
  3. 3. Fletcher R, Park S. The impact of trust in the news media on online news consumption and participation. Digital Journalism. 2017;5(10):1281-1299. DOI: 10.1080/21670811.2017.1279979
  4. 4. Giddens A. Modernity and Self-Identity: Self and Society in the Late Modern Age. Cambridge, UK: Polity Press; 1991
  5. 5. Luhmann N. Trust and Power. Chichester: John Wiley; 1979
  6. 6. O’Hara K. A General Definition of Trust. Southampton, GB: University of Southampton; 2012. p. 19
  7. 7. Lane C. Introduction: Theories and issues in the study of trust. In: Lane C, Bachmann R, editors. Trust within and between Organizations. Oxford: Oxford University Press; 1998. pp. 31-63
  8. 8. Weber JM, Malhotra D, Murnighan JK. Normal acts of irrational trust: Motivated attributions and the trust development process. Research in Organizational Behavior. 2005;26:75-101
  9. 9. Friedman B, Kahn PH Jr, Howe DC. Trust online. Communications of the ACM. 2000;43(12):34-40
  10. 10. Beldad A, de Jong M, Steehouder M. How shall I trust the faceless and the intangible? A literature review on the antecedents of online trust. Computers in Human Behavior. 2010;26(5):857-869
  11. 11. Lewis JD, Weigert A. Trust as a social reality. Social Forces. 1985;63(4):967-985
  12. 12. Haas DF, Deseran FA. Trust and symbolic exchange. Social Psychology Quarterly. 1981;44(1):3-13
  13. 13. Goffman E. The Presentation of Self in Everyday Life. New York: Anchor Books/Doubleday; 1959
  14. 14. Sztompka P. Trust: A Sociological Theory. Cambridge: Cambridge University Press; 1999
  15. 15. Zand DE. Trust and managerial problem solving. Administrative Science Quarterly. 1972;17(2):229-239
  16. 16. Corritore CL, Kracher B, Wiedenbeck S. Online trust: Concepts, evolving themes, a model. International Journal of Human-Computer Studies. 2003;58:737-758
  17. 17. Doney PM, Cannon JP, Mullen MR. Understanding the influence of national culture on the development of trust. Academy of Management Review. 1998;23(3):601-620
  18. 18. Currall SC, Judge TA. Measuring trust between organizational boundary role persons. Organizational Behavior and Human Decision Processes. 1995;64(2):151-170
  19. 19. Rousseau DM, Sitkin SB, Burt RS, Camerer C. Not so different after all: A cross-discipline view of trust. Academy of Management Review. 1998;23(3):393-404
  20. 20. Koller M. Risk as a determinant of trust. Basic and Applied Social Psychology. 1988;9(4):265-276
  21. 21. Mayer RC, Davis JH, Schoorman FD. An integrative model of organization trust. Academy of Management Review. 1995;20(3):709-734
  22. 22. Marwick AE. Online identity. In: Hartley J, Burgess J, Bruns A, editors. A Companion to New Media Dynamics. Malden, MA: Blackwell; 2013. pp. 355-364
  23. 23. Boyd D. Social network sites as networked publics: Affordances, dynamics, and implications. In: Papacharissi Z, editor. A Networked Self: Identity, Community, and Culture on Social Network Sites. New York: Routledge; 2011. pp. 39-58
  24. 24. Sutcliffe AG, Gonzalez V, Binder J, Nevarez G. Social mediating technologies: Social affordances and functionalities. International Journal of Human Computer Interaction. 2011;27(11):1037-1065
  25. 25. Hogan B. The presentation of self in the age of social media: Distinguishing performances and exhibitions online. Bulletin of Science Technology Society. 2010;30(6):377-386. DOI: 10.1177/0270467610385893
  26. 26. Papacharissi Z. Without you, I’m nothing: Performances of the self on Twitter. International Journal of Communication. 2012;6:1989-2006. DOI: 1932-8036/20120005
  27. 27. Duguay S. “He has a way gayer Facebook than I do”: Investigating sexual identity disclosure and context collapse on a social networking site. New Media and Society. 2014. DOI: 10.1177/1461444814549930 [online first]
  28. 28. Ott M, Cardie C, Hancock JT. Estimating the prevalence of deception in online review communities. In: Proceedings of International World Wide Web Conference 2012 (IW3C2). 2012
  29. 29. Houser D, Wooders J. Reputation in auctions: Theory, and evidence from eBay. Journal of Economics and Management Strategy. 2006;15:353-369. DOI: 10.1111/j.1530-9134.2006.00103.x
  30. 30. Sherchan W, Nepal S, Paris C. A survey of trust in social networks. ACM Computing Surveys (CSUR). 2013;45(4):47
  31. 31. Dwyer C, Hiltz S, Passerini K. Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and MySpace. 2007
  32. 32. Fogel J, Nehmad E. Internet social network communities: Risk-taking, trust, and privacy concerns. Computers in Human Behavior. 2009;25(1):153-160
  33. 33. Dhami A, Agarwal A, Chakraborty TK, Singh BP, Minj J. Impact of trust, security and privacy concerns in social networking: An exploratory study to understand the pattern of information revelation in Facebook. In: 2013 3rd IEEE International Advance Computing Conference (IACC). Ghaziabad; 2013. pp. 465-469. DOI: 10.1109/IAdCC.2013.6514270
  34. 34. Paramarta V, Jihad M, Dharma A, Hapsari I, Sandhyaduhita P, Hidayanto A. Impact of user awareness, trust, and privacy concerns on sharing personal information on social media: Facebook, Twitter, and Instagram. In: 2018 International Conference on Advanced Computer Science and Information Systems (ICACSIS). 2018
  35. 35. Sharif A, Soroya SH, Ahmad S, Mahmood K. Antecedents of self-disclosure on social networking sites (SNSs): A study of Facebook users. Sustainability. 2021;13(3):1220. DOI: 10.3390/su13031220
  36. 36. Abdul-Rahman A, Hailes S. Supporting trust in virtual communities. In: Proceedings of the 33rd Hawaii International Conference on System Sciences-Volume 6—Volume 6 (HICSS ’00). Vol. 6. Washington, DC, USA: IEEE Computer Society; 2000. p. 6007
  37. 37. Matsuo Y, Yamamoto H. Community gravity: Measuring bidirectional effects by trust and rating on online social networks. In: WWW’09—Proceedings of the 18th International World Wide Web Conference. 2009. pp. 751-760. DOI: 10.1145/1526709.1526810
  38. 38. Zacharia G, Maes P. Trust management through reputation mechanisms. Applied Artificial Intelligence. 2000;14(9):881-907
  39. 39. Chen W, Fong S. Social network collaborative filtering framework and online trust factors: A case study on Facebook. In: 2010 Fifth International Conference on Digital Information Management (ICDIM). 2010
  40. 40. Rosen D, Lafontaine P, Hendrickson B. CouchSurfing: Belonging and trust in a globally cooperative online social network. New Media & Society. 2011;13(6):981-998
  41. 41. Jiang W, Wang G. SWTrust: Generating trusted graph for trust evaluation in online social networks. In: IEEE 10th International Conference on Trust, Security and Privacy in Computing and Communications. Changsha; 2011. pp. 320-327. DOI: 10.1109/TrustCom.2011.251
  42. 42. Li Y, Cao H, Zhang Y. Static and dynamic structure characteristics of a trust network and formation of user trust in an online society. Social Networking. 2018;7:197-219. DOI: 10.4236/sn.2018.74016
  43. 43. Henderson S, Gilding M. ‘I’ve never clicked this much with anyone in my life’: Trust and hyperpersonal communication in online friendships. New Media & Society. 2004;6(4):487-506
  44. 44. Duguay S. Dressing up Tinderella: Interrogating authenticity claims on the mobile dating app Tinder. Information, Communication and Society. 2016;20(3):351-367
  45. 45. McGloin R, Denes A. Too hot to trust: Examining the relationship between attractiveness, trustworthiness, and desire to date in online dating. New Media & Society. 2016;20(3):919-936
  46. 46. Djafarova E, Rushworth C. Exploring the credibility of online celebrities’ Instagram profiles in influencing the purchase decisions of young female users. Computers in Human Behavior. 2017;68:1-7
  47. 47. Amin F, Khan MF. Online reputation and stress: Discovering the dark side of social media. FIIB Business Review. 2021;10(2):181-192
  48. 48. Ryu EA, Han E. Social media influencer’s reputation: Developing and validating a multidimensional scale. Sustainability. 2021;13(2):631
  49. 49. Sheldon P. “I’ll poke you. You’ll poke me!” self-disclosure, social attraction, predictability and trust as important predictors of Facebook relationships. Cyberpsychology: Journal of Psychosocial Research on Cyberspace. 2009;3(2):Article 1. Retrieved from: https://cyberpsychology.eu/article/view/4225/3267
  50. 50. Adali S, Escriva R, Goldberg M, Hayvanovych M, Magdon-Ismail M, Szymanski B, Wallace W, Williams G. Measuring behavioral trust in social networks. In: 2010 IEEE International Conference on Intelligence and Security Informatics. 2010
  51. 51. Lankton NK, McKnight DH. Do people trust Facebook as a technology or as a “person”? Distinguishing technology trust from interpersonal trust. In: AMCIS 2008 Proceedings. 2008. p. 375
  52. 52. Habibi M, Laroche M, Richard M. The roles of brand community and community engagement in building brand trust on social media. Computers in Human Behavior. 2014;37:152-161
  53. 53. Anderson E, Simester D. Reviews without a purchase: Low ratings, loyal customers, and deception. Journal of Marketing Research. 2014;51(3):249-269
  54. 54. Huber B, Barnidge M, Gil de Zúñiga H, Liu J. Fostering public trust in science: The role of social media. Public Understanding of Science. 2019;28(7):759-777
  55. 55. Mourey JA, Waldman AE. Journal of the Association for Consumer Research. 2020;5(2):162-180
  56. 56. Gierth L, Bromme R. Attacking science on social media: How user comments affect perceived trustworthiness and credibility. Public Understanding of Science. 2020;29(2):230-247
  57. 57. Atanassova I, Bertin M, Larivière V. On the composition of scientific abstracts. Journal of Documentation. 2016;72(4):636-647
  58. 58. Dwyer C, et al. What Factors Influenced Online Social Interaction During the COVID-19 Pandemic? 2021

Written By

Kevin Koidl and Kristina Kapanova

Submitted: 02 August 2021 Reviewed: 25 February 2022 Published: 03 May 2022