Disinformation, misinformation, and fake news in social media : emerging research challenges and opportunities

著者

    • Shu, Kai

書誌事項

Disinformation, misinformation, and fake news in social media : emerging research challenges and opportunities

Kai Shu [and three others], editors

(Lecture notes in social networks)

Springer, [2020]

機械可読データファイル(リモートファイル)

大学図書館所蔵 件 / 1

この図書・雑誌をさがす

注記

Includes bibliographical references and index

Description based on online resource; title from digital title page (viewed on June 30, 2020)

収録内容

  • Intro
  • Acknowledgment
  • Contents
  • Mining Disinformation and Fake News: Concepts, Methods, and Recent Advancements
  • 1 Information Disorder
  • 1.1 Fake News as an Example of Disinformation
  • 2 The Power of Weak Social Supervision
  • 2.1 Understanding Disinformation with WSS
  • 2.2 Detecting Disinformation with WSS
  • 3 Recent Advancements: An Overview of Chapter Topics
  • 4 Looking Ahead
  • 4.1 Explanatory Methods
  • 4.2 Neural Fake News Generation and Detection
  • 4.3 Early Detection of Disinformation
  • 4.4 Cross Topics Modeling on Disinformation
  • References
  • Part I User Engagements in the Dissemination of Information Disorder
  • Discover Your Social Identity from What You Tweet: A Content Based Approach
  • 1 Introduction
  • 2 Related Work
  • 3 Method
  • 3.1 Word Embedding
  • 3.2 Bi-LSTM
  • 3.3 Attention
  • 3.4 Final Classification
  • 4 Experiments
  • 4.1 Dataset
  • 4.2 Hyperparameter Setting
  • 4.3 Baselines
  • 4.4 Results
  • 4.5 Transfer Learning for Fine-Grained Identity Classification
  • 4.6 Case Study
  • 4.7 Error Analysis
  • 5 Discussion and Conclusion
  • References
  • User Engagement with Digital Deception
  • 1 Methods and Materials
  • 1.1 Attributing News Sources
  • 1.2 Inferring User Account Types: Automated Versus Manual
  • 1.3 Predicting User Demographics
  • 1.4 Measuring Inequality of User Engagement
  • 1.5 Predicting User Reactions to Deceptive News
  • 1.6 Data
  • 2 Who Engages with (mis) and (dis)information?
  • 2.1 The Population Who Engage with Misinformation and Disinformation
  • 2.2 Automated Versus Manual Accounts
  • 2.3 Sockpuppets: Multiple Accounts for Deception
  • 2.4 Demographic Sub-populations
  • 3 What Kind of Feedback Do Users Provide?
  • 3.1 Across Multiple Platforms
  • 3.2 Across User-Account Characteristics
  • 4 How Quickly Do Users Engage with (mis) and (dis)information?
  • 4.1 Across Multiple Platforms
  • 4.2 Across User-Account Characteristics
  • 4.3 Demographic Sub-populations
  • 5 Discussion and Conclusions
  • References
  • Characterization and Comparison of Russian and Chinese Disinformation Campaigns
  • 1 Introduction
  • 2 Literature Review
  • 2.1 Russia Internet Research Agency Data
  • 2.2 Chinese Manipulation of Hong Kong Narrative
  • 3 Data
  • 4 Characterization and Comparison
  • 4.1 Network
  • 4.2 History of Accounts
  • 4.3 Geography of Accounts
  • 4.4 Calculating Content Marketshare Over Time
  • 4.5 Bot Analysis
  • 4.6 Multi-media Analysis
  • 4.7 State Sponsored Accounts
  • 5 How Many Similar Actors are Left?
  • 6 Conclusion
  • References
  • Pretending Positive, Pushing False: Comparing Captain Marvel Misinformation Campaigns
  • 1 Introduction
  • 2 Related Work
  • 3 Data Description and Methods
  • 4 Results
  • 4.1 Diffusion on Twitter
  • 4.2 Originating and Responding Twitter Communities
  • 4.3 Types of Actors
  • 5 Discussion and Conclusions
  • References
  • Bots, Elections, and Social Media: A Brief Overview
  • 1 Introduction

詳細情報

ページトップへ