Skip to content

KaziTanvir/Fake-news-classifier-using-LSTM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Fake news classifier using LSTM

Introduction

Fake News has been a concern all over the world and social media has only amplified this phenomenon. Fake News has been affecting the world on a large scale as these are targeted to sway the decisions of the crowd in a particular direction. Since manually verifying the legitimacy of news is very hard and costly, there has been a great interest of researchers in this field. Different approaches to identifying fake news were examined, such as content-based classification, social context-based classification, image-based classification, sentiment-based classification, and hybrid context-based classification.

What is LSTM?

Long short-term memory (LSTM)[1] is an artificial neural network used in the fields of artificial intelligence and deep learning. Unlike standard feedforward neural networks, LSTM has feedback connections. Such a recurrent neural network (RNN) can process not only single data points (such as images), but also entire sequences of data (such as speech or video). For example, LSTM is applicable to tasks such as unsegmented, connected handwriting recognition,[2] speech recognition,[3][4] machine translation,[5][6] robot control,[7][8] video games,[9][10] and healthcare.[11] LSTM has become the most cited neural network of the 20th century.[12]

The name of LSTM refers to the analogy that a standard RNN has both "long-term memory" and "short-term memory". The connection weights and biases in the network change once per episode of training, analogous to how physiological changes in synaptic strengths store long-term memories; the activation patterns in the network change once per time-step, analogous to how the moment-to-moment change in electric firing patterns in the brain store short-term memories.[13] The LSTM architecture aims to provide a short-term memory for RNN that can last thousands of timesteps, thus "long short-term memory".[1]

Source: https://en.wikipedia.org/wiki/Long_short-term_memory