Skip to content

xandie985/Natural-Laguage-Processing

Repository files navigation

NLP Project Portfolio

This repository contains a collection of Natural Language Processing (NLP) projects showcasing diverse applications and techniques:

Description: This project tackles the challenge of fact-checking claims using Wikipedia as the knowledge source. A neural network model is developed to classify claims as supported, refuted, or lacking sufficient information based on the evidence found in Wikipedia articles. The model leverages word embeddings and LSTM layers for effective text representation and classification.

Key Technologies:

  • TensorFlow
  • GloVe Embeddings
  • LSTM (Long Short-Term Memory)
  • Natural Language Inference (NLI)

Description: This project demonstrates how to fine-tune a pre-trained BERT model using TensorFlow and TensorFlow Hub for binary text classification tasks. The specific task involves classifying questions from the Quora Insincere Questions dataset as either sincere or insincere. The model achieves high accuracy in identifying toxic or misleading questions.

Key Technologies:

  • TensorFlow
  • TensorFlow Hub
  • BERT (Bidirectional Encoder Representations from Transformers)

Description: This project explores the use of neural networks for Part-of-Speech (POS) tagging. It leverages GloVe word embeddings and different architectures like LSTM, GRU, and their combinations to predict the grammatical role of words in sentences. The model is trained and evaluated on the dependency treebank dataset.

Key Technologies:

  • TensorFlow
  • GloVe Embeddings
  • LSTM (Long Short-Term Memory)
  • GRU (Gated Recurrent Unit)

Description: This project utilizes transformer models (BERT and RoBERTa) to analyze sentiment in cryptocurrency-related tweets. Pre-trained BERT and RoBERTa models are fine-tuned on a dataset of annotated tweets to classify them as positive, negative, or neutral. The performance of both models is compared using classification reports and confusion matrices.

Key Technologies:

  • Transformers (BERT, RoBERTa)
  • TensorFlow
  • Python

How to Explore:

Each project folder contains detailed README files, code implementations, and relevant data (if applicable). Please refer to the individual READMEs for specific instructions on how to run and experiment with each project.

Feel free to explore, learn, and adapt these projects to your own NLP endeavors!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published