Question and Answering telegram bot -- to increase productivity and efficiency within customer service related jobs.
-
Updated
Mar 13, 2023 - Python
Question and Answering telegram bot -- to increase productivity and efficiency within customer service related jobs.
Using State of Art transformers for text classification and deep CNNs for Image Classification
Applying zero-shot learning on classification task.
Deep learning for Natural Language Processing
Advanced RAG pipeline using Re-Ranking after initial retrieval
A sentiment analysing web application for customer reviews. Positive, Negative and Neutral opinions are highlighted.
I will perform a text classification tast using various Transformer like BERT, DistilBERT, ElLECTRA models, mostly from Huggingface community
Thesis Project
1. Fine-tune DistilBERT on NLI and dentify the some salient or toxic features that the model learnt. 2. Sample annotations techniques and production of silver label using (EDA and Back Translation).
🏥 Dr.Jarvis is a medical transcript classifier that helps patients to get their symptoms diagnosed in real-time on a Streamlit-powered web app. Trained by SVM, KNN, and Random Forest models of sklearn.
Deploying a pretrained distilBERT model with SageMaker
Kaggle Competition
A notebook for a medium article about text classification with Hugging Face DistilBert and Tensorflow 2.0
Distilbert trained on goemotions dataset using huggingface
Sentence Classification with BERT
A simple supervised learning solution to classifying candidates according to their professional seniority level (entry, mid, manager, cxo, etc.).
This repository contains the Romanian version of DistilBERT.
I performed sentiment analysis aimed at determining the sentiment of 50000 imDB movie reviews, whether they are positive, negative, or neutral. I employed various NLP approaches including lexicon based approaches, machine learning models, PLM models, and hybrid models, and assessed the performance on each type of model.
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."