🤗 Dockerized BERT-Multi-Label-Classifier Inferer 🤗
-
Updated
Aug 30, 2021 - Jupyter Notebook
🤗 Dockerized BERT-Multi-Label-Classifier Inferer 🤗
A study on encoding english sentences to tensorflow vectors or tensors using pre-trained BERT model from the Hugging Face Library.
Welcome to our Smart Content Accumulator website! We have developed a powerful tool that streamlines the process of obtaining summarized content from any article. With just a URL and a click of a button, our website generates concise and meaningful summaries, saving you valuable time and effort.
Using State of Art transformers for text classification and deep CNNs for Image Classification
Applying zero-shot learning on classification task.
Deep learning for Natural Language Processing
Advanced RAG pipeline using Re-Ranking after initial retrieval
Q&A System using BERT and Faiss Vector Database
HLE-UPC at SemEval-2021 Task 5: Toxic Spans Detection
A sentiment analysing web application for customer reviews. Positive, Negative and Neutral opinions are highlighted.
I will perform a text classification tast using various Transformer like BERT, DistilBERT, ElLECTRA models, mostly from Huggingface community
Thesis Project
A python program that classifies medical transcripts based on their medical specialty.
1. Fine-tune DistilBERT on NLI and dentify the some salient or toxic features that the model learnt. 2. Sample annotations techniques and production of silver label using (EDA and Back Translation).
revising reddit troll hunting with some new nlp techniques (DistilBERT, Multi-Sample Dropout, OOF, Group K-fold on subreddit and some fancy pre-processing)
🏥 Dr.Jarvis is a medical transcript classifier that helps patients to get their symptoms diagnosed in real-time on a Streamlit-powered web app. Trained by SVM, KNN, and Random Forest models of sklearn.
Fine tuning 🤗 transformer model for softskill NER task
Fine Tuning Transformer (DistilBert, but generic to other models) for MultiClass Text Classification (Sentimental Analysis over IMDb)
Deploying a pretrained distilBERT model with SageMaker
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."