Extremely straightforward A to Z guide on Abstractive Text Summarization by Fine-tuning RoBERTa model (using Hugging Face) on Amazon Fine Food Review Dataset.

Photo by abillion on Unsplash

RoBERTa(Robustly optimized BERT approach), which is implemented in PyTorch, modifies key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective, and training with much larger mini-batches and learning rates. This allows RoBERTa to improve on the masked language modeling objective compared with BERT and leads to better downstream task performance.

Anubhav

IIIT-Delhi | NLP enthusiast πŸ‘¨πŸ»β€πŸ’» |Machine Learning || LinkedIn: https://www.linkedin.com/in/anubhav-singh-03a3a3184/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store