Transformers: Implementing NLP Models in 3 Lines of Code



Original Source Here

Transformers: Implementing NLP Models in 3 Lines of Code

An introduction to the transformers library for implementing state-of-the-art models for different NLP tasks

Figure 1. Transformers | Image by author

Using state-of-the-art Natural Language Processing models has never been easier. Hugging Face [1] has developed a powerful library called transformers which allows us to implement and make use of a wide variety of state-of-the-art NLP models in a very simple way. In this blog, we are going to see how to install and use the transformers library for different tasks such as:

  • Text Classification
  • Question-Answering
  • Masked Language Modeling
  • Text Generation
  • Named Entity Recognition
  • Text Summarization
  • Translation

So before we start reviewing each of the implementations for the different tasks, let’s install the transformers library. In my case, I am working on macOS, when trying to install directly with pip I got an error which I solved by previously installing the Rust compiler as follows:

$ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

After that I installed transformers directly with pip as follows:

$ pip install transformers

Great, with the two previous steps, the library would have been installed correctly. So let’s start with the different implementations, let’s go for it!

Text Classification

The text classification task consists of assigning a given text to a specific class from a given set of classes. Sentiment analysis is the most commonly addressed problem in a text classification problem.

To use a text classification model through the transformers library, we only need two arguments, task and model , which specifies the type of problem to be approached and the model to be used respectively. Given the great diversity of models hosted in the Hugging Face repository, we can start playing with some of them. Here you can find the set of models for text classification tasks.

In Figure 2 we can see the implementation of the bert-base-multilingual-uncasced-sentimentmodel for sentiment analysis.

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot



via WordPress https://ramseyelbasheer.io/2021/05/20/transformers-implementing-nlp-models-in-3-lines-of-code/

Popular posts from this blog

I’m Sorry! Evernote Has A New ‘Home’ Now

Jensen Huang: Racism is one flywheel we must stop

Streamlit — Deploy your app in just a few minutes