Introduction: Being in the nlp field, one of the burning topics currently is transformer architecture. Using attention and pooling and a totally new architecture, transformer based models have been pushing new improvements every year. Now, generally these models are pretty tough to understand and implement. Therefore people search for a good library with these models implemented in them. And one such good library is huggingface's transformer library. I have recently started to explore it. transformers has both pytorch as well as tensorflow support. To install transformers, in linux, you can just type pip install transformers And it will download and settle. Now, I will go through the quick tour part and try out a couple of examples from it. Quick tour: From quick tour, I have decided to try out the summarization task. I had a microsoft related text in my pc downloaded earlier. Now, I will try out different summarization tasks with the easily usable pipeline structure. A pipeline ...
I write about machine learning models, python programming, web scraping, statistical tests and other coding or data science related things I find interesting. Read, learn and grow with me!