Scrapy is a Python framework for large scale web scraping It gives you all the tools you need to efficiently extract data from websites, process them as you want, and store them in your preferred structure and format if you start writing code for every little task you perform, you will eventually end up creating your own scraping framework. Scrapy is that framework.
Writing Custom Spiders
a spider is a program that downloads content from web sites or a given URL. When extracting data on a larger scale, you would need to write custom spiders for different websites since there is no “one size fits all” approach in web scraping owing to diversity in website designs. You also would need to write code to convert the extracted data to a structured format and store it in a reusable format like CSV, JSON, excel That’s a lot of code to write, luckily scrapy comes with most of these functionality built in.