Python 3 Web Scraping to Scrape Coronavirus Data into CSV Excel (.csv) File Using BeautifulSoup4 and Worldometer API Full Project For Beginners

Python 3 Web Scraping to Scrape Coronavirus Data into CSV Excel (.csv) File Using BeautifulSoup4 and Worldometer API Full Project For Beginners

 

Welcome folks today in this blog post we will be scraping live coronavirus data from worldometer website and save it as excel file in python. All the full source code of the application is shown below.

 

 

Get Started

 

 

In order to get started you need to install the following library using the pip command as shown below

 

pip install requests

 

pip install lxml

 

pip install bs4

 

After installing these libraries make an app.py file and copy paste the following code

 

app.py

 

# Import required modules 
import requests 
import bs4 
import pandas as pd 



# Make requests from webpage 
url = 'https://www.worldometers.info/coronavirus/country/india/'
result = requests.get(url) 



# Creating soap object 
soup = bs4.BeautifulSoup(result.text,'lxml') 



# Searching div tags having maincounter-number class 
cases = soup.find_all('div' ,class_= 'maincounter-number') 



# List to store number of cases 
data = [] 

# Find the span and get data from it 
for i in cases: 
    span = i.find('span') 
    data.append(span.string) 

# Dispaly number of cases 
print(data) 


    
# Creating dataframe 
df = pd.DataFrame({"CoronaData": data}) 

# Naming the coloumns 
df.index = ['TotalCases', ' Deaths', 'Recovered'] 



# Exporing data into Excel 
df.to_csv('Corona_Data.csv')

 

See also  Join 12 Django Python Whatsapp Group Link For Django Developers in 2021 | Python Django Whatsapp Group Links For Python Developers | Whatsapp Group Links For Django Python Programmers

 

Now if you execute the above script by typing the below command

 

python app.py

 

 

 

As you can see it has scraped all the coronavirus data from the website and saved it inside the csv excel file

Leave a Reply