Question
my page scraper is only saving the last line to the csv file, so only website and goodyear appear. how do I write everything to
my page scraper is only saving the last line to the csv file, so only website and goodyear appear. how do I write everything to the csv file? Thanks. this is python by the way.
import requests import csv from datetime import datetime from bs4 import BeautifulSoup
# download the page myurl = requests.get("https://en.wikipedia.org/wiki/Goodyear_Tire_and_Rubber_Company") # create BeautifulSoup object soup = BeautifulSoup(myurl.text, 'html.parser')
# pull the class containing all tire name name = soup.find(class_ = 'logo') # pull the div in the class nameinfo = name.find('div')
# just grab text inbetween the div nametext = nameinfo.text
# print information about goodyear logo on wiki page #print(nameinfo)
# now, print type of company, private or public #status = soup.find(class_ = 'category') #for link in soup.select('td.category a'): #print link.text
# now get the ceo information #for employee in soup.select('td.agent a'): # print employee.text
# print area served #area = soup.find(class_ = 'infobox vcard') #print(area)
# grab information in bold on the left hand side vcard = soup.find(class_ = 'infobox vcard') rows = vcard.find_all('tr') for row in rows: cols=row.find_all('th') cols=[x.text.strip() for x in cols] print cols
# grab information in bold on the right hand side vcard = soup.find(class_ = 'infobox vcard') rows = vcard.find_all('tr') for row in rows: cols2=row.find_all('td') cols2=[x.text.strip() for x in cols2] print cols2
# save to csv file named index with open('index.csv', '') as csv_file: writer = csv.writer(csv_file) # actually write to the file writer.writerow([cols, cols2, datetime.now()]) # apprend time stamp
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started