You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks a lot for this great package. This is an awesome package and really worth all the starts it gets.
I was wondering how can I feed multiple local html files to the scraper instead of keywords.
So basically I just want to use the parser module of the package to scrape html files that are already downloaded in the local file system.
I see there's a scrape_from_file config option but it takes a single file and still needs a keyword array(even though the scrapper doesn't use the keyword).
may be something like scrape_file_loop instead of keyword scrapping loop where I can assign an array of files or URLs that are ready to parse.
Is there any way I can achieve this?
The text was updated successfully, but these errors were encountered:
pankajjha-cd
changed the title
Scrape the page using html sourcecode
Parse multiple local html files instead of keywords
Jun 4, 2020
Hi,
Thanks a lot for this great package. This is an awesome package and really worth all the starts it gets.
I was wondering how can I feed multiple local html files to the scraper instead of keywords.
So basically I just want to use the parser module of the package to scrape html files that are already downloaded in the local file system.
I see there's a
scrape_from_file
config option but it takes a single file and still needs a keyword array(even though the scrapper doesn't use the keyword).may be something like s
crape_file_loop instead
ofkeyword scrapping loop
where I can assign an array of files or URLs that are ready to parse.Is there any way I can achieve this?
The text was updated successfully, but these errors were encountered: