Skip to content

I've scrape millions and millions of url with Gscraper, what's next?

I got my gscraper running 24 hours and i got millions of dedup url, from varies gsa footprint and general keywords. I don't know what is the proper way to manage this: 

1. Import to GSA SER in Options - Advanced - Tools - Import URLs (Identify platform and sort in) or
2. Create project point to bing.com and import target urls

If i use method 1, while i have 50 projects running, it took months for me to finish 1 millions of url with my dedi 8GB ram 1Gbps line, and i got millions of urls every single day with gscraper.

@ron any good ideas how to proper manage this process? 

Comments

Sign In or Register to comment.