Regularly save scraping progress
Ok, I'm quite pissed so instead of creating a rage thread, I'll channel this rage into something useful.
The last couple of days I was scraping contextual URLs for my latest project. At ~95% scraping progress my computer crashed. After turning it back on I realised that SER scraper isn't saving the URLs in real time to your custom .txt file. It only saves once the progress is finished or aborted.
So how about you make it save every ~1-5% so instead of losing the whole list you only lose the last 5% or so.
The last couple of days I was scraping contextual URLs for my latest project. At ~95% scraping progress my computer crashed. After turning it back on I realised that SER scraper isn't saving the URLs in real time to your custom .txt file. It only saves once the progress is finished or aborted.
So how about you make it save every ~1-5% so instead of losing the whole list you only lose the last 5% or so.
Comments
Did you mean 'feed'? If so, I was scraping using GSA foot prints + KW list.
edit: Where does SER save the URLs to, if no custom .txt file is selected?
I'm scraping again now and it's updating my custom file every ~100 URLs, just as you said it would.
Thanks for the help Sven.