Hey, a quick question.
I'm about to make the fresh url flow from Gscraper a bit more consistent. So, what I got thinking is that I could dedicate one of the global folders (like failed), and then just load all my urls there and have projects use them automatically.
Question is this: do the urls need to be sorted out by engines? Or can I just dump a text file full or urls in that folder?
Oh, and another question about driving GSA!
When I'm importing urls into project, how do I know it has already gone through all of them? I know I should disable all the search engines, and perhaps the global lists, but how do I know project is done with my list?