I built 100k verified URLs :D
So guys, I have been learning a lot about GSA SER over the past 2-3 months, and I'm happy to say that I think I have pretty much wrapped my head around everything in this game - SER projects, proxies, scraping, footprints, site lists, etc. etc.
So far, running only test projects, I have built 100k verified URLs, 5% of which are contextual links.
This is my own list that I scraped and verified myself. Would you say this is a good achievement for a newbie?
Time to use this list and rank a small site at least and start making the monies!
Comments
Instead choose "submitted per day" and set the number to maybe 50.
Once you run it for a little while you'll see the submitted - verified ratio and can set the above number more accurately.
I don't use an indexer either any more. Check how many of your contextuals are indexed naturally after you have built them. It's usually around 60% for me, so i don't bother indexing. But others disagree.
If you are looking for speed, the best way is to import URLs directly to projects because then SER will process each URL in sequence with no dups, so it's more efficient.
But "use URLs from global site lists if enabled" still works well and is more hands off so you can just let it run with that setting. But it picks URLs randomly so there is a chance it will pick URLs that have already been processed for each project.
I usually import directly, but also set projects to post from global as a back up if projects run out of targets before i can import more.
Hope it helps.