This stresses me, it makes no sense
I set a project to have platforms based on the highest volume of verifications and highest percentage of verified (from the show stats list 1 v list 2 section)
In stats terms this has over 200K verified links.
My site lists have > 6,000,000 verified links.
When running a project to get 1000 verified links, its a piece of cake, a couple of hours and done - every time.
When running a project to get 5000 verified links the VpM is 80 for the first couple of hours then drops riiiiigghhhtt down to 3 and never gets higher. So in real terms a 5000 verified link project will race up to around 2300 and then crawl along at almost nothing for days.
There are no limits on submission rates, and its set to submit to almost anything - no filters for PR, follow, IP sites excluded, no keywords in url/domain filter....
What the hell.... why cant it just race to 5000 given there are hundreds of thousands of urls locally stored - as well as when scraping is involved (both approaches have the same sort of race - crawl behaviour)
I pick sites with average of 70% success rate, and tens of thousands of submissions / verifications so maximise the dataset, yet here I am looking at for example submission = 6363, verification = 224
Are the list comparison stats BS?, I notice there are some where the verification figure is higher than the submission figure on smaller numbers yet there is still a positive percentage attached to it.
Comments
dunno wat makes the bot do that.
If i specifically take a chunk of such a list and import the urls from clipboard, it'll post fine. y u so confused, GSR bot, y?