Help me to understand.
Help me to understand.
I have 3 verifications for each url per day in my settings.
There are 500 links in the task.
It is also included to take links ONLY from the list of verified ones.
There are more than 10,000 successful placements in the verified normal list.
It should add 1500 urls per day, but the project added 700 and that's it
Comments
Beyond that, it's down to the site list and working sites. You say you have 10,000 working sites but have you removed the duplicate domains? When was the last time you checked that all these sites actually work?
Comparing to my own link numbers, 10k working sites in article and wiki is extremely high. I didn't think there were that many working sites in these 2 modules. If there are, I'd better get scraping lol
Engines like Gnuboard can go down very quick. It may have worked yesterday but today it won't due to your ip being blocked, for example.
Even things like failed captchas can effect the number of verified links.
Also I see you have ser engines and web 2.0 selected - do these engines actually give any links?