Verified lists from server 1 and 2 get 40% success rate on server 3, why?
howdy,
I've noticed this quite frequently, enough to stop living with it and address the problem in question format.
I set 70K verified comments/links with exact same parameters from 2 boxes over to post on a 3rd box today. It posted to about 20K of them. That's really bad given that these are verified prev posted 1-2 days ago comments/trackbacks/etc - not the kind of urls that go down in large amounts.
also - the "do not post to the same site twice" parameters - do they apply to inner urls or whole domains? If I import50 different urls of SAME domain that can be commented on, will 5max posts per and 2max accounts per - mean per url you're submitting to? or for entire domain? making it 5x2 = 10 comments/50 posted? or 10 comments per each of 50 urls?
Whats up? What can I do to better the rates?
I've noticed this quite frequently, enough to stop living with it and address the problem in question format.
I set 70K verified comments/links with exact same parameters from 2 boxes over to post on a 3rd box today. It posted to about 20K of them. That's really bad given that these are verified prev posted 1-2 days ago comments/trackbacks/etc - not the kind of urls that go down in large amounts.
also - the "do not post to the same site twice" parameters - do they apply to inner urls or whole domains? If I import50 different urls of SAME domain that can be commented on, will 5max posts per and 2max accounts per - mean per url you're submitting to? or for entire domain? making it 5x2 = 10 comments/50 posted? or 10 comments per each of 50 urls?
Whats up? What can I do to better the rates?
Comments
Are your results the same as when you import to project?
That way you eliminate if importing from file is the problem, or maybe proxies as @trevor_bandura suggested or maybe even something else.
Point is, you have to try different things to get the source of the problem sometimes.
^^Everybody inflates the value of their verified list. Over 90% are duplicate domains and URLs. And then with the remaining 10% probably half the links don't exist (over time). So 5% seems like the true residual over time.
**The kicker** - You use your verified sitelist for links, and you keep perpetuating the cycle of the same links.
@trevor_bandura - I'm scraping free proxies on the 2 servers that got the 70K verified, and using 300-400 squids on 100 threads to post to them on my master server - so should be atleast the same high rate.
@gooner - testing this now, but I don't get why an unused verified global folder w 70k txt of urls would somehow upload to a new project of GSR differently than importing targets from a file to a new project. Same thing to a bot, I'd think. Just importing a list from a txt.
@brandon - no, these were 70k verified over the past 2-3 days. They wouldn't have gone bad that quick, nor were there previous verifieds in that file since it was a fresh install. That's why it's confusing.
@ron - same thing, I guess, if directed at me. This was a fresh list, if it wasn't - I wouldn't be wondering.
I do fully know and use Xrumer - learned it before I knew about GSR, so I understand quality of lists goes bad quickly/etc/etc.
That's why this is such a puzzle. The private squids are used for nothing but scraping - and if they're not blacklisted on Google I can't imagine them being blacklisted on crappy tier3 comment/trackback sites/global blacklists. Tested a few and seem all clear.
Thanks for the suggestions - testing on new project and through an unused "global list" as an import point now.
Also going to test the full list on one of the two servers - see what that do.
But if you let SER post from a specific folder it can try to post to the same URL repeatedly = Better success rate.
SER is a multi threaded software, perfectly good URLs will fail sometimes, that's the nature of the beast - So better give it multiple opportunities to make a successful post.
Not as low as 30 or 40%, but definitely lower than usual.
@sven could an update be responsible?