Food For Thought - Stop Complaining
I see everyone blaming Sven for timeout issues, sockerror issues and complaining that it's SER that's wrong and not their proxies.
Then they downgrade SER and say "downgraded, still nothing. you need to fix this ASAP".
Sockerror has to do with your proxies. New flash: if you're using shared proxies, stop.
I personally use 100 dedicated from proxy-n-vpn, 100 dedicated from buyproxies and 30 shared from buyproxies.
As you can see, I have a lot more dedicated than shared. Shared proxies are being hammered by up to two other people (if you ask me, I think the proxy providers resell to more than 3 people for the shared proxies. They can't be proven selling to more than 3 anyway).
So you run the risk of having your proxies used on random SEO tools abusing Google at max threads by random people you don't even know.
Then when your shared proxy that's being used by 5 other people can't connect to the site that you scraped that takes 5 minutes to load because there are 14,000 comments on it, you complain.
Buy dedicated proxies (buyproxies is really good). Set the right ratio of proxy:threads. Configure your timeout correctly, and last, but not least: stop complaining.
Then they downgrade SER and say "downgraded, still nothing. you need to fix this ASAP".
Sockerror has to do with your proxies. New flash: if you're using shared proxies, stop.
I personally use 100 dedicated from proxy-n-vpn, 100 dedicated from buyproxies and 30 shared from buyproxies.
As you can see, I have a lot more dedicated than shared. Shared proxies are being hammered by up to two other people (if you ask me, I think the proxy providers resell to more than 3 people for the shared proxies. They can't be proven selling to more than 3 anyway).
So you run the risk of having your proxies used on random SEO tools abusing Google at max threads by random people you don't even know.
Then when your shared proxy that's being used by 5 other people can't connect to the site that you scraped that takes 5 minutes to load because there are 14,000 comments on it, you complain.
Buy dedicated proxies (buyproxies is really good). Set the right ratio of proxy:threads. Configure your timeout correctly, and last, but not least: stop complaining.
Comments
@yashar your theory made sense so I broke out the paypal and bought a bunch of "dedicated" proxies from buyproxies. I then removed all of my semi-dedicated ones.
Using only dedicated proxies I'm still getting a crap load of sock errors right off the bat and continuously scrolling down the log. Still getting just as many sockerrors as before.
It was worth a try but it did not help one bit in my case. This is not to be considered a complaint because I have just ignored this up to now. I just felt obligated to let everyone know my lack of results.
I switched from 200 shared proxies from newipnow to 100 from buyproxies and 100 from proxy-n-vpn.
My LpM went from 80 to 250+.
Oh, and having 10 dedicated proxies isn't going to be of any help if you run a shit load of threads, didn't think it'd be that hard of a concept.
Yashar,
I'm not running that many projects.
I previously used ten semi dedicated and increased all the way to 30 semi dedicated and that did not get rid of the sock errors.
Now I'm only running 100 threads with 10 new dedicated proxies and the sock errors are continuous- and they started immediately as soon as I hit the start button with the brand new proxies. So I do not know how these dedicated proxies could have been "used up by others" if I just started using them (and they are supposed to be dedicated only to me)
Maybe Ill try lowering the threads to 50 and see if that helps.
What would you keep your html timeout at in my case? (I'm at 150 seconds now)
What would be the ideal custom time between search queries? (I'm at 180 seconds now)
I'm running 200 dedicated proxies on 700 threads.
since the footprints like "leave a comment" can be used for more engines than general blogs, if it is not recognized by gsa but scraped with the footprint, it will say no engine matches
find some better footprints that are more specific and edit with ser footprint editor like i did
You can use a lower number, but 10 is not enough for hundreds of threads.