@Ron I was running 2 projects and now just one. At first days there were aprox 200 verifieds per project, afterwards not more than 100 verified (just checked the diagram). I used different platforms first 3 days, afterwards just contextual.
Take as much search engines as possible and fire away. If you get banned get new proxies from your support and higher the search engine waiting time. Repeat this until you find a good ratio!
I think my question relates to this thread, so I'll just ask it here.
Tell me please now: Startrip said to not use SER for scraping. How can I disable SER from scraping? I don't understand how that works.
I started thinking of this, because I just got myself 30 private proxies, and they're burning left and right. I'm running 250 threads with a timeout of 150 seconds.
Should I change some settings on the Proxies configurations -> Options, as well?
I just got myself Gscraper to run on another VPS, so I think I don't need to scrape anything with GSA. Or do I?
Also, my LpM seems to have dropped to less than 10 lately
Sometimes it feels a little overwhelming how much stuff there is to know.
For anyone else experiencing proxies burning down, try this.
Go to Proxies configuration -> Options
Then try with these settings: threads: 10 timeout: 60
My proxies are so rock-solid right now I can't believe it!
I also disabled the option to disable private proxies, when detected to be down. I've found that if I test like 2 seconds after the proxy going red, it's working, so I don't want to be popping those proxies back up all the time.
I am having similar issues. This last time, I loaded fresh dedicated proxies into GSA. I started the process using all US engines. GSA reported them as blocked immediately. Could this be a wrong test by GSA? Which keeps it from using them correctly? I turned off all testing in the proxies options, tested them in Scrapebox and they are fine. Heres my settings.
30 Dedicated proxies Set to 20 threads to be safe Timeout 150
Any one have any ideas why this is occuring? When I turn the proxies off, GSA posts great.
Comments
At first days there were aprox 200 verifieds per project, afterwards not more than 100 verified (just checked the diagram). I used different platforms first 3 days, afterwards just contextual.
I think my question relates to this thread, so I'll just ask it here.
Tell me please now: Startrip said to not use SER for scraping. How can I disable SER from scraping? I don't understand how that works.
I started thinking of this, because I just got myself 30 private proxies, and they're burning left and right. I'm running 250 threads with a timeout of 150 seconds.
Should I change some settings on the Proxies configurations -> Options, as well?
I just got myself Gscraper to run on another VPS, so I think I don't need to scrape anything with GSA. Or do I?
Also, my LpM seems to have dropped to less than 10 lately
Sometimes it feels a little overwhelming how much stuff there is to know.
So, is GSA automatically scraping for something? How can I disable it?
Lately my LpM has dropped below 10, and my 30 private proxies are burning left and right.
I only use Google as search engine.
I DO have GScraper running on another VPS, so I don't need to scrape anything with GSA.
select project > edit > options > go to search engine box > uncheck all search engines
For anyone else experiencing proxies burning down, try this.
Go to Proxies configuration -> Options
Then try with these settings:
threads: 10
timeout: 60
My proxies are so rock-solid right now I can't believe it!
I also disabled the option to disable private proxies, when detected to be down. I've found that if I test like 2 seconds after the proxy going red, it's working, so I don't want to be popping those proxies back up all the time.
Hope this helps!
This last time, I loaded fresh dedicated proxies into GSA.
I started the process using all US engines.
GSA reported them as blocked immediately.
Could this be a wrong test by GSA? Which keeps it from using them correctly?
I turned off all testing in the proxies options, tested them in Scrapebox and they are fine.
Heres my settings.
30 Dedicated proxies
Set to 20 threads to be safe
Timeout 150
Any one have any ideas why this is occuring?
When I turn the proxies off, GSA posts great.
Proxies -> Options
Your proxy settings could be waaaay off. My proxies became super tight after doing those tweaks.
Maybe the popular engines do not like search strings?
BTW @Artsi . Your advice is good, didn't work to solve my problem, but I can definitely see the standard settings will set you wayyy off.
Also, is it normal for the submitted number to drop?
Without proxies it goes up nicely, with it continues to drop without verifications going up.
I'm running my 30 private proxies at about 300 threads. I've tested even bigger numbers, and they were still good.
Does GSA just disable your proxies, or does it say they are not working?
I don't if it's good practice or not, but I unchecked the "disable non-working private proxies", since the tend to work whenever I test them.
GSA says they are not working, then I have played with enabling/disabling.
I swapped to another proxy company, they seem to be doing better.
I am going to try to bump the threads up now and see if it blocks the proxies