Skip to content

Google proxy block

245

Comments

  • spunko2010spunko2010 Isle of Man
    @PerryM sven is on holiday for a week so you will be waiting until then to find out if there is an issue :)
  • Yes, that's what I had guessed.

    I am adding verified links with the scraped proxies so I'm still in business.
  • edited June 2013
    @spunko2010 thanks been on top of this SER for 11 months. my setings are optimized beyond optimization - I've followed all the suggestions from the SER advisors etc so I don't want to go down the settings change road if possible. . it's the last week only that this issue cropped up for me as well- I think this thread is going to grow quickly and Sven will address this eventually. Maybe google has always had this proxy block and Sven just added it to the log - but I get it for every google engine selected, every time now, so I just deselected them all. Eventually, Not going to be good for my google ranking though.
  • This proxie block issue could be the new penguin 2.0 google algorithm detecting spam faster?
  • AlexRAlexR Cape Town
    @Ozz - I think you're right about the 2 step ban. That's what I've been seeing. What I've also seen when I manually placed some proxies in my browser on my VPS was an issue that made me wonder about the search string.

    1) Block 1 - is warning with captcha
    2) Block 2 - is a full block, but it's only on a search. So the normal Google page loads fine but when you do a search you get a banned message that can last a few hours. What's interesting about this, is that it passes the Google search string since the Google page shows, but only on search does it give the block. That's why I think many people are seeing passed proxies, that are actually blocked, since SER is finding the string, but on a search it's getting blocked by Google. 

    Great idea about a banned proxy holding box!
  • spunko2010spunko2010 Isle of Man
    edited July 2013
    I did ask in another thread for Sven to give us the chance to enter the CAPTCHA manually within SER from Google, just like if you set to Ask User for other CAPTCHAs but he said no. Maybe if others want this feature then they can request it too? The more requests = the more likely it is.  Personally, I think if that feature is integrated and works then it sets SER ahead of all others.

    https://forum.gsa-online.de/discussion/4510/google-captcha-prompt-idea
  • OzzOzz
    edited July 2013
    i'm thinking that it would be better to give the proxies some time to rest (= penalty box) which were asked to answer a captcha.
  • count me in as i never experienced this behavior before with the proxies. It has to be rectified somehow.
  • spunko2010spunko2010 Isle of Man
    Seems some of the google blocks have been removed from my proxies. But I am still seeing startpage blocks :S
  • I also believe google has put stricter rules to discourage/slowdown scraping.

    My proxies are getting banned faster than I have ever experience.
  • spunko2010spunko2010 Isle of Man
    Are you using proxy-hub? I wonder if their IP ranges have been flagged...
  • I use various proxy providers, all same results. Setting delays seems not help (just slows down scraping). Only answer is too have larger number of proxies available.

  • I still see tons of proxy blocks with google and all is green when testing with the gsa ser proxy checker. Same test done for a week and I do not think the proxy checker is not working.
  • I'm seeing them every now and then too.
  • I use Proxy Hub with semi private and until this week have always been OK, but this last week pretty much 2/3rds of them have been banned by Google.

    I think @jpvr90 might be right and the answer is get more proxies
  • spunko2010spunko2010 Isle of Man
    How many proxies are you running? I have 50 semi private.
  • I am running on 60 semi dedicated proxies too.
  • @Spunko2010 30 semi private, but I'm thinking I need at least 50

    I've been testing Rightproxy.com to see if their managed list helps with searching, but they are nothing more than slightly higher quality public proxies because the list is refreshed more quickly, so you have all the same issues of using public proxies.

    I ran them all day yesterday and achieved a max LPM of about 50ish with 500 threads.  When I logged into the server this morning at about 8am, the LPM was something like 12, so I switched back to using semi-private and my LPM is now 160 even though I've only been running 150 threads because of the issues with Google blocking.


  • edited July 2013
    ure fine with 20 private seriously. even with 10. i'm using 2 versions of SER with 10 private proxies each. they get a nice big list every couple days to digest.

    i dont know how often i must repeat: use SER for posting. Not for scraping. There are other programs for that. And youre wasting a lot of ressources+money if youre using SER for scraping. I did it too and burned such a big amount of proxies... go scrapebox+public proxies.

    this is what happens if youre using SER for posting and feed it high quality lists:

    image

    Of course it doesnt stay there for long, but you get the point. And this is done with only 10 Proxies as you can see.

    I dont wanna brag here (well, maybe a little), but maybe that makes some people understand how crucial it is to pre-scrape.

    Best Regards


  • whats the best program to use for scraping to ultimately feed to SER ?
  • the answer is already on this page bro, cmon.
  • I know SB but any other options?
  • GScraper or Scrapebox, don't look further.

  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    The problem i have with Gscraper is that it start of great but then it quickly burn up the proxies and the scrape becomes a crawl.
    I normally use a large list of footprints (at least a few 100), and then i will use about 20 keywords.I use 30 private proxies. after it starts off with speed  a few 1000\min and then after a while slow does to a couple hundred\min.

    Have anyone tried Gscaper with public proxies ?
     
  • I wanted to try and buy Gscraper but heck I'm not impressed. Their blog doesn't load, which has mentioned download of free one. So it's just there to market free version, I can't even get my hands on it.

    Next, it turned out to $68 from $38 and no discounts. Proxies went to $66/mo instead of $48 and I'm seeing so many threads about it's recent deterioration on BHW and other forums.

    I can't buy the software which I need to spend countless hours trying to find a download link which in the end, ends with no results. No working free trial or free version download link is found on their website. Lists to blog which doesn't even load in past 2-3 weeks when I tried.

    I've SB and I'm quite happy with it.
  • I used Gscraper with their proxies service, 6 months ago it was really good (can scrape up to 20k url / min) but actually it is utter crap and way over priced.
    Now i use a public proxies list service + sb, i have better results and it's a lot cheaper.
  • public proxies list service, can you tell me where you get that list?
  • edited July 2013
    Why even bother with scraping lists yourself? Why not we pool our money and buy the lists from ScrapeBrokers and Freshlistbank? That way, we'll have good pre-scraped lists that we feed into SER every month?
  • @Username Assuming those lists are oversold and spammed to death.
  • spunko2010spunko2010 Isle of Man
    I'm happy paying for 100 private proxies every month and just using SER for scraping too.
Sign In or Register to comment.