Skip to content

HOw many proxies you use?

I have 25 proxies from squid proxies.. but GSA seems to be saying they are all blocked by google and yahoo (I also use these for scrapebox). I know there are people doing fine, so what am I doing wrong? I have it set to 100 seconds custom wait time between search engine queries.

Comments

  • I'm using 10 private at the moment with 100 threads but I only have 3 projects running right now and only one of those tiers is all out anything goes. But I am not running scrapebox on the same proxies. 

    That said I am planning on adding 30 shared proxies soon to that. 
  • grax1grax1 Professional SEO, UK | White Label SEO Provider
    I used to use 10 private from proxy-hub.com and changed to 30 semi private from buyproxies.org. In my opinion it's better to have more but semi than less but private, but it's only my point of view. Try to use more search engines - I always use 100+ when scraping with SER
  • I usually just click all english search engines
  • I agree with @grax1 about proxies, lots of semi-private work better than a few private.
  • edited November 2013
    @tsaimllc You're not doing anything wrong IMO. Google and Yahoo are just good at banning proxies is all (they are by far the most popular platforms after all)!

    Try filtering out Google and Yahoo - try your luck with some of the many other search engines.

    I think you should be fine with 25 proxies, however, I'd personally recommend trying a different proxy provider as I've had trouble with them (Squid Proxies) in the past.



  • tsaimllc  i got 20 private, 20 semiprivate proxies and got banned pretty fast as well. However, now I am using those proxies for SB and hope to gather some decent list much faster. 
  • BrandonBrandon Reputation Management Pro
    I have 200 private proxies from buyproxies.org.
  • goonergooner SERLists.com
    edited November 2013
    @rayban - In your position i would use SB for scraping. SER is burning proxies up too fast and unless sven can make it so you can specify how many threads to use for scraping only (with another amount for posting), you will always need a silly amount of proxies. For my setup i would need at least 200, maybe more.

    But instead i've got 100 private proxies split over 3 vps and scrape only with scrapebox and that works well.
  • @brandon so you pay $300 just in proxies?
  • goonergooner SERLists.com
    I doubt that, you can get good deals on bulk buys
  • I just saw their site its $150 for 100
  • goonergooner SERLists.com
    You can get better deals, i pay $120 for 100.
  • edited November 2013
    300 dedicated and 400 semi-dedicated for posting - so far so good.

    And our own engine for scraping (1000+ anon. public proxies per day).
  • goonergooner SERLists.com
    @mmtj - Wow, you must be building an incredible amount of links
  • edited November 2013
    Yes, we have a big volume. Roughly 80.000-120.000 contextual links per day - though we distribute that over several servers.
  • BrandonBrandon Reputation Management Pro
    Sorry, I misspoke. I am using semi-dedicated from buyproxies.org. 200 total, I think my monthly bill is about $140.
  • I just purchased the 50 semi private frm buyproxies. I will test, and am debating which is better:
    50 semi and 25 private =$73 about
    100 semi = $75
  • Do semi-private proxies die faster, or is it just me? I have already had like 11 or so die and they arent even a week old. Or is it just buyproxies. i didnt have this problem with private squid proxies.
  • goonergooner SERLists.com
    You can almost guarantee other people are using them for scraping the same as you are, so yea they will die quicker than private.
  • edited November 2013
    I have 50 private proxies from buyproxies.org and all works just perfectly
    however ALL scraping is strictly done with SB
    45-60 seconds waiting time
    4 connections for scraping
    33-44 for live checks by SB or 88 (re-)verifications by SER (all else stopped during RE-verifications at 88 threads)

    earlier I had 30 semi private proxies PLUS 50-100+ public proxies and theyALL died too often because of massive hammering by other co-users, also at that time I mostly used SER for scraping
  • goonergooner SERLists.com
    edited November 2013
    Yea @hans51 is right.

    SER for scraping makes a HUGE difference, the reason is because on SB you can set how many threads for scraping but on SER you can only set total threads. This means if you set high threads your proxies will likely die much quicker and if you set low threads you won't get as many links as you might want.
  • he needs to make a feature to change howe many threads per task (im sure this has been asked).
    well, as long as buyproxies keeps changing them and doesnt limit to how many changes per month, squid proxies i think i can only change them once per month (or maybe that only applies to random changes and not dead proxies)
  • So quick question on all this.

    I scraped a huge list yesterday from SB. 

    But now I have to wait forever to import it into SER?

    Is there a faster way or does SER have to check every domain to see if it supports that database?

    I am a bit confused on how this SER thing is working internally.

    And for my SB procedure. I just pulled the footprints from SER and then added my keywords. I added only 3 keywords and then I scraped my list 1 footprint category at a time... ie articles, blogs, etc.

    Should I be doing something different? Should I be "alive" checking all my links even if I scraped them today? Is there a better way for me to reduce my list before importing it into SER?
  • goonergooner SERLists.com
    You can just import them directly into your projects and try to post like that.

    Or you can set up a dummy projects, import the scraped list into that and on you real projects you can choose "post from global list - verified" so your real projects will build links quickly from verified links of dummy projects.

    Best to test all options and see what works well for you, most experienced users have developed their own way of doing things, each a little different i imagine.
  • edited December 2013
    I wanna ask opinion every friends in this forum. 
    I have 10 private proxies from cheapprivateproxies.org and i run my Gsa Ser for 3 project. And now i want to ask to friends in here. What is the good setting for the :
    1. thread to use .... ?
    2. Html time out ....?
    3. And custom time to wait between search engine queries .... ?
    I put my Gsa to Vps with the spec : 
    Memory: 1 GB
    Disk Space: 60 GB
    OS: Windows
    Bandwidth: Unlimited
    24/7/365 Support
    99.9% Uptime Guarantee
    This is the picture, u can see at the bellow : 
    image
    Give me the best opinion, please ?   
  • edited December 2013
    @amosnainggolan what is your VPS CPU specs?

    I can run up to 250 threads with 4 core @ 2.0 Ghz VPS

  • I switched from buyproxies.org to cheapprivateproxies this month and I still regret it. 
    The proxies get banned soon and they do not even replace it like the guys in buyproxies do. 

    Buyproxies beat them in quality, replacement and everything else except price.
    But quality has its price and buyproxies gives quality proxies. Try yourself next month and you will see.
  • @aulia : my VPS specs is :
    Memory: 1 GB
    Disk Space: 60 GB
    OS: Windows
    Bandwidth: Unlimited
    24/7/365 Support
    99.9% Uptime Guarantee

    I use public proxy and set thread to use 70, and set html time out 50 ?
    How was your opinion ?
  • capricious : I think the best private proxy only at buyproxies.org, because, buyproxies.org always and support to all client during a week and if we having problem with our proxy, we can submit ticket to get a new list private proxy.
    I order 10 proxy list at  cheapprivateproxies,org. after that i check all proxies and my surprise saw only 2 proxies are working and the rest die all.
Sign In or Register to comment.