Proxy checking system need update

I can not go over 300 threads - software becomes un-responsive or crashes
can be solved with an option that will disable gui update - just need number updates such as success : 32,232 - failed : 53,321

Also i don't want to check automatically web proxies
please add an option for this too

i have 680k proxy list so more threads is better for me :)

Comments

  • OzzOzz
    edited May 2013
    do you a favor and buy some (semi-)dedicated proxies. i'm sure you don't have that many headaches with them.

    10 semi-dedicated > 680k public proxies (which needs to be tested also and that takes a lot of time. 675k of them will be dead anyway)
  • @Ozz, I don't agree. I use 30 dedicated proxies for submissions, but use public proxies for scraping search engines.

    Why? Because I'm getting more targets and higher LPM with public proxies.

    @Sven, can you add something like: "Use only socks proxies for search engines"?

    I found that it's even better to scrape with public socks proxies.
  • OzzOzz
    edited May 2013
    really?? i think you are the exception to the rule then.

    i'm curious to know which SEs are you using? my believe is that they are not gonna work that good for google as they should be banned often times within <2 hours.
    do you use private lists of public proxies or just scrape them from public sources with SER?
  • Scrape from public sources.

    I use 8 SEs (the largest ones).

    My private proxies get temporarily banned in 3-5 hours if I use them.

    I'm running 300 threads.
  • thanks, i'm gonna do some testing for myself.
  • I also use an 1 second delay for search engines, and 1 second proxy time-out.
  • I'm also thinking about proxy groups.

    For example "Proxies for submissions" and "Proxies for search engines".

    I can obtain thousands private proxies, but an owner restricts any posting. He only allows SEs scraping.

    So it's better to have separate groups for posting and scraping.
  • i purchased 4 private proxies and all 4 were banned from google permanently
    not even captcha
    also i tried them with gsa ser and in several minutes they failed

    with public proxies i am able to do a lot better :D
  • why didn't you just go back to the provider and get new ones?
  • OzzOzz
    edited May 2013
    yavuz, what kind of "query time" you were using with proxies for SEs?
  • Everyone should be using private for posting and public for scraping.
  • <Everyone should be using private for posting and public for scraping.

    and the reason for that is???
  • @Ozz actually i was not scraping any SE. I have my own software for scraping and i am importing urls from there

    They replaced with new ones but 4 private proxies were able to get like 12 mbit bw
    so i told them to cancel and refund
  • It's good practice for all spammers to maintain the overall health of private proxies. Plus public proxies work equally as well for scraping, YMMV. 
  • Btw, if you need proxies for scraping you can try a Russian service: http://anonym.to/?http://buy.fineproxy.org/eng/

    I don't know how many people use them, but I tried a 1000 EU packet for Scrapebox and it worked like a charm. Nice speed and a lot of proxies to use.

    That's why it would be nice to add separate posting/scraping proxy groups to the GSA SER.
  • edited May 2013
    my proxies also not scraping anything
    30 semi dedicated proxy from buyproxies.org
  • so i tested 'public proxies' for >12 hours and i won't recommend doing that at all. it worked better than expected but they fail hard for google and yahoo, imo.
    that test was far from perfect though as i used global site lists as well and as long as you are satisfied with the results everything is fine :)
    nonetheless i believe that most are better off to use private proxies for scraping when there query time is adjusted well.

    ron for example just uses SE scraping to get new targets every day without any global lists and doing fine. he would not get any submissions if his proxies are banned from searching.

    i also noticed a higher CPU usage but that could be caused by lower 'query time' of 1 second.
Sign In or Register to comment.