Skip to content
  • NetMajomNetMajom https://buywebgraphic.com | skype: nszmmp
    Can you put filter option the next release to automatically delete proxy if the speed under .... ? 

    Now, i found only manual removing under the "remove" button

    Where can i set the automatic proxy test (with custom test sites) in every x minute option? I found only manual under the "Test button"

  • SvenSven www.GSA-Online.de

    in options you have to add a custom test. Then you check it and all further tests are done against it.

    An option to allow certain speed is not really required as you can just lower the timeout values and you skip all proxies being slower.

  • @sven you forgot to reply my question earlier. here goes:

    1. i have 3 copies of gsa ser running on 3 diff machines, do i need to buy one proxy scraper per gsa ser copy?

    im not so sure how this work, im using verified list and i need good proxies primarily to pass recaptcha, can the software does this?

    does the software scrape and export the proxies *and i import them manually to gsa ser?

    or do i install a copy next to gsa ser and it automatically keeps adding new proxies to gsa ser without me doing anything?

    for me my needs are to get continous fresh proxies for gsa ser to solve recaptchas (via captcha breaker)
  • You can export them to ftp server and then import in gsa from there
  • thanks @andrzejek is this process automated once i have set it up? like gsa will automatically get the new list every X seconds (or any time period)?
  • Yes
  • @sven is there an option to test the proxies each x minutes?
  • @andrzejek i just went to take a look at gsa ser and i dun see any ftp info to enter, was wondering if you happen to know? :P

    also was wondering if you are having good results with using the proxies to solve recaptchas? :)

    appreciate your help!
  • SvenSven www.GSA-Online.de

    @Dikkill yes ther eis a interval you can set to test proxies each xyz minutes

    @hardcorenuker

    there is no ftp server in SER of course. You have to scrape proxies on one PC (it's a one PC license) and provide the found proxies to all SER instances if you do not plan to buy Proxy Scraper for all your running SER licenses. This proxy can be provided by automatic export to a webserver, by email, ftp server or whatsoever.

  • I said "You can export them to ftp server and then import in gsa from there"

    Export to ftp in proxy scrapper and then import in gsa automatically each X minutes from your source
  • hi guys, i just bought proxy scraper and this are my settings:

    *Activate internal proxy server is checked "on"

    Use proxies with tag: I only select "google" because i want to get proxies which can solve recaptchas.

    From my understand, for internal proxy server, this is meant for use if i have gsa ser (or any other software) running on the same physical machine as proxy scraper?

    Once I have this set, i went to GSA ser, proxies settings and i set these:

    Check "automatically search for new proxies every X mins". And i entered the ip 127.0.0.1 which is where gsa ser accesses the new proxies every X mins?

    I also checked "Disable proxies when detected to be down" for both public and private

    With the above settings I am assuming this gsa ser copy will auto get the proxies from proxy scraper every X mins?

  • Once the above is done, I went to take a look at captcha breaker and to be honest im getting lotsa hard to solve recaptchas from the proxies. I have already set "google" proxies as the filter. Do I need such a filter if i am not scraping?
  • SvenSven www.GSA-Online.de
    if you use the Proxy Scraper's internal proxy inside of SER as an only proxy solution, you should not use the options to disable proxies. This would cause problems as 127.0.0.1 actually are all proxies in the list of Proxy Scraper. It picks one randomly for each request sent to it.
  • @sven it seems like the public proxies i find die really fast, they are reported as ok in scraper but when i import into gsa ser, almost 90% are dead using the proxy tester. just curious how frequently show gsa ser pull the latest proxy from the ftp (supplied by proxy scraper)?

    are we talking about 5-10 mins interval? or is 1-2 hours ok?
  • NetMajomNetMajom https://buywebgraphic.com | skype: nszmmp
    edited May 2015
    @hardcorenuker i experienced this problem too in all GSA product when i want to use this scrapped ip adresses... When i import to ScrapeBox this ip-s 99,99% are always dead. I use only fresh scrapped proxyes, not old ones, anonyms, and only fast ones.

    You must know, the public proxies are not always reliable, and this scraper are a very new gsa product, but sven update the software very often, and we must believe him, that this scrapper will be a reliable tool for scrapping in the future...
  • SvenSven www.GSA-Online.de
    @hardcorenuker dead in what test? Testing against google will indeed show most found proxies as dead.
  • edited May 2015
    Just bought GSA Proxy scraper, I'm new to scraping proxies and would appreciate the manual when is it going to be available?
    Meanwhile can anyone offer advice on setting it up to run automatically with GSA SER
  • @sven when i tested again google search in gsa ser, most of them came up red. but just mins ago they were reported as ok in proxy scraper. was wondering if im doing anything wrong?
  • SvenSven www.GSA-Online.de
    @hardcorenuker the tests are almost identical in Search Engine Ranker and Proxy Scraper. What happens if you test them again in the Scraper once they are red in SER?
  • SvenSven www.GSA-Online.de
    @GreyGable Im working hard on writing something up for the scraper. But it's not really a complicated program. All default options should be fine for most users.
  • Will there be integration for GSA SER and Indexer and all the other GSA tools @Sven?
  • SvenSven www.GSA-Online.de

    Well you can already use it in it like in any other tools that offer proxy input.

    a) Enable the internal proxy server in Proxy Scraper -> use it in the software 127.0.0.1:8080 (unless you changed it). This will use a random proxy from the list added in Proxy Scraper

    b) Enable auto-export of proxies to a text file. In GSA tools you can set things up to import them in intervals.

  • any API?
  • SvenSven www.GSA-Online.de
    api for what? what do you have in mind?
  • to control with commands, http GET/POST like at CB or json. something what is make it easier to control from different programs as well.
  • SvenSven www.GSA-Online.de
    for other tools you can simply use the internal webserver option and it would deliver a new ip on each request. isn't that enough?
  • edited May 2015
    Oh I checked out the auto export is looking good too. just can I send proxies to the PS to check it out and get back the result? and when auto upload an html file every 60 min thats always get refreshed and rechecked the ips? and there is any discount if I already own more of your products ? :P
  • I'd like to know if there is a way to exclude tags when exporting? For example I want to export all kinds of proxies except those with google tag. I want to use those with google tag exclusively for other projects. Is this currently possible?
  • s4nt0ss4nt0s Houston, Texas
    edited May 2015
    @spiritfly - If you go to settings > export options, then setup an export job, you will be prompted with this window on the last step which lets you choose the specific proxies you want: http://i.imgur.com/WGV4R0t.png

    You can setup multiple export jobs using different options so for one project you can export all proxies, for another you can export just google, etc.
  • edited May 2015
    Sorry to be so thick but can't find out how to import proxies from Proxy Scraper to SER.
    I've tried Configure proxies > Add Proxy > Import from file but I get a "No Proxies found" when opening my file from Proxy Scraper, is this because SER asks for a file with host:port:login:password?
    How do i get SER to import the proxy.txt file from PS every X minutes?
  • SvenSven www.GSA-Online.de
    @hypofely yes you can do all that. No coupon from us sorry.
  • SvenSven www.GSA-Online.de
    @spiritfly you are right, the option to export will export proxies with at least one of the tags. If you want to exclude proxies with a certain tag then I have to add another option. This will be in next update.
  • SvenSven www.GSA-Online.de
    @GreyGable make sure the place you save/read from is actually accessible by both programs and the format is in Text, not CSV.
  • NetMajomNetMajom https://buywebgraphic.com | skype: nszmmp
    edited May 2015
    @sven

    I see your update, and i have some questions (highlighted)

    1.10 - new: improved proxy informations (ok)
           - new: ability to exclude proxies with a domain for exports
                 GScraper doesn't seem to like such proxies and refuses the import 
                 of files with none IP based proxies (dont see this function, where found that)
          - new: export filter options to exclude proxies with certain tags (can i filter / delete the text included ip-s? where  can i find that function?)
          - new: improved saving of proxies (ok)

    Thx

  • SvenSven www.GSA-Online.de
    edited May 2015

    >ability to exclude proxies with a domain for exports

    thats the second checkbox on the filter dialog.

    >can i filter / delete the text included ip-s? where can i find that function?

    I don't understand what you mean by that. On the same export filter dialog oyu can set what to export (e.g. google only) but now you can also define to export everything BUT e.g. google.

  • NetMajomNetMajom https://buywebgraphic.com | skype: nszmmp
    I dont see it >ability to exclude proxies with a domain for exports

    image

    The second question, that there are some type of proxy what the gscraper and scrapebox dont accept like that (example): 

    rrcs-24-173-143-138.se.biz.rr.com:8008
    static-183-16-53-5.metrosg.ru:8080
    cpe-174-100-99-52.neo.res.rr.com:47197
    c-98-203-116-108.hsd1.fl.comcast.net:8089

    that is the text base ip. do i can remove it too in the new update?
  • s4nt0ss4nt0s Houston, Texas
    @NetMajon - go to automatic export tab and setup an export job. It's on the define filter parameters window:

    image
  • NetMajomNetMajom https://buywebgraphic.com | skype: nszmmp
    thank you....
  • @s4nt0s I know about that feature and it's extremely helpful and well thought. But I'm afraid there isn't a way to exclude tags. For example, export all proxies EXCEPT google tagged. Since there is no way to tag all proxies, but google passed ones, there is no way to export them without google passed.

    And this is very important because I want to use google proxies only for google scraping and the rest of them either for bing or whatever. But I definitely don't want to burn google passed proxies for bing and other things.
  • s4nt0ss4nt0s Houston, Texas
    @spiritfly - Ya, I see what you mean now.Good thing Sven understood what you meant and added "exclude tags" to todays update. 
  • edited May 2015
    @s4nt0s where is that "Define filters parameters" located? I followed the steps you mentioned, but i end up with much less options than you have.
    Running the V1.11

    ******************************************************
    Never mind, just found it :).
  • s4nt0ss4nt0s Houston, Texas
    edited May 2015
    Dikkill - haha ya you don't see it until that last step. Glad you found it :)
  • edited May 2015
    @sven it would be cool to include a "loader" like freecap or widecap, where you can assign rules for specific programs (proxychaining etc... no seo purpose). this would open a new salesroad for non-seo-markets and it would open the gsa proxy-server function for programs that doesn't have proxy support. with a loader you can tunnel all programs to a local port or a set of proxies even if they doesn't support proxies.
  • SvenSven www.GSA-Online.de
    @TOPtActics yea this means I have to hook all the network apis when loading. I don't think this is easy stuff but I can have a look on it.
  • edited May 2015
    I keep getting an error on the trial.

    It says it can not save the proxies to the file destination ( which is a dropbox folder ). This is for the automatic export.

    However, when I manually export the list from the button in the main interface, it works fine.

    Im on 1.11
  • SvenSven www.GSA-Online.de
    can you paste the correct error message?
  • yeah, Im looking through it.

    Oddly, it hasnt given me the warning again though.
  • edited May 2015
    Figured it out

    When I hit the Add button and add from clipboard a set of proxies, it goes through and scans them and then when finished, a popup comes up and says: Unable to save proxies to file!

    But when the system is running normal ( meaning, automatcially pulling proxies itself without me hitting the Add button ), it saves them to the file just fine.
  • Also, in 

    Settings -> Provider

    You can enter in manually and it ask for URL or File.

    However, setting a local file ( a file on my desktop loaded with proxies I want to test ) in this field does not work.

    I enter in C:\Users\Administrator\Desktop\New Text Document.txt   and it accepts it, but never pulls it during a run.
  • SvenSven www.GSA-Online.de

    unable to save proxies is for the global list of proxies. They are saved in appdata folder. If thats not working, then something on your OS is restricting access to it.

    same for that file-provider. Are you sure it has access to the file? I see it is working for me. Maybe simply start the program as admin to see if all works then.


  • SvenSven www.GSA-Online.de
    skip my previous message about the save proxy issue...I just found the problem and will fix it asap.
  • Hello Sven,
    can i import Proxy Sources, in Test Version i dont find something, only one URL.
    Or is there no need to import sources, will GSA update the Sources?
    Thanks
    Peter
  • Bought the Tool, and enter Register Informations, when i enter them, the Program say Please Restart.
    After Restart the Tool is not registered!
    Please help!
  • SvenSven www.GSA-Online.de

    @pbsolution for license issues email GSA directly please. Im sure it's some copy/paste error.

    updating sources is however not really required as I am doing this all the time and adding many many of them.

  • Ok, found the Error, my first Name contains German Umlaut, used the Alternative Name.
    Thanks
  • wcwong8wcwong8 Malaysia
    Hello

    What is the best setting to use with GSA ranker

    Thanks
  • wcwong8wcwong8 Malaysia
    I really do not know how to use the proxy scraper pls help.
  • SvenSven www.GSA-Online.de

    @wcwong8

    option 1) enable internal proxy in Proxy Scraper and use it in SER (127.0.0.1:8080)

    option 2) create an export in Proxy Scraper and add an proxy provider in SER to read proxies from a file

  • Hello Sven,
    im a little bit disappointed about this Tool. It dosnt find not really more Proxys then the integrated Proxy Scraper in GSA Ranker. Or im doing something wrong?
    Peter
Sign In or Register to comment.