Proxy issue and some more questions

edited December 2013 in GSA Search Engine Ranker
I bought gsa ser not long ago - but tried to gather my first issues/problems I faced in one topic: 1. I have public proxies saved periodically in a file located on my hard drive - I want gsa only to get periodically those proxies - no other sources, no testing them (I do it myself) - but I can see that gsa is getting proxies from those urls all the time instead of file - how can I change it? 2. Are those human questions answers done by me saved somewhere for future use of gsa? I saw that I can save unanswered ones - but where to move those after I solve them? 3. Am I able to post only to my own list of urls? I mean should I always delete search engines in every project? 4. Is it a way to easily add new urls to gsa for all scripts and to be used in all projects? I mean I harvest them by daily basis and want gsa to post only to those new. Or should I better edit it by scrapebox (removing dupes and those already in my global lists and add new source to a random project? 5. Is it possible to post to url not identified by gsa? Is gsa good at identifying scripts?- as I see many working in my previous poster soft urls are marked as not identified. Or it might be issue that I pass it full url while the script is located in the root or subfolder - is gsa able to identify those? 6. Regarding the above - is gsa able to redirect urls - follow redirects. I get an error posting to a phpBB url as there is a link to accept - from /profile.php?mode=register, it should move to /profile.php?mode=register&agreed=true, but it's not and it cause an error while posting. 7. I make few projects but in the global folder I see only files named sitelist_scripttype and sitelist_Unknown_scripttype. There is no verif/submitt/fail/identif - should it be done manually by me somehow?


  • edited December 2013
    1. Don't use public proxies they suck. Use private proxies, 10 would be enough for starters 2.I don't know why you would want to answer each captcha yourself, use at least Consider investing in DC if you want to use SER at an optimal level. 3.GSA SER can scrape on it's own using the keyword list you give it. 4.Yes, Global Lists. You will have to manually remove dups(look up SER TOOL in search) 5.Not sure, probably not. 6.Not sure what you mean. 7.You need to configure SER to save submitted,verrified,failed. You can save theme anywhere you want. BETTER FORMATTING:
  • edited December 2013
    1. I will never be that rich to buy 100-500 fully dedicated proxies only for scraping google - so I have to enable public as I am doing in all seo software I am using. I point parsing a file and ticked automatically search fo new proxies(does it mean getting them from file?) - but it's not working. 2. I use dbc, I mean question captcha. 3. I was not asking about that. 5/6. It's simple - you have a forum that have to accept terms and the registration address is not typical - can gsa cope with it? By the way what's with formatting on this forum - how to add new lines here?
  • SvenSven
    Accepted Answer

    1. You can set it up to import proxies from a file when adding a new proxy provider. Than set things up to delete all proxies before adding a new one and not testing them.

    2. Questions and there answer is saved once the question is detected as a correct solution (successful submission) and used for further submissions if asked again

    3. Yes, disable all search engines, and options below that box. Than import your URL list and you are done.

    4. Import to projects (you can select more than one and right click->import) or import to site lists.

    5. If you think one URL is not identified, than you can use URL#engine name and import that. But usually it should identify everything correctly.

    6. It follows redirects but not if it's a simple link on the page and not redirected by header.

    7. You have to enable the different site list types and folders.

Sign In or Register to comment.