Skip to content

[Help] How to get Target URLs with GSA Only

Should I keep trying to get Target URLs with GSA?

I have been experimenting with using GSA SER and its Search Engine functionality to get Target URLs. Prior to this testing, all I ever did was scrape for targets with GScraper or Scrapebox, and split those URLs between my projects.

Does anyone feel that it is even worthwhile to use GSA SER Search Engine functionality to get Target URLs anymore?
I ask this because I am running into many issues. 
The first problem is that if I use my private proxies for GSA submitting and verifying, I am fine. Once I add searching, the proxies are burnt up in a flash. Do I just have crappy proxies? To workaround this, I started using Public proxies pulled from within GSA SER just for GSA Search to acquire Target URLs. It works, but it is slow, and the overhead (additional threads for continual proxy leading and testing) is not desireable.Should I just give up on GSA Search? Here are my settings:
Hardware
13.6GHZ
Quad
4GB Dedicated RAM
30GB
Unmetered
1Gbps

Data > Keywords = Large list of English keywords
Collect keywords from target sites = yes
USe collected keywords to find new target sites = yes
Put keyword in quotes when used in search queries = yes

Options > How to get Target URLs
Search Engines to use = Check by Language > English
Always use keyword to find target sites = no
Add stop words to query with = 20%
Use URLs from global site list if enabled = no
Use URLs linking on same verified URL = yes
Analyse and post to competitor backlinks = yes

Settings > Options
Threads = 160
Proxy list > Options > Threads = 70 (Timeout = 5 sec)
Use proxies = yes
Configure > List > 0 Private Proxies
Configure > Options > Search for new proxies every 15 minutes
Configure > Options > Only on less than 20 active proxies
Search Engines = Public
Custom time to wait between search engine queries = 5 seconds
Disable banned proxies = yes
Submission = Private
Skip for identification = no
PR Checking = Public
Verification = Private
Stop projects on no active proxies = no


Comments

  • goonergooner SERLists.com
    edited January 2014
    I think "Custom time to wait between search engine queries = 5 seconds" is the problem. I've seen other people recommend that setting but whenever i've tested that my proxies are dead in a day. Try 120 maybe
  • I just adjusted it from 5 to 60 and changed Search Engines to Private.
  • SvenSven www.GSA-Online.de
    edited January 2014
    yes 5sec is not a good idea. It means the proxy/'ip uses that same search engine every 5 seconds. Of course you get banned soon for this.
  • gooner Sven I have it running smoothly now thanks to you guys. What I did was allocate Public proxies acquired by GSA for Search and adjusted the timeout in the 60 sec range. I refresh my Public proxy list every 20 minutes and I give it 100 threads to get the Proxies sorted out as fast as possible. My projects lag at that interval, but so what. I get to use GSA to search to get targets on autopilot now and I am happy. I am building up my Global site lists this way using GSA SER Search with a set of sub-projects that scrape for traditional T1 and T2 targets in various languages. As a side benefit I have these subprojects building backlinks to links I have built for a traditional T1 project pointing to a Money Site that I am experimenting spamming links with. Win Win Win Win. Cheers!
Sign In or Register to comment.