Skip to content

SUPER LOW Amount of NEW Targets to Post to...

Running at 200 threads, 2 core, 2 gb ram.

- 400k 1-2 word kws separated into 1000 each file to scrape for targets.
- 35+ projects.
- avoid posting to domains posted before
- NO global list used to scrape for new targets
- 2-3LPM
- posting only to Articles, Directories, Wiki, SN, SB, Microblog
- PR2+

After 10.5 hours -> ONLY 20 VERIFIED...

Running the same above setting on 2 licensed copies of GSA SER and still both copies the similar stats.

When I remove "avoid posting to domains posted before", then my GSA SER FLIES with 150-200LPM and 50k subs, and 20k+ verifies per DAY.  But then it just posts to the same 50-60 domains in the verified list which is BAD, because post Penguin 2.0 you need more diverse domains to post to.  Hence, trying to scrape new targets, but it's giving about only 40 new targets per day...

Anyone getting the same low amount of new targets to post to?


  • Yeah, I'm having a very hard time finding new sites to post to through searching.  but of course if I use global lists it flies like no tomorrow.
  • Same here. One thing I notice is GSA doesn't actually use keywords.  If you watch your log and stop it, then right click on a search engine query, copy URL, then post into a browser, it literally never has a keyword attached.  I'm almost thinking we're all posting to the same domains.  Unless I'm missing something, I've literally never seen a keyword, and I've done this method dozens of times to see what's going on. @sven?
  • use "always use keywords" then
  • @Laubster...that's strange to hear!

    I'm going to test what @ozz suggested to "use keywords only" and see if I get more targets!!  :)
  • Yeah, not much keywords are used to search. But I assume on doing what @Ozz told would decrease targets or it'll still search with footprints definitely. I think I tried it and either results weren't overwhelming or were the same.
  • I've done both use keywords,  And Not,  I'm still getting horrible search results.
  • s4nt0ss4nt0s Houston, Texas
    edited July 2013
    It can only pull the results that the search engines feed it. If you want more results, you'll need to modify the .ini's and add more footprints.

    If you know how to create footprints, you can beef up your scripts quite a bit. 
  • edited July 2013
    I know the power results are in the custom footprints, but I don't even know where to start regarding how to find/identify them...

    And even when I feed lists from Scrapebox I'm not getting crazy results...
  • edited July 2013

    In which ini file are footprints located?

    Not a pro at finding custom footprints but I could at least try to give a shot. Though got 38.5K submitted today, but I need more.
  • What you need to do is a couple things 

    1. Go through your verified submissions , look for common "footprints"  that occur on every site in a platform, like if your posting to drupal, lot's of those will have inurl footprints such as 


    And a variation would be inurl:/node/ "user" "password", because some drupal sites have the user login box in a sidebar.

    Just an example - 

    Make sure to do this for actual verified submissions, so you know exactly the sites your scraping will have a higher chance of a successful submission than just using stock footprints.

    2. When your scraping target sites, you invariably will end up with finding other spammers backlinks, so what you need to do is install the backlink checker add-on in scrapebox, it now uses seomoz api to pull backlinks, hire a VA for 2.5-5.00 an hr from fiverr, to create about 20-40 free seomoz accounts, throw those into the backlink checker and pull all the backlinks seomoz will allow per url  (1k)... Then throw these links into GSA, where u can import and identify platforms.

  • s4nt0ss4nt0s Houston, Texas
    @s4nt0s - The footprints are located in every .ini after the line, "search term=".

    Open up a project or create a new one and then in the platform selection window double click on a platform to open up and see the script.

    @yougotmymoney - Awesome idea with the SEOmoz API + backlink checker add-on. ;)
Sign In or Register to comment.