Skip to content

No Targets to Post

Today I'm seeing this in my GSA campaigns.

No targets to post to (maybe blocked by search engines, no engines allowing url extraction chosen, no scheduled posting)

Goal of Tier 1 Project:

Find Article, Web 2.0 and Wikis that are PR 1 and higher with 100 or less outbound links from English search engines and only from .com and other such country specific TLDs.

Summary of Project Settings.

  1. Articles, Web 2.0; Wikis
  2. Using Proxies (don't appear to be blocked as I don't see a ton of socket or download errors)
  3. 306414 keywords used (big list of English keywords & a few niche specific)
  4. 131/852 search engines (English checked)
  5. Always use keywords to find target sites (with 300k I shouldn't be close to done)
  6. Use Global Sites enabled (Identified; Verified; Submitted)
  7. 100 Outbound; PR 1 or Above; Skip PR N/A
  8. Skip sites/urls with bad words (pretty decent list)
  9. Skip sites from the following languages (all selected making it just .com and such)

QUESTION:

Is GSA really the issue here or is the issue.. I'm really being very specific and as such, there aren't going to be a ton of targets unless I go scrape on my own?  GSA seems like it will scrape over time and find sites to post to using that massive keyword list. 

But it's stopped finding very many sites.  1 a project a day if I'm super lucky.

Maybe Outbound 100 is irrelevant here given that I am not doing blog comments?

Any help is great appreciated.  I have read quite a few of the FAQs and threads regarding performance.

Comments

  • SvenSven www.GSA-Online.de
    what do you see in the log?
  • I'd be happy to send or PM you the full log file but I'm attaching just a sample here.

    Sample from the log:

    17:05:52: [ ] Attention! No targets to post to (maybe blocked by search engines, no engines allowing url extraction chosen, no scheduled posting)
    17:05:55: [ ] E-Mail Verification finished.
    17:05:56: [ ] Verifying finished (1 check done)
    17:06:00: [ ] Loaded 0/40 URLs from site lists
    17:06:52: [ ] 000/020 [Page 031] results on Lexxe for Article Friendly Ultimate with query "Author Login" "Contact Us" "About Us" "Submit Article" "Free Signup" Home "Search Site" lanosterol
    17:06:56: [ ] 000/010 [Page 003] results on Startpage GB for Catalyst Web CMS with query "ForumRetrieve.aspx" lanosterol
    17:06:59: [ ] 000/020 [Page 045] results on Lexxe for TikiWiki with query lanosterol "Theme: Coelesce"
    17:07:00: [ ] Loaded 0/30 URLs from site lists
    17:07:04: [ ] 000/020 [Page 031] results on Lexxe for Article Dashboard with query "We reserve the right to include advertising on pages with your articles" "Disclaimer" "Author Terms of Service" "Publisher Terms of Service" lanosterol
    17:07:06: [ ] 000/020 [Page 036] results on Lexxe for Article Directory Pro with query "Submit Articles" "Make us your home page" "RSS Feeds" "Add us to favorites" Pristis pectinatus
    17:07:08: [ ] 000/020 [Page 011] results on Lexxe for OSQA with query "questions tags users badges unanswered" "ask a question" lanosterol
    17:07:10: [ ] 000/010 [Page 003] results on Startpage LR for Catalyst Web CMS with query "ForumRetrieve.aspx" lanosterol
    17:07:11: [ ] 000/010 [Page 003] results on Startpage US for Zendesk with query lanosterol support ".com/entries/"
    17:07:15: [ ] 000/010 [Page 003] results on Startpage BS for Article Directory Pro with query "Submit Articles" "Add us to favorites" "RSS Feeds" "Make us your home page" lanosterol
    17:07:21: [ ] 000/010 [Page 004] results on Startpage DM for MoinMoin with query "GPL licensed" "MoinMoin Powered" lanosterol
    17:07:22: [ ] 000/010 [Page 005] results on Startpage ZA for MediaWiki with query "wiki" "This page was last modified on" lanosterol
    17:07:26: [ ] Project on pause for 349 more minutes as 44 submissions have been reached.
    17:07:26: [ ] 000/010 [Page 002] results on Startpage CA for MediaWiki with query lanosterol inurl:"wiki/index.php?title=" wiki
    17:07:28: [ ] 000/010 [Page 005] results on Startpage US for DokuWiki with query "DokuWiki supports some simple markup language" lanosterol
    17:07:31: [ ] 000/010 [Page 003] results on Startpage BZ for MediaWiki with query inurl:"%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D0%BA:" wiki lanosterol
    17:07:46: [ ] Loaded 1/30 URLs from site lists
    17:07:46: [-] 1/1 filter "woman" matches domain of http://www.iamwomanglobal.biz/business/linkedin-business-friendship-and-health-cardiff-womens-business-club-social-media/
    17:08:00: [ ] Loaded 0/40 URLs from site lists
    17:08:52: [ ] 000/020 [Page 030] results on Lexxe for AltoCMS-LiveStreet with query "Powered by LiveStreet CMS" lanosterol
    17:09:00: [ ] 000/020 [Page 041] results on Lexxe for Catalyst Web CMS with query "ForumRetrieve.aspx" genus Dipladenia
    17:09:01: [ ] Loaded 0/27 URLs from site lists
    17:09:01: [ ] 000/010 [Page 005] results on Startpage ZA for MediaWiki with query inurl:"Special:Whatlinkshere" lanosterol
  • Update:

    If you are wondering why it's using "lanosterol" it's because that is one of the 300k keywords it can choose from the huge list I am using.  Maybe once it goes thru all the iterations it can, it will pick another.  But I've seen this one for quite some time.  Like it's stuck on that for all it's queries?
  • I did see this in the log...

    17:59:29: [-] 1/1 language en-us detected and skipped

    Which is very odd given I want English sites to post to?
    I am wondering if I messed up the Skip sites from the following languages
    and accidentally picked English?  But I do have some posts to a few sites so that doesn't make sense.

    But the error above is odd given it is exactly what I want to post to.
  • Update:

    I did have the English checked to skip!
    Kinda a big mistake.  But when I unchecked it and saved.. I still see sites being skipped because it's English.

    I've always been under the impression GSA's running projects will auto-refresh upon such changes. 

    QUESTION:

    Does the project needs to be STOPPED and STARTED for those type of thing to take effect?  When I entered the list of keywords and saved, it started using them without me pausing or stopping and starting the project.
  • SvenSven www.GSA-Online.de
    the project should detect the change after a short time and reload the settings. But a stop/start would do it faster.
  • I did a stop/start yesterday.  Also imported a pretty decent sized list of sites that were identified.  My Identified list is now topping 8k.

    Could be my # of submissions a day is uber low.  I'm trying to be ultra conservative.

  • My Submissions per day for this project are: 0 +/- 5

    I am using Global: Identified; Submitted: Verified lists although given this is the 1st project, I wouldn't think the 2nd two selections matter much unless it was a lower Tier or another project that could take advantage of that option.

    Some times it will reach the limit in a day with 2 submissions.

    But I did check the Target URLs to post and it really has grown.  Not that big mind you but it went from 2 sites to 88 sites after scraping and importing more.  I believe that is because it's using Articles, Web 2.0 and Wikis only and unless I scrape and find new sites of that class to post to, they won't show up in the new target sites to post.

    Tasks:

    * increase the # of submissions before pausing
    * scrape for those specific sites and get the list bigger so GSA and use it

    I have to get the Identified list bigger so GSA can post to more sites that fit the critera for this project.
  • One thing I have noticed since using a combination of Private/Public proxies for searching...

    11:49:53: [!] Proxy 139.0.19.157:8080 blocked on google AS

    This is just one example but I get a lot of this running thru the logs.  This isn't one of mine so I know it has to be a Public proxy for searching.  But this could be getting the way.

    I've seen so many videos going back and forth on the best method.  So say to use Public for searching and let GSA find those.  And then Private for PR checking.  Or split it up... goes on forever.  Confusing!

    Given that I'm not searching for 1M sites a second, maybe I should be using Private 100% so I can just let GSA do it's job and keep moving.
  • Stats:

    12 Submitted; 4 Verified

    In 4 days running.  I could create links manually faster than this.  Given that I have Article, Web 2.0 and Wikis selected and I am using English search engines and the English language to post to.. I still find it hard to believe there are just 4 places to post a link throughout the entire Internet.

    My feeling is... I haven't got this right.  I must be missing something here.  GSA can't just be used for posting crap spam links.  I've fixed a few things but the speed of this finding places to post and then doing verifications is terrible.

    Honestly, I can create 10 links myself from high PR sites by hand before breakfast.

    Given the # of positive reviews, I am under the impression it is user-error on my part or a misunderstanding of something I have checked that is killing the usefulness.
  • Here's what I find odd.. in my stats I have this.

    Article-Wordpress Article.....: 2933

    Yet I have only 12 submitted; 4 verified sites?  I certainly have Articles selected.  Could be many of them don't meet my criteria?
  • This morning I did a quick scrape and grabbed ~1400 WordPress Article sites.

    Did an "import" target URLs to the project and it started posting pretty fast.  Could be that I'm letting GSA search which is slow and when I imported a list of sites, it started going a lot faster.

    Last I checked, the verified sites was still a measly 4.  I have it set to Automatically verify.

  • goonergooner SERLists.com
    Wordpress is very hard to get links from and very hard to scrape accurately because it's difficult to know the difference between one of the millions of standard Wordpress sites and the few that allow you to submit articles to.
  • edited September 2014

    @gooner - that is a great point.  One that makes complete sense that I didn't understand.  Just because you get 10k sites in a scrape, doesn't at all mean GSA is going to be able to use a certain percentage of that.

  • Question:

    Always use keywords to find target sites

    I've heard so many varying answers to this one.  Some say if you don't, GSA will always search the same generic queries to find new sites.

    Some say, you use that for niche specific sites but otherwise, don't.  Turn it off.

    Others say.. why are ever using GSA to search?  It's slow and inefficient.  It should be a post only tool.  Go scrape and import a target list and let GSA sort it out and post.

  • Great Question Mda1125 im looking for some experts to chime in on your question! 
Sign In or Register to comment.