Skip to content

Question about how GSA finds target site

Obviously SER find target sites with the keywords that I have provided.

But what about the sites that are not tailored to specific keywords - like article directories and web 2.0s. How does it find these kinds of sites if left on its own?

The reason I am asking is that with around 100 keywords and only article directories enabled I am getting a very low submission rate (0.1 LPM).

Comments

  • SvenSven www.GSA-Online.de
    For those is uses either generic keywords (stop words) or just the plain footprints.
  • bestimtoolzbestimtoolz High PR WEB 2.0 posting service - affordable !
    Why  you using GSA SER for scrapping, it`s waste of its potential, have you tried GSCRAPER or Hrefer for scraping ?
  • @Sven OK. I left it yesterday night and checked in the morning after it ran 12 hours. It seems it had stopped building at night itself and had built only 70 links with only 10 verified. It was showing the message "2014-08-29 19:07: No targets to post to (maybe blocked by search engines, no engines allowing url extraction chosen, no scheduled posting)". 

    What could be the problem? The proxies are working fine for searching, so we can rule that out.
  • SvenSven www.GSA-Online.de
    what engines do you use?
  • edited August 2014
    Article engines only.

    I set the project status to email verify only. In the log it shows that it's checking various emails with verifications waiting. Then it says e-mail verification finished. But none seem to be actually verified and when I go to the URL list there are still submissions awaiting account verification.

    Also I happen to be getting a lot of "proxy might be blocked" messages in the log, even though they successfully pass a google test.
  • edited August 2014
    I set up a completely new test project with articles, web2s and wikis with zero filters and 2000 keywords. Let's see what happens now.
  • edited September 2014
    Hey Sven,

    I ran that test project, and I came back and saw that the new test project had the same problem (lots of "awaiting account verification" messages). In over 12 hours it has been able to verify only 150 links. The number of submitted goes up to 1000, then goes back down to 500 and varies like that. I think that happens because the accounts never get successfully verified and so it keep increasing and then lowering the submitted number.

    I also had a similar project on a YouTube video that had ALL engines selected. I got a good 8-14 LPM out of that. HOWEVER then I clicked Show URLs on that project, it ALSO had many account awaiting verification messages for Articles, Wikis, Social Networks and Social Bookmarks. This wasn't a problem with guestbooks, blog comments, trackbacks, etc.

    I simply don't know what the problem is, but here's a list of potential problems:

    - I use public proxies (but they are anonymous and google-passed and always fresh - scraped by No Hands Proxies). Maybe it tries to verify the same account from different public proxy IPs and that's why the account check fails.

    - There might be a problem with the emails that I am using, but they are all Yahoo emails and they successfully pass the email check. I don't know if they are spam-filter disabled, but they might not be a problem with Yahoo emails.

    What's really weird is that if I set the status of the project to "Active (Verify Emails Only)" it seems to verify the emails according to the log, but no URLs actually get verified.


    Help me out here Sven! What other information do you need?
  • SvenSven www.GSA-Online.de
    public proxies...well im not so sure but that is the cause of all evil.
  • edited September 2014
    Ha ha. Well Sven my guess is that that is probably the problem as well! If the same account is trying to get verified from different public proxy IPs (even if they are anonymous), I guess that might look fishy. What do you think?
  • SvenSven www.GSA-Online.de
    SER tries to use the same proxy for each email account.
  • Then what might be the problem?

    I will get my hands on some private proxies and see if the problem persists.
  • Got 10 private proxies. Scraped my own list. 

    Results: 25 LPM on average, 10 VPM on average, 10% article links, 20% social networks, 20% blog comments, 60% dofollow, 12000 verified URLs in 24 hours

    Good start for first time GSA SER user. What do you think Sven? :D
  • SvenSven www.GSA-Online.de
    yep sounds good
  • edited September 2014
    Hei Monie

    You said 10% article links, 20% social networksimage, 20% blog comments but what was another 50% ?
    And what did you use for scrape?


  • Hi Jual,

    I used Scrapebox to scrape. I think remaining links were evenly distributed among various link types. Don't remember right now. Will post the current status of the test project later.
  • ok thanks :)
Sign In or Register to comment.