Skip to content

How many keywords needed?

edited October 2012 in Need Help
I know I think Ozz has said he normally puts in 10k + keywords (a lot!) in a project.

My question is if GSA works in such a way that putting say only 100 keywords or so would have any limitations in terms of it finding sites to put links on. If you're thinking of running a project for weeks and months on end, would this in any way affect the ability of GSA to find links? Would it slow down its ability to find links? Would putting much more keywords make it find more sites and find them much more quickly?
Tagged:

Comments

  • OzzOzz
    edited October 2012
    Of course it has. Once all 100 keyword were used with all search engines than you won't get much target URLs anymore. Remember that the results of all SEs will be duplicated to some point and the results won't  differ that much from day to day within the maximum result of 1000 targets per SE.

    You can try to add some phrases like "best", "cheapest", "top", etc. to every Keyword and you might get some different results.

    You can spin them with {{best|top|cheap|awesome} Keyword1,Keyword...,Keyword100}
  • Is there an option to tell  you that all your keywords have been used anywhere?
  • AlexRAlexR Cape Town
    @Ozz - How do you generate 10 000 keywords! Surely they are all so similar that 90% generate duplicate results (leaving about 1000 keywords that generate unique target URL's) so that it only uses resources to find duplicate target URL's? I tried in SB with this many and almost every time it generated 90% to 95% duplicate target URL's. So I guess, either you have a very smart way of generating 10 000 keywords that generate unique target URL's or you're getting duplicate target URL's. ;-)
  • thats cool to know that even keywords are spinable,

    You can spin them with {{best|top|cheap|awesome} Keyword1,Keyword...,Keyword100}

    makes it a lot easier

     

  • edited October 2012
    Ozz,

    When I tried to import my keywords using spintax, it looked like this - see here http://screencast.com/t/eAGnsiAqFim

    Is this the way it's meant to be?

    Then when I try to save the project, it says, "The spin spintax in keywords might be wrong. It produces content with {} spins in it"

    image
  • OzzOzz
    edited October 2012
    test the spin with the "test" button. if everything looks fine in the "keyword" field than it should work.
    i did this spin on a former project and i don't use that by now so i can't test this by myself.
  • Good idea. I tested it and it seemed like none of the spins had that added "best", "top", etc. I tried to put in.
  • AlexRAlexR Cape Town
    What's the best way to generate keywords to use? I am using scrapebox keyword tool, but when I scrape the urls I'm having trouble with it generating too many duplicate target URL's. 
  • You can try to use googles keyword tool first and expand these with scrapebox.
  • Yes, using googles keyword tool then putting it into scrapebox helps a lot. Gives me much more keywords
  • AlexRAlexR Cape Town
    That is what I did. I used the Google keyword tool and then expanded it with SB. I then add in the GSA footprints, but I get too many duplicate target URL's when I run the keywords that are generated through the SB URL harvester. Any ideas? I figure others are having the same issue.
  • OzzOzz
    edited October 2012
    If you only want to find new targets you can also add some generic keywords + your main keyword.

    For example:
    generic keywords -> able, possible, be, a, soon, ... ||| ab, ac, ad, ae, .., ba, bc, bd, ... ||| 01, 02, 03, ...
    main keyword -> dog, cat

    => keywords for search query -> dog able, cat albe ||| dog ab, cat ab ||| dog 01, cat 01

    If you want to do that than google for some scrapebox tipps about that topic and search for some generic keyword lists (most common words, most common names, ....)

  • I think that maybe some people expect to be able to find too many sites. Use a little logic in your thinking. There are only so many website pages on the internet about "blue widgets". When you break them down even further by platform (like SER does), there are even fewer. So, at some point, it doesn't matter what you search for ( best blue widgets able, top blue widgets, blue widgets able, who sells blue widgets,...) - you are going to return many of the same pages in the search results.

    This is especially exaggerated if your niche is fairly narrow. There are going to be fewer keywords (and fewer search results) to choose from. You are going to have to choose to either live with this or use generic keywords and be happy with the links you get.

    Be realistic in what you expect!
  • AlexRAlexR Cape Town
    @DavidA2 - that's exactly my point! It seems that after a certain point, you start generating duplicate target URL's and as a result it doesn't make sense to use a massive keyword list, as it just consumes resources scraping these, only to find that most will already be parsed! So once you have done the basic set of keywords generated by the Google keyword tool, and you use multiple SE's, you should in most instances have most of the possible niche/keyword target URL's. No need to expand the keywords endlessly. That's why I am trying to focus more on getting the links verified of the target sites it has found. (also why I am asking for a feature to resubmit target URL's that failed on a project basis, rather than a global basis)
Sign In or Register to comment.