Skip to content

Using the same 100k keyword list for all projects?

edited February 2013 in Need Help
Hi there, I found a great list of top 100k searches, and I used it for all my projects (around 6), but now its getting really slow with a lot of already parsed url's.
Is this because I'm using the same list? 
Even if I clear the caches.. I still get already parsed

Comments

  • Same problem happen to me, my GSA SER running slow for me also and i also use 100K generic keyword with 156 English language at search engine options.

    image

    The bellow options is i'm post to any PR and avoid post to the same domain. That's all...

    i Use scheduler with 10 and 30 minute waiting time.

    anyone have a solutions?
  • I guess you could reduce the number of Search Engines that you use.. some ppl recommend 5-10 only.. others recommend to choose 5 google SE.  (different countries)

    I found that unchecking "always use keywords to find target sites" helps a lot.
    Also you could try to check "Submitted" in your urls from global site lists...

    I was reaching 120 LPM doing this.. but then:  my proxies got flagged from google!  damn! this is a complex game. 
  • Which 5 people are using, i have also read some where, as i am also using 156 engine, all English language and what that 500+- 600 means i am using o to 100.( It will be give verified links between 500 to 600  ?)  .Can u give link for that 100k keyword as i am also using that, buy got from some other forum so can be different i think .
  • And what referer and indexer does /? indexer function is similar to seo indexer ?
  • @prodak i read in the threads it's recommended to have all keyword lists different in all projects
  • @prodak Can you share that 100k searches list?
  • You would be better served to split the keyword list up into chunks and load different part in different projects.  Otherwise all your projects are doing the same searches (assuming that the projects are using the same engines). This is wasted work for SER to be doing!

    This is even more true if you are creating Global Site Lists. Because the urls that are found in one project will be loaded into the GSL, and then they can be used by all your other projects (without SER having to do any searches).

    The other point to consider is that, if you are using the same KW list that 100's of others are (even if it is a large list), you are just ending up with the same sites that everyone else is. There are plenty of KW scrapers available. Scrape your own keywords and load them into your projects.
Sign In or Register to comment.