Skip to content

Any Easy way to Import 100k keywords in GSA SER

Any Easy way to Import 100k keywords  in GSA SER  

Comments

  • empteeemptee http://prjacker.com
    No offence.. but you're probably doing it wrong..

    If you're wanting to scrape with SER, you're better off using something else (ScrapeBox, GScraper, etc..)

    If you're wanting to use the keywords for anchor text.. well.. do you really want/need 100k keywords? LSI is one thing.. but I can't think of any reasonable way of checking over each of those keywords to make sure they're what you want..

    Just my 2c..

    Michael
  • Hey Michael,
    That for your response yah i got some info by reading out threads some  software like http://www.getspintax.com/  or by using scrapebox is best for this....
  • One more thing i need to know that how to get 100k keywords...for these i also use scrapebox or any other source...
  • andrzejekandrzejek Polska
    edited July 2015
    The best way is to create your own keyword list.. well you can always use these dictionaries but i see no point personally - you just get duplicated results all the time. But in the other way, if you need all the domains and you dont care how long it will take to scrape... go for it. It all depends on what results you want to achieve.

    I personally use pretty random words ex.

    Polycarbonate
    Resistant
    tendence
    rainbow
    Gibberellins
    Gravitropism
    Maize
    Shoots

    Now i run a scrape with 433k words like that, in different languages. + 33 footprints that equals

    13 856 000 queries * 10 pages each query =

    130 856 000 actuall queries
    for example 30 seconds pause per query and 500 threads that equals
    1000 queries per minute that equals 
    90 days of parsing... 

    Its just an example but you should consider these things before you start scraping. Or you should consider other way of getting domains lists ;)

    Regards

    Btw. thats why i love hrefer, removing duplicates when it scrapes...

    image

    84 779 926 duplicated domains filtered in 4 hours... after 90 days of parsing how would i delete all the domains?

    that equals

    46 440 000 000 duplicated domains after 90 days 

    ~O)
Sign In or Register to comment.