Skip to content

Best method for gathering keywords?

So currently im using Keyword Map Pro along with Market Samurai to gather all my keywords. It seems to work okay this way but Market Samurai will only list 801 keywords at a time. Should I invest in Scrapebox or a similar program and will give me much better results? I would like to get a bigger list of results for each keyword than 801.

Comments

  • You should invest in scrapebox, not just for the keyword scraper, but for all the other cool stuff that it does too.

    Scrapebox will give you as big a list as you care to generate - you can scrape from a set of keywords, then scrape a new set from the results of the previous scrape, with the list expanding exponentially.
  • MrXMrX Germany
    Is anyone using names as keywords ?
  • @2Take2 - Thanks for your post. I just bought scrapebox the other day and I'm still learning how to use it. I've been using the 100K most searched list for over a year now and I'm thinking I should probably start looking for new targets so I decided to invest in SB. Are you trying to stay niche relevant with your keyword lists?  How many keywords are you putting into each project? And last question, Are you using SB to scrape targets for GSA to submit to?

    Thanks,
    George
  • 2Take22Take2 UK
    edited October 2013
    Hi, @gtsurfs, you're welcome.

    I always add some niche keywords, but (depending on what you are doing) for *most* platforms you are better of using generic keywords as well.

    The number of keywords and footprints that I use varies, but generally I try to use as many as I can without crashing the program - you'll need to experiment a bit.

    Yes, sometimes I use Scrapebox for GSA SER target lists, but I'm finding that I'm doing it less and less these days.

  • One thing I always didn't understood. Perhaps let me ask you @2Take2 as you're experienced far more than me with SER. Isn't scrapping in scrapebox going to give you same results again and again, whether you scrape today or tomorrow or after a week? I mean there's not any drastic change I guess as ranks are maintained for a considerable time on Google SERPs. Isn't letting SER scrape then makes more sense? :)

    Cheers.
  • 2Take22Take2 UK
    edited October 2013
    @Pratik - With approx. 150,000 new sites added to the internet every day, I don't think that running out of targets is going to be a problem.

    If you use the same keywords and footprints every time, then yes, as you say, the more often you scrape the more duplicates you will probably get.

    If you don't, then it probably wouldn't be a noticeable problem, especially if you are scraping from multiple search engines, and not just scraping the top 10 results per keyword.


Sign In or Register to comment.