Keyword Lists
Satans_Apprentice
SERLists.com
in Need Help
What's the story with keyword lists for scraping? Should it be niche specific, or a generic list?
If it's generic, can someone point me to a list?
Comments
So my question is, when you get a huge keyword list, are you putting that same KW list into every project, or just using the tools and having SER identify and sort them then and there.
Also, are you then using any/all of the 3 options of "collect kws from sites", "use collected kws to find new sites", "put kws in quotes"
I never have these checked because of the beforementioned thoughts I had, but i assume this is why I have super low LPM.
I used to create a 3000 - 5000 word list on my niche, and stopped bothering with that altogether. It was too difficult finding links.
Now I'm using a 100,000 list that is split into files of 250 each, and I use it for every project. I never run out of targets.
For me, the important point is that you will rank your site based on contextual references to your moneysite. You have complete control over the creation of contextual properties. Then add in some high PR links and you have a winner.
The links that can be found with niche specific keywords, like blogs, are fairly unimportant in the big scheme of things - just my honest opinion.
p.s. Do a search for the 100,000 most used searches. What can possibly get you more results than the most searched terms on the internet?
@tsaimllc - You are over thinking it too much, just put the keywords into the project and hit start. Job done. @ron mentioned the reasoning earlier in this thread
Here is the macro I stick in the keyword field:
%spinfolder-C:\Users\Administrator\Dropbox\kwspins%
Obviously you need the correct path, so fix that. I found early on that grabbing the keywords out of Dropbox seemed to make SER faster, so I have stuck with that.
First, there's no drag on the system with a large list. Secondly, I used the SER Editing Tool from @kaykay to bust up that list into 250 each because then I know I am using a new list all the time. It is grabbing a new set of 250 keywords each time it does its thing. I believe that when the project is started, it grabs a new file. So each project is getting its own new file - because SER looks to the project file to determine which keywords to use. In my case, every project has that token. So it's fresh meat for each project each time.