Changing Keywords - are you doing it?
Hey,
I'm using the 100k keyword list from this forum for all of my projects to cast a really wide net of pages i can target. i dont go for "relevancy" since it makes no sense for most engines imho (correct me if im wrong).
I don't let SER collect keywords since most collected keywords are just some longtail stuff i dont need.
Does it make sense to use different keyword lists in my projects or to refresh that list after some weeks?
Im seeing lots of "already parsed" and my LpM is creeping at 30-40 most of the time.
First thing i'll do is uncheck some poor engines, but that i think i also need to work on that "already parsed" thing.
Your ideas are welcome.
Best Regards
PS: I'd do a fresh list by putting like 20 generic words (state, finance, woman, man etc.) into scrapebox and let SB do its dirty job, sounds like a good idea?
Comments
I have been using that one for a year and still get great results.
You could take that million URL list, stick it in scrapebox, de-dupe it, and break it into 100,000 blocks. And then change it every month, and then repeat the cycle.
Remember, there are millions of new blogs coming on the scene every day, so the same list can continue to get endless results over time.
That could easily be a function of how many projects you have, how many limits you have on projects, etc.
I just started some spring cleaning to remove the worthless projects that weren't bearing fruit, and my LPM went down. So don't sweat it. LPM is sexy, but results matter more.