Re Using Lists
I am running a couple of tier 1 projects, mainly using your SerLists. I also scrape in GSA, even though it is not awesome I get by. I recently added some tier 2 projects, and I understand that I can re-run the lists through for lower tiers. Problem is this is huge overhead- I am logging into my server every 12 hours to clean and replace lists for my lower tiers.
I have tried scraping the lower tiers... and it is ok- but I do not scrape enough URLs to complete the projects- they are active forever.
What is your advice on keeping tier 2 fueled? Re running lists or scraping or what? I have tried scraping with Gscraper and Scrapebox but it doesnt seem to improve anything and takes up my time. Thanks!
Comments
That will keep them fueled and if you do want to import any new lists, you can import them directly into projects and it will run the new lists before going back to identified.
The projects sometimes have the message unable to find targets but if you take note of the verified column for each project you should see it is still climbing, so you can just ignore the important message.
What happens if you import directly to a project SER flies through them and it only gets one chance per URL per project to get a verified link. But if you put them in identified SER has multiple chances to get the same link because it picks URLs at random from identified list.
You can also select "allow posting on same domain" in the project options. For tier 2/3 etc this is fine as you are using multiple URLs from higher tiers as your link. So you can post to the same site over and over, chances are each time it will be a different tier 1 link.
I don't know if i explained that very well?
The guide is based mainly on how to get maximum speed because everyone is obsessed with LpM. But it does require more work doing it like that.
I have all projects running from identified as a backup, so when i get a new list i import it first into all the projects and then second into identified folder.
The only problem is that after a few months your identified folder will be huge and full of sites that are now dead or no longer allow posting and this will slow down SER.
So what i do is delete everything in the identified folder and start again from scratch, this is fine for me because i am always building as well as buying new lists. If you don't get many new lists you might want to clean up your identified list as per our guide - Instead of deleting it completely.