Skip to content

Re Using Lists

I am running a couple of tier 1 projects, mainly using your SerLists. I also scrape in GSA, even though it is not awesome I get by. I recently added some tier 2 projects, and I understand that I can re-run the lists through for lower tiers. Problem is this is huge overhead- I am logging into my server every 12 hours to clean and replace lists for my lower tiers.

I have tried scraping the lower tiers... and it is ok- but I do not scrape enough URLs to complete the projects- they are active forever.

What is your advice on keeping tier 2 fueled? Re running lists or scraping or what? I have tried scraping with Gscraper and Scrapebox but it doesnt seem to improve anything and takes up my time. Thanks!

Comments

  • goonergooner SERLists.com
    Why not put import your own verified list and any list you have bought into identified folder, de-dup it and the allow lower tier projects to run from identified.

    That will keep them fueled and if you do want to import any new lists, you can import them directly into projects and it will run the new lists before going back to identified.
  • Oh yeah I do that- problem is even if I import 2-3 serlists plus my own list into my tier two I run through them in a day and I am out of posting targets. I do not have enough lists to sustain a project for any length of time. Thoughts?
  • goonergooner SERLists.com
    edited August 2014
    If put all the lists into identified and let the projects run from there as a backup it should keep going for weeks and weeks.

    The projects sometimes have the message unable to find targets but if you take note of the verified column for each project you should see it is still climbing, so you can just ignore the important message.

    What happens if you import directly to a project SER flies through them and it only gets one chance per URL per project to get a verified link. But if you put them in identified SER has multiple chances to get the same link because it picks URLs at random from identified list.

    You can also select "allow posting on same domain" in the project options. For tier 2/3 etc this is fine as you are using multiple URLs from higher tiers as your link. So you can post to the same site over and over, chances are each time it will be a different tier 1 link.

    I don't know if i explained that very well?
  • Yeah that makes sense, I will turn it on for my lower tiers. It also makes sense about importing them into identified instead of into each project. Am I safe to use the identified folder for all projects including Tier 1 and 2?

    As I get new lists should I then import them into identified so that they can be accessed by all projects? I assume this is preferred- then any lists that I scrape personally I would download to a particular project by importing target URLs.

    In the SerLists quick start guide it suggests changing your identified folder to a new blank folder each time you import a list- is this correct? Do I re-add all lists each time I do this?
  • goonergooner SERLists.com
    You can do that for all tiers yes. But tier 1 tends to need much less importing, especially if you are building only 10 - 30 links per day for example.

    The guide is based mainly on how to get maximum speed because everyone is obsessed with LpM. But it does require more work doing it like that.

    I have all projects running from identified as a backup, so when i get a new list i import it first into all the projects and then second into identified folder.

    The only problem is that after a few months your identified folder will be huge and full of sites that are now dead or no longer allow posting and this will slow down SER.

    So what i do is delete everything in the identified folder and start again from scratch, this is fine for me because i am always building as well as buying new lists. If you don't get many new lists you might want to clean up your identified list as per our guide - Instead of deleting it completely.
  • Yeah I probably will be buying all of the lists. LPM doesnt mean much for me since I am mostly Tier 1- my first month averaged like 0.07 LPM and it went just fine. I still find myself running through a SerList on my Tier 1 though really quick- it certainly does not last forever. Let us say that I use your tactic and one of my Tier 1 properties runs through all of my lists... Do you personally prefer scraping or do you re import a list at the project level again?

  • goonergooner SERLists.com
    It won't hurt to re-import the list directly. You're definitely get more links out of it. I've tested importing a list to a project up to 10 times and on the 10th time it was still getting new links. But that was using all engines, for contextuals only i guess it would be worth importing it at least 3 times.
  • Gooner, you have been most helpful, I thank you for sharing your knowledge!
  • goonergooner SERLists.com
    No probs pal, you're welcome :)
Sign In or Register to comment.