Don't just use global site lists!
I was just using global site lists, and i was getting 30 lpm on all engines, only using the verified global list - terrible right? (With 150 Threads! 10 Proxies)
I went to options -> Advanced -> Tools -> Export site lists -> verified
I then imported that verified .sl list into 1 project with no limits.
If i went any higher thread wise my pc would probably shoot off like a fucking nasa rocket to join the mars rover.
So what i learned from this, Global lists suck - Verified lists rule
Comments
From what I understand, when using the verified global sitelist, the project picks out the targets obviously from the verified file, but it does not work it's way through the list in a linear fashion, it picks out targets randomly, this may contribute to the 'already parsed' messages and the reduced LPM in comparison to directly importing targets.
When a list is directly imported to a project, that project runs through the entire list from start to finish, it is more reliable in that respect.
Selecting targets from a sitelist by ticking a checkbox is more automated... I'm someone that uses SER to the maximum, normally running around 200-250 projects on each of my installs, with a massive verified sitelist, when I try to import my verified list directly to over 200 projects, SER simply freezes and crashes due to the sum of targets involved, being imported to each and every project.
If you're using SER casually without so many projects and you don't mind loading up your targets manually then yes, directly importing your sitelist to the project is a lot more faster.
Importing targets directly means that SER doesn't need to load more targets between runs and also prevents SER picking out the same targets causing 'already parsed' messages which in return slows down your link building.
Matt
Matt
The question wasn't churn and burn, think you got this thread confused with another.
Why would there be a reason to build a global list of many platforms if you are only interested in the important ones...
You should indeed have a way of constantly building your verified list, but I don't think wasting time gathering up all platforms is a good way of doing things.
All everyone needs to do to process their raw links is this...
Set up a new project, name it 'Processing' select the platforms you want to start building a verified list for and then add some content/emails etc untick all sitelists and the global site list box, hit ok and you're done.
Everytime you have completed a scrape, deduplicate the list and simply go to your 'Processing' project, right click and import targets and let it run, depending on how fast you want this done, pause other projects whilst doing this.
With 1 Processing project which receives fresh targets regularly, all other projects can be set to verified lists only.
Tim high five matt _o/
For targets that are of value, that would be all of them except exploit, rss, referrer, pingback and indexer, those are junk in my opion. All the other targets have their place in structures. There are higher value targets such as article, wiki, etc; then there are junk links, but still of value for say tier 2 or 3, such as blog comments, guest post, trackbacks, etc;
So what I do is build those target list and then I can put what goes where withing my skeleton depending on what type of project I'm running eg; churn and bake, long term tired sites. Yes you are correct in your thinking and explained it better than I should have the first time.
Matt