I think people need to be more precise in what they are saying and comparing against. Everybody here has a different set-up. Most people use SER to scrape, and some use lists to get their results.
As you already know, I'm involved with lists, so that is my approach. Even with lists, I have noticed issues with the overall lower number of verified, submitteds that never verify, and things like that. On the other hand, I'm still getting very nice diversity with contextuals.
Below is a project where I have it set to 100 contextuals per day:
You can see that for the past 7 days I have been dancing right around the 100 mark which is my limit for the day. And you can see a wide diversity of engines creating the links.
I guess the point I am trying to make is that you need to throw more screen shots up if anyone (especially @Sven) is going to be able to narrow down the problem.
I already know that the list thing is a big deal when it comes to link production and avoiding other issues - that is why I headed in that direction. But regardless of where you get the links from, you guys have to throw up more screen shots, whether you are having SER scrape or using a list, and then make an attempt to isolate the variables.
For example, I have 20 test projects all with emails and everything, and they point to a total bullshit URL that doesn't exist. So if you want to test mediawiki as an example, only check that engine, give it a list of only mediawiki, use new emails, and let it run.
Just to show you the test projects I have setup so you don't think I am pulling your leg, here's a quick screenshot:
Listen to me: Everybody is afraid to send in backup projects to Sven because it has their sacred website info there. You 100% avoid that issue if you set up dummy URLs and dummy test projects. Then you can actually use a test project to test one engine, or test another engine. Just set-up the project properly as if it were for real. Stick in new emails, make sure the target URL cache is empty, stick in spins for the articles, etc. And remember, once you setup one test project, you can clone it. Just make sure you have new emails in each.
The key is then you can send Sven the details with one click to backup this dummy project - and say "Here Sven is a backup showing you a MediaWiki problem" or whatever you think the issue is.
If you do this, @Sven will be able to help. If you don't, then you are not really helping that much.
I think everybody should set-up test projects, and actually isolate the engines where they think there is a problem. Then back it up and send it to Sven. Then he can help.
@MorphMan - I set up whatever I want to test. But let's say I wanted to test Mediawiki.
I would setup a brand new project from scratch, new emails, new spins, and then set the project settings exactly the way I would want it as if it were a real project. So the settings in the project are all set exactly the way I would want it for a contextual project, etc.
Then I would create a new blank folder and call it mediawiki. I would then map that folder to use the Failed folder in Main Options.
Then I would grab my verified list, and copy the media wiki file, and past it in that new folder that has the Failed folder pointed to it (leave Failed unchecked in Main Options).
Then I would set the new test project to use the "failed" sitelist in the options tab within that test project.
Then let it run to completion. And then you have something to analyze.
Final note: I would dedupe your verified list first at both the domain and URL levels - before copying and pasting the mediawiki file into your new folder. It will save you a ton of "already parsed" messages.
I forgot to add one thing because I am so involved in lists. Let's assume you use SER to scrape for targets. Then you skip the whole failed folder thing and copying the mediawiki list part.
Instead, you just set the search engines to your normal set-up and let it rip on just mediawiki following the above example. Then you can actually see what's happening from that angle as well.
Seriously I have nothing from you but "I have sooo less lpm"...noone is sver coming up with proved where a list of utls worked before but doesn't work now. Nothing. I keep an eye on everything even though I don't see any issues on email verification or certain engines posting.
The only problem someone figured out on it's own was that his emails got blocked (hotmail/outlook) and so never produced any good links.
And please stop this "GSA is dead" talk, nothing is dead especially not GSA as this is the company. GSA SER however works as before for many (including myself). Don't get me wrong, Im here to help and I agree that if many have seen a low LpM, maybe there is something wrong, but you have to give something to play with. Not some "Yes low LpM for me as well". Come up with some URLs, engines or anything...
And don't be shy to share the project backup. I have zero interest in spying on your niche. I just don't care.