@seoaddict as you mention I believe problem is at email verification. before out of 1000 emails i would get minimum 100 verified links, now i get practically none. the only contextuals i am getting verified are the ones that dont require email verification.
are you guys getting any verified links for contextuals during email verification?
I think people need to be more precise in what they are saying and comparing against. Everybody here has a different set-up. Most people use SER to scrape, and some use lists to get their results.
As you already know, I'm involved with lists, so that is my approach. Even with lists, I have noticed issues with the overall lower number of verified, submitteds that never verify, and things like that. On the other hand, I'm still getting very nice diversity with contextuals.
Below is a project where I have it set to 100 contextuals per day:
You can see that for the past 7 days I have been dancing right around the 100 mark which is my limit for the day. And you can see a wide diversity of engines creating the links.
I guess the point I am trying to make is that you need to throw more screen shots up if anyone (especially @Sven) is going to be able to narrow down the problem.
I already know that the list thing is a big deal when it comes to link production and avoiding other issues - that is why I headed in that direction. But regardless of where you get the links from, you guys have to throw up more screen shots, whether you are having SER scrape or using a list, and then make an attempt to isolate the variables.
For example, I have 20 test projects all with emails and everything, and they point to a total bullshit URL that doesn't exist. So if you want to test mediawiki as an example, only check that engine, give it a list of only mediawiki, use new emails, and let it run.
Just to show you the test projects I have setup so you don't think I am pulling your leg, here's a quick screenshot:
Listen to me: Everybody is afraid to send in backup projects to Sven because it has their sacred website info there. You 100% avoid that issue if you set up dummy URLs and dummy test projects. Then you can actually use a test project to test one engine, or test another engine. Just set-up the project properly as if it were for real. Stick in new emails, make sure the target URL cache is empty, stick in spins for the articles, etc. And remember, once you setup one test project, you can clone it. Just make sure you have new emails in each.
The key is then you can send Sven the details with one click to backup this dummy project - and say "Here Sven is a backup showing you a MediaWiki problem" or whatever you think the issue is.
If you do this, @Sven will be able to help. If you don't, then you are not really helping that much.
I think everybody should set-up test projects, and actually isolate the engines where they think there is a problem. Then back it up and send it to Sven. Then he can help.
@MorphMan - I set up whatever I want to test. But let's say I wanted to test Mediawiki.
I would setup a brand new project from scratch, new emails, new spins, and then set the project settings exactly the way I would want it as if it were a real project. So the settings in the project are all set exactly the way I would want it for a contextual project, etc.
Then I would create a new blank folder and call it mediawiki. I would then map that folder to use the Failed folder in Main Options.
Then I would grab my verified list, and copy the media wiki file, and past it in that new folder that has the Failed folder pointed to it (leave Failed unchecked in Main Options).
Then I would set the new test project to use the "failed" sitelist in the options tab within that test project.
Then let it run to completion. And then you have something to analyze.
Final note: I would dedupe your verified list first at both the domain and URL levels - before copying and pasting the mediawiki file into your new folder. It will save you a ton of "already parsed" messages.
Cheers @ron i'll see if I can get something setup to do some trouble shooting as well as something to give to Sven. There is an issue, there is no doubting that we just need to get to the bottom of it.
@MorphMan - Thanks for doing that. I think if everyone does a little testing, we will collectively solve these problems. I'm not trying to get Sven off the hook or anything, but honestly, this software has gotten very big. If everyone divides and conquers, everything will get better much more quickly.
I forgot to add one thing because I am so involved in lists. Let's assume you use SER to scrape for targets. Then you skip the whole failed folder thing and copying the mediawiki list part.
Instead, you just set the search engines to your normal set-up and let it rip on just mediawiki following the above example. Then you can actually see what's happening from that angle as well.
Ok, after whole night of submission to contextual targets I could say it is something wrong with automatically verification, I disabled them at all in the projects (about 30 projects) and let them submit for a night, and quess what ? (: I switched half an hour ago to verify only and I got thousands of verified links ! (: it is still veryfing. Just disable auto verification let the projects run for longer time and then switch to verify you will see a huge difference (:
@thomas73 Yes I think you maybe onto something here. I just ran a brand new project set to NOT verify automatically. Ran that for a few hours, stopped the project and then started it at verify only active (V) and I have more verified links in this few hours than I've had for over a week of identical projects set with the verify automatically.
@thomas73@MorphMan
Did you guy's read the entire thread? I have already pointed this out earlier and sven asked me to send him one of my projects. Unfortunately he was not able to find any problems.
@Robby54 yes I should have mentioned that you had found this before, I've been with the thread since the beginning. if Sven hadn't found anything kinda rules it out, (would have been nice to let us all know) but for sure it's producing way more verified links this way
Robby54, I just confirmed that what you discovered ,nothing else I think it`s important to reproduce the same results in more than one examples right ? (:
It is strange Sven don`t see any problems if we see them, all of us, not 1, not 2,n not 3, but all noticed huge decrease of verified links, and as you discovered disabling auto verify mechanism brings us back to earlier results, so I think there is something wrong and it should be fixed to work the way as before.
Wow this is soul destroying. I've ran my projects for the past 24hrs and have just changed to verify only. Out of 127 projects and tens of thousnds submitted i currently have 23 links. It's been verifying for the past half hour. Logs green finding the links just not adding them to my verified column. This is the worse its been in the two years of use.
Starting to agree with all of you. Between all the crashing, Memory Errors, Verification Issues, Issues with update, Bugs. SER is no longer becoming a automated program. I literally have to sit around and babysit it at all the time. Might be time for me to finally start looking for other solutions.
Seriously I have nothing from you but "I have sooo less lpm"...noone is sver coming up with proved where a list of utls worked before but doesn't work now. Nothing. I keep an eye on everything even though I don't see any issues on email verification or certain engines posting.
The only problem someone figured out on it's own was that his emails got blocked (hotmail/outlook) and so never produced any good links.
And please stop this "GSA is dead" talk, nothing is dead especially not GSA as this is the company. GSA SER however works as before for many (including myself). Don't get me wrong, Im here to help and I agree that if many have seen a low LpM, maybe there is something wrong, but you have to give something to play with. Not some "Yes low LpM for me as well". Come up with some URLs, engines or anything...
And don't be shy to share the project backup. I have zero interest in spying on your niche. I just don't care.
LpM doesn't mean shit. I never got over 1-2 Lpm on contextuals anyways. Sven, I don't see how you can't see this problem. I mean come on. Set up a contextual project, no filters, import a list. Watch GSA take a shit and produce 30 links max. Don't use wikis, XE and other bullshit, use the main ones - Drupal, Joomla, vBulletin, the DoFollow engines. There's the problem. Simple to re-create.
Do you see there people complaining about the same massive problem yet you say there is no problem. You think we're just making this shit up? I'll send you a backup or whatever you want. This is the absolute worst problem I've ever seen on GSA and should be taken extremely seriously.
@seoaddict - Did Joomla and VBulletin used to be good? They've always been shitty for me in the 8 months i've been using SER. My master verified list has less than 1000 Joomla and less than 1000 VBulletin.
Drupal on the other hand is working ok for me at least.
Comments
are you guys getting any verified links for contextuals during email verification?
I think people need to be more precise in what they are saying and comparing against. Everybody here has a different set-up. Most people use SER to scrape, and some use lists to get their results.
As you already know, I'm involved with lists, so that is my approach. Even with lists, I have noticed issues with the overall lower number of verified, submitteds that never verify, and things like that. On the other hand, I'm still getting very nice diversity with contextuals.
Below is a project where I have it set to 100 contextuals per day:
You can see that for the past 7 days I have been dancing right around the 100 mark which is my limit for the day. And you can see a wide diversity of engines creating the links.
I guess the point I am trying to make is that you need to throw more screen shots up if anyone (especially @Sven) is going to be able to narrow down the problem.
I already know that the list thing is a big deal when it comes to link production and avoiding other issues - that is why I headed in that direction. But regardless of where you get the links from, you guys have to throw up more screen shots, whether you are having SER scrape or using a list, and then make an attempt to isolate the variables.
For example, I have 20 test projects all with emails and everything, and they point to a total bullshit URL that doesn't exist. So if you want to test mediawiki as an example, only check that engine, give it a list of only mediawiki, use new emails, and let it run.
Just to show you the test projects I have setup so you don't think I am pulling your leg, here's a quick screenshot:
Listen to me: Everybody is afraid to send in backup projects to Sven because it has their sacred website info there. You 100% avoid that issue if you set up dummy URLs and dummy test projects. Then you can actually use a test project to test one engine, or test another engine. Just set-up the project properly as if it were for real. Stick in new emails, make sure the target URL cache is empty, stick in spins for the articles, etc. And remember, once you setup one test project, you can clone it. Just make sure you have new emails in each.
The key is then you can send Sven the details with one click to backup this dummy project - and say "Here Sven is a backup showing you a MediaWiki problem" or whatever you think the issue is.
If you do this, @Sven will be able to help. If you don't, then you are not really helping that much.
I think everybody should set-up test projects, and actually isolate the engines where they think there is a problem. Then back it up and send it to Sven. Then he can help.
@MorphMan - I set up whatever I want to test. But let's say I wanted to test Mediawiki.
I would setup a brand new project from scratch, new emails, new spins, and then set the project settings exactly the way I would want it as if it were a real project. So the settings in the project are all set exactly the way I would want it for a contextual project, etc.
Then I would create a new blank folder and call it mediawiki. I would then map that folder to use the Failed folder in Main Options.
Then I would grab my verified list, and copy the media wiki file, and past it in that new folder that has the Failed folder pointed to it (leave Failed unchecked in Main Options).
Then I would set the new test project to use the "failed" sitelist in the options tab within that test project.
Then let it run to completion. And then you have something to analyze.
Final note: I would dedupe your verified list first at both the domain and URL levels - before copying and pasting the mediawiki file into your new folder. It will save you a ton of "already parsed" messages.
I forgot to add one thing because I am so involved in lists. Let's assume you use SER to scrape for targets. Then you skip the whole failed folder thing and copying the mediawiki list part.
Instead, you just set the search engines to your normal set-up and let it rip on just mediawiki following the above example. Then you can actually see what's happening from that angle as well.
It is strange Sven don`t see any problems if we see them, all of us, not 1, not 2,n not 3, but all noticed huge decrease of verified links, and as you discovered disabling auto verify mechanism brings us back to earlier results, so I think there is something wrong and it should be fixed to work the way as before.
Seriously I have nothing from you but "I have sooo less lpm"...noone is sver coming up with proved where a list of utls worked before but doesn't work now. Nothing. I keep an eye on everything even though I don't see any issues on email verification or certain engines posting.
The only problem someone figured out on it's own was that his emails got blocked (hotmail/outlook) and so never produced any good links.
And please stop this "GSA is dead" talk, nothing is dead especially not GSA as this is the company. GSA SER however works as before for many (including myself). Don't get me wrong, Im here to help and I agree that if many have seen a low LpM, maybe there is something wrong, but you have to give something to play with. Not some "Yes low LpM for me as well". Come up with some URLs, engines or anything...
And don't be shy to share the project backup. I have zero interest in spying on your niche. I just don't care.
Drupal on the other hand is working ok for me at least.