@seoaddict as you mention I believe problem is at email verification. before out of 1000 emails i would get minimum 100 verified links, now i get practically none. the only contextuals i am getting verified are the ones that dont require email verification.
are you guys getting any verified links for contextuals during email verification?
I think people need to be more precise in what they are saying and comparing against. Everybody here has a different set-up. Most people use SER to scrape, and some use lists to get their results.
As you already know, I'm involved with lists, so that is my approach. Even with lists, I have noticed issues with the overall lower number of verified, submitteds that never verify, and things like that. On the other hand, I'm still getting very nice diversity with contextuals.
Below is a project where I have it set to 100 contextuals per day:
You can see that for the past 7 days I have been dancing right around the 100 mark which is my limit for the day. And you can see a wide diversity of engines creating the links.
I guess the point I am trying to make is that you need to throw more screen shots up if anyone (especially @Sven) is going to be able to narrow down the problem.
I already know that the list thing is a big deal when it comes to link production and avoiding other issues - that is why I headed in that direction. But regardless of where you get the links from, you guys have to throw up more screen shots, whether you are having SER scrape or using a list, and then make an attempt to isolate the variables.
For example, I have 20 test projects all with emails and everything, and they point to a total bullshit URL that doesn't exist. So if you want to test mediawiki as an example, only check that engine, give it a list of only mediawiki, use new emails, and let it run.
Just to show you the test projects I have setup so you don't think I am pulling your leg, here's a quick screenshot:
Listen to me: Everybody is afraid to send in backup projects to Sven because it has their sacred website info there. You 100% avoid that issue if you set up dummy URLs and dummy test projects. Then you can actually use a test project to test one engine, or test another engine. Just set-up the project properly as if it were for real. Stick in new emails, make sure the target URL cache is empty, stick in spins for the articles, etc. And remember, once you setup one test project, you can clone it. Just make sure you have new emails in each.
The key is then you can send Sven the details with one click to backup this dummy project - and say "Here Sven is a backup showing you a MediaWiki problem" or whatever you think the issue is.
If you do this, @Sven will be able to help. If you don't, then you are not really helping that much.
I think everybody should set-up test projects, and actually isolate the engines where they think there is a problem. Then back it up and send it to Sven. Then he can help.
@MorphMan - I set up whatever I want to test. But let's say I wanted to test Mediawiki.
I would setup a brand new project from scratch, new emails, new spins, and then set the project settings exactly the way I would want it as if it were a real project. So the settings in the project are all set exactly the way I would want it for a contextual project, etc.
Then I would create a new blank folder and call it mediawiki. I would then map that folder to use the Failed folder in Main Options.
Then I would grab my verified list, and copy the media wiki file, and past it in that new folder that has the Failed folder pointed to it (leave Failed unchecked in Main Options).
Then I would set the new test project to use the "failed" sitelist in the options tab within that test project.
Then let it run to completion. And then you have something to analyze.
Final note: I would dedupe your verified list first at both the domain and URL levels - before copying and pasting the mediawiki file into your new folder. It will save you a ton of "already parsed" messages.
Cheers @ron i'll see if I can get something setup to do some trouble shooting as well as something to give to Sven. There is an issue, there is no doubting that we just need to get to the bottom of it.
@MorphMan - Thanks for doing that. I think if everyone does a little testing, we will collectively solve these problems. I'm not trying to get Sven off the hook or anything, but honestly, this software has gotten very big. If everyone divides and conquers, everything will get better much more quickly.
I forgot to add one thing because I am so involved in lists. Let's assume you use SER to scrape for targets. Then you skip the whole failed folder thing and copying the mediawiki list part.
Instead, you just set the search engines to your normal set-up and let it rip on just mediawiki following the above example. Then you can actually see what's happening from that angle as well.
Ok, after whole night of submission to contextual targets I could say it is something wrong with automatically verification, I disabled them at all in the projects (about 30 projects) and let them submit for a night, and quess what ? (: I switched half an hour ago to verify only and I got thousands of verified links ! (: it is still veryfing. Just disable auto verification let the projects run for longer time and then switch to verify you will see a huge difference (:
@thomas73 Yes I think you maybe onto something here. I just ran a brand new project set to NOT verify automatically. Ran that for a few hours, stopped the project and then started it at verify only active (V) and I have more verified links in this few hours than I've had for over a week of identical projects set with the verify automatically.
@thomas73@MorphMan
Did you guy's read the entire thread? I have already pointed this out earlier and sven asked me to send him one of my projects. Unfortunately he was not able to find any problems.
@Robby54 yes I should have mentioned that you had found this before, I've been with the thread since the beginning. if Sven hadn't found anything kinda rules it out, (would have been nice to let us all know) but for sure it's producing way more verified links this way
Robby54, I just confirmed that what you discovered ,nothing else I think it`s important to reproduce the same results in more than one examples right ? (:
It is strange Sven don`t see any problems if we see them, all of us, not 1, not 2,n not 3, but all noticed huge decrease of verified links, and as you discovered disabling auto verify mechanism brings us back to earlier results, so I think there is something wrong and it should be fixed to work the way as before.
Wow this is soul destroying. I've ran my projects for the past 24hrs and have just changed to verify only. Out of 127 projects and tens of thousnds submitted i currently have 23 links. It's been verifying for the past half hour. Logs green finding the links just not adding them to my verified column. This is the worse its been in the two years of use.
Starting to agree with all of you. Between all the crashing, Memory Errors, Verification Issues, Issues with update, Bugs. SER is no longer becoming a automated program. I literally have to sit around and babysit it at all the time. Might be time for me to finally start looking for other solutions.
Seriously I have nothing from you but "I have sooo less lpm"...noone is sver coming up with proved where a list of utls worked before but doesn't work now. Nothing. I keep an eye on everything even though I don't see any issues on email verification or certain engines posting.
The only problem someone figured out on it's own was that his emails got blocked (hotmail/outlook) and so never produced any good links.
And please stop this "GSA is dead" talk, nothing is dead especially not GSA as this is the company. GSA SER however works as before for many (including myself). Don't get me wrong, Im here to help and I agree that if many have seen a low LpM, maybe there is something wrong, but you have to give something to play with. Not some "Yes low LpM for me as well". Come up with some URLs, engines or anything...
And don't be shy to share the project backup. I have zero interest in spying on your niche. I just don't care.
LpM doesn't mean shit. I never got over 1-2 Lpm on contextuals anyways. Sven, I don't see how you can't see this problem. I mean come on. Set up a contextual project, no filters, import a list. Watch GSA take a shit and produce 30 links max. Don't use wikis, XE and other bullshit, use the main ones - Drupal, Joomla, vBulletin, the DoFollow engines. There's the problem. Simple to re-create.
Do you see there people complaining about the same massive problem yet you say there is no problem. You think we're just making this shit up? I'll send you a backup or whatever you want. This is the absolute worst problem I've ever seen on GSA and should be taken extremely seriously.
@seoaddict - Did Joomla and VBulletin used to be good? They've always been shitty for me in the 8 months i've been using SER. My master verified list has less than 1000 Joomla and less than 1000 VBulletin.
Drupal on the other hand is working ok for me at least.
I only use DoFollow contextual CMS. That's literally the only reason I use GSA. Other people that are still getting verified links may be using Wikis, blog comments, trackbacks, and shitty engines like XE... but the engines I listed above were the ones most affected, which is why I am feeling this the worst.
@seoaddict, you of course remembered to make sure that you could manually post to these 70 urls and that they havn't modified their engine, right? Targets die all the time and the verified urls you get today will not all work tomorrow.
Also, I want to applaud the guy who did a test with an old version of SER against the latest on 1000 urls. However that didn't prove any decrease in SER's ability to create and verify links from back then to now to me. A lot of different things could have happend while going through the list which would have made the data unreliable. Had you ran the same test 50 times I believe you would have ended up with much more reliable data. Using a list of hundreds of thousands of urls would also be a great way to get better results in my opinion. I'd be interesting to see a comparision of the engines you managed to post and verify links on between the two projects though.
I must sound like a broken records but are you currently getting verified links when SER checks emails? I remember the days when SER would go check emails and I would get ton of verified links. Now practically nothing.
I wish could repeat this process showing verified links during email checking...but I have no access to older version.
Sorry, but @seoaddict, 1000s of sites shut down each and every day "overnight".
I've been following this thread and as a long time SER user, I got to say how disappointed I am and how I can't believe how ignorant some of you are being to @sven.
Literally, think about what we did / used before SER.
Have a think about what version SER was at when you bought it and what it is now. Tell me about another piece of SW that's been as updated?
If you don't think @sven is trying his best to sort our issues then find something else that does what SER does...
@seoaddict, that was rather rude, but I understand your response as my initial reply could have been interpreted the same yet I didn't mean it to. You simply made me get the impression that your 70 urls were weeks if not months old, because I've gotten the impression from this thread that people have been seeing a problem for a while now.
I agree that there is a problem since a mate of yours is posting successfully to these urls. I would've asked into both your set ups, but it'd make more sense to just wait and see what Sven has to say. I appreciate the fact that you're providing Sven something to work with.
Look @seoaddict, send me over 5 of those URLs and I will try to manually register on them. It's worth it to be able to eliminate registration being closed as a possible cause.
Wordpress and Drupal verification ismy issue. Sending backup of 90 test projects to Sven now. There are 3.7 million remaining target URLs in those projects.... Should help debug.
I'm not sure if this maybe causing the issues? Again I’m just throwing it out there to see what you guys say.
I've have gone through all of my emails on all projects and this is the message I get, now if SER cannot verify the emails then this would be one reason for having the big decrease in verification's?
The email accounts are all setup correctly:
All accounts are POP3 enabled
All accounts are verified (captcha unlocked)
SPAM Filter is disabled
A lot of you guys in this thread are longtime users of SER, is this normal or shall I go away and hide under a rock?
In order to try and help with this, Me and @gooner decided to run a test project last night, using only previously verified contextual urls from another server. I have sent the project backup to Sven to take a look at.
Same pb here... I can not make a single verified with Articles & Social Networks, except Xpression articles.. I have a BIG list of Articles and SN SER was able to post not a long time ago.. seems like SER can not get a single verified on this list anymore..
I checked the email account (hotmail) and the emails confirmation/activation are there (hundreds of them) .. I dont know why I can not see any links coming on the verified list..
I hope Sven will be able to take a look at it and fix it.
I actually just found this thread after having to shut down my Fiverr gig.....Back on 7.xx I was building links like it was my job (it was lol). Ever since I updated to 8.xx I can't get verified links to build anywhere near the way it used to. I used great dedicated proxies from buyproxies.org and they work fine on WAC and Scrapebox. I even upgraded my internet speed to 120/40 to see if that was the issue but I still have the same problem. I use my own lists and have for over a year now with GSA and this is the first time I've had any issues. Hopefully it gets figured out because at this point I am losing money so I may have to look into another program or something.
If I can help in anyway, or send you any info please let me know. I love GSA and I'm comfortable with it, I don't want to have to pick up UD and relearn a new program.
Comments
are you guys getting any verified links for contextuals during email verification?
I think people need to be more precise in what they are saying and comparing against. Everybody here has a different set-up. Most people use SER to scrape, and some use lists to get their results.
As you already know, I'm involved with lists, so that is my approach. Even with lists, I have noticed issues with the overall lower number of verified, submitteds that never verify, and things like that. On the other hand, I'm still getting very nice diversity with contextuals.
Below is a project where I have it set to 100 contextuals per day:
You can see that for the past 7 days I have been dancing right around the 100 mark which is my limit for the day. And you can see a wide diversity of engines creating the links.
I guess the point I am trying to make is that you need to throw more screen shots up if anyone (especially @Sven) is going to be able to narrow down the problem.
I already know that the list thing is a big deal when it comes to link production and avoiding other issues - that is why I headed in that direction. But regardless of where you get the links from, you guys have to throw up more screen shots, whether you are having SER scrape or using a list, and then make an attempt to isolate the variables.
For example, I have 20 test projects all with emails and everything, and they point to a total bullshit URL that doesn't exist. So if you want to test mediawiki as an example, only check that engine, give it a list of only mediawiki, use new emails, and let it run.
Just to show you the test projects I have setup so you don't think I am pulling your leg, here's a quick screenshot:
Listen to me: Everybody is afraid to send in backup projects to Sven because it has their sacred website info there. You 100% avoid that issue if you set up dummy URLs and dummy test projects. Then you can actually use a test project to test one engine, or test another engine. Just set-up the project properly as if it were for real. Stick in new emails, make sure the target URL cache is empty, stick in spins for the articles, etc. And remember, once you setup one test project, you can clone it. Just make sure you have new emails in each.
The key is then you can send Sven the details with one click to backup this dummy project - and say "Here Sven is a backup showing you a MediaWiki problem" or whatever you think the issue is.
If you do this, @Sven will be able to help. If you don't, then you are not really helping that much.
I think everybody should set-up test projects, and actually isolate the engines where they think there is a problem. Then back it up and send it to Sven. Then he can help.
@MorphMan - I set up whatever I want to test. But let's say I wanted to test Mediawiki.
I would setup a brand new project from scratch, new emails, new spins, and then set the project settings exactly the way I would want it as if it were a real project. So the settings in the project are all set exactly the way I would want it for a contextual project, etc.
Then I would create a new blank folder and call it mediawiki. I would then map that folder to use the Failed folder in Main Options.
Then I would grab my verified list, and copy the media wiki file, and past it in that new folder that has the Failed folder pointed to it (leave Failed unchecked in Main Options).
Then I would set the new test project to use the "failed" sitelist in the options tab within that test project.
Then let it run to completion. And then you have something to analyze.
Final note: I would dedupe your verified list first at both the domain and URL levels - before copying and pasting the mediawiki file into your new folder. It will save you a ton of "already parsed" messages.
I forgot to add one thing because I am so involved in lists. Let's assume you use SER to scrape for targets. Then you skip the whole failed folder thing and copying the mediawiki list part.
Instead, you just set the search engines to your normal set-up and let it rip on just mediawiki following the above example. Then you can actually see what's happening from that angle as well.
It is strange Sven don`t see any problems if we see them, all of us, not 1, not 2,n not 3, but all noticed huge decrease of verified links, and as you discovered disabling auto verify mechanism brings us back to earlier results, so I think there is something wrong and it should be fixed to work the way as before.
Seriously I have nothing from you but "I have sooo less lpm"...noone is sver coming up with proved where a list of utls worked before but doesn't work now. Nothing. I keep an eye on everything even though I don't see any issues on email verification or certain engines posting.
The only problem someone figured out on it's own was that his emails got blocked (hotmail/outlook) and so never produced any good links.
And please stop this "GSA is dead" talk, nothing is dead especially not GSA as this is the company. GSA SER however works as before for many (including myself). Don't get me wrong, Im here to help and I agree that if many have seen a low LpM, maybe there is something wrong, but you have to give something to play with. Not some "Yes low LpM for me as well". Come up with some URLs, engines or anything...
And don't be shy to share the project backup. I have zero interest in spying on your niche. I just don't care.
Drupal on the other hand is working ok for me at least.
All I need is an URL where SER failed to post (probably worked in the past) and you can submit manually using Browser. Thats all I ask for
But it's 10:30pm here so bedtime for me...trying to debug tomorrow.
I applaud your attempt to discredit this claim.
Amazing how common sense is not so common anymore.
100+ sites do not shut down overnight. This guy I know Charlie never updated from 7.66 and guess what? He can post on all the same URLs as always!
There's a problem. I'm not manually registering on sites, but if his GSA can post to them, they should work.
I must sound like a broken records but are you currently getting verified links when SER checks emails? I remember the days when SER would go check emails and I would get ton of verified links. Now practically nothing.
I wish could repeat this process showing verified links during email checking...but I have no access to older version.
I've been following this thread and as a long time SER user, I got to say how disappointed I am and how I can't believe how ignorant some of you are being to @sven.
Literally, think about what we did / used before SER.
Have a think about what version SER was at when you bought it and what it is now. Tell me about another piece of SW that's been as updated?
If you don't think @sven is trying his best to sort our issues then find something else that does what SER does...
Do you see an ydifference ?
My Lpm on contextual increased now to over 70 !
It`s incredible, Sven again done his job (:, thanks Sven.
I switched back each project to auto verify, I don`t know what was wrong before but I see now a huge difference, it works.