As of now,the best way to use GSA SER is to create custom site lists that aren't spammed to death and create your own engine files. Your rankings will be unbeatable when used and maintained correctly.
To take it one step further, diversify your SEO efforts. Getting your site to #1 with GSA is certainly possible with the right planning but will require a concentrated effort and time like anything else in life.
If I wanted to rank dozens of sites and drive more traffic I would invest some of my earnings into private networks, build my own networks, buy niche related sites off Flippa, do co-partnership with other site owners, build community portals off social network sites, article marketing, build youtube channels, etc. The possibilities are endless. At the end of the day it comes down to ROI, time, and money. GSA is currently the best SEO tool on the market in my opinion because of it's speed, flexibility, and amazing stability. Success is really dependent on the user.
Nice info.. especially the creating your own engine files, I hadn't thought of that one.
See, it's not a bad thing to open yourself up to new ideas :P
I thought I was done with ser, but you have me thinking now of more ways to work it.
But on a side not, just to defend myself with the whole gooner thing.. I wouldnt put up with someone on the street talking to me like that.. so it's no different here.. anyway.. moving on
@heisenberg - I base my conclusions on testing across 100's of sites over many years. And my tests show tools like SER work and work well.
You come along on and say the opposite, based on what?
If you offer up some evidence for your conclusions then we have a discussion, if not what should i say to you?
Not only that but disavowing the links that you can get from SER is impossible, we are talking millions of sites. So what will G do, disavow all sites that use any of the engines that SER can post to? Not very realistic is it?
As for defending SER/Sven, well here is a guy that produces a software that allows me to replace my entire SEO outsourcing team for no monthly recurring fees. What should i say about that?
Well I hardly come along and say it based on thin air now do I?
Obviously based on my own findings.. on my sites... my client's sites.. and the hundreds of other sites ive analyzed..
Do I feel the need to prove myself to he worthy of a discussion with you though? No.
If ser is working for you then good for you, but I've had more than a few sites burned by it in the past to pass this thread by when I saw it to not chime in with my opinion on it. And I know you have too since I've seen you mention it in the past that you lost your clients rankings.
But what annoys me is that it's like a bunch of sheep in here whenever someone's site is penalized in a penguin update.. oh.. must have been your anchor text ratio.. oh.. don't blame ser.. oh you're not using it right..
Nobody here for one minute ever considers what the disavow tool is really there for and I feel sorry for the noobs who are fed this story time and again here so thought I would give my opinion this time around rather than see them fed the same parrotted reasons they always get...
Did google make this wonderful disavow tool just for the good of webmasters around the world to disavow their links whenever they had too high an anchor text ratio? Or could it be that it was made with the sole purpose of finding every known spam source and penalizing any site who'se backlink profile is made up of xx% of these links?
People here seem to like going on about the hundreds of platforms that ser supports.. it's fantastic that sven has built a tool capable of doing what it does.. but for some reason it seems people think that google wouldnt penalize their site since "how could they penalize all those platforms?'
It's not about penalizing all those platforms.. it is about knowing about any sites on those platforms that are a known source of spam and directly penalizing any page they link to though.
It shows how naieve you are that your only defense to this suggestion is "So what will G do, disavow all sites that use any of the engines ser can post to? "
No obviously not.. only the ones that are known sources of spam.
So you've figured out that (although there is evidence to suggest that it works for the intended purpose) the disavow tool is mainly an attempt by Google to crowd source a database of spammy domains/pages to use as part of their spam algo/s?
I agree, but it's hardly a ground breaking revelation, and certainly not something that most of us don't know (suspect) already.
However, I am intrigued as to how you have managed to obtain a tool that can filter out and post to sites that are not on this database, seeing as nobody outside a chosen few at Google probably have access to it?
With regards to people coming on here complaining that their sites have been penalised, you and I both know that it is actually far more difficult to get a site slapped than most people imagine, and 90% of the time it is normally something far less sinister, with the remaining 10% probably made up of how the tool has been used, as opposed to what the tool actually does, and the sites that it posts to.
That's not to say that *if* you were posting to the same list over and over again with SER that it *couldn't* be the cause of some penalties, but that's not really how the tool is meant to be used, and is not how 99% of people would use it (at least none that I know).
However, this is just speculation, unless you have some data from your penalised sites that isolates all other possible factors for the penalty?
I guess what I'm saying is that, even if certain sites transfer negative link metrics, or are even being taken into account by the spam algo/s, provided you are constantly scraping new target sites (which has always been good practice, and good for rankings), then you are always going to remain one step ahead, especially if you (as already said by @Sweeppicker) add your own custom footprints and engines.
I would however concede that as time goes on and the disavow index grows, then the effectiveness of SER will probably get less and less, but the fact still remains that if you know what you are doing, you can still use it (sometimes in conjunction with other tools/methods) to easily rank pages, and as @gooner says, if there comes a time when it doesn't work at all, then we will just find something else that does to add to our arsenal (excuse the pun lol).
@heisenberg - You don't need to feel worthy of having a conversation with me, but if you come to this forum and slate a tool that a lot of people use to make a lot of money, of course those people are going to question your motives. You use a tool that doesn't post to disavowed links, sounds like the beginning of a sales pitch!
As for being penalised, 3 of my sites out of more than 200 that use SER have been penalised - All 3 had huge amount of links (built by outsourcers, not SER) and high anchor text %... For me it's pretty obvious what the issue was.
Especially when you consider that i duplicated those sites, re-wrote the content to around 60% uniqueness and all of them were on page 1 within 2 months using only SER for links. So far they have stayed there for 2 - 3 months and counting.
@sweeppicker and @2take2 (and many others on this forum) have given noobs as you say plenty of good advice about how to take their SEO one step further than SER. As for me, i use a PBN and have mentioned this numerous times and given advice on how to set this up too.
I feel sorry for noobs that read post like yours and posts on so many other forums kidding them that the big G is so much smarter than it actually is at detecting link spam, ranking manipulation and all that good stuff. But i shouldn't complain because information like that keeps competition lower and allows me to clean up providing better results at half the price of traditional SEO companies.
Guys, the disavow tool works on an individual basis. It works like an isolated robots.txt file where you no-index certain links, only in this case it works with external links pointing to your domain. Whether this is going to change in the future, I can't tell, but I know that they're not using the disavow files as a "database of bad links" yet.
Now, what I can tell you is that most of the ranking drops that you are experiencing has nothing to do with links. I know that this might sound a bit radical, but most of the problems that my clients have are based on PANDA, not Penguin.
The reason I'm telling you this is that most of the "black hatters" out there simply have no clue how wide the scope of Panda really is. It's not just about keyword stuffing, it's about the measurable user metrics, and believe me, Google can measure a lot of stuff (the fact that they own Chrome, invest billions into Firefox, know how the user behaves when he lands on your pages....you connect the dots...), and user metrics are a part of Panda. If your site sucks for the user, you might rank it for a while, heck, even a year, because the linking signals are strong, but eventually, it'll go down because of PANDA, not the Penguin.
It doesn't have anything to do with GSA SER, or the Penguin guys. Heck, I can fix a Penguin penalty in 2 days simply by doing a tweak on my site. It has worked for me 100%, but only because it's a glitch on Google's side, and I won't share it publicly on the forum (nothing to do with 301). For those who want to know it, simply PM me, but don't even bother if you're not a reputable member here (I'm talking about the ranks of Ron, LeeG etc.).
i also believe that disavow is not a bad links database that is so paranoic to think like that....
lets say the majority of links created by SER are on the disavow database, so if i want to rank i just need to run SER on my competitors website and i will rank doing regular on page seo and couple of "natural links" because i will tell google to penalize my competitors doing negative seo, according to @heisenberg statement: it is about knowing about any sites on those platforms that are a known source of spam and directly penalizing any page they link to though....
I disavowed some links today that a client's 'marketing guy' made. They weren't on a bad site, toxic or bad neighbourhood links, but they weren't helping in any way, so I got rid. I will gain links from that same site but in the way I want not the way they were made by his marketing guy.
What this thread seems to miss is there are plenty of us out there who use GSA SER fully planning and expecting to get banned - as if there was a 'safe' way to use it! But until the hammer falls you can still have fun. Sometimes it comes quick - sometimes it doesn't come at all.
Ok, I'm Chinese, I thought the GSA SER was best SEO tool in the world now! AnyWay, I will continue used GSA SER for my seo work, Unless, i found new one best the SER. that's all i wanted to say!
Comments
See, it's not a bad thing to open yourself up to new ideas :P
I thought I was done with ser, but you have me thinking now of more ways to work it.
But on a side not, just to defend myself with the whole gooner thing.. I wouldnt put up with someone on the street talking to me like that.. so it's no different here.. anyway.. moving on
@heisenberg - I base my conclusions on testing across 100's of sites over many years. And my tests show tools like SER work and work well.
You come along on and say the opposite, based on what?
If you offer up some evidence for your conclusions then we have a discussion, if not what should i say to you?
Not only that but disavowing the links that you can get from SER is impossible, we are talking millions of sites. So what will G do, disavow all sites that use any of the engines that SER can post to? Not very realistic is it?
As for defending SER/Sven, well here is a guy that produces a software that allows me to replace my entire SEO outsourcing team for no monthly recurring fees. What should i say about that?
Thanks is all i got.
So you've figured out that (although there is evidence to suggest that it works for the intended purpose) the disavow tool is mainly an attempt by Google to crowd source a database of spammy domains/pages to use as part of their spam algo/s?
I agree, but it's hardly a ground breaking revelation, and certainly not something that most of us don't know (suspect) already.
However, I am intrigued as to how you have managed to obtain a tool that can filter out and post to sites that are not on this database, seeing as nobody outside a chosen few at Google probably have access to it?
With regards to people coming on here complaining that their sites have been penalised, you and I both know that it is actually far more difficult to get a site slapped than most people imagine, and 90% of the time it is normally something far less sinister, with the remaining 10% probably made up of how the tool has been used, as opposed to what the tool actually does, and the sites that it posts to.
That's not to say that *if* you were posting to the same list over and over again with SER that it *couldn't* be the cause of some penalties, but that's not really how the tool is meant to be used, and is not how 99% of people would use it (at least none that I know).
However, this is just speculation, unless you have some data from your penalised sites that isolates all other possible factors for the penalty?
I guess what I'm saying is that, even if certain sites transfer negative link metrics, or are even being taken into account by the spam algo/s, provided you are constantly scraping new target sites (which has always been good practice, and good for rankings), then you are always going to remain one step ahead, especially if you (as already said by @Sweeppicker) add your own custom footprints and engines.
I would however concede that as time goes on and the disavow index grows, then the effectiveness of SER will probably get less and less, but the fact still remains that if you know what you are doing, you can still use it (sometimes in conjunction with other tools/methods) to easily rank pages, and as @gooner says, if there comes a time when it doesn't work at all, then we will just find something else that does to add to our arsenal (excuse the pun lol).
As for being penalised, 3 of my sites out of more than 200 that use SER have been penalised - All 3 had huge amount of links (built by outsourcers, not SER) and high anchor text %... For me it's pretty obvious what the issue was.
Especially when you consider that i duplicated those sites, re-wrote the content to around 60% uniqueness and all of them were on page 1 within 2 months using only SER for links. So far they have stayed there for 2 - 3 months and counting.
@sweeppicker and @2take2 (and many others on this forum) have given noobs as you say plenty of good advice about how to take their SEO one step further than SER. As for me, i use a PBN and have mentioned this numerous times and given advice on how to set this up too.
I feel sorry for noobs that read post like yours and posts on so many other forums kidding them that the big G is so much smarter than it actually is at detecting link spam, ranking manipulation and all that good stuff. But i shouldn't complain because information like that keeps competition lower and allows me to clean up providing better results at half the price of traditional SEO companies.
So please keep spreading the gospel
AnyWay, I will continue used GSA SER for my seo work,
Unless, i found new one best the SER.
that's all i wanted to say!