Skip to content

My GSA only 90-95% NO Follow links? I bought a verified list for $50

Hi guys, I am not sure why this is happening? I bought a list for serlist.. followed their recommended steps and since then my lpm is only 5-8.. I am not sure what I did or what I checked off but now I am building barely any links. Please help, I am extremely confused.

Would anybody mind looking at my GSA?


  • magicallymagically
    You are not alone!
    We are 5 others in total facing the same issues and seeing the same patterns, please review this thread here:

  • why don't you check out our lists? we have good quality lists.
  • Post here screenshots of your settings or PM me your skype. I will ping you in free time.
  • It's because all link sellers are selling crap here, they are providing you with useless nofollow blog comments, manual approval blog comments, indexers, pingbacks, trackbacks, exploits and other shits. It's not SER issue, it's the sellers. 
  • my not scrape your own list your self using scrapebox
  • JezzaJezza Perth, Australia
    Yea don't worry me too, we probably bought the same list. I got one from Serlists and it ran like crap, barely any of the links sold to me were actually verified. Waste of $50 X(
  • Tim89Tim89
    edited December 2014
    These lists that people sell aren't of quality lol, I actually went to a couple of those sellers websites and checked their link stats and I burst into laughter when I saw

    Blog Comments - 60,000+ And then
    Article - 2550

    I mean, come on... A list of 200,000 targets with the stats above is pretty funny, what is this? A scrapebox blog comment AA list Lol.
  • Wow, you only needed 7$ to buy Scrapebox and scrape a list by yourself. I really don't see the point buying list. Ofcourse there are people that are scraper high quality list but if 100+ people are using it.......
  • You need to search for links i get 20% - 30% dofollow
  • Use search for PR nothing lower than 3 nothing higher than 10 so you get PR 9 links..You'll get them.
  • 2Take22Take2 UK
    edited January 2015
    Pgak - Apologies for missing this and sorry to hear that you weren't getting the results you wanted from our lists. I don't really like to talk about LPM too much as it can be affected by pretty much every internal setting, and a ton of external factors as well. However, I would expect you to be getting a much higher LPM that what you previously stated, and if you want to post screen shots of your settings either here or to our support email we'd be happy to take a look.

    dariobl - I respect your opinion, and I agree that the lists contain a lot of (what some people would consider) junk links, but there's also a ton of contextuals as well.

    We could just sell lists that were contextual only, but there wouldn't be any more urls than there is now, so I guess if people want to use the 'junk' targets they can, and if they don't, well then just don't add them to your projects. Personally, I use both, but for different parts of my strategy.

    The Stats are clearly displayed on our site....

    Jezza - Sorry to hear that you were disappointed with your purchase, but I can assure you that every URL the list(s) contain have been previously verified by SER. Whether they get verified on your rig is obviously another matter though, and could be affected by many different factors. We're happy to offer any help and / or advice though via our support, so please feel free to drop by if you want.

    Unkown717- If you know how, and have the resources then I agree that you should scrape your own lists, with the below guide being a great starting point;

    However, if you're doing large volumes of links per day it's unlikely that you will be able to keep SER fed without buying in additional lists as well.

    The other thing is, do you really want to spend all your time scraping and processing urls, or would you rather just pay your money, load up your list, and never worry about it again? I guess it depends on how you value your time.

    Even if we weren't selling lists I'd probably be buying them from respected forum members like @trevor_bandura, and @donaldbeck instead anyway. I'd probably also be signed up with @loopline 's service as well, but then again, I'm a bit of a fiend for fresh sites to spam. lol

    Also, even though our lists are sold to 100 people, I personally use them myself and have no problems ranking.

  • magicallymagically
    edited January 2015

    "but I can assure you that every URL the list(s) contain have been previously verified by SER..."

    I would say you are so full of shit!

    How on earth can these target have been verified in GSA SER, when there are links to pdf's`


    If those links indeed ALL have been verified by GSA SER - How come the list contains shit like that?


    That wouldn't be possible - unless you are lying in order to preserve your reputation, right?

    Well...I'm looking forward to get an explanation on those pdf'-links and how you got those verified:D

    Are you using a secret engine? 
  • ronron
    edited January 2015
    @magically - All I can tell you (as I have been here since Day 1) is that SER scripting was very broken. It was providing links as verified that didn't even have a place to post. As in blogs that had no comment pages, etc.

    You keep wanting to blame the list guys. You have undoubtedly heard of Garbage In, Garbage Out. Every single list had this issue, including people that made their own from just using SER scraping targets.

    It's all fixed now with v9.46. I won't say that with 100% absolute authority, but it appears the bad stuff is behind all of us.

    You have to remember that if you use SER to process any list of scraped targets or just using it to get targets on the fly, if the front-end processing (SER) has an issue, that means the final results will have an issue.

    You either get it or you don't. I'm certainly not going to argue about it.  

    p.s. Just to add a few other examples...1) targets getting constantly misclassified in the wrong engine, 2) taking 1000 freshly verified links, rerunning those and only getting 60 links on the second run, etc. There were so many things wrong it was like a meteor shower. This was the worst spell I have witnessed since I have been here. The last 2 months have been hell for I feel your frustration.
  • magicallymagically

    I believe we have talked about this before as well. Of course error happens, hence the final end result can't be 100% accurate.

    However - wouldn't it make sense to use a second tool to verify the links, and remove such garbage?

    People expect quality - and a list should at least be free of such useless junk.

    That's just my 5 cent;)
  • ronron
    @magically - I hear you, but the issue was having a list of 100,000 or 200,000 links. How in the world can you do it? Sure, you can remove pdf's, but there were 99 other problems.

    But here was the kicker...SER wasn't processing things properly on both ends - meaning that not only was it providing false targets (or targets in the wrong engine bucket) on the front-end processing of the scrapes -  but also - it wasn't even processing legit verified links properly either. It just got totally messed up. When you are in that kind of situation, there really is no recourse. We were all in the same boat.

    So much happened in a quick period of time. A lot of engines suddenly started changing login and other procedures needed to make a link. Others started using recaptcha just to battle the folks in this forum. It was like everything went bad at once. The forum (and users) grew to a very large number over the past 18 months, so the amount of websites getting slammed with spam grew exponentially. So the retaliation by the engines and platforms came quickly.

    Again, if both ends are screwed up, meaning the processing of the links, as well as the actual creation of the links, then that is the perfect storm. And that was the first time users experienced this type of scenario./

    The good news is that things are totally back to normal with v9.46. I am personally afraid to change out of it and update. So the best piece of advice I can give you is to backup all SER files on the C: drive for v9.46. There are two folders that SER writes to on the C: drive. Backup both of them. Of course, if you deploy these folders to start over, make sure you don't overwrite your project folders as those keep your current links built to date.

    And it is ok to get upset because we were all upset. Unfortunately with this business model (providing lists), the first people to get hammered and accused are the list providers. It isn't fair, but I wanted to respond to you because I didn't want you thinking we weren't trying to do anything about it.

    Ultimately, we all have to work with Sven to help him narrow down the issues. So if you see something weird going on, try to shoot him a PM and maybe give him a project backup so he can reproduce the same result - that's how we get problems fixed around here.

    Anyway, have a prosperous New Year and go make some money :)

  • magicallymagically

    I totally agree here on many of your observations!

    Actually - I just reverted back to 9.46...9.47 messes up again big time, leading to almost no verified (at least in my end).

    I think I will study the scripting of those engines, as a Java programmer it should be possible to make some optimizations on my own (Think I'm becomming too lazy):D

    Anyway, I have nothing against those lists (Neither personal issues @2Take2) - In fact we are all in the same boat as you say;)

    It's a free choice if you are a list-guy or a scrape-dude - I too wish everyone good luck.

    In my end - i'm actually not going directly after the gold (money) - Just trying to keep up with the rest of you guys in terms of rankings.

    Sven can get all the help and information he needs - I did suggest some kind of teamwork before on this forum. 

    Sharing is caring - and helping out others is crusial. We should all contribute to this process much better than we currently are doing. Unfortunately, there are lonely wolves everywhere:P
  • The lists I'm using goes to 60-70% do-follow links.
  • guys nofollow site more than dofollow because its easy to verify and without any security usually because its a dofollow = worthless backlink

    will the best way is to select more dofollow engines

    from the edit project window on the platform section R-click then uncheck nofollow and popup appears so uncheck sites with both (do-no)follow links click no so u have about 80% dofollow and 20% nofollow which is jsut perfect for google to think ur backlink real not spam
Sign In or Register to comment.