Skip to content

Sure I'm going for quality backlinks, but LpM < 5 suggests something is wrong...

varthdavervarthdaver Sydney, Australia
I know that my LpM will be low when I'm skipping domains below PR2, but the following symptoms suggest that something else is wrong:
  • LpM < 5
  • lots of "no targets to post to (maybe blocked by search engines)" warnings
  • lots of "download failed" messages
  • lots of "no engine matches" messages
  • occasional "out of memory" errors (I thought SER handled this automatically now?)
Here's my setup:
  • I have 11 projects, each with 12 search engines enabled
  • I have 200 private proxies pulled from the buyproxies special URL, but these must be duplicated because my buyproxies.org subscription is only for 50.
  • Also have between 500 and 1500 public proxies scraped by SER
  • Latest SERLists Red list is loaded into Identified
  • Ryan Lu's list is loaded into Failed
  • "Use URLs from global site lists if enabled" is ticked for "Identified" and "Failed" only
  • "Use URLs linking on same verified URL" is enabled
  • "Always use keywords to find target sites" is disabled
  • "Analyse and post to competitor backlinks" is enabled
  • Previously I had "At least 1 keywords must be present on anywhere" enabled.  I turned it off to see if my LpM would rise, but it didn't help.  I'd really like to turn this back on to improve the backlink quality, once we can fix this low LpM issue.
  • I am skipping < PR2 on domain, unknown PR, and OBL > 30
  • I am skipping non-English, porn, gambling & pharma sites
  • I've analysed the performance of the various engines and disabled those with low success rates
  • I perform the weekly maintenance suggested by SERLists best practice:
    • Remove all duplicate domains
    • Remove all duplicate URLs
    • Verify then replace emails (10 per project)
    • Clear submitted
    • Delete Target URL Cache
    • The only thing I don't do is untick all target search engines, because I'd like to have at least some keyword-relevant backlinks in the mix!

Comments

  • varthdavervarthdaver Sydney, Australia
    Oh dear!  I still really need help with this... :(

    I'm not sure there's much point in buying any more lists if none of it works.
  • goonergooner SERLists.com
    First i would suggest you get rid of the public proxies, de-dup the private proxies, disable search engines.
    Save/Export/Delete your verified list to start with a fresh one. Then allow projects to post from verified and identified/failed.

    Test and let us know your results.

    There are a lot of variables in your setup. Try to eliminate as many as possible and start with a blank slate.

    If results are still crappy post again and i'll try to help :)
  • varthdavervarthdaver Sydney, Australia
    Thanks Goonsy.  Actioning this now.  Slightly concerned about deleting the verified list, in case I need to remove some later for detox/disavowal.  You've had success re-importing verified lists in such cases?
  • goonergooner SERLists.com
    edited August 2014
    No probs mate.

    When you remove verified links for detox you do that on a project level... So please please don't delete verified from individual projects.

    But you can delete the global verified list with no worries (you know the one in app data/roaming/gsa ser/verified).

    I would export it and save it first, in case you need it later. Or just create an empty folder like "New Verified" and in global options, point your verified folder to that one. That way you preserve the old one.
  • varthdavervarthdaver Sydney, Australia
    Holy Shit!  Wihtout changing ANYTHING but cutting the proxy list back to my 50 paid proxies & restarting SER, I'm at 33 LpM!  Insane!  I haven't even loaded today's SERList yet.

    But hangon, why is it slowing down?  5 mins later it's dropped below 15 LpM and still heading south...

    Will continue with your other recommendations, Gooner.
  • goonergooner SERLists.com
    Have all the projects changed to blue in the status column?

    SER usually goes into verify mode when it restarts.
  • I have 11 projects, each with 12 search engines enabled

    Make sure you don't have any search engines there that provide bad results - do a manual search for some of the footprints and you'll see what's good and what's not.

    I have 200 private proxies pulled from the buyproxies special URL, but these must be duplicated because my buyproxies.org subscription is only for 50.

    Check my reply on your other thread for the answer.

    Also have between 500 and 1500 public proxies scraped by SER

    Don't use them, they're honestly the worst thing you can do speed wise, especially when you're trying to target high quality sites. Use your private proxies for more than searching.

    Latest SERLists Red list is loaded into Identified

    Glad to see you followed their tutorials, really good stuff in there. Make sure to read them word for word, really juicy info.

    Ryan Lu's list is loaded into Failed

    Don't see why this helps, but okay.

    "Use URLs from global site lists if enabled" is ticked for "Identified" and "Failed" only

    Now I see why this helps. Make sure you are not writing to the identified / failed folders... it'll make a huge mess.

    "Use URLs linking on same verified URL" is enabled

    What I believe this does is look for other posts, for example, on someones blog. It has arrows pointing to the last/next post, and it attempts to post to those. If you're looking for quality, it won't help because of the duplicate domains. Uncheck.

    "Always use keywords to find target sites" is disabled

    For some engines like trackback and blog comment, this is necessary. Don't know your engines, but maybe try this out if you're using trackback/blog comment/pingback etc.

    "Analyse and post to competitor backlinks" is enabled

    I find this slows stuff down.

    Previously I had "At least 1 keywords must be present on anywhere" enabled.  I turned it off to see if my LpM would rise, but it didn't help.  I'd really like to turn this back on to improve the backlink quality, once we can fix this low LpM issue.

    Depends on your keywords really.

    I am skipping non-English, porn, gambling & pharma sites

    Nice.

    I've analysed the performance of the various engines and disabled those with low success rates

    Low success rates may be caused by above settings, I'd give it another go after changing some of the stuff I listed above (if you disabled blog comments, for example, then you would get lots of failed comments because you're not using keywords to find new targets and repeatedly using the same ones.

    I perform the weekly maintenance suggested by SERLists best practice:

    Smart.
  • goonergooner SERLists.com
    edited August 2014
    @zero is spot on, there were a few very important things i missed:

    "Use URLs linking on same verified URL" is enabled

    What I believe this does is look for other posts, for example, on someones blog. It has arrows pointing to the last/next post, and it attempts to post to those. If you're looking for quality, it won't help because of the duplicate domains. Uncheck.

    "Always use keywords to find target sites" is disabled

    For some engines like trackback and blog comment, this is necessary. Don't know your engines, but maybe try this out if you're using trackback/blog comment/pingback etc.

    "Analyse and post to competitor backlinks" is enabled

    I find this slows stuff down.

    I've analysed the performance of the various engines and disabled those with low success rates

    Low success rates may be caused by above settings, I'd give it another go after changing some of the stuff I listed above (if you disabled blog comments, for example, then you would get lots of failed comments because you're not using keywords to find new targets and repeatedly using the same ones.

    I would also add that if you are using lists and not search engines, you may as well check even the worst performing engines. Why not take that link if it is in the list already?
  • varthdavervarthdaver Sydney, Australia
    Great tips, zero. And you're right about the verify mode, Gooner.  Feeling like an idiot for missing that.

    I'm hesitant to remove all search engines and just use the lists, as I'm concerned that this will reduce the contextual relevance of the backlinks.  What's the best way to ensure good contextual relevance in SER?
  • @varthdaver Import relevant keywords and use keywords to find target sites.
  • goonergooner SERLists.com
    @varthvader - Your contextual relevance tends to come from posting an article related to your niche with your link in it.

    A lot of article directories, wiki's etc tend to be generically themed so it's really difficult to find a whole site that relates just to your niche, usually it's just a category or sub-cateogry of that site at best.

    I consider the page to be contextually relevant rather than the whole site. That's my approach but i know others do it differently.
  • varthdavervarthdaver Sydney, Australia
    I like your point about making contextual relevance, @gooner.  I'm gonna turn off the search engines and give it a go with just lists.

    I read elsewhere that one should avoid Exploit, Indexer, Pingback & Referrer backlinks, because they're likely to be spammy.  What say you, in light of this discussion?

  • goonergooner SERLists.com
    Cool, let us know how it goes... I agree with your comments about those engines. They're pretty much useless.
  • varthdavervarthdaver Sydney, Australia
    Well, @gooner & @zero, I've implemented all your suggestions, but it's still dribbling along at just under 6 LpM.

    I wonder if there's anything you see in the logs that I've missed: http://pastebin.com/9AW1Kye8
  • goonergooner SERLists.com
    The standout thing is "PR too low". I think that's what's killing your LpM right now mate.
  • varthdavervarthdaver Sydney, Australia
    I totally agree that this is slowing down my speed, for sure.  But who wants low PR backlinks?  Aren't they just as useless as Pingbacks etc?  I know PR is getting outdated, but it's still a good measure for older sites, which is where the good linkjuice is, no?  And if the homepage PR is >=2 then you've still got a vain hope that there's still some link juice left down at the post level.

    I ask this, @gooner, because I have found your perspective and experience to be refreshingly different to mine.  Am very keen to be convinced that PR0 backlinks are still worth building, coz I'm certain that SER could churn out tonnes of them!  But equally concerned that we'd only have to remove them all again in our next detox.
  • goonergooner SERLists.com
    Sure, i understand - I use SER in a slightly different way, mainly to spam my tier 1s - I figure all the juice going through those tier 1s will juice my money site too. So i'm not really concerned about PR. Any contextual link on tier 1 is fine for me.

    But i do also use PBN links and web 2.0 links on my tier 1 for probably 50% of my sites. Those that need the ranking boost.

    But i should say i have cut down a lot on their number of links on tier 1. Usually just a few hundred and then i use most of SER's power to spam them.

    I wouldn't recommend it for an important site though. Most of my work is for clients and i never work on their corporate site. I always build them a site for the purpose.
Sign In or Register to comment.