Sure I'm going for quality backlinks, but LpM < 5 suggests something is wrong...
varthdaver
Sydney, Australia
in Need Help
I know that my LpM will be low when I'm skipping domains below PR2, but the following symptoms suggest that something else is wrong:
- LpM < 5
- lots of "no targets to post to (maybe blocked by search engines)" warnings
- lots of "download failed" messages
- lots of "no engine matches" messages
- occasional "out of memory" errors (I thought SER handled this automatically now?)
- I have 11 projects, each with 12 search engines enabled
- I have 200 private proxies pulled from the buyproxies special URL, but these must be duplicated because my buyproxies.org subscription is only for 50.
- Also have between 500 and 1500 public proxies scraped by SER
- Latest SERLists Red list is loaded into Identified
- Ryan Lu's list is loaded into Failed
- "Use URLs from global site lists if enabled" is ticked for "Identified" and "Failed" only
- "Use URLs linking on same verified URL" is enabled
- "Always use keywords to find target sites" is disabled
- "Analyse and post to competitor backlinks" is enabled
- Previously I had "At least 1 keywords must be present on anywhere" enabled. I turned it off to see if my LpM would rise, but it didn't help. I'd really like to turn this back on to improve the backlink quality, once we can fix this low LpM issue.
- I am skipping < PR2 on domain, unknown PR, and OBL > 30
- I am skipping non-English, porn, gambling & pharma sites
- I've analysed the performance of the various engines and disabled those with low success rates
- I perform the weekly maintenance suggested by SERLists best practice:
- Remove all duplicate domains
- Remove all duplicate URLs
- Verify then replace emails (10 per project)
- Clear submitted
- Delete Target URL Cache
- The only thing I don't do is untick all target search engines, because I'd like to have at least some keyword-relevant backlinks in the mix!
Comments
I'm not sure there's much point in buying any more lists if none of it works.
Save/Export/Delete your verified list to start with a fresh one. Then allow projects to post from verified and identified/failed.
Test and let us know your results.
There are a lot of variables in your setup. Try to eliminate as many as possible and start with a blank slate.
If results are still crappy post again and i'll try to help
When you remove verified links for detox you do that on a project level... So please please don't delete verified from individual projects.
But you can delete the global verified list with no worries (you know the one in app data/roaming/gsa ser/verified).
I would export it and save it first, in case you need it later. Or just create an empty folder like "New Verified" and in global options, point your verified folder to that one. That way you preserve the old one.
But hangon, why is it slowing down? 5 mins later it's dropped below 15 LpM and still heading south...
Will continue with your other recommendations, Gooner.
SER usually goes into verify mode when it restarts.
Make sure you don't have any search engines there that provide bad results - do a manual search for some of the footprints and you'll see what's good and what's not.
I have 200 private proxies pulled from the buyproxies special URL, but these must be duplicated because my buyproxies.org subscription is only for 50.
Check my reply on your other thread for the answer.
Also have between 500 and 1500 public proxies scraped by SER
Don't use them, they're honestly the worst thing you can do speed wise, especially when you're trying to target high quality sites. Use your private proxies for more than searching.
Latest SERLists Red list is loaded into Identified
Glad to see you followed their tutorials, really good stuff in there. Make sure to read them word for word, really juicy info.
Ryan Lu's list is loaded into Failed
Don't see why this helps, but okay.
"Use URLs from global site lists if enabled" is ticked for "Identified" and "Failed" only
Now I see why this helps. Make sure you are not writing to the identified / failed folders... it'll make a huge mess.
"Use URLs linking on same verified URL" is enabled
What I believe this does is look for other posts, for example, on someones blog. It has arrows pointing to the last/next post, and it attempts to post to those. If you're looking for quality, it won't help because of the duplicate domains. Uncheck.
"Always use keywords to find target sites" is disabled
For some engines like trackback and blog comment, this is necessary. Don't know your engines, but maybe try this out if you're using trackback/blog comment/pingback etc.
"Analyse and post to competitor backlinks" is enabled
I find this slows stuff down.
Previously I had "At least 1 keywords must be present on anywhere" enabled. I turned it off to see if my LpM would rise, but it didn't help. I'd really like to turn this back on to improve the backlink quality, once we can fix this low LpM issue.
Depends on your keywords really.
I am skipping non-English, porn, gambling & pharma sites
Nice.
I've analysed the performance of the various engines and disabled those with low success rates
Low success rates may be caused by above settings, I'd give it another go after changing some of the stuff I listed above (if you disabled blog comments, for example, then you would get lots of failed comments because you're not using keywords to find new targets and repeatedly using the same ones.
I perform the weekly maintenance suggested by SERLists best practice:
Smart.
"Use URLs linking on same verified URL" is enabled
What I believe this does is look for other posts, for example, on someones blog. It has arrows pointing to the last/next post, and it attempts to post to those. If you're looking for quality, it won't help because of the duplicate domains. Uncheck.
"Always use keywords to find target sites" is disabled
For some engines like trackback and blog comment, this is necessary. Don't know your engines, but maybe try this out if you're using trackback/blog comment/pingback etc.
"Analyse and post to competitor backlinks" is enabled
I find this slows stuff down.
I've analysed the performance of the various engines and disabled those with low success rates
Low success rates may be caused by above settings, I'd give it another go after changing some of the stuff I listed above (if you disabled blog comments, for example, then you would get lots of failed comments because you're not using keywords to find new targets and repeatedly using the same ones.
I would also add that if you are using lists and not search engines, you may as well check even the worst performing engines. Why not take that link if it is in the list already?
I'm hesitant to remove all search engines and just use the lists, as I'm concerned that this will reduce the contextual relevance of the backlinks. What's the best way to ensure good contextual relevance in SER?
A lot of article directories, wiki's etc tend to be generically themed so it's really difficult to find a whole site that relates just to your niche, usually it's just a category or sub-cateogry of that site at best.
I consider the page to be contextually relevant rather than the whole site. That's my approach but i know others do it differently.
I read elsewhere that one should avoid Exploit, Indexer, Pingback & Referrer backlinks, because they're likely to be spammy. What say you, in light of this discussion?
I wonder if there's anything you see in the logs that I've missed: http://pastebin.com/9AW1Kye8
I ask this, @gooner, because I have found your perspective and experience to be refreshingly different to mine. Am very keen to be convinced that PR0 backlinks are still worth building, coz I'm certain that SER could churn out tonnes of them! But equally concerned that we'd only have to remove them all again in our next detox.
But i do also use PBN links and web 2.0 links on my tier 1 for probably 50% of my sites. Those that need the ranking boost.
But i should say i have cut down a lot on their number of links on tier 1. Usually just a few hundred and then i use most of SER's power to spam them.
I wouldn't recommend it for an important site though. Most of my work is for clients and i never work on their corporate site. I always build them a site for the purpose.