Skip to content

Lowest Lpm - A Quick Look

Hi, I'm new to the software and have been running a few campaigns for about a week now. They're T2's & T3's. T1 was done in UD. The below screenshots are from about 12 hours since since stat reset. Maybe someone could give me some insight into my settings b/c the LPM is, well the worst obviously. All comments that are useful to me are rewarded with finest hunnies on the planet stopping by your house with smoke and beer. Courtesy of yours truly. Thanks.

LPN
image

Settings 1
image

Settings 2
image

Settings 3
image

Settings 4
image

Settings 5
image

Tier 2 - Top Half
image

Tier 2 - Bottom Half
image

Tier 3 - Top Half
image

Tier 3 - Bottom Half
image

Tagged:
«1

Comments

  • edited July 2013
    Mark all proxies as private. Increase threads count. Share a longer log file or screenshot Disable filter in global settings, just give it a try Set when to verify to a longer time or never
  • Thanks @Dunce all i see is the blacklist in global filter (all those shown in graphic) and submit backlink to blog engines in global index (all defaults selected). Where do I find the Disable filter. Thanks for the advice. this LPN has to go up.
  • Image named settings 4, uncheck the "skip submission if the URL ..." Anyway, I'm not sure if uncheck this will improve ur LPM or bring bad result for ur rankings, just a guess.
  • Had the same problem. I removed OBL filter and PR filter then it shot up to 5 LPM. OBL of 70 should be ok since I see most sites that my links were posted to had less than 70 OBL. PR can be a problem. But in your case I don't see that set.

    Next thing I would recommend is adding more keywords. GSA will either use global lists or scrape for links. Either you import links into GSA or give it a lot of keywords and search engines to scrape with. I did both and hit 20-25 LPM next day.

    I used scrapebox to scrape article footprints with 100K keyword file (google and you will find it). I now have 500K non-duplicate URLs in identified list. 

    For GSA scraping, I gave it a lot of keywords (about 1K), select find and use keywords from target sites and selected all search engines.
  • Tier 2 - Top Half: check the "must have exact url" option.

    Apart from that the best known "Low LPM" issue is that your proxies get banned fast on google SEs these days (= no new target URLs). Add more proxies and/or different SE provider like yahoo, ask, aol for instance. Yahoo for example doesn't seem to care about SE queries that much and won't ban you that fast. You can try to just use a collection of yahoo SEs for a while with a lower SE query time like 20 seconds (or even less) to see if its making a difference.
  • Thanks @Ozz.  btw: I have 50 semi-priv proxies form buyproxy right now.
  • LeeGLeeG Eating your first bourne

    Untick use identified list

    Set captcha retries to 0

    Lower the amount of search engines used

    Don't ping the blog search engines since your already using another indexer

    Kill the engines that give no submissions. Complete engine sets selected and a lot of those wont be supported by captcha sniper

  • check your log if your proxies are banned. as you are using semi-private proxies you don't know what other user will use them for.
  • edited July 2013
    @nitinsy, so for one tier 2 campaign I have going for example, i used Kontent machine and it threw in 102 keywords i looks like. So you say go scrape like 1000 and import? And for the next thing you said, I don't understand completely "I used scrapebox to scrape article footprints with 100K keyword file (google and you will find it). I now have 500K non-duplicate URLs in identified list. " - I appreciate your help. And I have scrapebox btw so that's good.
  • @Charles_27, I got article engine footprints from GSA folder (check this forum, there are threads on how to do this). Then I used scrapebox to scrape these footprints + keywords (taken from 100K file). I have only done the first 3000 keywords as of now. This is an ongoing process for me.

    In you are using GSA to scrape, then also the more keywords you have the better. I took the 100K file and set it in keywords fields (for T2) using spinfile macro.
  • I was using newipnow for proxies which didn't work well. Then I moved to shared buyproxies and use that currently (30 shared proxies)
  • LeeGLeeG Eating your first bourne

    If your using the indexer in the engines your targeting, you need to untick "verified links must have an exact URL"

  • @Ozz - Hey I checked the proxies in both SB and GSA "check proxy" feature and half are dead. 25/50. I researched but can't find where the GSA log is. I know that sounds dumb but I cant. I right clicked over the scrolling URLs on the main GSA screen and selected "log to file" to see if that was it but don't see a lot of messages about the proxies being banned there. I could just skip using semi and get dedicated next time but I thought 50 semi-private for GSA and 10 private for Ultimate Demon was going to be enough.
  • @LeeG hi, since this morning, I've tweaked the settings to now not use both "submit backlink URLs to blog search engines" and index services at all, all set int he global index section. I read from @ron to do those externally. I have a indexification acct so I'll try to do that manually. But I thought for using Tier 3 going to Tier 2 I had to select  "verified links must have an exact URL"
  • LeeGLeeG Eating your first bourne

    Using indexification wont hurt

    But when I looked at the engine types you had selected, ie web 2 etc, you had Indexer selected just above microblog

    I had a chat with Sven on here in the past about getting those links to work and its that one tick box that will affect building those types of links

     

  • @LeeG. Ok, I'm going to check that out. Thanks again for your help.
  • @LeeG I see where you are saying. So I un-selected the indexer in the tasks (also global and campaign) and I took off the pinger there too. I figured if I'm going to be entering all links in the indexificaiton, that should take care of that too I hope.
  • Update: Got up to 34 LPM from less than 1. This was in just 10 hours of using the tips here and on another thread on this forum. I still have to manually index but, way, way better now . Thanks to all who helped. :D
  • ronron SERLists.com
    edited August 2013

    @Charles_27 - "I've tweaked the settings to now not use both "submit backlink URLs to blog search engines" and index services at all, all set int he global index section. I read from @ron to do those externally. I have a indexification acct so I'll try to do that manually. But I thought for using Tier 3 going to Tier 2 I had to select  "verified links must have an exact URL"

    I NEVER said that! I responded to your misstatement here: https://forum.gsa-online.de/discussion/comment/36917/#Comment_36917

  • Hi ron, i am a newbie GSA and want to know about disabling sites list to be used by GSA. Also i want to know the best settings to be used for Tier3 links. Please update me soon.

    Thank you
  • AlexRAlexR Cape Town
    @LeeG - I didn't quite follow why you said 
    "If your using the indexer in the engines your targeting, you need to untick "verified links must have an exact URL"

    1) Can you explain this a little more?
    2) You mentioned something about this impacting some engines. Which ones get impacted? 

  • LeeGLeeG Eating your first bourne

    Here you go Alex, Sven explained the settings, when I reported a bug that was not a bug

    https://forum.gsa-online.de/discussion/4501/bug-sitelist-indexer-whois-or-statistics

    2) no idea what your going on about

  • AlexRAlexR Cape Town
    @LeeG - Thanks
    2) You said something about deselecting "verified links must have an exact URL" when selecting web2.0's. Why's this??
  • LeeGLeeG Eating your first bourne
    edited August 2013

    No I never Alex, I said to deselect that option so you can use an engine in ser which is covered in the above link

    That's the only mention I made about that option, when I pointed out where the indexer engine was

    No talk about deselecting anything to use web 2's

     

    The only reason I pointed out the engine that way was to save confusion with any indexer options in the main options tab on ser. I never took into consideration the AlexR effect

  • donchinodonchino https://pbn.solutions
    edited August 2013
    I was searching the forum about "verified links must have an exact URL" option, but I didn't find any satisfying answer..

    I found that my projects to money site did not have this on, but Tier2 had it checked, and I unchecked this option everywhere because it seemed to me like holding verifications back (articles, directories, bookmarks, social, video, web 2.0, wiki).

    Can you please explain a bit, when and why should this be on? I thought if it is on, then all anchors are exact URLs, or does it make all link sources be exact URL.. I'm afraid I am misunderstanding here something.... though all verified links seem to be ok (anchor + source).

    Few comments from ser professionals would be highly appreciated @Ozz @ron @LeeG Thanks!!
  • donchinodonchino https://pbn.solutions
    Ok, I read the mouse-over help there over and over again.. am I right, if "verified links must have an exact URL"  is unchecked, some verified links can include only the domain and not point to the right post, like verifying a link pointing to wordpress.com instead of wordpress.com/post/438.html ? But I don't see how this could happen if I have the "Use the root/main URL in some variations" unchecked also under Data..
Sign In or Register to comment.