Skip to content

Low LpM - screenshots

Hey guys,

I´m trying to setup my GSA campaigns since days but somewhat is killing my LpM. Please take a look at my options and let me know what I´m doing wrong. I´m using Captcha Sniper and Success rate of Submissions to live links is high.

My VPS:
Windows Server 2008
Six cores, 3,2GHz
16GB Ram
100 MBit shared one (usually around 20MBit)
I pay 35€/months for it.

I have two types of campaigns:
Standard 3-Tier Linkbuilding
Tier2 + Tier3 (Tier 1 are expired domains)
Tier1 Links not really count as I build only a few a day. 
Content comes from Content Machine
I use a wordlist of top10k english words, top10k German words, the content machine generated keywords
I use mainly Domain/URL-Anchortexts

If I start the campaigns LpM is around 100 but soon drops to about 3. Threads to use drop below 10 as well after 5 minutes. I use same options for Tier1, Tier2 and Tier3. I use 1 Hotmail e-mail address and 3 custom e-mail addresses from one of my domains.



General Options:

Tier1:

Tier2 (same options as Tier1)

Tier3 (same options as Tier1)

Comments

  • Maybe turn off the "Finally ask user if" in the captcha settings? Unless you're always watching it.

    Also, it looks like you're using every single engine of each type you have checked. Find out which ones are working the best for you and only use those. These are what I use: https://forum.gsa-online.de/discussion/comment/26488/#Comment_26488

    I would also save submitted and turn them on in the "Use URLs in Global..." setting on your projects.
  • General Options:
    - maybe use your Proxies for submissions for a while in case your IP is banned everywhere. i don't think it matters that much though but is worth a try.
    - Captcha Sniper <-- which OS are you using? there is an optimized database for 64bit systems you can download from BHW.

    Project Options:
    - change "Pause project after XX" from '1440' to 'reached a day'
    - maybe create a 'submitted' global list and and use this for your project as well as your verification list is somewhat limited by now
    - "skip site where the following word appear" <-- you filter for the word 'sex' which is often used in registration forms
    - "for TAGs use" -> 'Anchor Text'

    I also ask you for curiosity. Which OS are you using?


  • edited July 2013
    Thanks for Your advice. I adapted everything except the engines and that is the result after around 10 hours (7,6Lpm): http://screencast.com/t/Q5vaSaJz8

    Why do my Threads go down always?

    My OS is Windows Server 2008 R2 Standard 6.1

    I didn´t adapt the engines yet. If I read this forum it´s like everybody has with similar configurations LpMs above 100 before adapting engines. I guess it´s easier to adapt my engines as soon as I have a bigger samplesize.

  • yes, most of the advices were not LPM related so i didn't expect an impact. most of the time a low LPM is caused by search engine or proxy issues when SER isn't feeded with enough new target URLs.

    another issue is when people filter too hard and everything gets skipped.  but that isn't the case for you. at least not for your Tier3 project. you should unselect the URL shorteners though. the reason to use them is to cloak your links and the backlink itself hasn't any (?) value i believe.
  • Okay, I checked proxies. They are working fine, below 10 Threads but they are working against Google and Bing. 

    I also have chosen different search engines for every campaign and only 2 Google search engines, 7-9 search engines for 7 non-submission-capped projects. 

    If I verify links, Threads jump to top. 
    In search only Threads stay around 5. So I guess the search is my problem. When I start and it goes through the global list, threads are very high but as soon the global list is completed threads go down.


  • ronron SERLists.com
    edited July 2013

    Here's my opinion:

    • Have private proxies for everything
    • Turn off "finally ask user..." for captchas
    • Turn off pinger in Indexing section
    • If you are using public proxies...Don't
    • Ask "First Service" to fill captchas, not you!
    • Uncheck Also Reverify Backlinks (better to do that once a week manually - takes 5 minutes)
    • Check the last two boxes in the Submit section
    • Use anchors for tags
    • Avoid posting to same domain twice on T1's
    • I think it is foolish at this stage for you to use sitelists. You need to get your submissions up. You can always turn on sitelists later (my opinion).
    • I just saw that 95% of links are domain, and 5% generic - that means you are not using any keywords as anchors

    I have verification disabled on all non-T1 projects, and do them once a week or whenever. It speeds up the LPM because then SER is working on submissions.

    Not sure this is an issue, but since I took the time to look at every screenshot I will say this just in case...For every contextual tier (T1,T2,T3) you should have a matching junk tier/kitchen sink (T1A,T2A,T3A). After all, you need to backlink the contextual links with a bunch of spam in order to index them and get link juice.

    T1 < T2 < T2A (your third tier) =>If you are doing this, then you are doing it wrong because you are missing a T1A which is really needed for your T1. If you have at least one junk tier for each contextual tier, your LPM will go up significantly. If you are already doing this, then please disregard (as I am not sure if you are showing everything).

    I get frustrated with people complaining about LPM when they have sitelists turned on - especially if they are new. New users have *crap* sitelists, and are delusional that having that turned on will improve LPM. (In fact, getting SER to constantly check a crappy/small sitelist will undoubtedly slow SER down, IMHO). I know this is a very sticky point for me (and I am ranting), but if you can't get a decent LPM without sitelists, then you have something wrong with your settings. Using a sitelist should be the cherry on the cake, but first get SER running well without sitelists. That is my opinion and I am sticking to it.

     

  • MrXMrX Germany
    You have to make sure that not only your proxies are working fine but that they are not banned on search engines. There is a custom string for testing somewhere here in the forum. Google will not show if your proxy is banned when you only check against their homepage but if you will perform a real search with it.
    I would say most of the time, low LpM is about banned proxies because you wont get more targets from search engines so there is nothing to do for SER (-> threads will drop -> LpM will drop)
  • ronron SERLists.com

    If I had to pick the #1 method to improve LPM, it would be to check verified vs. submitted on all engines, and get rid of the crappy performers. It changes everything.

    I just don't have the problems that some people here are having with bans (I use 30 buyproxies). Maybe it is because I use 120 search engines. But you must experiment with that as well because the more engines, the less bans.

  • MrXMrX Germany
    Thats true, the more search engines selected, the more time it takes you to hit the same SE with the same proxy and therefore get banned.
  • Hey guys, I followed your advice, thanks for that. Unfortunately problem is still the same. Please focus on the low LpM problem. 3% increase don´t help me if I start with 3LpM. Anything is KILLING my effort, so I´m looking for the big point.

    I checked my proxies again, they are working fine according to GSA and I can scrape with them with Scrapebox. In the log window i often read that search engine XY blocked the proxy. If I make a break searching is working again with 200 Threads. So my problem is related to Search and perhaps related to my proxies. 

    I have 30 buyproxies. I tested public proxies and saw similar LpM results. 

    What do you mean with custom string? 

    Updated Screenshots:

    Tier2:

    Tier3:


    Options:


  • MrXMrX Germany
    You can do the following to make sure your proxies are not banned:
    Click Test > All > Custom
    Paste this to site: https://www.google.com/search?btnG=1&pws=0&q=crap
    Paste this to string: crap
    This will perform a real search on "google.com" for the word "crap" and checks if the string "crap" shows up on the result. If it does, your proxy is able to scrape if not, your proxy is blocked from searching!
  • Okay, 29/30 proxies working. Unfortunately a string that is containing the URL wasn´t working but I used

    (with spelling mistake) and String: microsoft

    So proxies seem to be fine. After putting it on run average Threads are around 5 like the minute before the test. 

    What else could it be?
  • AlexRAlexR Cape Town
    Start off by making sure your images show in the forum and we don't need to click to see them. You need to insert the correct image url so it shows up in your post. Much easier for people to see your screenshots..
  • Okay, will do it next time. It seems that the problem is not connected to my options.
Sign In or Register to comment.