Skip to content

Low Submissions using SER List and recommended settings

I setup 4 tier 1 campaigns using the recommended settings from the SER Lists publications. I am using the latest Blue SER List and also a clean list from a buddy. I am using 100 threads, 10 email accounts, 10 private proxies. I have it set to only look for english language PR 3+. I have it set to pause after 1-5 submissions. when I turn this on... it runs for hours without finding anything. I have a buddy who is a SER user, and he says my settings look fine. I know LPM is not a metric to track here, but mine is like 0.08- and I let it run overnight and it barely picks up anything. Where can I turn to to troubleshoot this? I have screenshots of all of my settings. Any advice or help would be appreciated, thanks in advance!

Comments

  • So I have resetup all of my lists, cleaned everything, etc. All via the guides on SER Lists and still this thing is doing nothing. Ran overnight on 4 campaigns and only got 1 link. I updated the application as well, no change. Here are my settings- all proxies and e-mail addresses verify just fine: https://drive.google.com/file/d/0B30UQucpbBxbTFI1bmRaZkNIb2M/edit?usp=sharing
  • cherubcherub SERnuke.com
    Looking at your screenshots briefly, I'd make these suggestions/comments:

    You have a verified list specified in image 6, but in image 7 you have selected to use urls from an identified list (which you do not have) and not your verified list. Check to use the verified list.

    You have it set to pause after 4 submissions. I'd change this to 4 verifications.

    You only have 10 proxies but are running 100 threads. Dial it down or get more proxies.

    You have a PR3 filter. This will severely limit your posting. Additionally, with so few proxies you may quickly burn through their PR checking ability.
  • Alright! I "write" to my verified list directory, but when I did tools > Import site list, I placed them in the identified list directory. I believe this was what the SER List best practices guide instructed me to do. I was also under the impression that proxy # X 10 = threads, but I can certainly dial it down or get more proxies. I will also lower my PR, per your advice.

    The problem I am having with a limit of 4 VERIFIED though is that the other night I left it running and woke up to find 12 verified links- it does not seem to be very accurate there due to the multi threaded nature of the submission process.
  • With you changes, things have picked up a little bit- still takes me a couple of hours to get 3-4 links.

    Also, I have tried it set to pause after 4 verified, which does not work because of other posts. So I changed it to pause after 4 submissions, and it does not stop, just keeps on going: CIS Tier 1 is set to pause after 4 +- 1 and it has now counted up to 11 and does not stop: http://cl.ly/image/1F2c3P1V2v1D
  • @imburr kindly post it in SER List thread @ron is the best guy out there to solve your issue
  • ronron SERLists.com
    edited June 2014
    @lmburr - I see a number of things:

    1) Do you own/are you using GSA-CB? Most targets have captcha...

    2) If you do, then I would get rid of "ask user". That should be set to Random, and you should set captcha to only use services or 1st service.

    3) Get rid of re-verify!!! Another user just started a thread that he is net/net losing more links than he is adding. You don't need that setting and I never use that. And we recommend that people not use that.

    4) You can only have a high LPM if you have junk tiers. I see nothing but T1 projects specifying 4 links per day. If your other 2 projects have the same settings as the one you showed, then you should be at 12 links divided by 24 hours = 0.5 LPM !!!

    5) Uncheck "Skip engines with moving pages".

    6) After Google decimated everyone's PR ranking earlier this year, there are a lot fewer PR3+ targets - period. Loosen up that noose around SER's throat - try PR1+.

    7) Check: "Try to always place an URL..." Very important.

    8) I would never recommend running GSA Indexer while SER is running - it sucks the oxygen out of the internet connection. Run SER first, then when you have the links, then import them into the Indexer.

    9) Did you purchase GSA-Indexer? If you didn't uncheck that. The same for the other indexer box. If you don't have an indexer, then leave those unchecked.

    10) Very important - Uncheck Web2.0. Those platforms do not work unless you have a paid subscription to SEREngines. If you don't, then uncheck that.

    11) Social Bookmarks are a waste of time. Extremely difficult captchas. Uncheck that.

    12) Check Social Network - those create articles too.

    Well, that's a start. Good luck!

    Ron  
  • ronron SERLists.com
    @lmburr - I just gave you an exhaustive analysis, lol. It's gotta help.
  • W-O-W thanks @Ron! I was just over on another thread reading some of your other posts. Alright, let me dissect this:

    1) I do own GSA-CB, and it is setup as the ONLY CB option.
    9) I do NOT own GSA-Indexer, so I unchecked it. I do have an API for backlinkindexer
    10) I do purchase the SER Lists, the Blue Ones- I assume this is something different? Do you recommend purchasing a sub to SEREngines in order to capture the web2.0's, or just skip it? This is tier 1 only at this point.

    I made all of the other changes as you suggested, to all of my projects. I will monitor and keep an eye on them. Basically what I am hoping for is being able to turn on GSA and let it run at 4 links per day for 37 days until I have used up 150 spins of my 3 articles, then rinse and repeat. I get handwritten articles and use WordAI, and also spin custom titles- which is a lot of work. Not having to closely/hourly monitor GSA dripping would be excellent.

    Thank you again!
  • @ron
    10) Very important - Uncheck Web2.0. Those platforms do not work unless you have a paid subscription to SEREngines. If you don't, then uncheck that.

    11) Social Bookmarks are a waste of time. Extremely difficult captchas. Uncheck that.

    SO we shouldn't use these ones from the blue pills? I thought everything in blue pill can be solved by CB

  • ronron SERLists.com
    @Vijayaraj - The Blue List was created with GSA-CB set at 3 retries. So it really did create those links. And when you are using a list that has them and you have GSA-CB, and have it set to 3 retries, you will likely be successful (it may take more tries realistically because we could have been lucky on the third try). But Social Bookmarks is a low yield ballgame. We don't even have that many in the list because they are so hard to crack. 

    However, if you don't use our list, and you use SER to scrape, you will tie SER up endlessly (Read: inefficient use of SER resources). I would never bother in a million years to scrape and process them using SER.

    Because I am going for the highest efficiency (meaning the most links), I don't turn it on in my projects (and I am one of the creators of the list, lol). I'm just saying what I do.

    As far as Web2.0, those engines do not work with SER unless you subscribe to SEREngines. We don't subscribe to that service, but even using the outdated platforms resident in the software, we still pick off a few (I have no idea how, but we do). Regardless, it is the same case. Very few targets. And in my opinion - for my purposes - they are inefficient for me to be running.

    So yes, they were all solved by GSA-CB, and yes, you will get some links out of them, but it will likely require a number of captcha retries. 


  • ronron SERLists.com
    edited June 2014
    @lmburr - Since you don't have GSA-CB, I would not buy our list. Let me be very honest with you here. If you don't get GSA-CB, you are going to be miserable.

    Most targets require captcha solving - and I am talking about most engines. This whole linkbuilding business is going to be next to impossible for you. You have to get the software if you are going to be successful (with automated linkbuilding).

    Because you don't have automated captcha solving, that means you will have to solve it. So then you do want those captcha settings you originally had because you will keep getting called to solve them. And you only have 30 seconds or something like that. So you have to be sitting there nonstop while SER is trying to make links. Ouch!!!  
  • @ron he mentioned he own a CB :)
  • I do have GSA-CB. Also Deathbycaptcha, but I don't enable it, thanks!
  • ronron SERLists.com
    edited June 2014
    My bad. I misread it. Phew! I was going to say I never knew anyone that bought SER but not CB.

    Yeah, save your money on DBC. I would just use GSA-CB as the first solver, and then set my settings to only use the "first service".

    And by all means, grab a Blue List (Not Red). It is open now, and will likely close down in the next 36 hours - the last 3 lists did. Plus you get all of our settings (which is free anyway, even if you don't buy). But I would review our settings in all those tutorials first. You will learn a lot, and get off on the proper footing.
  • Yeah I have read all of the guides with your last list- looking forward to the RankWyz doc this time :)
Sign In or Register to comment.