Skip to content

Powerful VPS, low LpM, how could I improve it?

Hi,

I bought a VPS from Serverhosh: Intel Xeon E5-2670 3.3 Ghz, 6 core, 4GB RAM, 1Gbps port, 35GB SSD.

I have seen LpM at ~150, however, mine stays up to 20. I bought 10 rotating proxies from BLAZINGSEOLLC. I bought a submitted list from here:

 Here is what I did:

Checked all except ones with no contextual links.
image
image
imageimage
Selected only Italian search engines.
image
image

Copy pasted the same project 6 times:

image
image
image
image
image
image

Thank you.

Comments

  • looplineloopline autoapprovemarketplace.com
    @Gintas
    There is too much to go over here, Im going to look thru it all in detail and do up a short video and mail you. 

    But for the sake of everyone else, and you, rotating proxies are great, for SOME things.  Accuracy is not one of them. 

    If you want to scrape, rotating proxies are great.  If you want to do accuracy things like post, link check and check emails - these things aren't so much. 

    Nothing against rotating proxy providers, and I can't blanket statement everyone for this, but the providers I have tried (several of them) have functioned this way.  Its not a problem, the provider is putting the effort into making sure you get ip diversity, and thats great, but you lose accuracy at that level of mass diversity. 

    So for posting, verification, and email checking, I recommend using private or shared private proxies. 

    I do use rotating proxies, for scraping, but I use them for nothing else.  I use shared private proxies for everything else. 

    Thats my 2 cents, do with it what you will.
  • GintasGintas Lithuania
    @loopline I forgot to mention, these are rotating proxies every 10 minutes, not on every request. Does your recommendation still count then?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Gintas Loopline put it diplomatically but thats not my style, rotating proxies are a sack of shit for submission, try get semi dedis or dedis is possible. As loopline said rotating proxies can be awsome for scraping though.

    Also to my knowledge the autoapprovemarket place is only verified lists, no submitted or identified anymore. Not sure if that was just a typo in your OP though.

    You didnt say what the goal of that specific project was or the tier it was on so I can't offer much advice on that but for how I run SER there is a pretty big mismatch of engines there along with a fair few I consider useless for various reasons.
    • On the article tabuntick the bottom option about do not submit the same article thing, you are using spins they will be fine.
    • You have no captcha service selected? Not even ask user.
    • Verifying links automatically slows you down over all, do it once every few hours, depending what tier it is and what im doing I do it every 6, 12 or 24 hours.
    • Dont bother with GSA indexer, it does little to nothing other than waste system resrouces that could be used by SER or CB.
    • Tick retry to submit to previously failed sites and set it to at least 5 in my oppinion, its a brand new feature I suggested so im still toying with it but im loving it so far and its helping a bunch and im 90% sure im using the same list as you.
    • You have 9 search engines selected, if you are using a list untick them, they are slowing you down and wasting threads.
    • On your time to wait between two posts settings they could be changed and possibly improve your rig but again this depends on what your goal is with this project.
    • Again its down to you but I personally would untick the avoid pages with xx out bound links on page. I have perfectly good contextual domains that I use but have over 70 OBL due to various things.
    • In settings turn off enable PR for verified links it slows stuff down and offers little value.
    • I would only use proxies for Submission as you are using catch alls for your emails and you have I doubt you have a reason to scrape but again I cant see anywhere you mentioned your goals for the project.
    • Even with 10 proxies if you arnt scraping you can use more than 200 threads.
    • Untick the box about decreasing threads above 95% system resources, if SER is spiking you that high you need a better VPS anyway.
    • You have captchas enabled in options but not in your project, also Captcha Tronix should be below CB as SER works top to bottom. CT has a thread limit on it so it will clog up very fast the way you have it set up but in my oppinion CT is a waste of time anyway, also your CT API key is visible...
    • Untick the Yandex TIC option, you have no PR filters on your project and in my oppinion its a waste of time anyway.
    • You are using a bad words filter, depends on your goals and the engines you are using but again, in my oppinion theres no point 90% of the time.
    Your bottom image doesnt load for me so I cant offer feedback on that but theres some points on the rest but again its all my own oppinion and without knowing your goals or what the project is for bits will change.
  • GintasGintas Lithuania
    @shaun Thank you.

    1. I replaced these 10 rotating proxies with 10 semi dedicated from buyproxies.org
    2. Unticked "Do not submit same article" checkbox
    3. There are two captcha services:
    image

    4. How can I set it to automatically verify links only every 6 hours? Just stop manually, set verify, start again? Or is it this setting?
    image
    5. Alright, dropped GSA indexer, subscribed to indexification.com
    6. Ticked "retry to submit to previously failed sites", set it to 5.
    7. Alright, unticked all search engines, I thought they might be used for finding images or something.
    8. I think goal is just to get as much backlinks as possible to my custom made tier 4 web 2.0 links, testing some possibly custom linking strategy.
    9. Okay, unchecked "skip sites with more than 70 outgoink links on one page"
    10. Okay, unchecked "get PR for verified URLs" in options
    11. Alright, let's try it, left proxy only for submission
    12. Set threads to 500.
    13. Good point, unchecked "Automatically decrease threads on a CPU usage above 95%"
    14. Thanks, somehow that checkbox got unchecked, no clue. Isn't only the login visible in images? I can't see the key.
    15. Unticked "Use YandexTIC as PR"
    16. I thought maybe this bad words filter will avoid some garbage porn sites, I know they aren't good for you.
  • GintasGintas Lithuania
    Also, is duplicating the same project few times a good idea to maximize resource usage or not?

    By the way, just restarted it, I see it's rolling quite much faster :)
  • antonearnantonearn Earth
    edited November 2016
    I unchecked "Shauns feauture" "Tick retry to submit to previously failed sites" since I just thought it wasted resources (why should SER succeed with a post the 5th or whatever time?) and my VpM improved ~500-900% Had VpM of 10, now VpM at 50-90 for dofollow contextuals.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    4 - yea thats it.
    5- check your index rates, when I used them they were a waste of cash.
    7 - nah thats a seperate setting.
    14- ah fair one, some just give you a key to put in there.
    16 - If a blog is about cooking for example but has auto approve comments and SER users are hitting it then its going to have stuff like that from the SER users anyway so you are massively limiting your pool of available domains.

    @antonearn there are a number of reasons, firstly the site could timeout on the initial try for a number of reasons but be fine within a few hours. Secondly CB might support its captcha at 50% success or something, get it wrong the first time but second or third or whatever it might work. The site maybe offline completely for whatever reason but then come online again after a while. The proxy the initial thread uses maybe having issues or something for the initial try but the proxy used on the second try could work fine.

    The main idea for that feature was to be used on verification projects on a seperate rig for SER where you just push your list through it to see whats still alive and working on it then use the list that rig generates for your live lists as you know its a decent list.
  • Yea good points for sure, but for now, I dont check it to retry sites, as I found it slows VpM down. 
  • GintasGintas Lithuania
    Okay, removed that bad words filter.

    So what indexing service would you recommend? Is indexification.com good?
  • Elite Link Indexer has best success rates 
  • GintasGintas Lithuania
    Alright. I noticed GSA SER started to run at ~30 LpM, I restarted it, now it's ~100 LpM again. What went bad? Is it possible, that the verification process kicked in? If so, how could I make it faster?
  • Submission, verification, email checking,
    Skip for identification,

    Are you also ticking these boxes @shaun ?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited November 2016
    @Gintas Theres loads of stuff that could cause that in all honesty from target domains available to project settings to verification to how your list is filtered and a fair few others mate.

    Upload a screenshot @antonearn

    Edit - do you mean for proxies? I only use proxies for submission they way I use SER.
  • image 

    Ok, does this look fine? 
  • antonearnantonearn Earth
    edited November 2016
    "The email adress you entered is already in use or invalid" 
    I've checked the emails, they're not invalid. 

    So the problem may be a that I have set Maximum accounts per site to 10?
  • antonearnantonearn Earth
    edited November 2016
    Also curious if you're using this setting to insert x- number of additional links, to avoid broken tiers?:  
    Referring to this: https://forum.gsa-online.de/discussion/13811/serlists-tutorial-optimizing-tiered-linking-campaigns/p1 

    image 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    In both those posts you haven't provided enough information to answer your question...

    The screenshot depends on the type of emails you use, your total links per day and how you are getting targets.

    The text question in the second post depends on a number of factors such as proxies, email types and total accounts created per day.

    For your second question you could have probably just duplicated your project, put max accounts per site to 5 loaded the same number of emials in and let it run and see what happens and got an answer....
  • Try automated proxies by Solid Proxies the api is in SER and also advertised in the software too. They work great and semi dedicated are dirt cheap.


Sign In or Register to comment.