Skip to content

Need your advice about server to use

Hello guys,

I just started with GSA SER and i am using SERLIB for Tier 1. I am also using GSa captcha breaker GSA SEo indexing. There are always crashes and I have to babysit the softwares. It's simply a pain in the ass.

I am using the VPS below from SolidSeoVPS since one week. According to your experience, should we use another VPS or dedicated server to solve the problem? If yes, can you please point us in the right direction?

I hope you understand how frustrating it is to be using an automation system and to have to look at it every 2-3 hours waiting for it to crash?

Thanks in advance for your support.

The actual VPS i use is below:

HighlineMax - Geek


CPU: Xeon E3/E5

Core: 6

Ram: 6 GB ECC Dedicated

HDD: 80 GB PureSSD

Monthly Bandwidth: Unmetered

Port Speed: 1Gbps

IP ADDRESS: 1 IPV4

Tagged:

Comments

  • I am using Hetzner VPS (Contabo before) but I doubt this is the core issue you are experiencing.

    All of my GSA software (SER, Indexer, Redirect, Website Contact, Keyword Research) is working smooth and perfectly stable.

    What kind of crashes are you experiencing?
  • @organiccastle thanks for your answer. 

    The error that causes GSA SER to crash regularly comes from Serlib. It is "Error Code 0xe0000008" or a variation. This add on seems to be consuming all the RAM available. I am running GSA on 20 threads only with Serlib on 5 threads but this changes nothing. The creator told us we need at least 8 GB ram. We tried before with a 16 GB ram and got the same issue. 

    Ironically, we choosed this add on (SErlib)  just because we were short on money. And since then, we are at our 3rd server in the same month. We paid and migrated (waste of time and money) just because we want this add on to work smoothly. But most importantly we learned.

    We are going to run GSA by itself and find another way to get out Tier 1 links.

    Thanks in advance for your answer Bro!
  • TheDoc  have you tried again after the latest SER and SerLib updates on fresh projects?

    I believe the browser functionality has been changed/updated which may possibly be the culprit to your crashes.

    What number do you have set for browser based threads?



    Also, make sure to hit SERlib Update and select the options be it "check" "force" and "test"

    Do this in addition to updating SER normally.

    Try on new fresh project with lower threads like 5 or 10 to test.

    The 1 you see in image is set that way just for scripting your own new engine on 1 thread.

    Maybe check the SERlib docs ------> https://gsaserlists.com/serlib-documentation/

    You will find that there is advanced debug mode which you should be able to see whats going on in browser if open SER this way from command line.

    Windows Task Manger can be helpful as well to see whats processes are doing and what/how much resources they are using if you can pull it up.

    Maybe try that, and post back results. . .

  • Got it. So I doubt your issues are related to SER or the VPS but to the addon.

    I saw there was another update just today, hopefully this will resolve your issues. The developer is on this forum so he might be interested to provide direct support.
  • I updated everything (serlib, gsa ser). Look: there are too many chromium browsers (more than 50) opened while there are only 5 threads to run. This is what eating the RAM. 
  • If the Serlib add on could control the number of active chromium this would help the software run smoothly.
  • AliTabAliTab GSAserlists.com

    Hello,

    The behavior you've described is indeed normal. Each thread will have approximately 10-12 instances of Chromium open, as this is characteristic of how Chrome and Chromium operate. Chrome is designed to separate the browser, the rendering engine, and the plugins by running them in distinct processes. So, it's not like 1 thread, 1 chromium process.

    The problem we previously faced involved certain instances not closing properly under specific conditions, leading to an accumulation that caused system crashes. To address this, I've introduced a method to forcefully shut down these instances by ending the process after confirming task completion. Currently, I'm waiting for user feedback to confirm that this solution has successfully resolved the issue for them.

  • TheDoc I have started a fresh new project and been at it for a few hours now and I have not seen any crashes running SERlib only.

    To be fair I'm going to let run longer, but so far so good.

    As @AliTab mentioned its not 1 to 1 even my firefox developer browser says (20) or so and all thats open is 1 browser with openai.
    Thanked by 1AliTab
  • Updating:

    Without changing the server, just by adjusting the number of browser threads from 5 to 3, there were no crash anymore. The LPM drops to 0.60 and we have been able to gather 715 links in 20 hours.

    My conclusion is this: Serlib is cheaper than other tools people use for Tier 1 links but it eats a lot of RAM to run fast. 

    Because it's my first run into this game, I don't know what's the best choice because there are other things involved like the quality of the backlinks etc... I will continue to learn.

    To everyone and @backlinkaddict a big thank you!

  • Congrats and No problemo!

    Working in browser will use more ram in general. Other software running on computer that automated these sites in browser mode would be done with a drip feed campaigns over X amount of time.

    Whats going on in the link manager will show you much better overall progress, I think.

    LPM is not great measurement of "how well" you are doing. 

    See you dialed it back and got better results :)

    Also, LPM for me is not accurate, sometimes. I think resetting this metric right before the project starts submitting will give more accurate results.

    A well optimized campaign will look like this submitting and verifying. [cant find my image so I created another]

    Nothing on graph here, but hopefully this helps to compare on your end. . .




    Browser based engines usually need to be treated differently than the socket based ones.

    The goal with these is usually to get them built and have them stick so spamming them may not give wanted results.

    ;---------------------------------------------------------------------------------------------------------------------------------------------------------------------;

    >>>Because it's my first run into this game, I don't know what's the best choice because there are other things involved like the quality of the backlinks etc... I will continue to learn.

    These engines should have unique quality content on them for best stick rate and actually getting them indexed.

    Even content added to same site maybe a post a day as example not 100 in a day and then stop will help with this.

    There is always something to learn, which makes this fun!

    An yes, SERlib is cheaper than other tools, some I tried again recently that want 3-5 x the price and unfortunately they did not get any better.

    Remember, you can also create simple script and add any engine/site using SERlib, so if there are some working from other places you want to get a link from, you can add any into the program with some simple commands for now.


    Here's another example of what added custom engines look like. . .


    So as you can see you also have ability to go much further.

    A simple register and verify email for custom social bookmark would look something like this. . .

    [REGISTER_STEP1]

    registration success= Welcome to BibSonomy | To complete your registration please click the link in your registration mail within the next 24 hours | You successfully registered as user | Congratulations
    submit failure= This user name already exists. Please choose another name.

    open_url(https://www.bibsonomy.org/)
    wait_time(5)
    click_link_by_linktext(register for free)
    wait_time(2)
    type_into_field_selector(#registerUser\.name,%login%,0)
    wait_time(4)
    type_into_field_selector(#registerUser\.password,%password%,0)   
    wait_time(3)
    type_into_field_selector(#passwordCheck,%password%,0)   
    wait_time(2)
    type_into_field_selector(#registerUser\.email,%Your E-Mail%,0)   
    wait_time(3)
    type_into_field_selector(#registerUser\.homepage,%url%,0)
    wait_time(4)  
    type_into_field_selector(#registerUser\.realname,%name%,0)
    wait_time(2)
    solve_recaptcha(v2)
    insert_recaptcha(%recaptcha-action%,%recaptcha%,%recaptcha-callback%,)
    wait_time(3)
    click_element(#acceptPrivacy)  
    wait_time(3)
    click_element(#command > fieldset:nth-child(2) > div > div:nth-child(4) > div > button > span)
    verify submission=1

    [VERIFY_EMAIL]
    wait_time(5)
    open_email(%Your E-Mail%, %password%)
    click_email_link_text(activate, https://www.bibsonomy.org/activate*)
    verify submission=1
    first verify=1
    verify by url= https://www.bibsonomy.org/activate*
    submit success=https://www.bibsonomy.org/activate*

    So there are ways to do some interesting things in addition to the current engines in add-on.

    It's that simple!

    Good Luck and no issue at all, glad to be of some help :)







  • @backlinkaddict Big thanks again! Will apply these information!

  • Glad to see you stick with it and find solution, rather than give up!

    At this rate, you'll be more dangerous then most SER license owners before you know it!

    In addition, this is what I meant in statistics by LPM and VPM correlation in image above.

    Just click on LPM  on the bottom of GUI and you can find these settings there, as well as some others ;)



Sign In or Register to comment.