Skip to content

What is happening to SER?

goonergooner SERLists.com
edited May 2014 in Need Help
@sven - I love SER and i love the work you do so please understand i am not starting this thread to cause trouble for you, i am just really concerned about the direction SER is heading...

I run SER on 6 dedicated servers and 1 VPS... Day in, day out and i can tell you 100000% for sure that SER is getting slower and more resource hungry with every passing month.

I'm not a coder so i have no idea what the cause is, but surely making SER fatter and fatter by adding features all the time doesn't help?

I bought SER 7 months ago and it used to run very nicely on VPS, now you see everybody using VPS is complaining about CPU usage and out of memory. The latest versions are almost unusable at high volumes on a VPS (i rented a VPS solely to test this theory). It's now regularly pushing my dedicated servers to 100% CPU - Which is just crazy.

Then there is the the out of memory loops that i get stuck in on an almost daily basis (even when memory is at 300 MB, 500 MB or whatever).

Yes i can lower threads, use less projects but the end result is less verified links.

The only real purpose of the software is to post links, the rest is just noise.

I would be so happy if you went back to version 7.51 or whatever and only added to that the things that are 100% necessary to make SER run better.

I understand you want to try and include all of the feature requests that people ask for. But IMHO the focus has to be on making SER run faster and post more links.
«1345

Comments

  • donchinodonchino https://pbn.solutions
    edited May 2014
    Not that I don't agree with you.. but i think it depends why each of us is using Ser. 

    You seem the type of a link builder who needs to blast thousands of links daily on each project with many tiers. Some others might look for quality instead of quantity, and for them it is not only posting as much as can, but to make it good quality - therefore all features and updates are necessary.

    I think Sven is doing good job, and he will surely optimize whatever is needed. But maybe 7.51 is too long way back? People were jumping up and down clapping hands with 8.23-8.26. Maybe @Sven can look for a reason what came after that...
  • goonergooner SERLists.com
    Hey @donchino - I used 7.51 as the example because everyone seems to like how it runs.

    I see your point about different types of users. But i haven't changed the way i do things and i see higher use of resources. So 'something' is causing this aside from my settings, methods etc.

    I'm not saying it's the extra features that are the problem because i really have no idea.
    But there is a real problem that many people are experiencing and i think everyone would have a better user experience if it could be identified and fixed.

    Also, i agree Sven does a great job :)
  • @gooner how many projects per server do you run and can you include what scheduler timings you use (if you still do). I find that 50 projects per server is optimum (for me, at least), on 25 x 20mins. 

    Just going to swim against the current but my copies of SER are still completely fine, and have been since February, when I felt that I'd nailed the optimisation side of running it.

    I have four dedis, all running well within their limits and not above 1GB RAM. Only proxies are my downfall, 100 dedis, 30% failing way too often. I am hesitant to buy more as I've started Gscraping my own lists (haven't started posting yet so all of this I'm talking about is letting SER search and post).

    Maybe I don't get the results everyone else does, but rankings and conversions are all I care about and they're following suit. I'm in some horrendous niches; 75% are national (UK/US), 25% are global (no local) and doing very well. I'm not huge on mega spam, although I do some, but in the same breath I don't use the PR filter either.


  • edited May 2014
    Agreed with gooner .I remember SER being a light weight app that i used to run it on a dual core setup.

    Right now , my hexacore SSD box chokes at 300 threads and is not able to post much.The CPU usage hits 100%-1% and fluctuates like a crazy mad woman and at some point the box freezes . SER is reaching a stage where i am not even able to differentiate whats new in an update. 
  • "A programmer is his own worst enemy" - everyprogrammerthateverdevelopedanseotool
  • goonergooner SERLists.com
    @judderman - Thanks for the info. I remember we had a conversation about this a few months back and now it seems you've taken the optimization to a whole new level. Good job mate :)

    I am using 120 projects per server. So that could be an issue. I'm also running from lists not scraping, so that could be a potential cause too.

    50 projects per server sounds reasonable, but it's not a lot for a dedi really. Or to put it another way, i would need another 4 servers. I might need to employ someone just to monitor all of my servers! haha

    I remember when we spoke about this before i was running 120 projects with no scheduler... That's the version of SER i want back!
  • Maybe Sven should reconsider 64 bit version which can help a lot.
  • goonergooner SERLists.com
    @meph - I would love that to happen!
  • About five years down the road i expect SER V24.98 to be able to post to every other platform ever written with a near 100% success rate that Matt says "fuck it" and kills link building once and for all. 
  • Even if I up my scheduler to 30 projects I get a grey box around the Active (ie. N/A) so it's just too much for SER to do no matter how many proxies or if I lengthen the settings. 25 projects is the max for me (scraping and posting).

    Before when you were running 120 projects at one time, were they all actually being posted to at the same time or do you get Active (N/A) when you hover the mouse over the project?

    I spent way too many hours and days looking at every file, every setting and tried my hardest to minimilise SER. I still have another few areas I need to tweak and test. Some servers give me 30LPM, some give me 100LPM, but I have varying junk/indexing tiers and varying contextual only projects so I don't mind the lower number. 

    Have you tried running 120 projects again with very few threads?
    Do you use the same proxies over all servers? Maybe this is a problem? Different location/login for proxies which will run at the same time, causing a delay or problem? Maybe just run one server at a time and see if there is a difference? Then you might need to buy separate batches of proxies per server (doh, hope not).

    50 projects per server for me is very few (once used to have 500 per server) too, but it seems to be the only way (for me) at the moment. Costly too.
  • goonergooner SERLists.com
    lol @spammasta - It's only a matter of time. I hope everyone has their plan B in place.
    Still, until then we may as well milk it for all its worth.
  • @meph yeah +1 from me also that's the best way to to get quality links and quantity links ...everyone is happy
  • goonergooner SERLists.com
    edited May 2014
    @judderman - I actually have 50 private proxies dedicated to only one server, so yea it's costing a small fortune in proxies alone.

    At the moment, if i run all 120 projects SER does pretty much nothing. LPM drops to 20 - 30 (no matter how  many threads). The log lags all the time, then sooner or later out of memory pops up.

    On the scheduler, i can run 30 projects for 10 minutes and get 150 LPM. But CPU usage is off the charts.
    I know i shouldn't be complaining about 150 LPM, but 4 - 5 months ago the same methods saw 300 - 400 LPM. Verifieds per day has dropped from 250,000 to 150,000.

    I'm like you, i look at every file, every detail and try and tweak it but i'm starting to feel like i'm fighting a losing battle.

  • Ah OK cool. I have run out of ideas then...You'd think that running lists SER would be lightning quick no matter what...unless it's the types of sites/footprints that causes a lag? You guys have obviously found the 'right' footprints to scrape but maybe they are difficult or getting more difficult to post to - captchas becoming harder, captcha solvers struggling?

    http://www.gsa-online.de/download/15420-24_change.log anything in there that might cause issues? I know Sven makes changes for the better and must get a little annoyed at some people wanting changes and others wanting it left the same. I did see an improvement from 8.21 or 24 but not as much as other people were banging on about. 
  • goonergooner SERLists.com
    @judderman - It used to be very quick with lists. 8.23 is the last version that works well for me. It doesn't seem to have the CPU usage issues as much. But the speed is still not that great.

    The servers run all engine types pretty much, the same as they always did anyway

    Maybe i'll go back to letting SER scrape! :)
  • steelbonesteelbone Outside of Boston
    I agree with gooner...i had no choice but to get a vps
  • You can try to get the Content from a folder with Macro, i think that will help.
  • goonergooner SERLists.com
    @richardbesson - Thanks, i'll give it a shot. Do you do that with just the articles, or also the titles, descriptions etc?
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    The low LPM and low verified's have nothing to do with SER, but the proxies that everyone is using.

    Almost all SER users are using the same proxy providers and pretty much all IP blocks have been spammed so hard already that they are all blacklisted. Nothing we can do about that.

    Just to prove that i'm right, take any IP from your list and search for it on Google, then you'll see all the Spam entries that IP has.

    Now about the CPU and memory usage, I can't really complain to much as 8.31 is actually running pretty good for me, but then again, i'm not running nearly as many projects as many other users are running.
  • goonergooner SERLists.com
    Interesting @Trevor_Bandura - You could be onto something there. Let's try some new proxies.
    Thanks for the tip.
  • Trevor_Bandura , any success using a VPN service or something ?
  • KaineKaine thebestindexer.com
    edited May 2014
    @Trevor_Bandura

    Nice and surelly for much but with 7.51 verify more ? proxy it's the same. SER probleme is only on verification, but if vérification sux, you don't have BL. Submission and verification are the only things that we really want. 

    The rest have value only if it works well before.


  • BrandonBrandon Reputation Management Pro
    Interesting observation @trevor_bandura I just checked a bunch of my proxies and they're all listed...
  • KaineKaine thebestindexer.com
    edited May 2014
    I just generate a new list of sources to see. 

    @Brandon where you verify url blacklisted ? 

    It is possible to check list of proxy. Who is the best for that ?

    or
  • BrandonBrandon Reputation Management Pro
    @kaine just copy and paste the IP into google. You'll see project honey pot and stuff like that...clearly spam databases.
  • KaineKaine thebestindexer.com
    edited May 2014
    Yes :)

    I search good database for make bot. If i must be use google we must use proxy for.

    With that for exemple is easy and quick http://www.spamcop.net/w3m?action=checkblock&ip=1.34.125.162

  • KaineKaine thebestindexer.com
    edited May 2014
    For those who want to try proxy SpamCop Passed It withdraws a few.

    zenno bot:

    ***DELETED**** send me MP

    Just test with good proxy.

    EDIT

    For 798 proxy tested, 52 are listed.
    Ip listed in Spamcop a listed in Stopforumdatabase.

    But on sources that we all use the numbers listed must increase proportionately.
  • edited May 2014
    I think the Proxy thing is a stretch. If that were the case then why would it have only jsut started happening recently? I was getting 200 lpm steady speed just the other month with the same old proxies (not theexact same ofc theyve been refreshed but just mean same provider) so i very much doubt its that.

    Im not complaining aobut the verified speeds really cos indexing is my main elephant in the room. :P

    I do question some of the additions tho such as the camp pictures by each project. That seems like 'bloatware' to me.

    I think sven should only take suggestions from the big members such as ron, gooner and other power users not any old n00b who signs up and makes a post :P
  • Here is one advice to Sven which would help all churn and burn guys around there. 

    Please make a standalone GSA SER version which will only use proxies and post to imported lists. Remove all searching features, logging features, everything else expect using proxy and posting to imported links. I think it would be a HUGE help for all of us who use GSA just to post to already made lists and it will probably not be a huge thing for you, maybe you could charge us $10-20 USD additionally in order to get that standalone version, nobody will complain. 
  • goonergooner SERLists.com
    Thanks everyone for all your comments and suggestions.

    @dariobl - I would like to see either your suggestion or a 64 bit version. I think both options would solve a lot of these problems.

Sign In or Register to comment.