Skip to content

Where's My Bottleneck?

Here is a top level view of my projects

http://prntscr.com/3i6hxh

Submission Tab

Even when I put threads at 2000 it only uses 45% max of the CPU
I saw in a few videos that HTML timeout of 140 is decent.

When I first started I used private proxies for everything. I'm not sure if I should do that or keep it like I have it. Makes sense to only use them for submission. I have 10 semi-private proxies. Do people use Private Proxies for everything or is it always a mix?

I kept getting a lot of "download failed" or "IP seems to be blocked on blahblah" Meaning it was a bad public proxy where do I find good public proxies? I tried searching for sites to add, but after it added 80K I only had about 500 that were good.

http://prntscr.com/3i6j9o

The Data Tab

http://prntscr.com/3i6ic9

The Options Tab

I tried enabling and disabling the Send verified link to indexer services to no avail. Only English Search engines are enabled, Google, Yahoo, MSN to reduced duplicate search results

http://prntscr.com/3i6ih2

Options Screenshot 2

PR 0 and above then OBL 100

http://prntscr.com/3i6it6

For email verification I just selected
 
-Create New
-Then 25 Mailcatch.com

I don't have an AA list enabled. It's just searching based on keywords. I didn't think my search filters were that strict but maybe they are? Hard to believe that an AA list would improve my LPM that much. I wanted to get niche relevant back links. I didn't just want to throw up links anywhere.

I'm open to thoughts, ideas, suggestions?

Comments

  • spunko2010spunko2010 Isle of Man
    edited May 2014
    You need more proxies. I'd go for at least 20 more if you can afford it. I'm running similar specs to you and I have 100+ proxies.

    Also mailcatch and those types of services are OK for T2 but you are going to going to come up against huge spam filter hurdles for T1. Look around on here or other forums for bulk Yahoo/Gmail accounts. I buy them from Kelvin on here for $6/1000 accounts.
  • edited May 2014
    Thanks for the heads up on the G-Mail accounts. I was wondering why everyone was buying accounts when they were being made for free on GSA.

    You really think it's the proxies huh? Do you just use your private proxies for everything or do you still mix in the free ones too. I'm not really sure if I have it set up right. I figured only using them for submission made the most sense.

    I can't really justify paying $100 a month for semi-private proxies just for posting back links when I'm not really making any money with GSA. I could probably swing $20 a month though just to see if it does in face increase my lpm. What type of LPM on average are you getting with PR0 OBL 100 and 100 proxies?
  • spunko2010spunko2010 Isle of Man
    Hello

    I've got several licenses of SER though, so you could probably make do with a lot less. I would try with as many as you can afford, you can always cancel after 1 month.
  • Its not possible to get fast public proxies for posting unless you know what are you doing, and you get public proxies yourself, not by scraping other sites.  Buy 5-10 private from buyproxies and check how it goes (for posting). You would also need solution for scraping and here is problem. 
  • One of the settings that really slowed down my SER was PR checking - so I don't use it anymore. I noticed you're not using proxies for PR checking. Google will ban your real IP, and you'll not be able to retrieve the PR value. Without the PR, SER will not submit since you set your filters to PR=0 or above...
  • Thanks for letting me know public proxies for posting don't exist satyr85. I already have 10 semi-private proxies for submissions (which I guess is posting). Which is why I don't understand why it's so slow. Apparently it's the number of proxies I have and No AA list.

    Good thinking Olve1954 I was wondering why I was getting 4 LPM then it dropped down to 0 for awhile.
  • Public proxies for posting exist but its very, very hard to find them.
  • ronron SERLists.com
    edited May 2014
    @silverdot - I'm not sure where to begin because there are so many things I would change:
    • You need at least 30 semi-private proxies and completely abandon public proxies. If you don't, then your results will always suck.
    • Your threads are way out of line. I can run 1000 threads on 30 proxies and get 300-400 LPM:image
    • I do have verification turned off here, so the numbers are artificially low in the above screenshot.
    • I use a list. Obviously that changes the entire ballgame. But you never showed keywords. When I had SER scrape, I used a file of 100,000 generic broad keywords. How many are you using?
    • You checked off Web2.0. Why? You can't make any links on that platform unless you use SerEngines which is a paid service. If you don't, you will not only make zero links with that platform, but your link productivity will be cut by a massive %, something like -80%.
    • If you are running SEO Indexer at the same time on the same PC, it will literally suck the oxygen out of your internet connection. It will greatly decrease SER performance. If you must run it, run it separately while SER is not running.
    • Check "Try to always place an anchor text..." You are losing way too many links by not having this checked.
    • The filters are killing you. Absolutely killing you. Why bother? Just get rid of them.
    I want you to do something to prove something to yourself:
    • Create 1 dummy project where you make Bing.com or some search engine (not Google) the URL
    • Get 30 semi private proxies
    • Turn off all your real projects
    • Turn on all decent engines like forum, blog comments, image comments, guestbook, microblog, trackback, article, social network and wiki
    • Stick in a set of 100,000 generic keywords
    • Turn off all filters
    • Change the threads to 500
    • Uncheck link limits so it makes unlimited links
    • Make sure SEO Indexer is turned off in all projects and not running on your desktop
    • After this project is tuned up 100% and perfect, duplicate it 9 times so you have 10 beastly projects
    • Then run the bastard.
    What you will see is how fast you can really run with your equipment, and with scraping (as opposed to a list). You should be able to easily get 100 LPM. If you can't, it may be your PC/VPS, or your internet connection. But you have to first learn how to turn off the throttle on SER so you understand how it can perform in a pure unrestricted state.

    Lastly, you are running contextual projects. Don't expect to knock it dead when running contextuals. They can slow you down a lot. They have to create an account, wait for approval, blah blah blah. Use your head. They can't possibly run as fast as other platforms.

    Ron
  • Ron this is exactly the type of advice I was looking for. You should have charged me for that post. Thank you so much! I've been working so hard to try and figure this stuff out. I'm going to make these changes and then post back with the update on my LPM.

    Restoring hope to the hopeless my friend.
  • edited May 2014
    Sorry to interfere guys.. 

    • The filters are killing you. Absolutely killing you. Why bother? Just get rid of them.

    But @ron aren't you interested if gsa make a link at a target that has the word "fu**" at the domain ?? 

    Is this makes a difference at different Tiers or not ??
  • ronron SERLists.com
    @justice - It depends on your philosophy. I was/am able to rank for a lot of stuff with that filter turned off.

    I think the important thing is for the OP to understand how the system can run without restrictions, sort of in a naked state, and then back into whatever filter(s) he chooses. I have nothing against the bad words filter at all, but I do have a big issue with the OBL and PR filters as it will cripple the amount of links you can make.
  • edited May 2014
    @ron how many lpm do you get when you have reverify every day set to on or the stats above of 300-400 are with it on?
  • @ron Ok, my bad.. You were referring to the OBL and PR filters and not the bad words filters...

    :)
  • ronron SERLists.com
    edited May 2014
    @PeterParker - I am running v8.38 with verification off, and hitting speeds of 250 - 400 LPM with 30 buyproxies. @sven has obviously improved GSA-SER in a manner that I have never seen before. Frankly, it is amazing.

    I have been so mesmerized by the speed that I have been running posting without verification just because I am getting a woodie watching the LPM. Granted it is childish, but I love fast posting. I then take a break at around the half way point and run verification which is lightning fast. I know...it is a dumb way to do it. But I was suffering with previous versions, so I am enjoying all of this like a fine bottle of wine...I'm just playing with SER atm.

    So don't pay attention to me. I am just having fun. But at the end of the day, I am running verification, just not the normal way that I would typically do. 
  • @ron useful info,

    Do you have any idea of what speeds would be like with verification turned on?
  • @ron that's what I do :) It's far faster and running Active V once a day for 30-45 minutes is more than enough time. More control, more LPM, more verifieds, more rankings, more monies, more smilies :):):)
  • @Ron @judderman you guys both run multiple servers don't you?  How do you run a Verified once a day?  Is it a case of simply logging into each server individually and setting Active to Active V?
  • ronron SERLists.com
    edited May 2014
    @PeterParker and @davbel - Yes, exactly the same as @JudderMan says. I let it run for 12 - 18 hours on Active, then just flip the projects to Active(V) for about 45 minutes. When it goes down to 1 or zero threads, I hit stop, wait for everything to stop, and then hit start again to keep the Active(V) going. For whatever reason it starts hammering out a lot more verifieds out of the mix. Then after another 45 minutes I hit stop, change to active, and let it run.

    All my projects are disabled for verification. Despite that, after 200,000 submissions, I still have 50,000 verified while having verification disabled. So once I turn it on to Active(V), I get like another 100,000 verified or something like that.
  • @ron confused by your statement above aobut ;getting 100k more verified' we are tlaking about REverifiying arent we not the inital verify?

    SER will always show what verifieds its made so far to the right of each project wont it? didnt know there was a way to turn of verfiication completley? I presumed we were just talking about reverfication here.
  • @Ron I've noticed something strange about SER verifying when it shouldn't

    I normally only verify projects that have other projects pointing to them, so I usually set each one to verify once a day, but the verification never happens *only* once per day. 

    It seems that once it starts verifying the project, it is constantly dipping in and out of verification even though it's only set to verify once every 1440mins

  • @PeterParker Project Options - > When To Verify -> Never / Disabled
  • @davbel cheers. id seen it but never messed with it.
  • edited May 2014
    do you guys know how you can turn off reverify on all projects using 'edit only one option' cos i was just loking in the drop down list and it has reverify options but non that would turn off the reverification, i tried each one and the only one related was how many minutes but that presupposes that you DO want to reverify, ie that it is checked. when i put 0 in there it just defaulted to 1 minute when i checked a projects and all of them were reverifying :(
  • I change very often this option:
    Select all projects -> right click on them -> Modify project -> Edit signle option for all -> select "verify links" -> and then type "0" (0 is off, 1 is for automatic, 2 is for custom time)
  • yes but what about REverify ? thats what i wanna turn off at the moemnt
  • ronron SERLists.com
    @PeterParker - I would NEVER use the edit only one option - that is how all my projects got fuc**d up. I would use Notepad++ to do it. 3 minutes. I have had a terrible history using that edit feature, as well-intentioned as it is supposed to be. 
  • ronron SERLists.com
    @davbel - Yes, I agree. It does seem to slip into verification multiple times per day, and it seems to slow down link building a lot. That is why I am doing a manual verification. I just am getting a lot more links that way. 
  • edited May 2014
    ah i see sven has added vpm and dpm,  to the lpm counter nice!

    well this sucks...i jsut turned off verificaiton on all projects and no noticeable change in lpm, well it doesnt suck its good in that i keep keep it turned on in that case :D
  • @ron i dont see any ssues with edit only one option but i got fucked up royally with the edit only engines/options.
  • @ron "Then run the bastard"  3:-O That my friends is 'proper advice' right there. If you read one thing on this forum, make it that post.

    I have been manually verifying for over a year - first thing in the morning so I get a good vibe (stonker) for the day.  I have found it to be more effective as SER concentrates on one job at a time.  Don't forget to ramp up the threads when in verify mode!
  • @ron, @brumnick and other - Do you use proxies for verifications or go without them ?
  • ronron SERLists.com
    @RayBan - Use proxies for everything.
  • edited May 2014
    Brumnick said "...Don't forget to ramp up the threads...". If I am using 30 semi-dedicated proxies, do yo think 300 threads is good while doing manual verification? :-?
  • @ron, would you really use proxies for verification and identification? Currently I'd never use proxies for these, but I'm curious to know your reason.
  • As for me, I don't use proxies for identification, but for verification - yes. Imagine verifying hundreds (or thousands) of urls from the same domain using your real IP, your IP may be flagged and reported as spam/abuse...
  • ronron SERLists.com
    @fakenickhal - It really wasn't so much that I had a reason, it was more I didn't see a reason not to use them.
  • I see @ron, my reason for not using proxies for verification is that I don't want my proxies to slow me down even a fraction of an inch whenever possible. I appreciate the answer, you just had me curious as I thought everone were doing the same as me.
  • @davbel yeah dude I have all of my RDPs pinned to my taskbar and just highlight the projects and Active V. Again as mentioned above, you can run f**k loads of threads when you're just verifying and 100+ projects at the same time. Some of my dedis I've limited to 50 projects per server so they verify so quick then it's back to submissions after it's done (usually after I've been to the gym LOL).
  • @ron where is the link to where you say how to edit projects with notepad++ thought it was this thread but cant see it now must have been another one and forgot which. How do you do it en masse?
  • ronron SERLists.com
    @PeterParker - It's in SERLists.com as a subscriber. I think you have joined and unsubscribed, etc. If you did, just join back in there, and a list of all the tutorials will be given you with links.
  • edited May 2014
    @ron I started taking your advice and I scraped the forums for AA list. I got about 20 Mil URLS and added them to my GSA Verified list only. The I removed duplicate URLS, Duplicate domains, and the other option clean the list. I've been uploading list for about 3 days straight now.

    Found a 870K keyword list and use %spintax% in the keywords field to randomly choose a file.

    Turned verification off

    Turned off Web2.0

    Turned off indexing

    Checked try to have  an anchor text

    Changed filters to PR 0 and OBL 100

    I kept the bad words filters though.

    I got a charity hand out of 30 yahoo emails. Which was a huge help!

    My question is they were all in .txt format. I added from file and chose "identify platform and sort in" will this add the URLS to my verified list? How does GSA know which list to add it to?

    To verify I guess I just make the project "active V", but then where do I get the links to add to SEO indexer?

    I ran a campaign and my LPM went from 4 to 10. The only other variable is the proxies. I need to buy more.

    What else am I missing?
  • lol im kind of on the same position as you now, been trying things ppl have said and cant get anyhting steady above 40 lpm whereas i used to sit pretty at 200 ~:(

    Looks like im back into the land of the mortals.

    Have you filtered for only high performance platforms?
  • I've only had GSA about a month, so I haven't made enough post to actually do that yet. I plan on it once I run enough numbers to get some solid statistics.
  • ronron SERLists.com
    edited May 2014

    @silverdot - I would just import the target urls into a project (like 30 projects that are all clones) with all platforms turned on, and definitely leave verification turned on for automatic 1440.

    This scraping and processing you are doing is best done with dedi's, one for scraping and one for processing them through SER projects to create the verified file. You need about 10 emails per project.

    It sounds like you picked up some old lists. Never import them into verified. Huge mistake. Most targets are probably dead. The only way to find the gems is to process them through projects as I described above. I would create those 30 cloned projects with fake URL's to post to. In other words, I wouldn't even waste my time actually trying to post them to my real projects. What comes out as verified from all that processing should be used for your real projects.

    In other words, you are mixing a very inefficient process with your real projects. Processing massive lists to find a relatively few legitimate targets is a painful and slow process. It really should be separated from running your own real projects. Yes, it is expensive to do it the right way, but if you want to take that route, then a separate server with a second copy of SER + CB is the right way to do it.

    You basically have 3 options: 1) Have SER scrape for targets and directly feed your real projects (which is what most people do, and how I did it for a year and a half - and did just fine); or 2) set up a separate scraping server to find targets and feed a separate processing server to run those targets through SER - and then take the product and use on your 'real projects' server; or 3) buy lists.

    I feel that you are mixing approaches, and that is holding you back from speed and success.

  • edited May 2014
    SUCCESS!!!

    I went from about 4 LPM to 14 LPM

    http://prntscr.com/3jdi36

    I think with 10 proxies this is as fast as I'm going to get. 

    Here's what I did differently. 

    I loaded an AA list directly into the project. It was an older one, but it's all I got right now. About 500K from April I got from a friend. 

    I changed my proxy settings to have everything on private except search engines. I kept those on public with a 60sec timeout. With an AA list I don't think this setting matters because I'm giving it the target URLS. 


    I kept verified link must be exact URL on, but disabled verification. I also kept the indexer on because I don't want to have to copy all of those URLS and put them in the indexer myself. (I don't know how, show URL's > Verified > export?)

    Q: I'm assuming that the links that do get posted and verified will be added to my verified list?

    Q: On the Submission section what does the "Skip for identification" tick mark do?

    Q: What does the green and yellow highlights mean on the right hand side? Submitted / Verified? 


    (Sidenote: I'm totally making this thread into an ebook and giving Ron all the profits LOL!) 
  • fyi i have 100 lpm with 10 proxies and had 200 no prob so its not the proxies to blame.
  • What do you think my problem is then?
  • @ron @judderman what do you do about re-verifications or do you not bother?

    Also I've noticed that when running SER like this after a few hours it tends to slow down for no apparent reason.  Do either of you get that?

    I've just been stopping and restarting which seems to fix it.
  • Once a week for re-verifications for me. Just to check. Hardly any die-off in my experience. 

    A few hours? I guess it's running through more attempts or parsing emails then? I just run it for 30-45 minutes and run all projects at the same time. Runs at 2GB RAM and when I see more than 90% error message "all links verified once" I switch to Active again. 
  • Sorry @judderman I meant when it's posting links it slows down.  

    It'll run for a couple of hours hovering at it's setting of 1250 threads and post 10s of 1000s of submissions, but then it'll slow right down to 200-500 threads and the number of subs slow too, sometimes just a few hundred.

    Still has lots of targets to post to and as soon as I stop and restart it speeds back up, but it just seems to get stuck after a few hours.

    I'm guessing it doesn't happen to you or the others and its something unique to my set up
  • ronron SERLists.com

    @davbel - I got into a routine now where I have verification disabled on all projects, and then switch to to setting Active(V) later in the day for a couple hours to clear it all out. And then flip it back to Active.

    I also clear cache and reimport the targets once per day. That helps it to run fast and with plenty of targets.

  • @ron are you reimporting the targets into projects or into options -> Import Site Lists?

    It appears as I'm not the only one with the slow down -  there's another thread on the forums with users getting the same slow down issue whilst it's posting and then remedying it in the same way with a stop / start.

    They've asked @sven for a feature for SER to restart automatically every 2 hours
  • ronron SERLists.com
    edited May 2014
    @davbel - I am importing target URLs>sitelists. Under older versions of SER, we ran into issues where importing straight into projects was causing a RAM/CPU bottleneck. But since v8.38, importing directly has been awesome.

    I still set projects to sitelist 'on' with identify (which is where I store my latest list). So that is a safety net, if you will, if targets run out. I typically purge cache and re-import the same list daily. And with fantastic results.

    By the way, I am still using v8.38, fyi.
  • davbeldavbel UK
    edited May 2014
    Hmmmm

    @ron how many projects are you running when submitting?  Are you using the scheduler?

    And you are importing the list into every project aren't you?
  • ronron SERLists.com
    I'm running 150 projects simultaneously, no scheduler. When I import, I do them as a group. By group, I mean that all projects have exactly the same engines checked (otherwise the import gets screwed up and various engines will not import). So I do all contextual-only projects together as a group (all highlighted together), and all junk tier projects together, etc.
  • @ron that's something I was thinking about in reference to importing list. I imported the AA list directly to the project but only selected do follow contextual platforms. Most of the SER diagrams I've seen show the tiers being build with secondary no follow links. 

    Point being, should I import the same AA list to the secondary links as well. 

    or

    Am I missing out on possible targets for platforms because I only select do follow contextual engines?

    image
  • davbeldavbel UK
    edited May 2014
    Double Hmmmm in that case :((

    Normally I've been running about 60ish projects, no scheduler at 1250 threads and SER has been at about 30-50% CPU and 1.3-1.7Gb memory.

    However when I import the site list into all the projects, SER runs out of memory within a few minutes and struggles to run more than 10-15 projects without 100% CPU.

    This is how I had been doing it:
    1. Selecting all projects
    2. right clicking
    3. Import Target Urls -> From Site List
    4. Selecting the correct sitelist
    5. Then clicking "Yes" from the Automatically choose URLs from engines blah blah pop up
    But based on what you say @ron, what I should be doing is:
    1. Selecting alike projects i.e. only contextuals or only junk
    2. right click
    3. Import Target Urls -> From Site List
    4. Selecting the correct sitelist
    5. Then clicking "No" from the Automatically choose blah blah popup
    6. Select only the relevant engine types from the Choose Files window

    Correct?

  • ronron SERLists.com
    @davbel - That is exactly how I do it. Especially the "No" part on #5. I don't like what it does at all if I were to click "Yes". I'm glad you wrote that out. I hope people are paying attention.
  • I'm paying attention so close. I was reading your reply the second you typed it out. Literally. This thread is becoming epic right now. 
  • ronron SERLists.com
    @silverdot - Follow the exact 6 steps as @davbel mentioned. Yes, you are using the same sitelist to supply targets to all projects at all levels of your tiers.
Sign In or Register to comment.