Skip to content
  • @m1xf1 yep. Do Options-->Advanced-->Tools-->Export site lists-->Verfied and it'll create a nice handy *.sl file for to import the same way you exported. It merges with any existing lists you have too.
  • Thanks to this discussion I changed my Proxies from public to private, and take off a lot of engines with small percentage of verified links and now I have 50% of verified! 

    Incredible!
    Thanks to Sven, LeeG, Ron and the others!
  • Wow guys, went from like 1-2 lpm for the past month or so to over 100 lpm almost overnight!

    Spent about an hour or so implementing all the changes on this thread, so many thanks to all the GSA gurus. Just a few changes made all the difference, here are some of the things I did:

    - Reduce search engines to Google only, 5 selected
    - Deselected all the engines that had less than 10 verified links built and applied it to all my campaigns
    - Am using 10 semi-private proxies from buyproxies
    - In project options, am using 'use keyword to find sites' checked, and also checked 'use global site list' and only checked the 'submitted' option.

    Am running GSA on Berman hosting the middle package and CB.

    I think culling the engines that didn't perform took me from 50 lpm to over 100 lpm and it wasn't that extensive either.
  • congrats @weeza :)

    but did i understand you correctly that you have 'always use keyword to find sites' have checked? that is not recommended as it will lower your LpM in the long run as the SEs won't find as many sites as you do when you leave that option unchecked.
  • @Ozz Thanks, I will give that a go. But yes, at the moment I have that option checked. How does the SE's find sites to post on then if not to use the keywords?

    I don't fully understand how GSA works, just ploughing my way through it.
  • edited February 2013
    Did something change in the last couple of updates? My LPM has been in the range of 30-40 for the last weeks. The last 24 hours it's been 1-2. I've tried disabling proxies - same thing. My verified has dried up naturally as well.

    Nothing has been changed in the 20 projects I'm running.
  • OzzOzz
    edited February 2013
    @weeza, i don't want to bore you with technical things but in general it works like this:

    - if an enignine like "General Blog" search new targets it uses the footprints + your keywords for searching from time to time regardless of that "Always Use Keyword" option. this will work for those platform as there so many different and niche related websites to get found by SEs. to make it clear, the script adds your keyword automatically to the footprint, but not all the time

    - if you search target URLs for "Directories" and do this with with a footprint + keywords you won't get that results at all

    Example: 

    as you see you won't find any article directory with the "footprint + keyword" search term. sure, if the keyword is just a single general word like "dog" or a broad keyword with two words like "dog training" you will find more targets, but how many of your keywords are (specific) multi part keywords and how many are just a single word or broader terms?

  • Thank you NocT!  I had a copy running for some time so I have some good site lists in there!  Appreciate that.
  • @Ozz, thanks, sort of see what you mean. I will give it a try when I have time.
  • @noct . THANK YOU for this huge list! helps me A LOT!!!!!!!!!! But somehow.. some engines perform better than you with my setup.. like PHP-Nuke or php bbb
  • Guys not sure if anyone else has the same rig setup as as me.  I'm using a Macbook Pro running virtual box (windows xp parallel).

    When I did a speedtest.net on my windows parallel I noticed my upload speed was VERY low  This is an issue with virtual box, I fixed it according to this thread: http://techblog.geeksbrain.net/2012/11/how-i-solved-slow-windows-upload-speed-when-using-virtualbox-nat-or-bridged/

    Got my LpM a kick higher.

    Cheers
  • thanks for that !!!

    also using virtual box and this will help for sure :)
  • I'm running MB pro, Parallels, Win7 and my speeds are identical. 18 down, 7.4 up
  • yea i think it's a XP issue doesn't effect win7.  cheers.
  • Is it normal for GSA to be running at 2-5 threads with only 1-2 projects running?
     I've tried creating a new project - as well as deleting history on an existing. Site lists are turned on with thousands of verified links. Project have been running with - and without proxies. Google searches are working fine as well - no banned proxies.

    I've attached a screenshot showing GSA running. Max threads are 50 - but only 2-5 are running. When verifying all 50 threads are used. If I turn on enough projects, all 50 threads will be used.

    image
  • BrandonBrandon Reputation Management Pro
    @claus10 this is very normal, more projects will use more threads, not ideal, but normal. I posted about it here: https://forum.gsa-online.de/discussion/2027/not-properly-utilizing-threads#Item_12
  • Brandon - something is definitely wrong. It has created 2 links in 4 hours. It just hangs there - doing nothing. Is there a place where I can download 5.08 - 5.09? It ran much better for me with this.
  • LeeGLeeG Eating your first bourne

    rodol shared the 5.08 exe on here yesterday

    https://forum.gsa-online.de/discussion/comment/15219#Comment_15219

  • Thank you LeeG.
  • I am stable at 130 lpm :D
  • edited February 2013
    @LeeG

    I know you own very few sites as you told me that before. How can you get away from Google's penalties while building all these huge amounts of links every single day ?!

    I know you build links for all your sites internal urls. But is that safe enough though ? I'm promoting only one 2 years old site with about 6000 urls and I was thinking to try your strategy but I want to be as safe as possible that's why I'm asking. It's gonna be really nice from you to teach us how you hide from Google's eyes! after you taught us how to build huge amount of links!  : )
  • LeeGLeeG Eating your first bourne

    The trick is tier building

    I know in the past it was said on the net t1 is about 100 links per day

    So if you apply that rule t1 100 links t2 1000 links t3 10000 t4 100000

    Also take into consideration, your looking at a low confirmed level. 1 in 10 or worse

  • ronron SERLists.com
    @mamadou - I do 20X on Tier2 and no limits on T3+. The key is to not go crazy on Tier1. I may submit 5/day on a new site, and a few months later do 20/day. Those are very safe levels.
  • for me its approx 20x on secondary links to tier 1 . lots of secondary dies after a few weeks (BComments/ICommentc/GBooks/..)

    5/day is VERY safe.. what engines do oyu use there?
  • ronron SERLists.com
    Anything that creates an article.
  • @ron @LeeG

    I have so many urls on my site and I'm targeting totally different keywords with each url. I tried doing the 20/100 a day strategy. That yields very very poor results. Right now I'm doing like 100/200 link a day for each URL on my site!. I'm getting good results but not perfect.

    I was asking you guys because I want to raise it to something like 1000/day for each url on my site!!
  • edited February 2013
    Why when the day start lpm go to sky and after some hours start going down?
    My day started at 130 now is at 75
  • It is averaged out from 12am so it will be more difficult to keep up that speed all day, especially the closer you get to midnight, especially if you only verify once a day and that is in the latter part of the day, as you may spend an hour verifying so that will skew your overall results for the day.

    Guess you also need to take into account times of day when you may have slower bandwidth, such as after school when all the kids come home and start surfing the internet/playing xbox live etc
  • LeeGLeeG Eating your first bourne

    Mine briefly jumped to 413LpM at midnight here :D

    Looks like clearing the target url cache cured a problem I had

    Back to running a steady 230 > 240 LpM early evening

    See if that cured my problem with 5.16 

    Failing that, I might invest $8.50 in some software to monitor ser crashes

    http://sertools.com/ser-crash-catcher/

  • LEE thanks for this resource.

  • Does LpM counter reset when you restart?  I find when I'm debugging  often restart SER and the LPM seems to reset and jump up when I restart SER and rerun the projects.
  • edited February 2013
    So with all your shiny xxx LpM how many verifieds are guys having per day?
    Me personally I'm looking at around 30 LpM from 50 threads and 6000 verifieds a day from 40.000 submits.

    Do you retry failed "continously try to post to a site even if failed before"?

    Do you keep posting to the same domains (avoid posting to same domains)?

  • Which one are you guys currently using ?? I was on latest 5.17 and now went back to 5.10 which worked well for me.. but somehow it does not work well like before (maybe need to restart).. thinking going back to 5.08

    Funny cause i added a lot new projects and even i activate only 4-5 at once seems just the created projects bring the performance down
  • i don't know what to think of this "version hopping". as we all know there are good days and there are bad days so i don't think those short term test with less than 72h have any value.
    sure, if you notice that your LpM is abnormal low the whole day than there might be an issue. this is especially true if you are not using a super high amount of threads (+250) and SER is working flawlessly without maxing out the performance all the time. the less threads, the longer you have to observe the testing as there are many factors that could influence your submission rate.

    1) VPS - if you are on a shared VPS, how should you know what your sharing partners are doing on their machines and could they steal some of your rescources or do you steal from them when they are not doing much on their machines? i'm not exactly sure about this though, but this is my theory.

    2) Proxies - whats the speed to the VPS and how fast they are connecting to the host of the target URLs they try to connect to? if my VPS is in EU, my proxies are in the US and i want to built a backlink to a site in China than this would take a longer time than all of those three were hosted in europe.

    3) finding new target URLs - this depends on the keywords, footprints, SEs that SER is choosing randomly. some days you will find more target URLs due to that than the other and so the propability is higher that you can post to them (another factor we can't influence)

    4) what kind of target URLs - its obvious that building links to guestbooks is way faster than to create blogs on social networks for example. and that depends on what engines we have selected and what results we are getting by scraping the SEs

    and there may be many more (unknown) factors which can influence the submission rate day in - day out. just think about you've had a great submission rate 5 days ago and SER is doing the final verifications before dumping the URLs. because of that SER has less time to built other links but another 5 days later it will built more links again due to the fact that you couldn't built that many links on the "high verification day".

    i for myself running SER with less than 50 threads because i don't need that much until now and had a relatively bad day two days ago, the best day ever yesterday and today its on track to top yesterday. 
    all this happened with version 5.17 if i'm not wrong....




  • completely agree with @Ozz, stop switching versions always use the latest
  • edited February 2013
    Ozz youre right as always.. many factors play a role and we may not control all of them.. but OMG !!

    5.18
    - new: no thread-finish-waits before starting to verify
    - new: speed increased as submissions can now be started inbetween verifications

    I see no more lags between submission and verification.. this thing is flying now.. seo polizei gonna crack down on my server lol
  • OzzOzz
    edited February 2013
    yeah, somehow i wasn't that much influenced by bugs until now so i can talk cheaply ;)

    however, when i read something like "v5.11 has worked good, than v.517 not anymore and when switched back to v5.11 it wasn't working like it was 10 days ago" than thats because of all the factors we can't influence that much. sometimes simply patience is the key :)

    but i'm glad Sven figured something out for you guys. maybe i'm the next with major issues ;)
  • LeeGLeeG Eating your first bourne

    Those people that version swap should be slapped with a wet kipper

    Nasty people, not that anyone ever cottoned onto it from my screen shots :D

    Latest release, previous version and trusty versions are always pinned to my bottom bar

    Looks like one of the killers thats been hitting me might have been resolved

    new: no thread-finish-waits before starting to verify

    Thats been causing lock ups daily when its been waiting for threads to finish

    Plus the Ozz request should relieve search engine queries

    new: option to put keywords in quotes or not

     

  • +1 for the new feature "LPM counter to only show after 4 weeks of GSA SER usage.."
  • ronron SERLists.com

    I would like to point out that @LeeG started the version hop. When you see a guy with his pants on fire running down an alley, don't always follow him :)

    I always use the latest version.

  • BrandonBrandon Reputation Management Pro
    @ron but if you're going to follow someone with his pants on fire (high LPM) make sure that guy is @LeeG!
  • LeeGLeeG Eating your first bourne

    Thats it, Im sulking, everyone grassing me up for admitting to bad habits with ser :D

    I always look at my 12hr averages, just in case there has been any alterations to how the LpM is working

    100,000 in 12hrs and Im happy

    If not, I blast away with a trusty version to do that

    Todays changes can affect LpM again with the new option on searches

    Plus a high verified rate is far better than a high submission rate

  • edited February 2013
    +1 for the new feature "LPM counter to only show after 4 weeks of GSA SER usage.."


    ^^^ 8 weeks
    :D
  • AlexRAlexR Cape Town
    @LeeG - talking about HTML timeout you said:
    "Mine is set to 130. HTML time out is used in two places, submitting and verifying. Most people who suffer low verified, its down to them using the standard setting of 60 from memory. Plenty of time to submit, but not long enough to verify"

    My understanding from Sven is it's used to get site to load. So as soon as it sees it as loaded within timeframe its good to go, whether it's to verify or to submit. So if you set it to 130 that means you're happy to wait over 2 mins for a page to load (not submit or do a verify check, but just to load)...surely that can't be worth it as most sites/pages should load within 30s and it would make sense to ignore these super slow loading sites? 

    @sven - can you confirm this?
  • SvenSven www.GSA-Online.de
    It's like this: A timeout appears if in the set time, no single bit was transfered from the remote server. As soon as one bit is received, this timer is reset. Thats all this is for.
  • AlexRAlexR Cape Town
    Thanks...if this is so...then surely it makes sense to be able to set HTML timeout to 10s? This way I can avoid posting or loading any sites that take longer than 10s to send the first data bit (not for whole site to load)...will stop me waiting for all broken sites, or really really slow sites to respond?

    @sven - is it possible for you to allow us to put a lower value than 30 in here so we can test?
  • SvenSven www.GSA-Online.de
    I don't want that people play with such low values. It will get more "hey it's not working" messages.
  • @ron when you say you build 5 links a day to a new site. Is that 5 overall. What if you are trying to SEO inner pages as well. 

    I am working on an ecommerce site at the moment with about 20 projects running for this site. So should I be submitting 5 urls for each project? So that would be 100 links to the overall site..
  • AlexRAlexR Cape Town
    edited February 2013
    @sven - I'm just curious, what is the average time it takes for a website to respond with the first bit? I would have thought it's closer to 10s rather than 30s. You must have a reason for setting minimum at 30s, so I'm curious or maybe I have still not quite understood this feature...(possible :-)  )  
  • Thanks to everyone for posting results and methods. Thanks to this forum I've got a crazy amount of LMP going today.

    After just over 3 hours of running today:
    LPM:321.05

    image

    The low verified is due to only verifying once per day on money site, every 3 days on tier 1 and not at all on tier 2.

    2 weeks ago I was at about 20 LMP average.

    All hosted on a mid-range VPS


  • BrandonBrandon Reputation Management Pro
    I didn't realize the timer reset if a bit is sent...I'm definitely going to set mine lower than 180.
  • AlexRAlexR Cape Town
    @brandon - YES! Like 180 says to me "Wait 3 mins before this website even responds"...seriously...I wouldn't wait that long if I was manually browsing and the site didn't load on a 100mbit line...if it hadn't started loading in 10s on a 100mb/s line I'd assume the sites'  broken and would leave...that's why I'm trying to find out why Sven sets limit at 30s...but I could be misunderstanding something!
  • AlexRAlexR Cape Town
    @LeeG - I know you're getting some excellent results and many VPS have limited bandwidth plans. How much bandwidth are you using per month on a VPS? 
  • LeeGLeeG Eating your first bourne

    Only about 2tb a month

    From memory, Im paying about 30 cents a tb

    And I have an 8tb a month limit

    I know I published the results on here once of a months run and end of month tb usage

  • AlexRAlexR Cape Town
    Thanks...30c a TB! I just got offered a special of $10 per TB instead of their usual $30! Ha!
  • LeeGLeeG Eating your first bourne

    I lied about the tb cost

    Just went to my providers site and checked

    Mine was 30cents with the 40% off deal

    Just had a play with the slider and each tb put 50cents on the price

  • ronron SERLists.com

    @micb - You want a heavier concentration of links on the home page. I always use 40% as a guide. So in your case I may do 2-3 for starters on inner pages. Obviously as you proceed forward, the number of links gradually increases on T1. You have much more latitude and safety to turn on the faucet with T2+.

    The point I try to make to people is not to go hog wild with T1 links. If I had a dollar for every dummy that blasted links to their moneysite and then goes on a forum telling everybody how they lost their rankings - I'd be retired. You always have time to ramp it up. 

  • @ron thanks for that. How many tiers do you do. My GSA is currently running 20 T1's so adding upto T3 will end up being 80 projects on one installation. From reading some of your comments and am guessing that you only do T1 & T2 ?

    By the way your KM article theory does work as you know. T1's don't have to be of great quality and they do stick. 
  • LeeGLeeG Eating your first bourne

    micb11 one of the worst members on here for admitting to all the bad habits Ozz normally tells people to avoid doing, runs 27 t1´s and they are all t3 to t4

    Just take your time setting things up and you can achieve these kinds of submission rates with a bit of time and effort

    image

     

Sign In or Register to comment.