Skip to content

Low thread count despite of setting it high

2456

Comments

  • ronron SERLists.com

    I just keep the same pace. You don't want negative link velocity. You just need to maintain your link profile at that point.

    To this day I am experimenting. I have had the opposite results as people like Lee and a few others. I roll with about 110 SE's. Lee does 5. Everytime I get mad when I see Lee's Galactic speed and results, I get pissed, and I change my SE's to 5. Then my LPM goes down. It's like a dark British comedy, and I'm the one constantly getting hosed. LOL. So you experiment and see what works for you. 

  • @ron Haha yes, I guess this is just a trial and error game. What works for you might not work for me and vice verse.

    Appreciate your help as always.

    Thank you.
  • spunko2010spunko2010 Isle of Man
    edited June 2013
    @pratik I get about 10LpM but then I am running high quality campaigns for T1 only atm, with loads of options ticked.

    If I take these off for T3 etc I get 80LpM.  Or with lists 200LpM+
  • CMIIW but...

    If you do a search on SE, you should know what's going on. The more down the pages, the less accurate it is. So chances are you target cache may not contain URLs that you can post to, but they have to be processed anyway.

    If you clear that, SER starts again afresh, and get new data from SE, which because those are from first pages of SERPs, they are more relevant. And SER happens to post to them successfully for a while, until it gets more "junk" urls in the cache.
  • @spunko Damn, that's really great. My VPS is though not a very high end one, 1 GB RAM, and the download and upload speed (as tested from speedtest.net) is 5 MBPS and 0.2 MBPS respectively.

    For tier 1, I've set to post to above PR2, for T2, PR1 and no PR filters for T3.

    I kinda get jealous seeing you all having nice LPM haha.
  • @audioguy nice post.

    I'd also like @Sven to have option for it in future version to delete target url cache every X minutes (it'd stop the project to do so and after deleting, it'd start automatically).
  • @Pratik: you are able to deltete the history via command line feature "-delhistory". when you enable this your history will be deleted everytime you restart SER (after updates or manual restart). here you find some info:

    i know this is not exactly what you've been asking for, but i think it should be useful for your purposes.
  • Pratik use scrapebox or gscraper for spliting larger file in to smaller file.

    ron
    %spinfolder-C:\Users\omi\Desktop\Dropbox\GSA\Keyword list%
    so that one is working for you right? or i mean how we confirm that it will working.
    generally we click on test button and check that everything is fine or not.

  • @baba Although it's kind of weird to see 1000 keyword in the test screen, I guess the best way to find out is to try to run it. Have the log written to a text file and see if SER uses one keyword at a time to query SE. If that's the case, then it's working fine.
  • edited June 2013
    @baba

    I do have scrapebox and I use it quite a lot for scraping. How would I go about splitting though? Never used SB extensively other than scraping haha.

    Btw, currently getting 52 LPM (note, no changes have been made, even no new kws added, except for changing the rule of links per day to unlimited for T2 and T3) but I feel that it'd go low in a while though (although running at same pace for past one hour, nice!). I'll keep you guys updated.
  • edited June 2013
    Back again. At 44 LPM currently, running quite nice surprisingly.

    But @ron somethings made me worry. The submission rate. I've read mixed opinions on submissions per day.

    My project has been running for roughly about 11-12 days or so and I today removed links per day limit for T2 and T3 and it was blasting at 44 LPM with little adjustments as mentioned without any new keywords.

    But I'm kinda afraid to overdo. I've read you should limit T2 and T3 links (remember I'm not using kitchen sinks T's but T1, T2 and T3 normally).

    So am I overdoing it? I'm also afraid that if I'm not able to keep feeding the links at this rate, I might have to face penalty too, lol.

    edit: Also I've verification set to 1440 minutes, so I think it's not possible for SER to count how many verified links has it made today unless that check is run? So I think it'll go on building the links until it passes through verification? This is very important when you're building low verified links like 20-30/day.

    Thanks.
  • LeeGLeeG Eating your first bourne

    Oi, leave my name out of this ron, im still trying to hit the 1/2  million daily

    110 search engines selected? :-O

    I still only use five, but I spent some time working on the method Ozz shared for spinning engines the other day and get half decent speed

    The five I use are randomly pulled from a list of twenty or so googles

    Like anything, spend your time and don't rush setting up the engines

    Two to three hours getting things right, is better than rushing the settings and spending months getting things wrong

  • Pratik use "duperemove" addon of scrapebox.



    LeeG you mean you're getting better speed than previous using Ozz spinning method?
    you mean you select only 20 best google country, right?
  • edited June 2013
    @LeeG thanks. May I ask how much links you build for lower tiers like T2 and T3? Do you keep any limit? I am kinda afraid about blasting too many links for lower tiers.

    So would it be kinda okay to let it keep blasting all day long everyday?

    Asking this question to you too @ron in the post above LeeG.

    Thanks.

    @baba won't it just remove the duplicate keywords instead of splitting them? However, I'm separating 100K list manually into 1K each currently.
  • edited June 2013
    audioguy i did a test and found that it's not working.
    it will take 4k keyword each time when SER going to pick keyword.


    Sven can you please help me how to setup macro in SER to pull kw from a folder where 100 files containing 1k keywords in each file?
  • LeeGLeeG Eating your first bourne

    Like anything, test and check your stats

    I use the tier system Ron posted. So you have your kitchen sink links pointed at your articles.

    Im limiting the links on the lower tiers to 10 per url the same way I do on all t1 links, to help spread the link building, even on the bottom tiers.

    Since implementing the system Ozz shared, I get very few ip blocked warnings.

    Again, its down to your own choice on the engines used. Only you can decide which are best for targeting results and which might bring in limited results, due to internet restrictions and internet censorship in some countries. Now that's a hint some will catch onto and others it will leave total bewilderment and confusion

    But as with anything, the proof is in the screen shot

    image

     

    Just monitor your stats and hit the engines that give high amounts of submissions.

    From what I have seen, the submitted vrs verified stats can cause confusion.

    100% verified can sound good, but if you only get ten results. Compared to an engine with 30% verified rate, which can give thousands of results

  • @LeeG everything seems so confused kinda. So you built 10 links per URL, as in for example for tier2, if you've 100 URLs of tier1, then you build (100 X 10), 1000 links/day for that, correct?

    From the screenshot, it looks (as you've not set to submit unlimited amount of links per day) that you might be having quite a few projects running, hence submitting lot of links.

    Your LPM is beast too, haha.

    @baba I think what I said will work, could you test it? Right click on keywords textbox and select choose random option and then choose your folder of keywords. Please let me know if it works as I'm too interested in trying out this once I separate out the kws.

    Thanks.
  • Hmm, and I also noticed that when a project is paused on reaching verifications/day, the process turns slower and hence LPM goes down, whereas when all the projects are active, the LPM is maintained.

    Any reason for this? The status is Active (P) and is show activity as verification.
  • LeeGLeeG Eating your first bourne

    Everything is set to 10 links per url.

    As your tiers grown, you might have 10,000 links on t2

    So 10 x 10,000 = 100,000

    Use that over several projects and even I cant reach those numbers, no matter how hard I try :D

    But, those numbers are achieved by a lot of time and effort spent testing ideas, engines and stat checking

     

  • edited June 2013
    @LeeG haha yes, nice point. When URLs will grow, that'd be sort of only a dream number to achieve and I think then it will increase LPM too as now there's only few hundred URLs on each tier and the limit can be achieved easily and as a result when a project is paused (I've only got 1 running) the LPM decreases.

    Curious as to how many verification's/day you've set on "per URL" basis for T1 (not for junk T1)?

    Also on verification options I've checked "Also reverify existing backlinks" - does this make a difference much? Should I check or uncheck it?

    Thanks.
  • LeeGLeeG Eating your first bourne

    Why set verifications per day?

    Do you read these forums with your eyes shut?

    If you use that method, you could submit 1000000000000 links to your t1 and not have any found until the next day or two. Then not build any more links for months

    Use submissions per day, not verified per day

  • edited June 2013
    @leeG interesting, Actually read mixed opinions on these so I thought it'd be kinda best just to build links on verified URLs.

    Earlier I had verifications like every 3 hour or so, so verifications could have worked, but now I guess and agree with you that I should switch over to submissions per day.

    Thanks for the tip!

    edit: Right, and I was wondering why I get shit LPM, haha. Well, you learn something everyday, thanks again.
  • @Pratik You can use scrapebox addon DupRemove for split file. If you don't have scrapebox this addon is free http://www.scrapebox.com/free-dupe-remove
  • Pratik I think there is only two options..

    1. right click on keyword text box and select "use contents of a random file"
    or
    2.
    right click on keyword text box and got to insert macro and select %spinfolder%

    and none of above option is working.
  • @baba I think 1st option should work?
  • Also I've been rocking at 44 LPM since few hours, success baby, success. :P
  • @baba Define not working. Do you see SER uses all the keywords at once to query the search engine?
  • audioguy

    yup that's what i am seeing.
  • ronron SERLists.com

    I just woke up and Pratik is having success since my last post. Go figure.

    What you do on the underneath tiers (non-T1) will not hurt you. The pace of linkbuilding directly to your moneysite matters the most. You either want steady, or slightly increasing links over time, but not decreasing links over time.

    It is easier to manage the verified per day by using the "submissions per day" setting on each project. You typically will get some number, like 20%, of your submitteds become verified. So then it becomes easier and more predictable if you then use that setting on all projects. It may seem counter-intuitive, but SER is easier to get it to do what you want to do if you always work from the submissions angle. So if you want 10 verifieds per day on your T1, then you set the submissions per day to 50 - assuming that the ratio is 20%. Look at your numbers every day to get a good feel. Maybe keep a little spreadsheet (just for a week) and keep track of the rate of growth of the verifieds - then you will know what is happening.

    SER will never be able to build all the links you need on every project, particularly on the lower tiers. So just set some "number per URL" for the tier above it, and call it a day. Remember, your T1's and any other projects that link directly to your moneysite are always your number 1 priority. Always make sure those projects are on track in their linbuilding...everything else is secondary. 

  • edited June 2013
    @ron @audioguy

    I'm just done a few minutes ago with making a separated list of 1K keywords for about 100 lists. I imported it into SER but I could see in logs for it searching for things like "Add your bookmark" and many long tail keywords which are not even mentioned in the dictionary as they're just one word keywords. Also the keyword scraping from sites is disabled in the options too.

    So do you think its working?

    In the test window, I could see it pulling 1000 keywords at once from that single file.

    Also @baba I tested, both are same. The choose random gives same macro as to what Ron has mentioned.

    Ah Ron, just saw your post. At LPM of 46 currently, haha.
Sign In or Register to comment.