Skip to content

I'm flipping out right now :(

Hi guys,

can somebody please help me, i have same problem again and again, i just bought 30$ semi dedicated proxies, and i have LPM 0,17 WTF !
Earlier i thought that i have similar prob 'cos of public proxies but now whit PP i'm soooo mad. If someone know what's problem and my setting are these:
Pic 1
image
Pic2
image
Pic3
image

? Thank's !
«1

Comments

  • Uncheck skip settings, uncheck always use keywords, to a test domain, report your findings
  • Thank's on your reply.....but nothing spec is happened LPM 0,35

    Just i don't know what is your proposition "uncheck skip settings ?
  • s4nt0ss4nt0s Houston, Texas
    You have it set to skip anything below PR1 + max 50 OBL + always use keywords to find target sites. He was saying uncheck those things if you're going for LPM.
  • This is my T1 site , so i'm little bit unsure about unchecking those ?
     For the rec i'm not so crazy for some high LPM, i mentioned it just as  measure
    i want some 
    reasonable ratio submmited/verified, but that is not happenig
    So what you think about unchecking skip sett's for T1, and could sites for some niche be depleted ? I'm just wondering about that :-/
  • edited September 2013
    depleted for english SE's and domestic SE's 
  • I think it is because you do not have enough target list, it happen to me when I start use GSA SER, you need time to build up your site list or you purchase somebody's list 
  • edited September 2013
    I think that SER i searching list by himself, and also  user can upload it's own... I don't have some time for building list by SER, i'm using SB for that, and i uploaded forum list and .edu list, and as u see this is happening
    btw
    I'm not so new/not so old in using SER 1 yr

    btw, Thank you on your reply ! :)
  • SvenSven www.GSA-Online.de

    @kajzer the easiest way to get more target URLs is the new option:

    Use URLs linking on same verified URL (supported by some engines only)

    that you should turn on. With that you will surely improve LpM. I think noone realized till now how good that option is. I run a project with that only turned on (no search engines) and it orks great.

  • goonergooner SERLists.com
    @sven - Can you explain exactly what "Use URLs linking on same verified URL (supported by some engines only)" does please?
  • Hi @gooner, I believe that it looks at the pages that you are posting links on, scans for links to other sites, and then sees if you can post to them too. - Seems to work well, but uses up lots of CPU.

    https://forum.gsa-online.de/discussion/5952/link-pattern-technique-for-target-search#latest
  • goonergooner SERLists.com
    @2take2 - Thank you my friend. Looks like a great addition.
  • @Sven Thanks i will try it, droping review later...
  • SvenSven www.GSA-Online.de
    Shouldn't use a lot cpu at all ;-/ but yes, thats the way it works.
  • goonergooner SERLists.com
    Got an increase of around +35 LPM after using that option on all projects, not bad at all!
  • I think the higher CPU usage for both this option and importing and sorting URLs seems to be because the well spammed pages you can pick up from downloading site lists or following links from other people's spam can often be multiple MBs of data each - if you're then hitting 100 of them at a time due to multi threads its understandable your CPU takes a bit of a hit. Not sure what can be done about that really!
  • SvenSven www.GSA-Online.de
    ok I try to optimize this a bit but there is not much room left here I guess.
  • No, i think there are obviously limits to what can be improved there as you can't anticipate the size of a page until you download it - but I did wonder if the 'Maximum size of a website to download' setting was used in filtering urls to platforms or just in submitting (since there never seems to be cpu issues on submit).

    Also it may be useful to allow decimals in the maximum size box -eg I reckon most low OBL pages come in at under 500k but we can't filter lower than 1mb at the moment.

    One other option I thought could help with imports would be give an option to allow SER to assume all pages on a domain are the same platform, so if we identify one massive page of trackbacks we can assume all the pages in a sitelist being imported are also trackbacks, avoiding the need to download each one.
  • I bow down and thank @Sven . Yes, I just updated my SER and saw that option (I only did after I saw your above post) and checked it. Initially (around 30 minutes more or less), I didn't saw any difference and I just closed the VPS window. Right now when I logged in to check, to my woes, my LPM is around 45! That might not be much for pros, but for a guy who struggles around an LPM of 15, its 3 times more!

    Sven, from what I understand, it takes all the URLs on the verified link page and then tries to analyse and post to them? I maybe wrong but that's what I'm understanding. I could remember hearing about some guy here who used somewhat same technique on scrapebox and scrapped internal/external URLs on those sites and build quite good list in just matter of time. This is IMHO nice technique. Thanks for adding this. It's a superb addition to SER, I believe.
  • SvenSven www.GSA-Online.de
    @Pratik it's the way you expected.
  • kajzer

    your thread number appears far too high
    sometimes less is more
    keep in mind that for each successful connection you need time for captcha and time to upload text (articles)

    too many connections may kill / block your proxies

    my personal recommendation:
    try with 30-50 threads first - timeout 100-120 minutes
    the longer the timeout, the more your threads are blocked by slow sites

    then if better results on good list or while scraping
    you still can increase threads step by step
    unless you have huge umber of fast proxies, you most likely are more successful with fewer threads

    living/working at the other side of the world I use time out 90-100 rarely 120
  • edited September 2013
    hans51  I'll try it to lower threads again, when i was start to use SER i started with 60 by default and 90 sec timeout and i experimented whit lowering and raising threads and timeout, was find that this number 180/170 best suits me whit public proxies. 
    Now i have some ROI and i can afford to buy PP, VPS, I couldn't wait a moment to buy PP to see some better results, and when i saw these shitty results from my first post, it was so huge slap that i can't turn my self in reality from shock  @-)
    I see that everybody talking that theirs LPM increased to 35-45, mine LPM these morning was 0,60 with 2 projects turned on and "Use URLs linking on same verified URL" checked... ?? 8-}
  • @kajzer It also depends greatly on your lists build up. We've been running SER since months and so we've build up quite sizable list during this time. You'll catch up soon too.

    It's perfectly natural reaction for someone just starting out. I was like you too, mad behind LPM. But I then learned and realized that it is just a damn number, after I got my 2 niche sites slapped. Slow and steady wins the race, you know. This also doesn't mean that always be slow, but you'll learn a lot from various people's experience. Guys like @ron , LeeG, and many more have shared their experiences here. There are lot of tutorials and tips from many guys like @Ozz  and many more. Not to mention @ron again, every post he makes I take it as a tutorial, LOL.

    So just don't worry. Keep trying things until you find a sweet spot.
  • Pratik i'm working on this T1,T2,T3 site in SER near 1 yr, but your sentence give me some other idea, thanks on your effort, i will try some modifications, and if it's successful droping review, cheers guys !!!  ^:)^
  • edited September 2013
    @pratik its very funny, i am/was a poweruser too and tweaked my 2 versions of SER to 300-400 LpM.

    But then i realized that i worked so hard to increase that goddamn number (and keep it) that i totally "forgot" to make money. i didnt make new projects for almost a month and was only fu**ing around with SER. Its cool to blast out more than half a million links in a day, but the question is, is that really useful?!

    Now i totally changed the way im working with SER and i'm at a good 70 LpM which requires absolutely no maintenance. And you know what... my sites rank as well as before and i make way more money ;)

    Obviously it doesnt really matter if your T3/T2 has 200.000 verifieds or 20.000....

    Long story short: Dont worry about LpM. Try to get it above 50 and focus on your sites ;)

    Regards
  • @Startrip Precisely, haha. :)
  • Exactly Right. :)  I've been running between 40-90lpm for a looong time and I get great results. 
  • ronron SERLists.com
    ^^Exactly. I would also add that what you do directly to your site with site structure, onpage optimization and linkbuilding has a far bigger effect than whether you pound 2,000 or 20,000 links in your underneath tiers.
  • edited September 2013
    btw, when i tick that new option i get high LpM and a lot of new targets fast, but it simply stops to do anything after 2-3 minutes. any1 else experiencing this?

    i can't even close the program.
  • goonergooner SERLists.com
    @startrip - It uses more memory so you have to compensate by lowering number of threads and/or having less active projects.

    I lowered threads from 400 to 300 and active projects on scheduler from 20 to 10 and my LPM increased from 80 to 120 on average.
  • edited September 2013
    for me it loads a ton of new targets and then simply stops, always after it loaded like 2000 URLs from a spammed to death blog. even at 50 threads it dies :/
Sign In or Register to comment.