Skip to content

really poor verified results

Is anyone else having really poor verified numbers?

All my settings are spot on because i have had a very experienced gsa user have a look.

I have 30 private proxies, i am running 90 threads, settings are fine.

I have a auto update list from Trevor Bandura.

I have very low verified numbers for all my projects and that includes kitchen sinks.

If anyone else suffering these same problems or is something a miss with me instead?
«1

Comments

  • shaunshaun https://www.youtube.com/ShaunMarrs
    Mine seems fine, who is this experienced GSA user?

    I dont know they system you are running on but to me having you at 90 threads with 30 private (are they private or semi dedicated?) proxies is too low unless it is for hardware reasons.

    I would guess it is something at your end.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Mine seems fine, who is this experienced GSA user?

    I don't know they system you are running on but to me having you at 90 threads with 30 private (are they really private or semi dedicated?) proxies is too low unless it is for hardware reasons.

    I would guess it is something at your end.
  • fine on my end.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    cozziola  looks like SER is find in general so it is probably something specific to you.

    Post screenshots of your options tab in your project settings along with a screen shot of the platform totals at the very bottom of your identified list when you view its contents in the SER options tab.

    Are you using verifying projects to self verify the lists? If so how are you importing targets from your identified folder?
  • The experienced user who had a look around is Trevor Bandura himself :)

    I just shut my down my GSA after 24 hours and check the vpm and lpm in that time. My lpm always runs at a steady 45-50. vpm for the last 24 hours was 2.5

    The proxies are private and i buy these from buyproxies.org. Do you guys think i need more proxies? The first project i created was on the 16th of december. I built 2 tiers and then a kitchen sink for each tier. The verifieds for that so far are

    t1 = 754 (this has 11 target urls to backlink)
    t2 = 526
    t1a = 296
    t2a = 245

    This is with 30 proxies and trevors list correctly configured. Does this sound way off to you guys?




  • edited January 2016

  • edited January 2016
    This is tier 1 settings for the above project

    image
  • edited January 2016
    "screen shot of the platform totals at the very bottom of your identified list when you view its contents in the SER options tab"

    Where do you mean by this sorry?
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Are you running the latest version of SER?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    "Correctly Configured" is one of those terms open to interpretation.

    You don't need any more proxies.

    How many target URLs do you have on T1? Also what is it pointing at platform wise? Is it a YouTube video or something or a web 2 or your own site?

    Reverifying T1 links every 7200 minutes is far too long to wait mate, bring that back down to 1440.

    How are you pulling target URLs into the project? That screen shot is just the top pane, the "how to get target urls" bit is also needed.
  • 11 target urls on tier 1.

    They are all web 2.0 built via FCS.

    I will change the reverify now and include that screenshot.
  • image

    I know this may look crazy but its configured perfectly.

    All the target urls are located in dropbox and this pulls them out. At the start of each project i also import all target urls via sitelists.

    My VPM for the last 24 hours was 1.4. Not good!

    Does anyone have any idea why my numbers may be so poor?


  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Are you running the latest version of SER?
  • i would think it was either settings, proxies, or list. Do you have 'allow posting on same site again' enabled? I couldn't tell by your screenshot. Are you sure your proxies are working?? More than once, my private proxies failed and I didn't realize it until I checked them on another platform. Did you try using another list than your current vendor? SBv2.0 can scrape one up in a few minutes time and you can test it out to see if it makes a difference.
  • I am using version 10.35.

    Viking all my proxies are working great and at fast speeds. They was replaced only yesterday.
  • edited January 2016
    Allow posting is checked for all projects yes. Per URL is not ticked.
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Which list are you using and are you importing directly into your projects?

    If you're using the identified list, maybe try the unique urls verified list. That one will work better than the identified one.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Running the latest version of SER has nothing to do with this.

    cozziola  Your settings are far from spot on.....

    So far I can see 5 obvious things that anyone competent with SER would have picked up on at first glance when trying to increase LPM.

    The things I picked up on...

    1 - You are running too few threads for your proxy count (unless this is for hardware reasons but you failed to respond the first time I pointed this out.).
    2 - Reverifying T1 backlinks every 7200 minutes.
    3 - Having "Get PR for verified URL's (else PR? is show)," ticked.
    4 - Ask all services/user is selected for captcha solving.
    5 - You are verifying your links automatically.

    Were your settings like this when you asked Trevor to check over your settings and did you ask him what you could change to increase your LPM?

    There are also a few other things that I can see that arnt helping you out as well, and for the third time how are you loading your target url list?
  • edited January 2016
    Thanks for helping Shaun. I don;t think i am understanding the question about how am i loading my site list. I am answering it wrong. Can you tell me what you mean please?

    1. I will up the thread limit now. How many would you bump it upto?
    2. I changed the reverifying to 1440 earlier today.
    3. ?
    4. What should it be set at?
    5. What should it be set at?

    Thanks for this :)
  • I have been importing the site list wrongly. Now that i have done so my numbers have improved.

    Once i have all 5 changes that shaun suggested i should see a bigger increase,

    I am currently running 120 threads.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited January 2016
    Were your settings like this when Trevor checked out your SER?

    1 - What are you running it on, a home PC, a VPS or a server?
    2 - goodWere your settings like this when Trevor checked out your SER?

    1 - What are you running it on, a home PC, a VPS or a server?
    2 - good
    3 - the second screenshot you posted about half way down. To my knowledge it gets SER to use a thread to get the PR of every verified link wasting a thread that could be used for link building. I have done a few tests and having it unticked definitely seems to speed SER up.
    4 - "Ask first service to fill captchas" set GSA CB to first captcha service in SER, Probably a bit more advanced for what you need right now but in the future if you add a ReCaptcha OCR services then add it into GSA CB and manually tell CB what platforms to send to it. Then save your second captcha service in SER for a human service such as Death by Captcha.
    5 - It depends on a few things, if you just want to increase your LPM then have it verify your links every 24 hours so 1440 minutes. Just remember it will take upto 24 hours for your T2 to kick in proper and upto 48 hours for your T3 but after that it can be fast.


    Regarding your target urls, in your verified folder you have URLs, you have to put them into your projects so your projects know what to post to, there are 4 tick boxes in project options, ticking verified is the most common optfion.

    As Trevor said are you using his identified or verified list?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    How are you importing the site lists now?
  • edited January 2016
    1 - I am running this on a low spec dedi mate. But from the threads i read on here its more than enough for ser i think? Would you like the specs?
    2 - yes.
    3 - mate i didn't even notice this option at all. Unchecked now.
    4 - I am using both CB and captchatronix. Am i right in thinking that my current settings are ok then? I have CT as first and 0 retries. Also set to "only for hard captchas" CB is second in line with 3 retries.
    5 - Thanks ill do this now.

    It turns out i had configured my site lists slight;y wrong. So when i was importing my site list i was importing the wrong one. I was importing the "verified folder" but should have been importing the "submitted"

    Thanks for this. This has been slowly burning my head out.
  • Can you only answer the how many threads once you know my vps spec?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    cozziola 

    1 - Nah I don't need the specs, if it is a dedi then put your threads to 300 that is a safe number of threads, when you get to know how to read the SER log then you can start to slowly up it even further but for your current needs 300 is perfect.
    4 - Mate get rid of captchatronix, you are wasting your money. Captchatronix is essentially an OCR version of GSA CB. It does not have a very good success rate with hard captchas. If you want an OCR then I use Reverseproxies OCR, it has its flaws but it is currently the best one I have found.

    Who told you to pull it from the submitted folder?

    This is how I build my lists.....

    - Multiple Scrapebox instances get me target URLs through a number of methods and drop them in my "scrapes" folder. Not a SER folder, just a folder on my server.
    - GSA PI monitors this folder and picks up all new URLs into it and sorts them for me putting the ones I want into my identified folder in SER.
    - I have four projects running 24/7 thats soul purpose is to process the links in my identified folder, they literally just churn that folder to bits. Each project runs for a specific type/group of links. I will load my identified folder into them three or four times while I scrape my next one.
    - The SER projects dump all verified URLs into my verified folder and it builds up bit by bit every day but as my ranking projects are also adding to this list it gets clogged with duplicates and seemed to slow my LPM down.
    - Each day in the morning I stop SER and do my daily servicing on it. Part of my servicing is removing the duplicate URLs and duplicate domains using the inbuilt SER functions to do this. Once done I extract my verified folder as a site list and then import that folder into my failed folder in SER.
    - My live ranking projects then get their target URLs added to them from the failed folder. They are all verified URLs that I have verified myself and it doesnt have any duplicates and is blisteringly quick.
  • Yeah, that's a fairly strong process. There are other good ways to do it, that take less time. 
  • edited January 2016
    Trevor told me to use that sitelist. But its because i had the folders configured wrong. If i had done it right i would import the "verified" list.

    I will dump CT then and replace with the one you are using.

    About scraping my own list. I own both scrapebox and gscraper but as you can see with me having trouble with ser i am not the best with these tools haha. That sounds like a lot of stuff to learn to scrape your own lists. It does sound awesome!

    The crazy part is finding the strings? Is that the right word strings? haha
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I think you mean threads lol.

    Your dedi can take a right beating, I rape mine sometimes having it at 80% CPU usage and it is fine.
  • Scraping really isnt that hard.
Sign In or Register to comment.