Skip to content

Upper tier issue

VMartVMart Natural SEO
edited July 2016 in GSA Search Engine Ranker
I'm running upper tier above PR6. 10 Dedicated private proxy, But GSA ser extracting repeated links. Kindly advice me what is my mistake.



  • SvenSven
    I guess because of your scheduled submission settings within project?
  • shaunshaun
    PR6 is crazy for SER mate you aint going to get many like that.
  • VMartVMart Natural SEO
    shaun I have set PR3 but again same issue coming .
    Sven You have mention as schedule but I can't understand what is schedule 
    Please give me instruction to solve the issue..

    I have not set any schedule.

  • SvenSven
    I mean the project option where you set how many accounts and posts per site.
  • shaunshaun

    post a screenshot of the stuff below what you posted form your options tab.
  • VMartVMart Natural SEO
  • Tim89Tim89
    edited July 2016
    You're using search engines to find targets but you have unchecked the "Identified" list as an option, you are only using the "verified" list to build links, so what is happening is SER is finding new potential targets and then saving these targets in the Identified list, but your campaign is not pulling its' targets from the Identified list, it is pulling its' targets from the verified list.

    fix it by checking the identified lists only, (if you're using search engines to find targets).

    Do this until you have built a big enough list to be able to use your verified list efficiently.

    You may also want to change the timing settings for multiple posts, 2 hours is quite a long time to wait before another registration/post, you're practically waiting a total of 4 hours to obtain 2 posts from 1 target.
  • VMartVMart Natural SEO
    Sven I'm using 30 rediffmail account, I have using this 30 rediffmail account for all the projects. 

    After removing "Options/Scheduled Posting/Allow posting on same site again" the repeating site posting is solved but after that no single site is submitted, if any problem in my side?

    Tim89In Advanced settings I have unmarked the "Identified" the options but can I check the "Identified" in Options/Use URLs from global site lists if enabled settings 
  • clear target history and try again maybe.
  • SvenSven
    reverify existing verified urls, delete those who are no longer valid and delete unused accounts.
    then turn that option on again.
  • Tim89Tim89
    edited July 2016
    I don't see what the problem is here, you are using SEARCH ENGINES to find your targets but you aren't using the IDENTIFIED target list, you're pulling your targets from your VERIFIED lists.

    It makes no sense to me, why don't people just listen?

    Why have you unmarked the identified list in the advanced options? So where are the identified URLs being saved? Mate, you've got your settings all messed up, that's the reason your not building any links, check to save your identified list in the advanced options firstly, at least when SER is searching for target URLs, it will have somewhere to store them ROFL, then go into your campaign and uncheck verified and check IDENTIFIED and see what happens.

    SER is repeating the same links because it has no other targets to hit and do you know why that is? Do you understand?
  • VMartVMart Natural SEO
    edited July 2016
    kijix84  I have select all option and also clear  "all project history" but
    problem is coming again... :(
    Sven  you have mention this option "reverify existing verified urls"
    normally for all projects I have set "Re-verify existing backlinks
    every 1440" .Is it correct or not?

    "delete those who are no longer valid and delete unused accounts"

    you have mention this portion only...I have done like this...
    at last I have clear "Delete unused accounts" the following message are coming

    In advanced option i have select Identified option but there is no
    submit is happening..

    My Settings 

    20 Projects (Tier 1) only
    10 Private dedicated proxy (
    Every Project has niche related keywords above 700
    30 Rediff mails (tested)
    Use URL variations with 20 (Ticked)
    Use the root/main URL with 30 (Ticked)
    Above 700 Niche related keywords
    Collect keywords from target sites (ticked)
    Use collected keywords to find new target sites (ticked)
    Put keywords in quotes when used in search queries  (ticked)
    Try searching with similar looking keywords (ticked)
    Tier 1
    Engine Article/Document Sharing/Social Network/Web2.0/Wiki
    Filter URLs
    Skip sites with more than 25
    Skip sites with a PR below 3
    Skip also unkown PR
    GSA Captcha breaker
    3 days but only 8 submission is done..why it is like this.?
    In my side is there is any problem..please tell me

  • SvenSven
    what about the search engine selection? Can you make a screenshot of that + the options below?
  • Tim89Tim89
    Well woohoo, finally, you have changed the target list to identified instead of verified.

    Well now, you just need to find targets to post to... add as many keywords that are related to your site so SER can scrape potential targets, because you have set your PR settings and filter settings to anal amounts, you're not going to be getting an abundance of links like that, so as I said, get yourself more than 5 proxies as I see in your screenshot you only have 5 proxies, add more keywords and perhaps think about lowering your link settings (requirements) if you want more links.
  • VMartVMart Natural SEO
    My Search Engine Selection : I have selected language "Check by language/English"

  • SvenSven
    you can disable the box with "Use URLs linking on same verified URL" because you do not use such engines I guess.
  • Tim89Tim89
    edited July 2016
    Hey @deniro72, why would you have SER scrape and save identified targets to your verified target list? Where does your verified targets get saved to?

    At the end of the day, everyone should want to be building a massive verified list, with this list, you'll be able to create links whenever you want, no phaffing around.

    An identified link, or a search engine scraped target is not a 100% target, it has been picked up by footprints, it doesn't mean SER will definitely be able to make a backlink from the target whereas a verified link is a target where you know 100% you can get another link from it over and over again, until the domain goes down or they change their platform or tweak to prevent spam.

    If you're not saving your verified links then you're wasting alot of effort in finding new targets..

    If you're wanting a project to find new targets and only post to those new potential targets then you should have "identified" selected alongside your search engines and leave the "verified" list unchecked, then you'll want to go to the SER advanced options and reconfigure the site list structure to how it was originally, Identified saved to Identified, Verified > Verified, once this is done at least once you've found a target > saves into identified list > makes a back link > saves the back link into the verified list but I thought this was common knowledge  =))
  • Tim89Tim89
    That's what I said it does.. Practically however the link firstly gets stored in your identified list which then is used by your projects to pull its targets from.

    When SER identifies a link when searching the search engines with keywords, it stores the identified targets in the identified site lists lol I thought that was self explanatory to be honest :))
  • VMartVMart Natural SEO
    edited July 2016
    :(  Unit I'm not getting any verified links, plz check my setting.
    Please help me..

  • under proxies check submission and private also verification and private
  • your filters are really high. no more than 25OBL and PR 3+. why dont you see if it posts if you remove those filters.
  • VMartVMart Natural SEO
    I did both changes but my issue not solved :(
  • shaunshaun
    @webm200 Sorry for the delay, I have been crazy busy and getting back to you has just been getting pushed to the bottom of my to do list.

    Could you just confirm the fault you are experiancing again mate as there is loads of stuff posted now, is it still only getting one domain verified in your projects?

    Also, your SER is set up pretty bad. Although SER is a single tool you should realise that it has the built in capabilities to perform a wide number of jobs. In my oppinion its best to have individual copies of SER set up to do each of them rather than one set up doing them all like this one. Especially when you can only have it running on 100 threads as they are just getting wasted.

    Anyway a few things I noticed from your screenshots.....

    • Untick Web 2 and doc sharing you are wasting crap tons of resources on it. Web 2 is totally useless and doc sharing sites are few and far between and essentially pointless in SER without a builtin ability to spin the content in the PDFs as its just building duplicate content.
    • You have 128 search engines ticked, thats crazy high so you are wasting crap tons of threads on that too. Try ticking just and nothing else.
    • You are pulling from identified and verified on building projects, in my oppinion its a waste of time. Additionally you have the continiue to post to failed sites box ticked so essentially SER is identifying a target URL and putting it in the identified folder. Then its trying to post to it, failing but retrying over and over again wasting resources again.
    • You have "analyse and post to competitors backlinks" again no point on a building project. It is essentially link extraction, never used the built in SER version but the scrapebox version RAPES resources. So again wasting resources on a pointless task.
    • Skip sites with more than 25 outgoing links on a page, I know you upped it to 50 but just get rid of it mate its pointless for contextuals. That type of filtering is for blog comments and stuff but even then they shouldent be used on T1 anymore so its pointless too. Untick it. I have a little look at my contextuals and they have anywhere from 10-75 links on the page, most are internal links that are unavoidable. You are missing out on a lots of verifieds.
    • Skip sites below PR3, I know you are using the yandex TIC conversion but just untick it mate. SER links are for blackhat automation, if you want links with good metrics then scrape and analyse with scrapebox then manually post to the gold. So again losing out on loads of contextuals.
    • Skip sites with unknown PR - I have not used the yandex TIC feature so not 100% on this but im guessing it could be missing you targets again.
    • Try to skip creating no follow links - untick it mate. I have no idea how this feature works but it had a low sucess rate when I tested it and also prevented do follow links so costing you even more targets.
    • Avoid posting to sub domain sites - I would untick it costing more targets.
    • You have shit tons of emails in the project, try 3. Put them in the project and then click the test buttong and make sure they actually work. You are wasting loads of your threads here again with having so many emails.
    • Untick decrease CPU and memory usage if you are on a VPS it will manage the hardware usage for you, it might cause you problems connecting with your client but it will let you in after a few tries.
    • 100 threads/10 proxies. Up your thread count, the 10 theads per proxy thing is a load of crap it came from when people used SER to scrape for targets on Google. Swap over to bing and your thread count is used to complete loads of tasks so out of your 100 threads maybe 10-25% or something are being used activly on the search engines. I have no idea on your hardware but I would try 300 threads and go from there.
    • Captcha breaker retry 3 times - Set this to 1. If CB doesnt get it on its second try then its not getting it. I set it to 0 on my building projects but you are verifying too. 
    • You are posting to dead engines like easyblog. Again wasting resources SER cant actually post to them now. So again wasting resources finding them and trying to verify but it will never work.
    • Posting to shit engines like PHPWeb, most are on asian sites that I would miss totally. Just untick it.
    • I saw you had proxies selected for submission and verification, just untick it. Your rig will be far too slow to suffer any penalty for multi posting to the same domain on the same IP and using a proxy will slow it down further.
    • I saw you are also using public proxies, never use public proxies for anything other than maybe scraping but even then it has to be out of desperation.
    The best way to use SER in my oppinion is to have SER/HReffer/Scrapebox scraping on one set up. 1 copy of SER verifying those links. Then push all those links to your builder rigs. Each step if seperate so it can dedicate 100% of its resources to its task.

  • VMartVMart Natural SEO
    I'm happy about your good reply and suggestion.

    1) "Is it still only getting one domain verified in your projects?"
    Reply : No single domain getting verified :(
    2) Yes I'm going to run single project to check
    3) I unchecked web 2.0 and Document sharing engines
    4) In this section I have small confusion I'm targeting English speaking countries so I 
    select Language English, so that it selected 128 search engines. As you told I manually add in ad options. After that only search engine is added. So that I need to 
    disable all other 128 search engines or not.
    5) I have ticked options/identified/verified whether I need to tick failed or not, I'm not 
    able to understand this section.
    6) "analyses and post to competitors backlinks" I unchecked this options.
    7) "Skip sites with more than" I have untick it from my T1
    8) I don't have scrapebox, I just installed scrapebox but I have no idea scrapebox, So can 
    I tick "yandex TIC"
    9) If I not tick this options it post to unknown PR, So I need to tick this options.
    10) "Try to skip creating no follow links" I untick this option.
    11) "Avoid posting to sub domain sites" I untick this option.
    12) I have tested with 3 rediff email address 
    13) Untickd decrease CPU and memory usage 
    14) I have changed to 300 threads 
    15) Captcha breaker retry I have changed to 1
    16) I have seen "easyblog" in article section how can I find dead engine like "easyblog"
    17) I have seen "PHPWeb" in article section how can I find asian sites
    18) I have untick settings/submission/submission and verification
    19)  In "Search engine I have unticked "public" proxy option

  • shaunshaun
    4 - yea disable the rest.
    5 - I havent used SER the way you are in years so im not sure.
    8 - I personally dont use PR or TIC.
    9 - Why?
    16 - You just have to test them. Off the top of my head contextuals Easyblog, Catalyst and SupeSite are dead.
    17 - Right click the project it was verified - show URLs - verified - check the text in the fields or right click the actual entry and open url. I dont use PHPWeb at all right now but have some plans for it when Sven adds the location feature to the verified pane.

  • VMartVMart Natural SEO
    edited July 2016
    Please help me
    I have tried more method and verified lot of articles but I did n't get single submission.
    I think any major setting mistake my side.
  • shaunshaun

    1 - Are you online right now and for the next 30 or so minutes?
    2 - Do you have team viewer?
    3 - Please type out the problem you have as best you can in this thread.
Sign In or Register to comment.