Skip to content

2 feature requests, must be idiot proof

LeeGLeeG Eating your first bourne
edited January 2013 in Feature Requests

##### Request 1 #####, on ser start up or shut down. Option to clear duplicate urls and domains

Start up and shut down being the time when resources are at their lowest level of use.

If your pedantic, it would be start up.

Two tick boxes in the window with an extra childproof warning about removing duplicate domains

Reason, helps reduce memory usage and adds more speed to pulling in urls from sitelists

 

#### Request 2 ##### random search engine choice

Option to use a random search engine either from all major engines that all countries have, or a random engine from the global ones, ie random google, random bing, random yahoo

I personally only use random googles

This will help reduce the bleats about peoples proxies being blocked by the major search engines. Plus constantly rotate the engines used and your not constantly hitting the same one trying to pull search results

Not sure if any child proofing needs adding to this idea, but if its implemented, I will try my hardest to break it :D

Comments

  • This sounds like a great idea!  I know i'm always clearing out duplicate urls and domains from my site lists.
  • ronron SERLists.com
    edited January 2013
    @LeeG - Please explain your comment about child-proofing removing duplicate domains. I thought that was ok to do to help with efficiency. Now you have me worrying if I screwed something up.
  • LeeGLeeG Eating your first bourne

    Child proofing or as it can at times be called with me, Lee proofing. Is my sheer ability at finding ways of breaking things, in ways Sven has neven been able to think of :D

    If you have "avoid posting to the same domain twice" unticked. Which I do on everything below links build to money sites, you end up deleting a lot of links from the sitelists

    If you are using social bookmarks, these links soon build up and help you build more links from the accounts gsa sets up. With the likes of Pligg, once ser sets up an account, it will post multiple links from those accounts

    First pass from the global site list, it will find one url to use, a bit later two links, four links. Its a gradual build up and helps boost links to the tier above. Blogs, again it uses different blog pages to link to different pages on the above tier. Forums, you can only set up one account per email account on most, so your not going to be building multiple accounts with those.

    If you think you have screwed up, that can easily be rectified, with a couple of clicks of a button or two.

    Option > advanced > tools > import lists from verified and you will be back on track

     

  • ronron SERLists.com
    edited January 2013

    I was on a completely different page. I thought you were referring to Options => Advanced => Tools => Remove Duplicate Domains (and URL's). Ozz said he hits those two areas every once in a while to take out the garbage.

    I always check "avoid posting to the same domain twice" as multiple links from the same domain have no seo value. It's like voting for the same candidate twice in an election. Google doesn't count them and they have said as much. Not that it's some big no-no or anything. I just don't think it helps.

  • LeeGLeeG Eating your first bourne

    Your missing the point ron

    Remove duplicate urls, is good. I do it on a daily basis or when I remember.

    That can kill hundreds of thousands of duplicate links in one go.

    Deleting duplicate domains, kills site.com/page1 site.com/page9 etc

    If your building links to the tier above, you get links from different pages on that domain, rather than all from one page. And if you have an out bound link limit, one page might fail and another pass

    You have already said you use site lists. If you are anti posting links from the same domain, disable that feature. Its no different.

  • LeeGLeeG Eating your first bourne

    I was going to sneak this in as an edit for delayed ron paranoia

     

    Lets say you have a project running. pr3 obl 50

    Your site list has the same domain listed five times

    web.com/p1 > pr3

    web.com/p6 > pr8

    web.com/p9 > pr0

    web.com/p11 > pr7

    web.com/p4 > pr2

    Which one are you left with if you delete duplicate domains?

  • edited January 2013
    @ron

    >>> I always check "avoid posting to the same domain twice" as multiple
    links from the same domain have no seo value. It's like voting for the
    same candidate twice in an election.

    Multiple links on a single domain can be very powerful if they point to different URLs on your website.  But you need to use new content with each submission to get the boost you want. Also don't place all your links on the same page on that domain. So if you are using highly spun content in your projects and you have multiple urls you are promoting then I will recommend to uncheck this option to get better results.
  • ronron SERLists.com

    @LeeG - I don't use site lists. I just let SER find targets. I do have the box checked to "Avoid posting URL on same domain twice".

    But your point on duplicate domains makes sense. I'm sure I got rid of a lot of legitimate targets by eliminating duplicate domains. Hundreds of thousands as I recall. So when I eventually get around to using site lists, I will have diminished the number of viable posting targets that I could have used. Crap! I get it now. You didn't trigger paranoia, just regret. Thanks for that.

    @marmadou - I understand what you are saying, but that's not what that feature does. It doesn't do you any good to have multiple links from different pages on the same website pointing to the same page on your website. When you check that box, you are saying "I do not want more than one link (page) from any domain pointing to my one webpage".

    Having said that, if you run a project that, say, has 15 inner pages of your website bundled together in one project, then I might agree with you. Then you would have multiple pages of a target domain posting to multiple pages on your website. Then...what you said makes sense.

     

  • LeeGLeeG Eating your first bourne

    ron, I said putting the links back into the sitelists is easy.

    Its not like you have totally deleted them, hence this as the way of rectifying that

    Option > advanced > tools > import lists from verified and you will be back on track

    That will add all the verified links you have built up into the sitelists

    Then you kill duplicate urls after

  • ronron SERLists.com
    Got it. I was busy reveling in regret. Thanks!
  • AlexRAlexR Cape Town
    @ron -  "I do not want more than one link (page) from any domain pointing to my one webpage".

    Are you sure? I thought this checks on a per project basis only? 

    @sven - can you confirm for us here?
  • SvenSven www.GSA-Online.de

    As long as a URL/Domain is verified or about to get verified it is not used to submit to it.

    @Lee I could add request1 as a command line parameter that you add to your shortcut when starting the program. Request2 is added. It is already using a random search engine + keyword + engine footprint.

  • AlexRAlexR Cape Town
    So this applies on global basis, rather than on a per project basis?
    "avoid posting to the same domain twice"
  • SvenSven www.GSA-Online.de
    No, that only applies if the option is enabled in project.
  • AlexRAlexR Cape Town
    @sven - what I mean is that if I have 2 projects and have enabled it on both projects. 

    Does this mean that if Project 1, get's a verified  link on a website domain, project 2 will not post there?

    Or does it mean that ONLY in project 1, it will only place 1 link on a domain even if you have multiple URL's in project 1?

  • SvenSven www.GSA-Online.de
    All options are for that project only...they are not communicating with each other (only tier projects loading URLs).
  • AlexRAlexR Cape Town
    @Sven - thanks. 
Sign In or Register to comment.