Skip to content

"GSA Cloud" for sites to post to

Hi,

Can GSA Search engine ranker have a global database aggregating all of the sites one can submit to ?

For example - if I do a campaign, and have verified links on 200 sites - It would be nice to upload these sites to a global DB, so others won't need to search for them.


That way, we won't need to waste time (and proxies) to do the same searches and find the same sites, if all of them would be in the DB which would expand more and more over time - everybody would enjoy them.



Does it make sense ?

Comments

  • And yes, I understand that the guys selling monthly/weekly sites lists would be out of business, but all of us would benefit from it
  • Or for example if you make a search for a specific script signature like "Powered by wordpress" then let's do one big search and find all of those sites, and have a DB aggregating all of them!  (Then continue with a monthly search to find the new ones)


    Each one of those will have many pages.
    And people can still say - don't submit to this site if it has more than XXX external links
  • cherubcherub SERnuke.com
    image
  • Cherub, I understand that you are being cynical ?


    I admit that I am a newb with GSA - Please explain
  • edited June 2014
    If I upload my verified list to cloud , I am in a way degrading the value of sites I am posting to . That is like f*cking yourself literally .After the payday update your 100 custom scraped verified urls which are still to be spammed by other ser users is million times better than those 100K lists shared or sold here .
  • I understand you.

    What about this idea - Global Blacklist!


    There are many sites that are no good, for example blogs with closed comment section - there is no use for scraping them and trying just to find out that the comment section is closed.

    Maybe we can have our own SER global common blacklist where we can put the sites which there is no use scraping thus eliminating them on early stages of SER.



    What do you think ?
  • davbeldavbel UK
    edited June 2014
    As @sven likes to say, GSA = Company and SER = sw

    The reasons you think it would be a good idea are the very reasons why it would be a bad idea.

    It would just be spammed by newbs & BH kiddies / bros who don't have a clue and can't be bothered to learn how to use it.

    SER would quickly become an ineffective tool. It would be the same as all the built-in sites that come with MS, SENuke, UD etc etc but even more pointless.

    Rather than looking for a quick fix, take the time to learn how to use it.
  • I accept your logic @Davbel, and I admit that my original idea is flawed.


    But there are lists of sites which make sense in keeping like:
    Whois sites - in these sites, every SER user get a whole new page for himself, so no one is hurt from a global list.

    Another site list is PHP Exploit - No one is getting hurt from a global list - right ?

    Which means that everyone enjoys these lists, and I am sure it is worth to have several other lists - not for all engines - agreed!, but some of the engines will HUGELY enjoy collaberative lists.



  • Literally shaking my head

    Of course it will make a difference if a site gets spammed.  Google and others with either ignore or devalue any links from that site or worse still de-index it.

    I think before learning SER you need read up on SEO
  • About the seo, you are wrong on this point,

    Google goes Per-Page, the who-is for example will add lots of pages to the whois site, each page with 1 external link.

    AFAIK nothing spammy in google's eyes.
  • BrandonBrandon Reputation Management Pro
    @chaiavi There is nothing in any of your posts in this thread that has merit. I would accept @davbel's oft proven fact that if a site is heavily spammed it will be devalued by the search engines.

    Global (all SER users) lists of any kind is a bad idea for everyone involved except Google.
Sign In or Register to comment.