Skip to content

INDEXING: RSS Feed Mashup of Verified Links with submission to RSS Aggregators

edited November 2012 in Feature Requests
There was another SEO software I used before that seemed to really help out on indexing.

It would take verified URLs of a project and then mash them up into separate RSS feeds (10 links on a page) on different hosted services (that also carry some good PR).

After that -- the software would then also submit all those mashed up RSS feeds to the RSS aggregators that brings the spiders uh coming.

It's my belief that RSS to spiders usually equals news and I believe that the XML FEEDS represent a higher priority for content crawl and indexing.

Plus -- this gives a more natural way for a search engine spider to discover the link via an RSS feed that has a link to the page wanting to be indexed -- rather than force feeding a spider to go directly to the page wanting to be indexed.

This would be make a great Tier 2, 3, 4 and beyond to really get the spiders coming.

By chance would there be any type of module like this in the works?  :D
Tagged:

Comments

  • I used to do that "manually" with scrapebox (when i used AMR to post articles) i think the main issue there is really where to upload the rss feed files, maybe an option to input ftp credencials could work.

    But i'm curious about this module vs the seo indexer we already can have.
  • edited November 2012
    Yeah -- I need this AUTOMATICALLY done from start to finish -- and I believe with Sven's skills -- it's a very good possibility to do these automation steps -- With an optional on the RSS Submission Engine if need be -- but the actual RSS Feed would require to you:

    1)  Use Verified URL list as source of backlink URLs (probably already code base for this since it's used in other engines making this part somewhat easy with some potential rule based editing.  And I am gathering that Sven created this software in a modular fashion which allows him to just "snap on" these engines just like you would snap together legos.

    2)  Build an XML validated RSS Feed  Page Randomly Picking _____ to _____ Verified URLs  (user fills in with a number he/she desires of how many verified links to use for each page -- with each page getting a DIFFERENT number of links to shew off any sot of footprint.)

    3)  GSA SER then creates those validated RSS Feeds and, based on which RSS feed hosting service it is, then to upload to a service, or  upload details via an API or directly via sign-in to the given RSS feed hosting service that GSA SER gives access to.  Each time GSA SER creates a new page, each page has a different number of links, a different order, and also a field in the Options Center of a project to add some pre-liminary text right before the first link to give the page some CONTEXT.

    4)  Upon completion of uploading/creating all the mashed up RSS Feeds, GSA SER had been recording each RSS Feed URL it created at any one of the many RSS feed hosting services available.  GSA SER then records and takes the raw URL to each Mashed Up RSS Feed and submits all those URLs, individually, to each RSS Aggregator Service - - and here's a good list to include (not sure how up to date that is):


    Alternatively -- having only the RSS Mashup Creation part would be enough as I could just use my registered copy of RSS Submit in the URL above.

    5)   OPTIONS

    I would necessarily have the following options add to reduce footprint:

    a)  Generating RSS Feed -- have a default manner in which to create the RSS Feeds by processing URLs randomly even if duplicate URL in multiple feeds.  Or vice versa - have an option box to the side of this for the user to only generate RSS feeds by randomly picking from Verified URL list without duplicate URLs in them.

    b)  A box to allow user to submit EACH RSS Feed to a random set of  different RSS Aggregators available so as to alleviate any footprints.

    c)  Another box to allow user to submit EACH RSS feed within a varying time from ______ minutes to _________ minutes or ______________ minutes after a campaign is done (instead of right away).

    c)  Rotate Proxies (DEFAULT):  Submit a single RSS feed, and once done, rotate to another proxy for the next feed submission.  Another anti-footprint measure.

    I think that's it -- but I typed this late at night and just realized I never submitted it -- so I may add to it.

    However -- I truly thing this actually could be an excellent addition to the SEO Indexer -- which I do have.

    I just think that RSS/XML feeds are EXCELLENT spider food to this day.
  • OzzOzz
    edited November 2012
    My approach to RSS is slightly different. 

    I thought that it works best if SER picks up the RSS file which is created by some site and send this to the RSS directories. Lets take goarticles.com for an example. After you have posted an article you get a RSS file:

    This RSS feed could be collected when SER had verified the article successfully and then posted to the RSS directories afterwards. There are many platforms (not only articles!) that create such an RSS automatically for you.
  • That's perfectly fine if SER did that, too.

    But the idea is to give the search engine spiders some food with MULTIPLE URLs on the page that are gathered from different sources.
  • fullspeed, excellent idea. I would love to have this feature. 
    RSS is great for indexing. 
    I do this process manually sometimes and I am thinking of doing a zenoposter template for it as it is time consuming. 

    http://www.feedage.com/ and http://www.feedagg.com/ are easy to do it with including some other ones. Some of them can even rank. And with the high PR google seems to love them.
  • Yeah -- if I can do this WITHOUT thinking about it -- cuz GSA has already spoiled me -- MORE power to us.

    And if you can do zenoposter -- then let us know if you're willing to share a template.

    There's also that programming bot -- can't recall the name -- UBot -- where you can create a bot to do certain things.

    However -- I think Sven will INCREASE the attractiveness of SEO Indexer if he can include a module like this.

    It'll make the software 10x more effective and allow folks to not even worry about those index services -- which I think are wearing thin now and pretty sure that Google is onto them.  I think these services will be within the next few for Google to ignore and not pay attention to when they make pings, RSS feeds and such. 

    This is why I think it's important to use the third party services like you mentioned above.

    And that's EXACTLY it -- these RSS feed hosting services have high PR and most of them are free.

    I think Google gets a hard-on for services charging money for services that are gear toward manipulating search results -- and anything that is involved with them so-called "Backlinking Schemes" they talk about.    =))   Honestly -- I have NO CLUE WHAT they are talking about  >:)
  • Just for the record -- I would NEVER, EVER, EVER try to manipulate a search engine result at all.  It goes against everything that G's own motto they started out with "Do No Evil."

    Yeah, right!

    (I would think they no longer embrace or even acknowledge that motto any longer.)
  • I use BacklinkingWizard (from an other german coder).

    It reads my RSS-Feeds from my Blogs or from Linkclaw, mash them, generates a new, put them on a webspace by FTP and drip-feed them.

    So i have to submit the Feed-URL to RSS-Sites, twitter, etc…

    I can also add manually Backlinks direct into BacklinkingWizard

    IMO this Tools has two issues:
    - Every-time it drip-feed my backlinks it does not check if the links is working anymore
    - It has no API to automatically send the backlinks.

    I'm not sure if this information helps in this thread, but IMHO RSS-Feeds are important.
  • It helps at least as far as helping Sven understand what other tools are doing -- but this is just a feature request and have not had any Sven intervention of whether it may be possible or not.

    I'd be more interested in a tool that uploads to existing RSS feed hosting services like the two mentioned above -- because of the PageRank power of the actual domain itself because the higher the PR of a site -- the much more OFTEN it gets spidered.  PR5 sites get spidered multiple times daily.
  • I just looked at my INDEX rate on a list of 382 verified links that were submitted through SEO Indexer -- and I think there are 8 pages in Google's index as of right now (campaign was run like 2.5 weeks ago).

    So I've figured out a plan -- and which software I'll use for this as there is probably no chance this will get implemented within the next 2-4 months on SER -- so I'll just go ahead and use an RSS Auto-Generator like you are using Marc.

    Then I'm going to take that and actually HOST it on a PR7 site.

    I'll then take that feed and do two things with it

    1)  Add it to a PR9 site that also generates an RSS feed which I will submit to RSS aggregators.

    2)  Add it to another PR9 site that will also have an RSS feed that I will submit to RSS aggregators as well.

    Between all the spiders raiding the mashed up RSS feeds in 3 locations with an average PR8.5 -- there has got to be some better % of indexation.

    AND -- this isn't an INDEXING service I'm using, so it's free of charge other than maybe a few pennies of bandwidth that I may have to cough up.

    Either way -- if this process can be integrated into GSA, that would be great because there is an API that would handle at least the upload of the RSS feeds.   In any even event, the GSA dev in charge can contact me personally if they are interested in the process I've devised.  Unfortunately, I am not going to share it out in a PUBLICLY viewable forum so some chump can go hire a programmer for 400 to knock my idea off.  :D
  • So even with a lower index rate for SEO Indexer,

    is it still worth the purchase price ? My budget is somewhat limited. If I can use that instead to purchase a month or 2 of semi-private proxies.

    However, do you think that SEO Indexer is a must ?

    There is a big chance that all the higher tiers will be crawled since it was backlinked by a backlink. Right ?

    The only iffy part would be the bottom tier let's say Tier 4 for a 4 tier campaign.

    How about if I create a tier 5 using "Indexer" platforms and such ?
  • @sootedninjas -- that was my experience from it -- and perhaps I didn't use it right -- so I can't tell YOU whether it is worth it.  You should TRY OUT the DEMO version first -- see if it does anything for you -- and decide based on that.

    It's just my personal belief that using XML/RSS feeds for spiders to crawl and index URLs is more effective and I think that's evidenced by a number of the Indexing Services using MASHED UP RSS feeds.

    If there's ENOUGH search engine bait -- there shouldn't be a problem achieving a 40 - 65% index rate.
  • well, there was a video of santos showing that verified links are saved in the text file. If I remember it correctly.

    batch it up and use bulkping.com RSS Feed Generator Creator and RSS Submit 25 RSS Directories.




  • I'm gonna do one better than that and put it on steroids.

    I found a script I totally forgot that I bought last year.

    It allows me to dump in ALL my URLs and

    1)  Create a valid RSS/XML feed on my own site

    2)  Drip feed a certain amount of URLs on a scheduled basis.

    3)  Cross-posts to Twitter, facebook and Onlywire.com's stash of places.

    4)  Then I've got TWO RSS feeds to submit rather than one (Twitter and my own domain + any others that I've got from other microblogs posted to from Onlywire.com)

    That should get things indexed quite well.
  • @fullspeed Do you talk about BacklinkingWizard? If not, could you tell me the name of the tool.

    Thanks Marc
  • @marc -- nope!

    RSSonator.com
  • @fullspeed – Thanks, i will take a look.
  • @fullspeed  how is this solution working for you?
  • my antivirus blocked rssonator.com ...
  • Why not use the free http://rssholder.com/ or Bulkping?
Sign In or Register to comment.