Skip to content

[Feature Request] Actual SE Results Imported & Posted To

AlexRAlexR Cape Town
edited October 2012 in Feature Requests
I have been playing around with the footprints and the various platforms to use for SER and been thinking about a feature that may be very useful.

Basically my logic is as follows. Let's say I take keyword 1 and run it through a SE. Each SE generates a different number of results based on what it feels are the best results for that keyword. So, to get ranked for that keyword, it would be best to have a link from the sites it already deems worthy to list for that keyword. We also know that these sites are indexed as they appear in the SE's. Obvious. 

But what it seems we are doing with GSA is we are looking for the results for "Keyword" + "Footprint". These may generate similar results to just "keyword" BUT most instances they will be quite different, and generate a lot of unrelated sites. If we add in a number of different SE's we should get a good list of all the sites that are getting ranked for keyword1 not "keyword 1"+"footprint" which is actually a different search term altogether! Whereas, what we're doing is selecting a number of platforms and then trying to get results on this. Different keywords/niches have a different prevalence for different platform types. 

What I'd like to see is as follows:
Option to get GSA scrape the actual target URL's from the SE's WITHOUT adding the "footprint" to the SE term. So literally just searching for "keyword1" and returning results.
Step 1: It gets a list of relevant target URL's like this (without the platform footprint) for all of the SE's selected under options. 
Step 2: It takes this target list of URL's and then sorts them into platforms. 
Step 3: It gives you a statistic. xxx/yyy target URL's platform identified AND how many of each platform type make up these target URL's.
Step 4: It then allows you to post to these URL's where it identified the platform. 

The advantage of this would be:
1) You're posting to URL's that are already ranked for your keyword and indexed. By adding in footprints you'll get lots of extra URL's that may not be related, and actually you might not need these links to get ranked or it spreads resources scraping them, only to get removed by a filter.
2) Your link type spread for THAT keyword might be more natural. 
3) It won't add and have to parse so many URL's that are not related to your keyword (by adding in the "footprint" it generates all kinds of extra URL's). These could be added later by selecting platforms and the GSA does what it normally does. 
4) You'd get a better feel for what link spread is needed to rank. 

I'd really appreciate other users thoughts on this!




Tagged:

Comments

  • SvenSven www.GSA-Online.de
    That is a very special case in my eyes. Also you would not get much target sites to post to as almost any site does not have a fixed platform where you can post to. It's a nice idea but rather special.
  • In all honesty it's a nice idea but I don't think the logic is quite valid here. The fact is that a site is ranked on the 1st page for over 200 different reasons one of which is the quality of incoming links. By simply getting a link from a site that is in the top 10 will unlikely be much if a factor for your site (unless its from the ny times!).

    I think it's a case of the bigger picture and getting high quality links of our own, good on page seo together with compelling content (plus 197 other factors much of which we know little about :) ).

    Also as Sven says the amont of platforms which hold 1st page positions for any given kw is going to be low - I know this since I have tried numerous times using SB and only 10 results for thousands of kws for another project I did.

    All that said, you could always modify the footprints and just search on your keywords. You could also scrape using SB and import to test the results for yourself ;)
  • AlexRAlexR Cape Town
    Remember that it's not just the top 10. 

    Assume you have 1000 related keywords (often more), multiply this by the number of SE's you select and then the search engine depth (sometimes a 1000 URL's deep) = MANY TARGET URL's. 

    THEN it could just sort these into platforms it knows. 

    As for the 200 reasons - yes there are many but inbound links have to have the lions share of the influence. 

    e.g. type in "pdf" into SE's. Here's a site with little on page SEO, PR 9 and no.1 "http://get.adobe.com/uk/reader/" (I know it's only 1 example) but I do feel links and link type have a larger proportion of the algorithm. 

    The feature I am requesting would be very nice if possible. :-) The gist of what I'm asking for is for it to give us the option to use the keywords rather than footprints for a search and then to sort the results. 
  • lol, yea the PDF example is always a good one but do consider the shear volume of links as well as the overall authority of adobe :). you can still do what u want using GSA anyway can't you? open up the scraper and whack in ur keywords without any prefixes, or use SB.
  • AlexRAlexR Cape Town
    @takeachance - out of interest, if you had to split the ratio (i.e. the effect of the 200 factors into 3 categories) between:
    1) Inbound Links
    2) On Page SEO
    3) Other Factors

    Roughly how would you divide it? Just curious :-)
  • Well its a good question - the %'s change all the bloody time particularly in recent months....In answer to your question my steer on this is;

    1) Inbound Links - 45
    2) On Page SEO - 25
    3) Other Factors - 30

    I also think pre September 12 the on-page percentage was much higher, I really believe they have tweaked this downwards post Sep. The second biggest percentage of course is the stuff we don't know :)

Sign In or Register to comment.