Skip to content

Adding random links for T1 contextual

edited March 2013 in Need Help
I want to add some spin articles as T1 to my money site.  But I don't want to only drop my link.  I want to drop some other links from other related sites in the article, say 3-4 more... that way it looks like a resource, and no one can ever point a finger to my money site since there are more links in there.

What is a good way of doing this?  What features of GSA help, also do any of the tools like KM, WAC, ACW help with this?

Thanks a lot!
«134

Comments

  • hmm... I'm interested in the answer as well. That's a nice little trick to cover the link footprint too.... That just might be worth it to do on all tiers....
  • Just put in a spintax perhaps??
    So it rotates the urls/anchors?


  • @zeusy, yes can put spintax, hoping for an automated/random way of doing it.

    Love for one of the tools to crawl the web (say google/yahoo) and just put in random links from there.

    This way, you can put in more terrible spins with a large list of "resources".  Manual reviewers- even users- would be drawn to that, ignoring the text.  Also search engines red flag would not go up as fast, as it does not look like self-promotion, as the links are from totally unrelated - but contextually related - sites.

    I hate to write php script to do this (and create the spins as you suggest).  I hope one of the tools - or even GSA- would do it.

    Sven, tool devs, anyone?
  • On 2nd thought, I think this needs to be done by GSA @sven.

    So here it goes, a new macro

    random_link_from_web['keyword']   (syntax bad, just idea...)

    The above would return one random link every time it is posted.  If you repeat the above, you get multiple random links, and GSA would make sure they are not repeats on the same submit.

    Also it would accept spintax, so you can randomize the keywords to get lots of different links.

    I think easy to do.  I think a lot of people would love it.  Busts thru google's filters, I think.
  • I think that is an AWESOME idea and I agree it would completely set GSA apart from the pack! Not too mention it would certainly have to give the big "G" some issue in determining any patterns...
  • @hackersmovie thanks for supporting the idea.

    A bit more:

    If one has:

    random_link_from_web['keyword'] 
    random_link_from_web['keyword'] 
    our link
    random_link_from_web['keyword'] 

    Then GSA would randomly rotate them, so our link is not always in position 3. he he.
  • i think might be better off writing my own php script that just inserts a huge chunk of scraped text and links into the SER.  The good thing is that the amount of spun text needed is drastically reduced, make it easier to have a decent "resource" created.  Once/if I write that script, will be happy to share.
  • Are you going to go out and find 3-4 other sites that are related to your own? Linking to big names is going to let anyone with half a brain figure out who posted the article if its ever manually reviewed.



  • @greenwillow, very good point.  Like linking to Wikipedia, like everyone does, he he.

    So what is the solution?  One could do the reverse :):):) check the pr of the root domain and only take links from the lower PR's.  That way you shine!!!

    What u think?
  • I don't think it really matters where the other links go. Even when I'm manually link building I still link out to high authority sites as well as my own and they nearly never get removed. I think 3-4 additional random links out of any article, profile, etc. would be an amazing assett to GSA SER not to mention how much more natural it wold make things look. I hope @sven is reading this.... :)
  • Yes, me too :)

    I used to do this on my blog posts.  I had bought a tool from somewhere (don't recall) that would pull related blog posts with some text snippets, and I could quickly insert them on my post.  This way, it reduced the amount of text I had to write.

    If sven or someone else does this, it would be very cool.  You can put less focus on quality of the article pages.
  • edited April 2013
    AlexR, I read your post and this is the same with the exception we are saying that GSA should just randomly find the links AND which words in the text to use as the anchor. That way there is no need to upload any url lists or additional anchor text files. It really doesn't matter where the link goes or the anchor that is being used. The whole idea is to just get 2-4 more external links in the article. This would nearly eliminate any foot prints.
  • I have used software that does this for blog networks and it definitely helps with a more natural linking pattern. This IS what people do naturally :) So +1 for this idea
  • Couldn't this be done with the %spinfile macro? I haven't tried it, but I think you could put line something like this in your article...

    <a href="%spinfile-urllist.txt%">%spinfile-anchortext.txt%</a>   (I think the syntax is correct.)

    where...
    urllist.txt - contains a list of urls you want to link to (one per line)
    anchortext.txt - contains a list of anchor texts to use for the hyperlinks (one per line)
  • edited April 2013
    Excellent idea @DavidA2 I didn't think of that. I'll test it. That's why many minds are much better than 1 :)
  • @DavidA2 even if that would work, you would still be posting with the same urls and anchor text. It be would far more powerful if GSA SER could scrape the urls and select random text to use for the anchor. This way you have different, fresh links and anchors in every article...
  • I do that like this

    {|{|%file-filelocation%}} and in that file I put normal html spin of all urls and keywords like this <a href="{url1|url2..}">{anchor1|anchor2}</a>

    And that is all u needs. Works like charm. I have put the empty spin because sometimes I want that link and sometimes I dont want that link. So everything is more random.

  • @hackersmovie - I doubt that this is something @Sven is going to add to SER. It just adds more overhead. It would take you five minutes to run a scrape & a keyword scrape in Scrapebox and get several hundred (or thousand) urls & keywords related to any niche you wanted. This would be more than enough to handle many projects in any given niche.
  • edited April 2013
    @David I'm not so sure. I think it might be quite easy to implement even using an existing URL list like failed URLs.

    I'm going to test @Dexter's method see how it goes.
  • @DavidA2 I understand that, however, you are still limited to only the urls/anchors in your file. I run a LOT of campaigns in GSA, SENukeXcr and NHSEO. I don't have the time or resources to be updating profiles/.txt files/etc. I'm simply stating that IF it were implemented as part of the process in GSA it would really give it a "leg up" on the competition. I'm one of the few, it seems, that could care less about LPM and how fast I'm building links, I simply want to build links that stay where they are put and to build a LOT of links over time. Even using these automated tools I still focus on quality vs. quantity. As an example my articles use 100 spun master articles and I change them every 30 days. I also have a master anchor text file that has over 10,000 characters that include all the generic anchor text, url variations and keywords I need. I have also hand written my own blog comments, etc. I put the effort into the quality of the content and have been building these "lists" for over 5 years. They prove to be very effective but, the only thing I haven't found a solution to is what this post is all about. Add that to GSA SER and now you have a hands down winner in my opinion.

    Just my 2 pence...
  • +1 what hackers says
  • Just remember this will greatly reduce the linkjuice you will receive from that article. The amount of linkjuice is spread out over the amount of outbound links on a page, so if you add 5 random links you will only receive 1/6 of the linkjuice you would normally got when you are the only link on that page. That's why blog comments ( with thousands of OBL ) are so worthless.
  • @pietpatat, that is true, but you trade off the pr loss against google not thinking of the page as self serving, and also having the chance to drop more content and more varied content not being worried about reviews,etc.

    @hackersmovie, for production your point is true. Otherwise one has to create the spin files.  Rather have set and forget than worry about spinfiles...


  • @AlexR - very similar idea.  If you know php/etc. it is not hard to create some massive spins and even put them into external files and have them pulled in.  So you just run the spin per project with say scraped keywords and let it rip.  But I still like the idea of having it all automated :)

    @DavidA2 - scrapebox idea is good, I just don't see how we can get some text snippes, related urls' and pull it all into spinfiles.  That surely takes some programming?
  • AlexRAlexR Cape Town
    @DavidA2 - I use the spinfile and have a decent file of external URL's but the issue is that it's always in the same place more or less. That's why I wanted to randomise it. 

    Also - using spinfile within articles starts creating nested macros...the article is pulled with a macro, then within the article many macros are getting pulled. Would prefer a top level approach. 

    @sobiman - I have massive spins in external files, but as mentioned to David in this comment, it's generally in the same place and nested macros across 300+ projects = trouble! 


  • @AlexR I think the only completely eliminate footprints is to write an external program (e.g. php code) to take spin files and randomly drop stuff that you need in there.  In another thread I was asking sven to add special randomization but it gets pretty hard.  So I am thinking of massaging spinfiles with random stuff (e.g. multiple random resource links, etc.). Pretty easy if you are handy with php.
  • Seems to me that there is plenty of flexibility built into SER to do what everyone (or at least most people) is wanting here. For my projects, I use the %spinfolder macro to get the articles from a folder. The folder has 10-20 different spun articles. I can put my links (and vidoes, images, etc) in different places in each article. I can even put a couple links, videos, images (in different places). If I spin these too (the links/videos/images) and include a "blank" (no link) in the spin text, there will be so many different combinations that the articles really won't look similar enough to tell (in my opinion)

    The articles will be completely different (because they are spun)...some will have one link, some two (and some even none)...some may have a video/image, some not.

    I think sometimes we try to complicate things too much. Sure...maybe @Sven could make it where it was truly "random", but does it REALLY need to be? I don't think so.
  • @DavidA2 - yes using %spinfolder is the way to go.  That way, the content can be manipulated using automated or programmed tools before being fed to SER.
  • @DavidA2 & @sobiman let me play devils advocate for a moment....

    I purchased GSA SER because it AUTOMATES the process of building links. The same reason I have NHSEO and SENukeXcr. If I had the time to generate/create/update all these spin files, macros, etc. I'd just manually build links. (that's a stretch but, bare with me....)

    That being said, I do understand that there is SOME involvement/interaction that needs to be done on my part. I need that to be as little as possible though. I probably use this software much differently than most do. I'm not promoting fly by night websites/Adsense/Click Bank/Amazon stuff. I am ranking client sites. Real, human owned digital properties (and a LOT of them). Between client meetings, website updates and research to better my service, I don't have time for spin files, macros, etc. I need these tools to increase in value by adding tasks that will further AUTOMATE the process and reduce the time I have to be involved.
Sign In or Register to comment.