Skip to content

Adding random links for T1 contextual

edited March 2013 in Need Help
I want to add some spin articles as T1 to my money site.  But I don't want to only drop my link.  I want to drop some other links from other related sites in the article, say 3-4 more... that way it looks like a resource, and no one can ever point a finger to my money site since there are more links in there.

What is a good way of doing this?  What features of GSA help, also do any of the tools like KM, WAC, ACW help with this?

Thanks a lot!

Comments

  • hmm... I'm interested in the answer as well. That's a nice little trick to cover the link footprint too.... That just might be worth it to do on all tiers....
  • Just put in a spintax perhaps??
    So it rotates the urls/anchors?


  • @zeusy, yes can put spintax, hoping for an automated/random way of doing it.

    Love for one of the tools to crawl the web (say google/yahoo) and just put in random links from there.

    This way, you can put in more terrible spins with a large list of "resources".  Manual reviewers- even users- would be drawn to that, ignoring the text.  Also search engines red flag would not go up as fast, as it does not look like self-promotion, as the links are from totally unrelated - but contextually related - sites.

    I hate to write php script to do this (and create the spins as you suggest).  I hope one of the tools - or even GSA- would do it.

    Sven, tool devs, anyone?
  • On 2nd thought, I think this needs to be done by GSA @sven.

    So here it goes, a new macro

    random_link_from_web['keyword']   (syntax bad, just idea...)

    The above would return one random link every time it is posted.  If you repeat the above, you get multiple random links, and GSA would make sure they are not repeats on the same submit.

    Also it would accept spintax, so you can randomize the keywords to get lots of different links.

    I think easy to do.  I think a lot of people would love it.  Busts thru google's filters, I think.
  • I think that is an AWESOME idea and I agree it would completely set GSA apart from the pack! Not too mention it would certainly have to give the big "G" some issue in determining any patterns...
  • @hackersmovie thanks for supporting the idea.

    A bit more:

    If one has:

    random_link_from_web['keyword'] 
    random_link_from_web['keyword'] 
    our link
    random_link_from_web['keyword'] 

    Then GSA would randomly rotate them, so our link is not always in position 3. he he.
  • i think might be better off writing my own php script that just inserts a huge chunk of scraped text and links into the SER.  The good thing is that the amount of spun text needed is drastically reduced, make it easier to have a decent "resource" created.  Once/if I write that script, will be happy to share.
  • Are you going to go out and find 3-4 other sites that are related to your own? Linking to big names is going to let anyone with half a brain figure out who posted the article if its ever manually reviewed.



  • @greenwillow, very good point.  Like linking to Wikipedia, like everyone does, he he.

    So what is the solution?  One could do the reverse :):):) check the pr of the root domain and only take links from the lower PR's.  That way you shine!!!

    What u think?
  • I don't think it really matters where the other links go. Even when I'm manually link building I still link out to high authority sites as well as my own and they nearly never get removed. I think 3-4 additional random links out of any article, profile, etc. would be an amazing assett to GSA SER not to mention how much more natural it wold make things look. I hope @sven is reading this.... :)
  • Yes, me too :)

    I used to do this on my blog posts.  I had bought a tool from somewhere (don't recall) that would pull related blog posts with some text snippets, and I could quickly insert them on my post.  This way, it reduced the amount of text I had to write.

    If sven or someone else does this, it would be very cool.  You can put less focus on quality of the article pages.
  • edited April 2013
    AlexR, I read your post and this is the same with the exception we are saying that GSA should just randomly find the links AND which words in the text to use as the anchor. That way there is no need to upload any url lists or additional anchor text files. It really doesn't matter where the link goes or the anchor that is being used. The whole idea is to just get 2-4 more external links in the article. This would nearly eliminate any foot prints.
  • I have used software that does this for blog networks and it definitely helps with a more natural linking pattern. This IS what people do naturally :) So +1 for this idea
  • Couldn't this be done with the %spinfile macro? I haven't tried it, but I think you could put line something like this in your article...

    <a href="%spinfile-urllist.txt%">%spinfile-anchortext.txt%</a>   (I think the syntax is correct.)

    where...
    urllist.txt - contains a list of urls you want to link to (one per line)
    anchortext.txt - contains a list of anchor texts to use for the hyperlinks (one per line)
  • edited April 2013
    Excellent idea @DavidA2 I didn't think of that. I'll test it. That's why many minds are much better than 1 :)
  • @DavidA2 even if that would work, you would still be posting with the same urls and anchor text. It be would far more powerful if GSA SER could scrape the urls and select random text to use for the anchor. This way you have different, fresh links and anchors in every article...
  • I do that like this

    {|{|%file-filelocation%}} and in that file I put normal html spin of all urls and keywords like this <a href="{url1|url2..}">{anchor1|anchor2}</a>

    And that is all u needs. Works like charm. I have put the empty spin because sometimes I want that link and sometimes I dont want that link. So everything is more random.

  • @hackersmovie - I doubt that this is something @Sven is going to add to SER. It just adds more overhead. It would take you five minutes to run a scrape & a keyword scrape in Scrapebox and get several hundred (or thousand) urls & keywords related to any niche you wanted. This would be more than enough to handle many projects in any given niche.
  • edited April 2013
    @David I'm not so sure. I think it might be quite easy to implement even using an existing URL list like failed URLs.

    I'm going to test @Dexter's method see how it goes.
  • @DavidA2 I understand that, however, you are still limited to only the urls/anchors in your file. I run a LOT of campaigns in GSA, SENukeXcr and NHSEO. I don't have the time or resources to be updating profiles/.txt files/etc. I'm simply stating that IF it were implemented as part of the process in GSA it would really give it a "leg up" on the competition. I'm one of the few, it seems, that could care less about LPM and how fast I'm building links, I simply want to build links that stay where they are put and to build a LOT of links over time. Even using these automated tools I still focus on quality vs. quantity. As an example my articles use 100 spun master articles and I change them every 30 days. I also have a master anchor text file that has over 10,000 characters that include all the generic anchor text, url variations and keywords I need. I have also hand written my own blog comments, etc. I put the effort into the quality of the content and have been building these "lists" for over 5 years. They prove to be very effective but, the only thing I haven't found a solution to is what this post is all about. Add that to GSA SER and now you have a hands down winner in my opinion.

    Just my 2 pence...
  • +1 what hackers says
  • Just remember this will greatly reduce the linkjuice you will receive from that article. The amount of linkjuice is spread out over the amount of outbound links on a page, so if you add 5 random links you will only receive 1/6 of the linkjuice you would normally got when you are the only link on that page. That's why blog comments ( with thousands of OBL ) are so worthless.
  • @pietpatat, that is true, but you trade off the pr loss against google not thinking of the page as self serving, and also having the chance to drop more content and more varied content not being worried about reviews,etc.

    @hackersmovie, for production your point is true. Otherwise one has to create the spin files.  Rather have set and forget than worry about spinfiles...


  • @AlexR - very similar idea.  If you know php/etc. it is not hard to create some massive spins and even put them into external files and have them pulled in.  So you just run the spin per project with say scraped keywords and let it rip.  But I still like the idea of having it all automated :)

    @DavidA2 - scrapebox idea is good, I just don't see how we can get some text snippes, related urls' and pull it all into spinfiles.  That surely takes some programming?
  • AlexRAlexR Cape Town
    @DavidA2 - I use the spinfile and have a decent file of external URL's but the issue is that it's always in the same place more or less. That's why I wanted to randomise it. 

    Also - using spinfile within articles starts creating nested macros...the article is pulled with a macro, then within the article many macros are getting pulled. Would prefer a top level approach. 

    @sobiman - I have massive spins in external files, but as mentioned to David in this comment, it's generally in the same place and nested macros across 300+ projects = trouble! 


  • @AlexR I think the only completely eliminate footprints is to write an external program (e.g. php code) to take spin files and randomly drop stuff that you need in there.  In another thread I was asking sven to add special randomization but it gets pretty hard.  So I am thinking of massaging spinfiles with random stuff (e.g. multiple random resource links, etc.). Pretty easy if you are handy with php.
  • Seems to me that there is plenty of flexibility built into SER to do what everyone (or at least most people) is wanting here. For my projects, I use the %spinfolder macro to get the articles from a folder. The folder has 10-20 different spun articles. I can put my links (and vidoes, images, etc) in different places in each article. I can even put a couple links, videos, images (in different places). If I spin these too (the links/videos/images) and include a "blank" (no link) in the spin text, there will be so many different combinations that the articles really won't look similar enough to tell (in my opinion)

    The articles will be completely different (because they are spun)...some will have one link, some two (and some even none)...some may have a video/image, some not.

    I think sometimes we try to complicate things too much. Sure...maybe @Sven could make it where it was truly "random", but does it REALLY need to be? I don't think so.
  • @DavidA2 - yes using %spinfolder is the way to go.  That way, the content can be manipulated using automated or programmed tools before being fed to SER.
  • @DavidA2 & @sobiman let me play devils advocate for a moment....

    I purchased GSA SER because it AUTOMATES the process of building links. The same reason I have NHSEO and SENukeXcr. If I had the time to generate/create/update all these spin files, macros, etc. I'd just manually build links. (that's a stretch but, bare with me....)

    That being said, I do understand that there is SOME involvement/interaction that needs to be done on my part. I need that to be as little as possible though. I probably use this software much differently than most do. I'm not promoting fly by night websites/Adsense/Click Bank/Amazon stuff. I am ranking client sites. Real, human owned digital properties (and a LOT of them). Between client meetings, website updates and research to better my service, I don't have time for spin files, macros, etc. I need these tools to increase in value by adding tasks that will further AUTOMATE the process and reduce the time I have to be involved.
  • @hackersmovie - I did not mean to convey that I did not understand where this feature could be beneficial. Surely, it could be. And, granted, I can see where your's is an extreme case where it would definitely help. But, I was just offering alternatives that may work for most people.

    It sure doesn't hurt to ask. And @Sven has added more and more great features. So, maybe he can stick this in as well some time. If he does, I would use it.
  • @sobiman " yes using %spinfolder is the way to go.  That way, the content can be
    manipulated using automated or programmed tools
    before being fed to SER.
    "

    What are these automated tools that can manipulate the files before being fed to SER?
  • hackersmovie  I don't know about any automated programs, but I write php, and it is a piece of cake to take text files and manipulate them.  The sky is the limit.  I could even write a crawler to take relevant urls from search engines and insert into those text files.
  • @sobiman - care to write a few php programs for me?
  • AlexRAlexR Cape Town
    @DavidA2 - I was suggesting something simple. Like the same way SER checks the anchors, finds a random instance in the article and then grabs a random URL from URL field. 

    What I am suggesting is 3 extra fields. 
    1) external URL's (like the normal URL fields but for external URL's) 
    2) External Anchors
    3) No. of uses. 

    Then it does same as it's currently doing but with added external URL's. Placing a random image link would be great too, but not urgent. 
  • hackersmovie , sure hit me up
  • Such a feature would be great. Really.
    AlexR suggestion for 3 more fields soounds good.

    Sven, would it be possible to add such a feature ?
  • SvenSven www.GSA-Online.de

    I don't think this makes sense to add it to the GUI. It would confuse people even more.

    You can do all of that by defining yourself some macros like %spinfile-c:\ext_urls.txt% and %spinfile-c:\ext_anchors.text%.

  • AlexRAlexR Cape Town
    Feedback:

    I am using a workaround for this. 

    Per project I have:
    1) external URL's file
    2) Images file

    I then call them in using the macros like Sven suggested. BUT since they are filepaths, they often break or have an extra space etc, so have spent a lot of time fixing bugs. Also, some macro interactions have caused trouble for me in the past.

    That's why a %image% or %external_url% would be much easier to use than:
    %spinfile-c:\Administratator/Dropbox/Websites/Projects/MoneySite/Images/......../%

    Also - when using macros it fixes the place in the article where the link or image appears. Would be much nicer to have it randomise, like it does with the anchor text option. 
  • @Sven I disagree. Adding a simple check box to the GUI like "Add additional external links" would be PRICELESS. While many users may have the time and inclination to set up massive spin files for anchor text and urls, I'm sure just as many have purchased this software for it's AUTOMATION, like myself. Plus, judging from the response of this thread, it would seem to be a much welcomed addition...
  • SvenSven www.GSA-Online.de
    And where do you think it should insert these links? And what links on what anchors?
  • AlexRAlexR Cape Town
    @sven - it would work like the feature you already have in place with normal URL's & anchors.

    So in external URL file you would have a .txt file:
    ExternalURL#{anchor1|anchor2|anchor3} 

    If it found any of these anchors it would just hyperlink with external url instead of project url. 

    For image URL's there would be field in GUI with link to .txt file with image url's in format:

    SER would just place an image into articles after random paragraph (after first paragraph and before last paragraph) if this option was enabled. 

    Not urgent as can be done with macros, but a bit complicated (since filepaths break) and it keeps the link placement always the same and predictable. With above it would solve this. I think it adds value to submitted articles if they have a nice image and a value adding external URL. :-)
  • @Sven, that's the beauty of it. It doesn't really matter. It can insert the links anywhere and on any anchor. The whole idea is to make the article seem more 'real' by linking out to something other than what we want it to (our Money Site). Nearly every 'real' article, blog post, etc, I've ever read did NOT just link to one source. It can even randomly select the number of links (I'd suggest between 1-3 additional links but, no more) The more random it is, the better!
  • I think this is a VERY IMPORTANT addition, as @hackersmovie said.  I also talked about a similar concept on another thread.  Self serving links are out, resource links are in.  

    @AlexR why don't we make it a lot simpler.  In our text if we can put a spin list of something like
    {external:key1|external:key2|... }

    so the idea is:
    external:key

    When GSA sees external, it randomly picks one post and its title from say Yahoo's first 100 results for that keyword and insert it.  Simple.  We just make sure we are not competing for those keywords.  


  • SvenSven www.GSA-Online.de
    @hackersmovie so you want the program to search for some words and add like 3 URLs totally not related to the anchor ? Isn't that even more suspicious?
  • @Sven, if the anchor can be related, that is obviously better. If it has to be random and not related, just make it a generic anchor text, that way it doesn't matter...
  • SvenSven www.GSA-Online.de
    but a generic anchor text would asume it has to insert them...thats not possible without making it unreadable.
  • @Sven, the "relevance" only needs to be between the anchor and the link it's pointing to. Not the article. 99% of the articles, etc I read online may be about, say, an iPhone but, may link out to software, apps, Android, or even an African Safari that was filmed exclusively on a mobile device. African safari's aren't directly related to an iPhone but, in that context they are. Search engines use the "6 Degrees of Separation" or "Butterfly Affect" theories all the time. Seemingly "unrelevant" terms/items may actually be related so, it really doesn't matter...
  • AlexRAlexR Cape Town
    @Sven -  it should work like the URL field and anchor, except be an external URL. This way we can use a macro or populate the field with relevant URL's that add value to the user.

    So for example on certain projects I have a .txt file with pre-checked authority url's that will add value to the user. (like wikipedia entries, etc)

    So in my article it might have text:
    For detailed explanation on %anchor_text% check out EXTERNALURL.

    If we can use a macro and the URL format:
    ExternalURL#{external_anchor1|external_anchor2|external_anchor3} 

    We can ensure they remain relevant. 

    Regarding images, see comment at  . 
  • @AlexR While that seems to be a great way to do it, I don't have the time to research every single topic for every single one of my clients. I need to be able to check a box or something and have the software take care of it. It's great that GSA SER has the macro functions but, for many I would guess, it's not feasible to use.
  • AlexRAlexR Cape Town
    @hackersmovie - just type in a list of keywords in SB or any program, or hire a VA, and take the list of harvested URL's and insert into external URL field. Easy. For Sven to have to go and program something that goes out and scrapes external URL's is a much bigger job than just letting us get external links placed in articles. 
  • @AlexR - I can easily scrape keyword list using scrapebox and write a very simple program in php to grab some random urls from say yahoo and spin it all together.  This is the easy part.  But then it is a separate process, and I think it is good if we can avoid doing another process and let GSA handle it all?  
  • @AlexR what @sobiman said is what I was getting at too....
    ^:)^
    :-h \:D/
  • SvenSven www.GSA-Online.de
    edited April 2013

    OK I came up with a solution here: I can make a list of general URLs who would still make sense to get linked with anchors...please add some more if you know what I mean...

    http://www.wikipedia.org/wiki/%anchor%
    http://www.google.de/search?q=%anchor%
    http://dict.leo.org/?search=%anchor%

    An option would than randomly choose like 3 random words and put a link on them with the word in the URL

  • @Sven that's great! Would it also be possible to add a variable to the search like .edu or .gov so it will find relevant targets to post links to .edu and .gov sites, those will carry more authority...
  • SvenSven www.GSA-Online.de

    No you don't understand. I will not do a query and use the found items. I will insert these URLs. Everything else is slowing things down.

    like this: This is a sample of some <a href="http://www.wikipedia.org/wiki/random">random</a> anchor/URL that gets inserted.


  • ahhh!! I see. O.k. that will work just fine...
  • @sven that is good solution, although it gets limited to search engine content additions only.  I think if you are not adding a crawling solution to find anchors/urls from regular websites, then we should think it through some more.
  • SvenSven www.GSA-Online.de
    Too late, just finished this :) ...testing now and if all is good tomorrow morning it gets released.
  • SvenSven www.GSA-Online.de
    But now it would be nice if you guys come up with some more generic URLs that can be used...
Sign In or Register to comment.