Skip to content

How do you use GSA in 2014 to rank sites?

Well, just wanted to know how you guys are using GSA and what strategy is working for you right now.
Tagged:
«1

Comments

  • churn and burn, just smash as much backlinks as you can on the domains and make your money fast :)
  • I was doing this method, but I think it stopped working ever since last week my GSA or VPS has been acting up cant tell which? My 20K verified only shows like 800 Links on Ahrefs.. Can someone share their settings with me both GSA settings and Campain.
  • @PaulieP I doubt that works that well now. I used to rank similar some time back but havent used GSA in a while after I got slapped
  • edited February 2014
    Of course you get slapped, thats the game you play..... well not all the time, but sure you will lose 50% of your domains within 3 weeks of ranking, so you have to be smart in selecting your niches.

    My niches are Pay per Call offers and i hit the local searches with big sites, spung content and massive amounts of links and like i sad about 50% of your domains will die within weeks but the other 50% can stay up for months.

    The domain you lose you put a 301 on to your new domain, clone the content and hit it again and again and again until there is no more money in that niche or Google find a way to get you every time
  • Churn and burn web 2.0s. You might not be able to 301 it when it tanks somtimes, but you'll get a great boost because of the huge domain authority which some web 2.0 platforms has. Has been working great for me so far, you just need to be able to produce links fast enough.
  • @PaulieP "I hit the local searches with big sites" - could you elaborate what that means? Do you mean you build huge authority sites to rank for these local searches?
  • Apart from the above, if you know what you're doing, you can still use SER to rank pages using normal tiered link building tactics just fine as well.
  • edited February 2014
    Sure @johnmiller i build spamsites with spung content for a area with cities and villages and small towns

    lets say that i was targeting plumbers (which i am not cus every Tom Dick and Harry is targeting plumbers :) )
    But if i was then i would create a page for every city, town, village, neighborhood + plumber, plumber services, cheap plumbers you get what i mean. (of course i am not building these pages by had)

    And when they rank the site and get some traffic i redirect (using WPcloacker) the traffic to a Pay per Call offer for plumbers and wen they call i get paid.
    When the domain gets banned i use the free duplicator plugin to make me a copy trow it on a new domain, 301 the old one to the new and start over again.
  • Thanks for clarifying @PaulieP, congrats. Sounds like a well built-out silo you have there :)>- I assume you mass-create these pages with some plugin? Also, how do you avoid duplicate content? I mean if you have like "plumber dallas" and "plumber austin", the text on both pages would probably be very, very similar if you know what I mean.
  • @ johnmillerjohnmiller

    I not really silo it as i use also the tags to stuff more keywords in them, i do use old plugin that still works but isn't for sale or supported anymore it is called local pages ninja (if you tried hard you could find a copy somewhere) it was one of the rare WSO's that was worth the money :)

    About the content i really don't belief there is a dup content penalty when it comes to local searches "plumber dallas" and "plumber Austin" could have the same content but target a different region of the US, someone in Dallas will never see local content targeting Austin unless they add Austin in there searchquery.

    But even if it was a problem i could not care less about it, these sites are created to die eventually, its a game i play with Google, sometimes i win and sometimes Google bends me over and i defiantly don't win :)
  • edited February 2014
    Right, sounds similar to an autoblog plugin. I was just wondering how the content would look for a page created with this plugin for "plumber dallas", for example. If Google sees dupecontent on your site, only one of those pages would rank - theoretically. So if you had like 500 cities, only 1 city would rank and the others would be in the supplemental index - at least that's my limited understanding of dupe content and I'm always happy to be corrected.

    Anyway, the theory doesn't matter if it works for you...and great to hear that it does.

    One last thing: how do you manage the backlinking of these sites? I mean they must have several hundreds if not thousands of pages. I doubt you create a new campaign for every page in SER...or are you just blasting the homepage and biggest cities and the others get some link juice via sidebar links or similar?
  • The content is about 1500 words manually spung content, i spin it myself and it takes me about 3/4 hours to get it right, when you read it you will not know it is spung

    The difference between 2 pieces of content is about 60% and no you will not end up in any kind of sandbox because you target different cities, even if you change only the city names you still would be ranking for most of them.

    And that brings me to your last question, i backlink the big cities and homepage , for every domain i have two projects, one for the root domain and one for 10/15 city pages.

    They are all two tiered cus i don't belief more tiers will give you better results and then i hit them with everything i have and that is scrapebox for blogcommenting & SER for the rest

    Sometimes my mate let me play with his Xrumer toys, but i find it real tedious to work with i rather use SER for it
  • As a standalone tool without any supplementary blog posts or other links, how many of you are ranking with SER alone? It works effectively for me, just not as much as before.
  • @PaulieP +1 for the actionable info.
  • donaldbeckdonaldbeck Advanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
    @PaulieP

    Knows whats up.
  • @JamPackedSpam I would not know as i use multiple tools to backlink, i use SB for the blogcomments so that SER can point its power to the other engines.

  • PaulieP +1 Nice to get inside your brain on Pay Per Call. 
  • @PaulieP nice info, I have that plugin :) Not sure if it still works, will try it out. Completely forgot about it. 
  • edited February 2014
    @judermand It doesn't, if you install it now you will get a lot of warnings and it will not bring you the options you need.

    You will need to change some code as their license server is down and so your license isn't checked anymore , and i asked if they still supported it but i never heard back from them, i found my way around the warnings ;)

    But the system itself still works until this day, GSA, LPN and Cloaking = $$$$
  • It works! :) Hmm got a plan now.
  • I used to do this all the time. I wrote my own "spinner" and cloned HTML pages several years ago. This is a money maker. In fact, I have been planning on doing this again, but haven't gotten around to it. Protected keywords = City, ST --> spin em, index em, rank em and roll.
  • @pauliep thanks so much for this actionable info. Question - is there any software that auto generates the local keyword information such as state/cities within that state etc
  • @hadoken check your PM
  • @ PaulieP - Good stuff, very interesting what you're doing. Thanks for taking the time to share this.

    I'm really interested in the 301 redirect with sites that get tanked. Do you find that doing a 301 redirect with your penalized sites allows you to rank the new domain faster then if you simply took down the old site and moved the content over to a fresh domain?
  • @PaulieP awesome stuff you are sharing.
    Can you PM the software that generate local keywords?
    Thanks
  • Don't use Webmaster Tool!   That's how they are slapping sites!   They are tracking you!   They know if you buld 500 links in a week!
  • @cefege
    There us no keywordtool to use when it comes to this, you should use your brain and think of the keywords you would use to seek out a local service
  • They know about your links no matter if you use GWT or not. It is just you who don't know something if you don't use it. imho

    Analytical gives them some extra info but not GWT.
  • ye the GWT is watching you myth is bullshit. i have sites on there and off there and makes no difference but keep wearing your tinfoil hat waiting for the aliens to invade if you want :P.
  • I use GWT for all my sites WH & BH, that is the easiest way to find out why you're banned lol, you only see info that Google has on you no matter if you make use of GWT or not.

    So just use it it cus it is a very handy tool.
Sign In or Register to comment.