Skip to content

I built 100k verified URLs :D

So guys, I have been learning a lot about GSA SER over the past 2-3 months, and I'm happy to say that I think I have pretty much wrapped my head around everything in this game - SER projects, proxies, scraping, footprints, site lists, etc. etc.

So far, running only test projects, I have built 100k verified URLs, 5% of which are contextual links.

image

This is my own list that I scraped and verified myself. Would you say this is a good achievement for a newbie? :)

Time to use this list and rank a small site at least and start making the monies! :D

Comments

  • Remove the duplicate domains first then you'l know if you have a good list, otherwise good job mate   :)
  • goonergooner SERLists.com
    Well done, but it's all about the rankings. If you can rank your site now then you've done an excellent job!
  • Thanks. Ranking a small money site for a 1k searches per month keyword is the next order of business.
  • Very good suggestions here, you need to remove t he duplicates and start with a small site. Exciting times ahead!
  • edited September 2014
    Removed duplicate domains and the entire list effectively got reduced by half. Sheesh!

    Anyway, the list is still worth something I suppose. Time to rank that site!

    It's a 1k searches per month keyword. Right now I'm using a careful backlinking strategy of 30 contextual links per day on tier 1 (PR1+ domains), 10 contextual links and 50 secondary links per URL on tier 2 (no filters). I haven't used a third tier.

    I guess the above strategy SHOULD work hopefully. Let's see what happens! :) 

    One question: If I set the limit to 30 verifieds per day, is there a risk that it will end up submitting a lot more links before 30 links actually get verified? And then the actual number of verified is something in the range of 100 or so - is there a risk of that?
  • goonergooner SERLists.com
    Yes, there is a real risk of that if you choose "verified per day".

    Instead choose "submitted per day" and set the number to maybe 50.

    Once you run it for a little while you'll see the submitted - verified ratio and can set the above number more accurately.
  • @gooner Thanks for the advice.

    I am targeting two keywords, and one of them is already ranking in position 41 with only very few backlinks that I had built two months ago.

    So far the campaign has built 35 links on tier 1 today. I checked some of these backlinks manually and they look really good and natural! Hopefully within a week there are some noticeable improvements (without an indexing service).

    I will also be creating some parasite pages and trying to rank them as well while also running some test projects to build my verified list, as SER is hardly being used to its fullest potential currently.

    @gooner Do you think I should fire tier 3 links as well? Is it really worth it for such a small keyword? (1k searches per month)

    As @cwvps said, truly exciting times ahead. :D
  • goonergooner SERLists.com
    I wouldn't bother with a tier 3. Like you said it's probably not worth it with such a small keyword and one that is already showing good movement.

    I don't use an indexer either any more. Check how many of your contextuals are indexed naturally after you have built them. It's usually around 60% for me, so i don't bother indexing. But others disagree.
  • @gooner OK thanks. And one more question: should I deleted duplicate domains from my global verified list or just duplicate URLs? Is it possible that I will miss out on multiple link building opportunities on the same domain if I delete duplicate domains? And another: to run a project off the global verified site list, should I right-click and import URLs from site lists OR should I use the checkbox "use URLs from global site lists if enabled"? Which is the best way? I am thinking that the second option is better because it chooses URLs from the verified list randomly.
  • goonergooner SERLists.com
    You should delete dup domains and URLs. Otherwise it will slow down your link building as SER processes the dups.

    If you are looking for speed, the best way is to import URLs directly to projects because then SER will process each URL in sequence with no dups, so it's more efficient.

    But "use URLs from global site lists if enabled" still works well and is more hands off so you can just let it run with that setting. But it picks URLs randomly so there is a chance it will pick URLs that have already been processed for each project.

    I usually import directly, but also set projects to post from global as a back up if projects run out of targets before i can import more.

    Hope it helps.
  • @gooner Thanks for all the great advice. 

    Status: 60 tier 1 links built in two days. Ranking improved by 4 positions for one keyword. Other keyword still not in the top 100.
Sign In or Register to comment.