Skip to content

GSA SER - project set up (CONCEPT)

1234876675412348766754 Utrecht
edited December 2016 in Need Help

Hello guys,

It’s some time ago I have used GSA SER.
That’s why I have rewatched all (at least 95%) of the video’s in this thread.
I did read the unofficial FAQ and many other threads as well.

So, as for now, before I will start using GSA SER again i’ll put a concept plan on here.
It would be great if you could send me into the right direction/give me some advise.  

So first, before I am talking about building links those are the services/programs I am able to (and I will) use:

EXTERNAL PROGRAMS/SERVICES


Additional programs:
- Captcha Breaker (1st)
- SEO Indexer
- Wicked Article Creator (for spin/spun)

Additional Services:
- 2captcha  (2nd for web 2.0’s)
- SERengines for the Web 2.0’s
- Private Proxies
- Mail Accounts
+/- VPS (I have an PC here with 32GB RAM and more pretty stuff, so I’ll try GSA SER without a VPS first).

Regarding the private proxies & the mail accounts:
- I’d saw Millionproxy having a good rate, has anyone experience with them?
- Regarding the mail accounts, I saw mail.ru is a good one, has this been changed lately?
 
LINKBUILDING 
 
I’ll try to keep this as easy and as short as possible.
The campaign will have four tiers. One tier will be done 100% manually.
The 3 other tiers will be done by GSA SER.

So:

MANUALLY: Tier 1. -> articles to my website, minimum 1000 words & unique.
--
GSA SER: Tier 1 -> Article, Social Networks, Web 2.0 and Wiki.
Link ratio with tier above: 3

GSA SER: Tier 2 -> Article, Social Networks, Web 2.0, Wiki
Link ratio with tier above: 3

GSA SER: Tier 3 -> Blog Comment, Guestbooks, Trackbacks, Social Networks
Link ratio with tier above: 6

SETTINGS
Private proxies:
Same private proxies for all tiers, is that working? (else i'll split it up)
Ratio: 1 thread per 1 private proxy

Mail accounts:
Around 15-20 mail accounts per tier.

I will add some more information regarding the settings, anchor txt ratio etc. later on. 
It would be great to know if the fundation of the campaign is good / fine. 

«13456

Comments

  • shaunshaun https://www.youtube.com/ShaunMarrs
    I would use a catchall for emails. One is enough and they are crazy cheap. I use SEOSpartians.

    Also I would drop articles and wikis from your T3 they ain't going to index without support.

    SEO Indexer just takes system resouces and does little else.

    I would drop Social Bookmarks from T3 too. There's just not enough targets SER can post to.

    You can test this by making a burner project for Social Bookmarks and then letting it run and check your verifiers.
  • Thank you for your reply @Shaun
    do you maybe have an affiliate link to SEOSpartians?
    Since you gave the tip to use it, i'd think it's fair to use it :-) 

    I have edit the post and removed what you said above.
    You stated:
    "SEO Indexer just takes system resouces and does little else."

    Since I am only trying a campaign for a single website, i'd probabl have resources left.
    Would you advise me? Or just skip it. 

    -> I am working on a second part of the concept plan, more regarding the general settings of GSA SER itself.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Nah I don't do affiliate stuff for IM. With that set up you are using your T3 for indexing. Ideally the T3 will be 100% do follow.
  • 1234876675412348766754 Utrecht
    edited December 2016
    So, to get into the settings of the GSA SER program itself:

    Data settings: 
    https://gyazo.com/c136583213544c31c2a2add213114113
    I did not fill the keywords/anchor txt as for now, I will use keyword map pro to get a list of good keywords to scrape from.

    Submit/verify
    Tier 1: 25 URLS
    Tier 2: 75 URLS
    Tier 3: 450 URLS
    per 1440 minutes / 24 hrs

    Screenshot of the options tab:
    part 1; https://gyazo.com/3173afc0dae2d1776a7ffd7864c3c744
    * note, i ll change the amount of search engines to a little more & only english ones.
    I was thinking about MSN, Google and Yahoo.

    part 2; https://gyazo.com/a4784301b476f0528ca4575d20f5dea2

    I think that's all set, excluding the actual posts but I am sure I can cover that.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Your anchor text ratios don't add up to 100% mind.

    The ratios of URLs per day per url per tier changes depending on the keyword.

    You dont have a verified list? I doubt this will work with out that or having a seperate rig to build a list mate.

    The +/- 5 is pointless, link loss will randomise it for you anyway.

    Dont verify or reverify your T3.

    I dont use character spiinning or any of that either.

    Tick Article-Wiki or the whole thing is pointless. Untick contextual profile or profile contextual if you have it ticked I dont remember the exact name.

    Untick the skip sites with over 50 OBL.



  • 1234876675412348766754 Utrecht
    edited December 2016
    Thank you @Shaun for your valuable and fast reply again.

    What do you mean with the anchor txt ratios dont add up to 100%?
    Does it have to be a total of 100%? (atm. it's not)

    Regarding the verified list; it's some time ago.  so i'll just start from zero again. The first week, probably longer, i'll be building the verified list. Once that's done I'll move on to a test campaign.

    I have unticked and ticked the fields you stated.
    Though, for tier 1 and 2, should I untick everything stating "cotextual"? 
    (comment contextual, comment contextual/anchor txt, directory contextual etc etc). 

    is the option "analyse and post to competitor backlinks" interesting?

  • shaunshaun https://www.youtube.com/ShaunMarrs
    For T1 and T2 Article - Contextual and Article - Wiki is what I use.

    I think SER might have something in it if anchor text isn't 100% to pop up or something can't check cos I'm on my phone.

    Analyse competitor back links is basically link extraction if I remember right. Very resource heave but can grow a non contextual list exponentially if used right.
  • 1234876675412348766754 Utrecht
    edited December 2016
    Hm okay, I'll check the 100% out in a moment, maybe I can find something :-).

    Regarding your T1 and T2, do you use web 2.0's?
    I didn't see you stating you do in any of your replies lately. 

    If not, are the only platforms you use wiki and acrticles for T1 and T2?
  • Regarding the 100%:

    "If you mean project options with all the
    different anchor text variations, then sum that up and it has to be
    below or exactly 100%. If there is a gab to 100% then the field on main
    anchor text is used for the rest."

    Said @Sven
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I new there was something about anchors not being 100%.

    Social networks too on T1/2. I use a few different methods, one uses nothing but links built in SER, one with SER and RX but RX could be swapped for SERE and one using a new concept I came up with that I'm keeping to myself.
  • Thank you @Shaun
    So T1 and T2 look like wiki, articles and social networks for you.
    No Web 2.0's.

    I'll do a test run with a burning campaign later today to get a small link list
  • 1234876675412348766754 Utrecht
    edited December 2016
    @Shaun, or any other reading this thread, do you maybe have a tip to build as fast as possible a valuable link list? 

    I do have scrapebox and platform identifier. 
    I saw a video tutorial regarding how to add GSA SER footprints to the scrapebox. 
    Would that be good?  Or just.. let GSA SER run

    (Should I make use of SERengines for a burning campaign to gain a linklist?)

    Considering to buy 50 or 250 private proxies from proxymillion as well for scraping.
    (or any number in between)
    Then I'll use around 10-20 other private proxies for submission. 

    -- Sorry for the many questions.
    I was not able to find these information in one of the video's. 
  • beastmodebeastmode NYC
    edited December 2016
    Tier 1

    If you are really looking to rank with SER and safely you need quality links for tier 1 and not thousands (Quality NOT QUANTITY

    • PDF (must have revelant image / draw more traffic with an image - READERS)
    • Web 2.0 (put a link to a releant authority site and a video on each post - video's help rank the site)
    • Wikipedia links
    • Social Bookmarks (Keyword in title and tags)


    Make an account on Wikipedia other wise your IP will be exposed




     
  • shaunshaun https://www.youtube.com/ShaunMarrs
    12348766754 sorry mate only just seen this.

    It depends what you want to do, if you want to develope your blog/image comments then link extraction with scrapebox is the best way to go. If you want to develope your contextual articles then footprint scraping.

    1. Scrape/Link Extract
    2. Run through PI
    3. Run through filter project in SER like I explain here
    4. Use links on live projects
    Essentially the only difference between building your own list and buying a premium one for me is that steps one and two would be raplaces by using the premium list for targets.
  • 1234876675412348766754 Utrecht
    edited December 2016
    @Beastmode thank you for your reply, I'd value your input :-)
    I am considering using RankerX to build a small amount of high quality links for tier 1. 

    After that I'll use SER to build tier 1-2-3 structures with more links. I will go first with the; 

    • Web 2.0 (put a link to a releant authority site and a video on each post - video's help rank the site)
    • Wikipedia links (thanks for this awesome share!!)
    • Social Bookmarks (Keyword in title and tags)

    I'd think that's easier to start with, later on I'll try to expand my knowledge and building PDF as well.
    Thank you a lot :-)


    @Shaun I'd see the difference between buying a list and scraping it. 
    Most prices of linklists are around 50-80$. 

    So, or I buy 1 - 2 linklists, or I buy 250 private proxies (79$) for a month.
    I'd think I am able to scrape more potential URLs, while using keywords combined with the GSA footprints, with the amount of proxies instead of buying 2 lists.
    Do you share this opinion? 
  • Oh.. or would I start with 50 private proxies for scraping?
    (29$)
  • beastmodebeastmode NYC
    edited December 2016
    You can use FCS to build PDF if you want they have all the top PDF sites, or Ranker X has them in their wizard. I use the wizard to build everything very easy and fast to build,plus works great no separate accounts to create. 

    Did you see Asian virtual solutions lists $15 a month, millions a link a month 2 lists? I was wondering if its good or if any of his services are good. I can't find any reviews.

    The list is updated through dropbox
  • 1234876675412348766754 Utrecht
    edited December 2016
    @Beastmode thank you for your reply.
    To be honest, I do not have any experience with Asian Virtual Solutions, though through private messages I didn't hear anything promising. 

    I am considering to buy 50 private proxies right now, and then combine footprints with keywords for scrapebox.
    I'll order Ranker X as well, for the tier 1 (awesome it has a wizard for the PDF then :-)).) 
  • Did you ever use automated proxies that just keep refreshing in SER and never die. If they get banned or die you just get a new one off their server.

    Why don't you use foot prints?
  • @beastmode i did not use them to be honest.
    As far as my experience go i used buyproxies.org ones. 
    I am considering to move to proxymillion for the scraping part (way cheaper)

    Regarding footprints -> I'd bet that was a mis understanding. 
    I will combine a list of keywords with the footprints, so scrapebox will find 1 kind of websites (wiki/article etc). 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Dont use private proxies to scrape mate they will just burn out. Private proxies are for submissions.

    Either invest in GSA Proxy Scraper and let it feed you public proxies 24/7 or get scrapebox and use their cloud proxies.

    TBH right now I use lists, I have built my own in the past but when you understand how these lists are built it makes sense to just buy one unless you are using paid captcha solving.
  • Awesome @Shaun.
    Tomorrow my First Run Will start, including rankerX. Ill Keep you updated :)
  • Contextual links............Article/Profile 
    I am referring to Article, Wiki, 2.0s, Social networks

    Yes or No? I use platforms that can create both, anyone having inputs? 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    antonearn I don't use profiles or understand why SER class some of them as contextual as they only have a short bio on. Until recently there was a decent video engine that let you post an article but it seems to have gone the way of OSClass where all you get is a crappy bio but SER still classes it as a contextual.

    12348766754 Yea defo post how it goes! Got some crazy good stuff happening with my RX sites right now. I was helping a mate out the other day with a project on his RX and it seems they hve the same problem as SER. They offer too much and 90-95% of it is total crap and a waste of your time and effort but the 5-10% left is solid.

    The method still has its problems though as it is indexing worse than generic SER crap right now but im putting like 75% of my time into that method now.


  • antonearnantonearn Earth
    edited December 2016
    So you don't use platforms that can create both Article/Profile? (Some engines can do both) Only that can create Article? 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited December 2016
    Yea I use them I just turn profiles off so I only get articles.

    Edit - Some engines are broken and will only ever give you a profile like OSClass so I never tick it now on live projects.
  • @shaun ill take some time tomorrow to give a detailed update. Though, there was a little hold-on; waiting on a list seller to reply. Due the fact they only accept PayPal with debitcards.

    Though, the RankerX part will be done etc.

    More in 24 hrs :-D
  • 1234876675412348766754 Utrecht
    edited December 2016
    @Shaun quick question, do you use WAC or any other similar program for your T1 articles?

    edit; first run (RankerX Tier 1) will be over 3 minutes :-)
  • First run had some problems lol.

    First mistake: check if the "use proxies" are turned on at RankerX.
    Second mistake: check if your (auto) set price at 2captcha ain't lower then current price.

    Went for:
    Web 2.0 profiles + Social Network

    Big mistake: RankerX was adding Articles to profiles. I am looking into this how I can solve it. 
    (Though I added it as "article" i assumed RankerX would have known how to handle them). 

    Going to do a second run later on today. 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I just use Kontent Machine.
Sign In or Register to comment.