Skip to content

GSA SER - project set up (CONCEPT)

1235

Comments

  • @shaun - the domains are between 6-9 months old and had ser links built to them in the past. I'll pick up a couple new ones and test those as well.

    @12348766754 - if I understand you correctly, I think it's just a matter of checking the option to allow multiple posts on the same domain. It's the section above all the language / bad word filters.
  • Hmm, @redrays does that work for blog and guestbooks as well? Thought only articles :-x
  • @12348766754 - looks like you're right based on an answer Sven gave in another thread.
  • @redrays no problem, still thank you :-)
    I'd bet @Shaun can answer my question. 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Keyword density is an on page factor, in my opinion its way too high mate, I think you might be getting confused with anchor text ratios too, as I think that is what you are thinking of when you are building backlinks. I dont have any tools to check what you have so no idea what you are set at.

    Multi posting to the same domain is in the scheduled options section of the project options tab.
  • 1234876675412348766754 Utrecht
    edited December 2016
    @shaun I changed the whole information on that product-page :-) How much % would you advise?
    I did hear stories about 2% should work, though about 5% and even 12% as well. 5 and 12 seems to be sky high to me though.

    BTW, regarding the question how to let GSA ser post mutliply on same page;

    Can i change the settings, so SER will post multiply times on the same page, though every time with another link? I'd bet that's the box "per URL" at the submission options. Right?

    If that's right, will SER wait an X period of time before it creates a new account for a different URL on the same page? OR will it only wait an X period of time creating a new account when it will build a link to an already existing backlink (T2) on the page.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Mate I could rank about this for a while but I will keep it short.

    Basically a keyword and a key phrase are different and you workout the density in different ways as a key phrase is still a single term you are wanting to track in a fixed number of total words just it made up of multiple words.

    Keyword = Acne
    Key Phrase = Best acne treatment.

    If you read this it explains it better than I could and although in the end the key phrase calculation it says to use is pretty close to what you get if you class it as a keyword I had better results using the first one in my own tests but I plan to retest it and publically release the results.

    The thing that annoys me is these tools out there like Yoast and other things fail to take this into account so you have people with incorrect keyword density that they have never tested and are not in the top 300 due over usage. Then people presume it is either their tools or that SEO is a scam and go ranting on a forum.

    Sorry mate I just got back home and im pretty drunk and cant make sense of this....

    "Can i change the settings, so SER will post multiply times on the same page, though every time with another link? I'd bet that's the box "per URL" at the submission options. Right? "

    Try rephrase it and I will check it out when I wake up.


  • @Shaun Thank you, i've checked out that page and I did the calculation myself.
    I used the keyword 10 times in a 901 words page. Globally that's around 1.1%. 
    I guess that fits the needs and is a good improvement, thank you.

    It's true what you state about Yoast and other tools. That's also the reason why my keyword density was that high (had to increase before getting a green dot XD).

    -> Regarding the part you did not understand:

    - Am I able to let SER place multiply links of my Tier 2 on the same Tier 3 site without posting any Tier 2 link double? If so, is that the option under "Sheduled posting" -> "allow posting on same site again" and "per URL". I thought that option was for articles only. 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Ah right, when im using a T3 of blog comments and such I just let it post as much as possible to the T2 domains and dont care how many time they are hit per domain all I Want it spiders crawling.

    With T2 I have noticed it seems to work better with the logic of more URLs are better even if they are from the same domain.
  • 1234876675412348766754 Utrecht
    edited December 2016
    Okay that's great @Shaun
    I'd assume only do-follow though.

    But.. what settings do you need for that?
    As for now, I have put "maximum of 10 accounts per website (per URL)".
    Still SER is not posting all Tier 2 URL's on the same page.

  • 1234876675412348766754 Utrecht
    edited December 2016
    "Hello,

    indeed this is just working for articles. SER will not submit the same
    URL to engines like blog comments or engines where everyone can place a
    link on the same page.

    With Best Regards

    Sven Bansemer "

    Seems impossible. To bad :-(
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited December 2016
    Whats that reply from Sven in regards to?
  • @shaun

    I used other words, but:

    "Am I able to let SER place multiply links of my Tier 2 on the same Tier 3 site without posting any Tier 2 link double? If so, is that the option under "Sheduled posting" -> "allow posting on same site again" and "per URL". I thought that option was for articles only"
  • Since GSA SER will not post again on the same site (blogcomments, guestbooks etc). Would you advise me to buy the proxy program of GSA and start scraping it with keywords + the footprints of GSA SER in Scrapebox? Or will that not return any value (or not much) @Shaun

    I tried it yesterday with public proxies, from filtered out from the 1linklist. Though, scrapebox returned so many errors and almost no urls (like 80, lol). 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Nah don't do anything about it yet.
  • Well, how else do I get more targets? :P
    Already completed the 1linklist blogcomments list
  • Btw totally off-topic, @Shaun how're you?
    I'd saw you were drunk a few days ago.
    I'd suppose you're not sick anymore? 
    (Would be fkd up with christmas)
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Yea I'm much better mate thanks.

    Basically I wanted to wait out on replying to your question until I had spoken with Sven to make sure this was not hard coded for legal reasons before putting it out there but he has green lighted it. I havent ran my projects like this for about 6 month now but was reading my old notes for a blog post and a case study and it brought it all back to me.

    So, you are going to have two sets of projects, your masters and your lives for every projects. Essentially your masters hold everything and reverify and display link counts. You lives will post for your.

    So you set your masters up and then move your URLs from your lives into your masters and hold them there. You then rebuilt live projects from a template each day or however quick you are hard locking your list and then move the targets from your masters into your lives and let it blast again.

    Thats how I used to do it.

    I know thats a basic explination but I will type up a full blog post when I get the chance.
  • @Shaun so, if I'd understand:

    You make 1 project with ALL URLS
    Then you make sub projects with an X part of the URLS.
    -> let GSA SER blast to those URLS (few projects at same time). 

    And then.. after some time.. switch the URLS with new ones.

    --> Thank you for your reply though :-) 
    And I am glad you're much better. Merry Christmas
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Master Projects
    T1
    T2

    Live Projects
    T1
    T2
    T3

    For example says you let live run for 24 hours each cycle. You let your live projects run and then you go into the verified URLs, copy and paste them into their master project equivilant at the end of the day. The T3 is not kept as you dont care baout its links.

    The next day you rebuild or duplicate the live projects so they can rehit all URLs and let them run then repeate the process.
  • I'd think I already have something like that, just didn't use it for Tier 3 yet.
    I'll check it out, thanks :-) 
  • @Shaun do you know if it's possible to set a timer on the possible amount of threads?
    For example; from 00.00 to 08.00 500 threads, and else go down to "x amount"
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Nah you can't do it, maybe you could use the scheduler to limit the number of projects you have active at any given time?
  • @Shaun next update we can :-) 

  • Regarding your new article;


    some questions;

    Did you use elitelinkindexer as well? If so, what percentage does it give?
    You said building tiers is not needed anymore when using premium indexer; does this mean you only have tier 1's? 
  • oh, and what if you do a premium indexer + tier building? :-) 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I have an elite link indexer batch too but that is part of the 30 day case study between a number of different indexing methods and it is not performing anywhere near as well as it used to.

    I never said building tiers is not needed anymore, tiers are used for a number of things, tier two is to provide link juice to your tier one links and tier three is to help index. If you can't afford a premium indexing service then the tier three method is still getting 40% indexed.

    I have a 30 day case study using a premium indexer and a tier three but they are still two weeks out.
  • Ah okay, I just  did the last one; 3 tiers + elite indexer.
    Though, i might switch to premium when i submitted 10k links 
  • echanneyechanney usa
    edited January 2017

    Shaun could you pm me as to how to setup my pbn in gsa ser.  Thanks  Also what indexing methods are you using now that elitelinkindexer is not doing so well

  • 1234876675412348766754 Utrecht
    edited January 2017
    @echanney read his blog;

    Information regarding what indexer he uses (premium link indexer - express indexer) is all stated :-) 
Sign In or Register to comment.