Skip to content

How To Run GSA SER In The Beast Mode

13

Comments

  • i currently use 15 re post ,,, not all site get repost many unique domian thought and mostly repost on blogs becuase they have defferent pages and if the blog is high page rank then its a gold mine of links >>> i got alot of +6 page rank site and gov and edu using these setting

    before i used these setting i was hardly getting 1000 url in a day now i got about 15000 in 5 hours and 70% unique domains
  • np, and about the captchas the next update will be including more than 80 (new,improved) captcha types
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited November 2015
    BigGulpsHuhWelp I really had hoped you would have grown a little over the past month or two but it seems you have not :( and just leave pointless shortsighted comments on threads still.

    mashafeeq You actually seem like a nice guy who genuinely wants to help, is there anychance you could answer a few quick questions? Depending on your answers I have an old ebook that may help you out with SER.

    1 - What platforms are you using in this beast mode. As in article, blog, image comment and so on or are you just using everything?

    2 - How long have you owned and used SER?

    3 - Have you ever ranked a web page on the first page of Google?

  • will i use all (only engines that dont delete my links)
    page rank from 0-9
    filter chinese and japanise language

    i own gsa ser,xrumer,senuke,seo robot,Ultimate D for a long time but i was studying so u can say i never have time for them and of course i ranked a site will 2 years ago but as i said i let it go for my studies and now am in medical college ,,,,

    these days i returned to the business and starting to rank my new site but i faced alot of problems that google dont let me dance with it properly and there the new algorithms and sh*t so i need help from guys like u who have experience with the new google indexing methods

    am old school seo(er) =))
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited November 2015
    mashafeeq I have the ebook in my dropbox, I can email it to you if you want it. Just PM me your email address.

    It is around 6 month old or so now but has two pretty solid methods that still work in it and some other useful SER stuff.


  • thanks , i really appreciate it

    ridofacne.com@gmail.com
  • shaunshaun https://www.youtube.com/ShaunMarrs
    mashafeeq just sent it, it might take like 5 mins to come through because of the size of the file.

    It covers the basics of the platforms and a few other things but be aware that my own personal testing has shown over the past two month or so that some of the high value link platforms and junk platforms have now also fallen into the worthless links catagory.
  • how many emails do you suggest if im using yahoo not catchall? i think 50? because you have set max accounts per website to 50 +/- 10
  • the mory = the better
  • more emails = less links lower lpm. alot stick to 5-10 emails per project
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    @squidol I agree. 50 different email accounts for SER to check will really slow things down. If anyone needs to make that many different accounts on the same website, a catchall email would be the best option I think.

    Why anyone would need to make that many different accounts on the same site in the same project I have no idea. If this needs to be done, simply duplicate the project 10 times, have 5-10 emails per project, and allow SER to create multiple accounts per site. Or as stated above, use a catchall email.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    squidol exactly.

    Trevor_Bandura I agree completely, I have been testing projects with catch alls and some with only 5 emails and the LPM change is minimal tbh but there is no need at all to ever go above 10 email accounts per project.
  • Tim89Tim89 www.expressindexer.solutions
    Lol, thanks @shaun for the mention, I appreciate your support and honesty.

    I've only just seen this thread, it made me giggle a little when the guy says he gets his links indexed with a few hours but then says "sorry, i didnt have time to check if they were actually indexed or not"

    Indeed, many new comers will think that indexing urls within Google is easy peasy, in actual fact it's more difficult now than ever, I wish people who make these bold claims, think before they type as there are many miss-led sheep around here who will simply waste time and resources following suggestions/guides that don't work at all and will result in people seeing no results and mostly quitting... SEO is a very rewarding industry and I encourage many people into it as I want to see them succeed.

    Express Indexer, among other indexers (that work) is an entire outsourcing service for indexing back links in the masses, yet some people do not understand this, at the end of the day, our back end servers alone is quite substantial in price and our highest plan is only $48.99 for 1 million credits, think about it this way.. if you were to actively index your own links with any method you've found, you'll need a dedicated server running 24/7 and still, there would be manual work involved processing them yourselves.

    In this industry, time is money, outsourcing is very beneficial for everyone, a fee of $48.99 per month to automatically on the fly process your back links for indexing in my eyes, is a massive contribution to your SEO duties but that's just my humble opinion.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited November 2015
    Tim89 Exactly, thats why I made such a big point of it, the last thing we need on this forum is the blind leading the blind and it turning into warrior forum 2.0. I know how hard it is to index these days, especially SER links so I don't want people thinking they can do it with GSA indexer and GSA redirector. To date I think yours is the only indexing service that has and still does work for me even since the last patch that knocked some of the others out.

    mashafeeq has a good attitude and was trying to help. He is learning and growing as an IMer. Once he has more experience I can see him becoming a real asset to the forum. Unlike
    BigGulpsHuhWelp who doesn't seem to have grown at all over the past two month since his failed bragging post that was nothing but embarrassing for everybody bless him.

  • sorry guys i was having internet issue and i was out from net for 2 days..

    will i never want to mislead anyone but i share what i have..

    will thanks @shaun and the book really helps alot
  • shaunshaun https://www.youtube.com/ShaunMarrs
    mashafeeq Glad its helping dude :)

    Had a few people inbox me asking for it since. Its open to anyone just make sure you put an email address in the in PM so I know where to send it guys.
  • Tim89 thanks for ur humble opinion and as i said it was my old strategy and i left seo for about two years and i didn't know how hard it get to index now ,, i was indexing fast but now it really get harder and only few of my links get indexed

    i didnt know that google changed every thing until shuan through several notes taken my intention to check my index links if they really get indexed and i was shocked only 4 out of 30 link get indexed in 1 day
  • Tim89Tim89 www.expressindexer.solutions
    no problem @mashafeeq Google is constantly updating their algo, being on top of it is important, 2 years out of SEO is like being away from a normal industry for 10 years :).

    You will get on top of it all soon enough. 
  • i hope so.
  • @Tim89 What is the most reliable tool for mass index checking? Scrapebox and Gscraper are not accurate at all.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    How do you come to the conclusion Scrapebox is not accurate?

    I use it all the time and I have never received a false possitive. I recheck some URL batches a few times per day too.
  • @Anonymous: What do you mean by inaccurate?  What's the behavior you see I mean.
  • @Diji1 Insert 100 links and mass check them and then manually check results some of this and you will see what am talking about.
  • Tim89Tim89 www.expressindexer.solutions
    Links get indexed/deindexed many times during the index process before holding a position within the index, you could be seeing this in action when you're checking them over and over again.
  • after it get indexed it never get deindexed .. that meen u are using public proxies and they die while checking the index ratio

    use private for real results
  • Tim89Tim89 www.expressindexer.solutions
    I believe you're incorrect there @mashafeeq
  • shaunshaun https://www.youtube.com/ShaunMarrs
    If only that was true mashafeeq, The truth is that unfortunately links get deindexed all the time for a number of reasons.
  • not until u get penalized

    if not then google wont remove it from there index
  • Tim89Tim89 www.expressindexer.solutions
    ..... @mashafeeq please.
Sign In or Register to comment.