Skip to content

How many verified submissions per day is safe?

2»

Comments

  • Domain is about 1 month old, GSA only, around 2,000,000 unique pages or more on the website but Google only indexed about 100,000 so far...
  • shaunshaun https://www.youtube.com/ShaunMarrs
    ronniej556 

    Where are you getting all of that content?
    Are you pointing all those links directly to the home page or are they spread out between the inner pages?
    Are you using a proper indexing service for the backlinks you build?
  • ronniej556ronniej556 Trinidad
    edited November 2015
    Scraping different related websites and using an API.
    All spread out and homepage but the homepage is only 1 link out of like 100,000 anchors, I think so, is GSA Indexer good?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    No GSA Indexer is a waste of time in my experience.

    So is it duplicate content or have you ran it through a spinner or something first?
  • ronniej556ronniej556 Trinidad
    edited November 2015
    Each page has an article relevant to what it is about, I've programmed something to make about 5000 articles with unnoticeable generated Spintax, they're sorted by the latest first, made to be imported into GSA, It took almost 2 hours to import the articles folder into GSA though, otherwise I would have done a lot more.
    5000 with the Spintax, I'm guessing around 100,000 unique articles.
    I've only used GSA indexer, but I see backlinks went from like 30 to 300 today on Ahrefs, so the indexer must be working?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    ronniej556 What platforms in GSA are you using or are you using them all?

    AHRefs has NOTHING to do with the google index at all, please don't fall into that trap.

    Are you trying to rank all 100,000 unique articles for their specific keyword then?
  • ronniej556ronniej556 Trinidad
    edited November 2015
    Yes for their specific keyword
    I'm only using Articles, Wiki and Web 2.0
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I think you should scrap that site and start over. If you were able to afford the processing power to run enough copies of SER to rank over 100,000 keywords quickly then you wouldent need to do SEO to make money, your banks interest would keep you fine :P.

    Maybe change your stratergy slightly and rather than trying to rank 100,000 pages only try to rank one for one specific keyword. Then you can put 100% effort into that one keyword.
  • edited November 2015
    It really depends on your personal risk tolerance. I don't believe any SEO efforts is worth the risk of causing irreparable damage to any serious money site. I use aged PBN's for tier 1 and GSA for tier 2 & 3. I'll hit each t1 url with around 3 t2's a day, and hit each t2 with around 10-15 t3 a day. I'll drip these projects over 3-4 weeks.

    SEO is a game of patience.
  • @shaun what list do you use on you campaing, do you have you own list, buy them or let GSA scrape that for you.
  • I have my own but It's wierd I can see other people using GSA to post to some of them
  • shaunshaun https://www.youtube.com/ShaunMarrs
    DIAMONDHOTRIMZ1 My list is highly refined made up of my own scraping and two premium lists that I get for free.

    ronniej556 Think of all the premium list sellers, this of how many servers they have running scraping 24/7. There is always going to be a lot of over lap in the lists. As I said above I have two free subscriptions to two different list services a while back I compared the two and there was around 60% duplication between the two lists.

    Also someone using the link extraction method to get target URLs will be able to find a fair amount of your list from a single hit and working backwards.
  • @shaun

    Probably SERocket and SERlists

    Both of which are like 70% Joomla k2
  • shaunshaun https://www.youtube.com/ShaunMarrs
    BigGulpsHuhWelp 

    SeRocket and Looplines AutoApprove.

    I have tried a fair few list providers now and they are all packed out with crap platforms....thats why I run them through my own process every few weeks take the good and just scrape what I want and process my own stuff.

  • Self scraping is really the only route to go now. I think SERocket adds like 3 ursl to their lists per day, i always get "already parse" when i import their lists every day.
  • Tim89Tim89 www.expressindexer.solutions
    The older the domain, the more resilience you'll have to play with.

    Another factor is how are you indexing these links, if you're not actively indexing them, then building those numbers in that time frame should be fine.

    Keyword variations, how many keyword anchors are you using? Sometimes people think it is 100% to do with link velocity, but most of the time it is actually your anchor ratios being to high with such output.

    Are you tiering your links?
Sign In or Register to comment.