Skip to content

What are you using as T1?

2»

Comments

  • cherubcherub SERnuke.com
    sickseo said:
    I don't use paid indexer - too expensive and ineffective.

    Seo indexer and Url redirect pro I use as T1 only.

    I don't run automated tiers anymore with the software. Too many broken tiers are created so I don't advise doing that. For example, many gnuboard links are verified to be live but when checking i'm prompted for  a login and password. 

    I use a standalone desktop software to store and check all my tiered links from all my different link sources. These days I use GSA SER to run single tiers instead - either as T1 to money site urls or as last tier pointing at links that have been checked to be live, e.g tiered campaign from rankerx, or T1 gsa ser links.

    The frequency of verifying links really isn't going to make a difference - if you are automating tiers you will always be building links to some dead links at some point - it's inevitable when using public link sources to create tiers. A link that's live today could die at any point in the future.

    Ideally you should set the reverify links option to everyday if you want to automate tiers.
    @Sven i know there is a current function in ser that generally checks for the 'need to be logged in' and add those to the verified lists, but maybe there is something weird about the gnuboard that is slipping thru like he said? just so were not building tiers to false positives?
    We probably need to add a step to the engine script to make sure SER is logged out of an account before the url is initially verified. I'll take a look in the next few days.
  • sickseo said:
    They certainly help towards the overall strategy. I've scraped well over 10k unique domains from the new engines. Still scraping more. They are good links for T1. Not available anywhere else. Nice mix of profile and contextuals, do follow and no follow.



    I've not done any isolated tests with just these sites. I'm poinitng them at the homepage of all my clients sites and my own projects to boost the referring domains and authority. But once juiced up with tiers, there is no reason why they won't help with rankings.
    Hey, just getting back into the SEO game.  Are you sending all these links to well established site with a lot of links already just to bulk it up?  

    Need to get up to date on Google algos but does this amount of links not nuke sites?

  • sickseosickseo London,UK
    So far I've just used them as T1 to the homepage. Then running a T2 campaign to boost them further.

    I pay zero attention to link velocity. It's not a google ranking factor. Of course there are many SEO's that would disagree with this or simply too scared to consider this approach. I've scared off many clients with this approach that just don't get it! lol

    Within 24 hours, i've used all my link sources and tools on T1 to homepage on brand new sites. That's around 30k unique domains lol That's a normal strategy for me. Never experienced any rank drops or signs of being penalised. To boost domain authority you need to be using unique domains as the boost only counts once.

    Considering how slow google is at discovering new links, I really don't see it being a problem. It could be weeks or even months before any kind of natural crawling/indexing takes place.

    These days to compete in Google, the game is all about domain authority. Once it's high enough, even inner pages from the site will rank without any backlinks - the link juice flowing through internal links will be enough to rank them. Hence why I focus on boosting the homepage as quickly as possible.
    Thanked by 1OldGregg
  • Tim89Tim89 www.expressindexer.solutions
    edited March 7
    sickseo said:
    So far I've just used them as T1 to the homepage. Then running a T2 campaign to boost them further.

    I pay zero attention to link velocity. It's not a google ranking factor. Of course there are many SEO's that would disagree with this or simply too scared to consider this approach. I've scared off many clients with this approach that just don't get it! lol

    Within 24 hours, i've used all my link sources and tools on T1 to homepage on brand new sites. That's around 30k unique domains lol That's a normal strategy for me. Never experienced any rank drops or signs of being penalised. To boost domain authority you need to be using unique domains as the boost only counts once.

    Considering how slow google is at discovering new links, I really don't see it being a problem. It could be weeks or even months before any kind of natural crawling/indexing takes place.

    These days to compete in Google, the game is all about domain authority. Once it's high enough, even inner pages from the site will rank without any backlinks - the link juice flowing through internal links will be enough to rank them. Hence why I focus on boosting the homepage as quickly as possible.
    But Tops, if you're building your tier 2's/3's out en mass, for indexation, this will of course make more paths for the googlebot to find your tier 1's... especially when bulk building tier 2's and 3's in layers, I have, in the past experienced a hell of alot of dancing when going gung ho' like this, even if I've processed a sizable number of Tier 1s through the indexer... perhaps the dance may be short-lived and rankings might return etc but I wouldn't personally disregard link velocity entirely..

    I've been pretty much building my tiers out like you, but I solely use PBNs as tier 1's, I'm not as brave as you to build 30k tier 1 links directly at a money-site that I care for, or for a client of mine lol. You crazy crazy man :D
  • sickseosickseo London,UK
    Lol Seriously, link velocity is not a ranking factor. 


    Here's the entire post: https://www.searchenginejournal.com/google-link-velocity/331637/

    I've seen that same info from google in multiple places over the years. According to Google, link velocity is just a made up term with no place in their ranking algorithm.

    The quality of the link and whether it was acquired naturally or not are what they care about. But how quickly the links appear is irrelevant.

    Of course there is nothing natural about automated link building lol
  • Tim89Tim89 www.expressindexer.solutions
    edited March 8
    sickseo said:
    Lol Seriously, link velocity is not a ranking factor. 


    Here's the entire post: https://www.searchenginejournal.com/google-link-velocity/331637/

    I've seen that same info from google in multiple places over the years. According to Google, link velocity is just a made up term with no place in their ranking algorithm.

    The quality of the link and whether it was acquired naturally or not are what they care about. But how quickly the links appear is irrelevant.

    Of course there is nothing natural about automated link building lol
    Hmm, ok, I see. 

    Alright, perhaps if it’s not treated as an actual ranking factor though, they could treat it as a signal to detect viral content? For example, if it sees a load of links all of a sudden, it could deem the site or page as temporary/viral post, then drastically diminish those rankings/links thereafter…? 

    There’s got to be some sort of protection in the algorithm to deter mass spam of decent quality links which in this day and age, would be totally possible what with AI and the automation that is available to us…?

    There's just a part of me that tells me that it would be wrong lol :D

  • sickseosickseo London,UK
    From what I see, no there isn't any type of protection - other than the quality of the link itself being processed by their algorithm.

    If you publish low quality content in volume, you won't get any results as the algorithm will pick up it's low quality and will give the links a low quality score - will have little impact on rankings.

    If you do the opposite and publish high quality content and you get it indexed within 24 hours, then I would expect there to be a significant rank boost. No penalisation for building links too quickly - that's irrelevant according to Google.

    We did something very similar back in the day with those tumblr blogs in the payday loans niche hooked up with your indexing service. We were ranking within days until tumblr took the blogs down.

    Nothings changed other than it being a lot more difficult to index links in volume in a short time frame.


  • Tim89Tim89 www.expressindexer.solutions
    sickseo said:
    From what I see, no there isn't any type of protection - other than the quality of the link itself being processed by their algorithm.

    If you publish low quality content in volume, you won't get any results as the algorithm will pick up it's low quality and will give the links a low quality score - will have little impact on rankings.

    If you do the opposite and publish high quality content and you get it indexed within 24 hours, then I would expect there to be a significant rank boost. No penalisation for building links too quickly - that's irrelevant according to Google.

    We did something very similar back in the day with those tumblr blogs in the payday loans niche hooked up with your indexing service. We were ranking within days until tumblr took the blogs down.

    Nothings changed other than it being a lot more difficult to index links in volume in a short time frame.



    Oh snap, yes, I remember that. literally ranked for payday loans in two days or something. That was fun times.

    I'll be setting up a few tests then and going gong ho' on them :D
  • sickseo said:
    Using Scrapebox for scraping Google. I've used 200 dedi proxies in the past for this, but can only scrape at 1 thread to avoid ip bans from google. Can literally scrape continuously for days with minimal blocks. Now i'm only using rotating data proxies to scrape Google which costs $0.50 per GB. Way faster and can run scrapebox at 500 threads. Definitely more expensive this way.

    I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go. 
    Another question on this as I try to recover all my old licenses!!

    What is best to scrape with Hrefer or Scrapebox?  Always used to use Hrefer for the most part and wondered if that is still best (or as you say best for those search engines mentioned)?

    Also, in regards to VPS levels, are they resource intensive? What level VPS would you (or anyone else) recommend for scrapebox and Hrefer (although aware that this may be on the same server as Xevil which requires a lot of cpu power).
  • Also, trying to work out my budgets on all this.  How much does it cost to scrape using the proxies at $0.50 per GB as you are using it?
  • sickseosickseo London,UK
    OldGregg said:
    sickseo said:
    Using Scrapebox for scraping Google. I've used 200 dedi proxies in the past for this, but can only scrape at 1 thread to avoid ip bans from google. Can literally scrape continuously for days with minimal blocks. Now i'm only using rotating data proxies to scrape Google which costs $0.50 per GB. Way faster and can run scrapebox at 500 threads. Definitely more expensive this way.

    I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go. 
    Another question on this as I try to recover all my old licenses!!

    What is best to scrape with Hrefer or Scrapebox?  Always used to use Hrefer for the most part and wondered if that is still best (or as you say best for those search engines mentioned)?

    Also, in regards to VPS levels, are they resource intensive? What level VPS would you (or anyone else) recommend for scrapebox and Hrefer (although aware that this may be on the same server as Xevil which requires a lot of cpu power).

    Due to a recent change with google adding javascript to their serps, scrapebox no longer functions as it used to. They released an updated version but it runs really slow and stops by itself. I've had to build my own bot with Zennoposter which works real good.



    For the other engines, hrefer is still my tool of choice. Good for scraping seznam.cz, so.com, yahoo, bing and duckduckgo.



    The other option is to use GSA SER. It has a huge number of search engines built in. But will take some fine tuning to find the engines that work with your proxies. Something I've not played with yet. 

    Vps for running these tools - they both use small amount of cpu/memory so you don't need anything special to run them. I put my installs on the same vps as xevil. I get amd 4 cpu/8gb ram vps for 5.99 euros/month and that's plenty to run gsa ser with hrefer and scrapebox.
    Thanked by 1OldGregg
  • sickseosickseo London,UK
    OldGregg said:
    Also, trying to work out my budgets on all this.  How much does it cost to scrape using the proxies at $0.50 per GB as you are using it?
    Costs will depend on how much you scrape. I'm running 20 threads continuously at the moment which is about $1/hour. 



    Not something I could sustain everyday as that's just too expensive. But for short scrapes targetting specific engines that I know will work in GSA, then it's very worth it.

    The other alternative I use to scrape google is static residential proxies from privateproxy.me. Unlimited bandwidth, but you'll need a lot of them running at low threads to avoid the ip being banned. I get mine at 50% off as well. Works very good in zennoposter, but only because it clears cookies and uses unique profiles before each search, which also helps avoid ip bans.
    Thanked by 1OldGregg
  • dnshostdnshost Thailand
    Dataimpulse is expensive but works well. I use it with Rankerx + XEvil. After changing the Webshare.io proxy to dataimpulse.com, XEvil works very smoothly. No proxy bans like when using Webshare.io.

    But I wouldn't use scrape URL, it's expensive.

    Thanked by 1OldGregg
  • sickseo said:
    OldGregg said:
    Also, trying to work out my budgets on all this.  How much does it cost to scrape using the proxies at $0.50 per GB as you are using it?
    Costs will depend on how much you scrape. I'm running 20 threads continuously at the moment which is about $1/hour. 



    Not something I could sustain everyday as that's just too expensive. But for short scrapes targetting specific engines that I know will work in GSA, then it's very worth it.

    The other alternative I use to scrape google is static residential proxies from privateproxy.me. Unlimited bandwidth, but you'll need a lot of them running at low threads to avoid the ip being banned. I get mine at 50% off as well. Works very good in zennoposter, but only because it clears cookies and uses unique profiles before each search, which also helps avoid ip bans.
    Cheers for your answers, much appreciated!!

    How easy/difficult is it to code zennoposter to scrape google? Never used it but might need to spend some time learning it next week as seems like it can be a real help and timesaver!!

    Do you mind sharing your VPS supplier at that price might need to try a couple of them?
  • sickseosickseo London,UK
    edited March 22
    It's very easy to use. I've set up many bots for my business for content generation with chat gpt and spin rewriter, making catchall emails for gsa and rankerx through cpanel, setting up new rankerx campaigns, posting to PBNs, even building my PBNs for me. The bot for scraping google was probably the most simple one to make. Planning to make more for other search engines too. There really is no limit here.

    Once you understand the basics, you'll be able to build a bot for just about anything.

    It's got a recording function which builds the bot for you as you perform each web action. That will be the best place to start with it. 

    For the VPS I mentioned - it's layer7: https://www.layer7.net/ Sign up for an account and you'll see their cloud vps which has options for cpu cores and gb ram. Go for the AMD cloud vps as it's better performance than the intel vps. I go for the 4 cpu and 8gb ram.


    Comes with windows server 2022 installed. It's great performance for running just GSA SER. But you can add more cpus and ram if you want to run more tools on there.
    Thanked by 1OldGregg
  • Amazing thanks as always!! im quite process driven so once im happy with any of my processes i will be looking for automation wherever i can.  

    Will check out that VPS as have all my tool licences sorted ready to get atarted. 
Sign In or Register to comment.