Skip to content

What are you using as T1?

Just like the object.
What are you using as T1? Which type of links?

Tagged:

Comments

  • My T1 projects use these engines:


  • I just checked the website of sernuke (that I didn't know before) and I can see you are a testimonial.
    Are you getting good results (in google ranking) with thiese selected T1s?
    Thank you very much for your help
    Thanked by 1Deeeeeeee
  • sickseosickseo London,UK
    edited November 2024
    They certainly help towards the overall strategy. I've scraped well over 10k unique domains from the new engines. Still scraping more. They are good links for T1. Not available anywhere else. Nice mix of profile and contextuals, do follow and no follow.



    I've not done any isolated tests with just these sites. I'm poinitng them at the homepage of all my clients sites and my own projects to boost the referring domains and authority. But once juiced up with tiers, there is no reason why they won't help with rankings.
  • Inside GSA SER, I use only articles, social forums, and wikis. However, I don't use any kind of bad filters or anything to filter out. I'm using Tier 1 links lost with two verified lists provided at a time. Same with RankerX. (Only articles)
    Thanked by 1Deeeeeeee
  • AllanAllan seoservices.com.br
    sickseo said:
    They certainly help towards the overall strategy. I've scraped well over 10k unique domains from the new engines. Still scraping more. They are good links for T1. Not available anywhere else. Nice mix of profile and contextuals, do follow and no follow.



    I've not done any isolated tests with just these sites. I'm poinitng them at the homepage of all my clients sites and my own projects to boost the referring domains and authority. But once juiced up with tiers, there is no reason why they won't help with rankings.
    Do you use non-contextual profile links? Do you think it's a good strategy? I've seen a lot of dirty profile links from Senuke.

    Are there any results with non-contextual links?
  • What do you mean by "dirty"?

    Non contextual links will still have the keyword in the url and the page where you create the link will also have some keyword related content if you set up your campaign to do that. So yes, there is still plenty of value in using non-contextual links. If it's do follow it still passes link juice. Makes no difference if it's a non-contextual.

    If it's no follow, it will still help with rankings in search engines that treat no follow and dofollow links the same. If you're only interested in Google then it will still help with crawling/indexing links it's pointed at.

    I don't discriminate in anyway with my T1 links when deciding between contextuals and non-contextuals - not with these engines as they all have low obls, so very good for pushing link juice through. Besides I'm more interested in boosting my referring domains count, so I'll use all sites available.

    If you just chase contextuals all the time, you won't have many sites to play with.
    Thanked by 1Deeeeeeee
  • AllanAllan seoservices.com.br
    I said dirty links because I found a lot of spammy links. Job site engines are amazing.

    But your approach is interesting. I'll do some tests.

    How many links do you usually make per day?

    I see you doing 10~20 per URL, and I get some results. I have some sites to burn with tests.
    Thanked by 1Deeeeeeee
  • lol Link velocity is not a google ranking factor. So when I run my T1 campaigns, I have no delays set. The campaign will comfortably make 10k+ links inside 24 hours. I do the same with rankerx and can send 10k new T1 links to any landing page within a few hours.

    Bear in mind these links don't get crawled and indexed in a few hours. It could take weeks for google to find them naturally, in some cases months.

    But definitely best to test what's possble.
    Thanked by 1peterperseo
  • AllanAllan seoservices.com.br
    sickseo said:
    lol Link velocity is not a google ranking factor. So when I run my T1 campaigns, I have no delays set. The campaign will comfortably make 10k+ links inside 24 hours. I do the same with rankerx and can send 10k new T1 links to any landing page within a few hours.

    Bear in mind these links don't get crawled and indexed in a few hours. It could take weeks for google to find them naturally, in some cases months.

    But definitely best to test what's possble.
    Good thing you save on indexing lol

    But I really prefer the "link drip" strategy.

    But it's worth every test.
  • nqhungnqhung Hong Kong
    edited November 2024
    sickseo said:
    They certainly help towards the overall strategy. I've scraped well over 10k unique domains from the new engines. Still scraping more. They are good links for T1. Not available anywhere else. Nice mix of profile and contextuals, do follow and no follow.



    I've not done any isolated tests with just these sites. I'm poinitng them at the homepage of all my clients sites and my own projects to boost the referring domains and authority. But once juiced up with tiers, there is no reason why they won't help with rankings.
    @sickseo Do you scrape URLs for SERnuke engines with GSA or scrapebox? And does it require a lot of IPV4?
  • Using Scrapebox for scraping Google. I've used 200 dedi proxies in the past for this, but can only scrape at 1 thread to avoid ip bans from google. Can literally scrape continuously for days with minimal blocks. Now i'm only using rotating data proxies to scrape Google which costs $0.50 per GB. Way faster and can run scrapebox at 500 threads. Definitely more expensive this way.

    I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go. 
  • nqhungnqhung Hong Kong
    sickseo said:
    Using Scrapebox for scraping Google. I've used 200 dedi proxies in the past for this, but can only scrape at 1 thread to avoid ip bans from google. Can literally scrape continuously for days with minimal blocks. Now i'm only using rotating data proxies to scrape Google which costs $0.50 per GB. Way faster and can run scrapebox at 500 threads. Definitely more expensive this way.

    I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go. 
    Which proxy providers are you using? I use stormproxy but seems that many of their IPs are blocked
  • For google, I'm using data impulse.

    Storm proxies are no good for scraping google. All ips are already blocked. They work ok in Hrefer though.


  • iamzahidaliiamzahidali United States
     sickseo said:
    For google, I'm using data impulse.

    Storm proxies are no good for scraping google. All ips are already blocked. They work ok in Hrefer though.


    Have you tried using Scrapping API for Google that cost i guess 0.5$/1k Request instead of Proxies but they make sure you never get banned. ScrapeOWL etc ? I think there might be the way to integrate it with GSA SER or Scrapebox.
  • nqhungnqhung Hong Kong
     sickseo said:
    For google, I'm using data impulse.

    Storm proxies are no good for scraping google. All ips are already blocked. They work ok in Hrefer though.


    Have you tried using Scrapping API for Google that cost i guess 0.5$/1k Request instead of Proxies but they make sure you never get banned. ScrapeOWL etc ? I think there might be the way to integrate it with GSA SER or Scrapebox.
    Do you mean https://www.scrapingbee.com/documentation/google/ ? Or any other provider?
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    This is what you should be using when building Tier 1 in GSA SER:

    GSA Search Engine Ranker Projects | How To Create Contextual
  • googlealchemistgooglealchemist Anywhere I want
    sickseo said:
    Using Scrapebox for scraping Google. I've used 200 dedi proxies in the past for this, but can only scrape at 1 thread to avoid ip bans from google. Can literally scrape continuously for days with minimal blocks. Now i'm only using rotating data proxies to scrape Google which costs $0.50 per GB. Way faster and can run scrapebox at 500 threads. Definitely more expensive this way.

    I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go. 
    Curious how long ago you were running 1 thread with 200 proxies to avoid bans? I noticed recently i can get away with a 90 second delay per ip...havent tried pushing it more than that yet.
  • sickseosickseo London,UK
    Been a few months since I've run it like that. But it ran flawlessly for days with minimal errors. Currently using rotating datacenter proxies for scraping google.
  • 9tRider9tRider New wark
    For tier 1 I enabled o ly articles, social network, forums and wikis nothing else. Use fresh content with no spinning.all generated with gpt4 and Claude with custom format with seo content machine or seo content generator. Using tier 1 link list for 1st tier and try to building links without dripping.  

    Works for me like charm 
  • @sickseo

    so.com works ok ?  I think with the new google updates its even harder. 


  • sickseosickseo London,UK


    so.com works real good. This javascript change from google is a bit of a pain. Waiting on scrapebox to come up with a solution. But hrefer is still powering ahead with no issues.
  • 9tRider9tRider New wark
    @sickseo

    so.com works ok ?  I think with the new google updates its even harder. 


    Scrapebox has fixed it with new jaba script but it's slower and mostly one thread operation 

    It's best to buy few link lists and extract links and expand instead of scraping. Both time and money will save for sure

  • sickseosickseo London,UK
    Thanks for the update. Will test it now.

    I've been down the road of buying link lists. The list of sites is tiny compared to what I scrape myself. Plus, you share that site list with hundreds of other users, so every site in that list is guaranteed to get spammed to death.

    If you do your own scraping, there is no limit to the size of your list. Plus, if you use custom footprints and keywords, you will find sites that most list sellers will never find.

    For T2 usage, it's not that important, but for T1 usage, it's very important, especially now that we've got the new engines from SER Nuke. There are literally thousands of new sites out there, if you can find them.
  • Do you think blog commenst are enough for indexing or should I use ping or something else in my lower tiers?

    Also is there any other option then scrapebox to test if the links are indexed or not?
  • sickseosickseo London,UK
    Blog comments, guestbooks and image comments will be good for indexing, as long as you've scraped those sites from google. This strategy is pointing google indexed urls at your non-indexed links.

    I also use wikis and articles in my last tier as they can also create paths for the googlebot to crawl, but you need very good content to rely on natural indexing of these content links.

    There are redirects and indexers, but I don't expect them to help much on their own. Some redirects are blocked from indexing, others are urls that don't exist anywhere but on paper, so they themselves need another layer to get them crawled. Generally speaking they are harder to index since google spam update last year.

    Only aware of scrapebox for index checking. 
    Thanked by 11WEBAJ
  • sickseo said:
    Blog comments, guestbooks and image comments will be good for indexing, as long as you've scraped those sites from google. This strategy is pointing google indexed urls at your non-indexed links.

    I also use wikis and articles in my last tier as they can also create paths for the googlebot to crawl, but you need very good content to rely on natural indexing of these content links.

    There are redirects and indexers, but I don't expect them to help much on their own. Some redirects are blocked from indexing, others are urls that don't exist anywhere but on paper, so they themselves need another layer to get them crawled. Generally speaking they are harder to index since google spam update last year.

    Only aware of scrapebox for index checking. 
    Thank you, I am going to test this approach on few campaigns. Are you also using any paid indexer? How about gsa indexer? and gsa URL redirects sw?
    Also since you seem you are very well versed with gsa set up, can you help me with proper settings for reverifing links in t1 and t2? Should I stop veryfing them lets say after 10 days or 30? or should I keep verying links at all times every day?
  • sickseosickseo London,UK
    I don't use paid indexer - too expensive and ineffective.

    Seo indexer and Url redirect pro I use as T1 only.

    I don't run automated tiers anymore with the software. Too many broken tiers are created so I don't advise doing that. For example, many gnuboard links are verified to be live but when checking i'm prompted for  a login and password. 

    I use a standalone desktop software to store and check all my tiered links from all my different link sources. These days I use GSA SER to run single tiers instead - either as T1 to money site urls or as last tier pointing at links that have been checked to be live, e.g tiered campaign from rankerx, or T1 gsa ser links.

    The frequency of verifying links really isn't going to make a difference - if you are automating tiers you will always be building links to some dead links at some point - it's inevitable when using public link sources to create tiers. A link that's live today could die at any point in the future.

    Ideally you should set the reverify links option to everyday if you want to automate tiers.
  • sickseo said:
    I don't use paid indexer - too expensive and ineffective.

    Seo indexer and Url redirect pro I use as T1 only.

    I don't run automated tiers anymore with the software. Too many broken tiers are created so I don't advise doing that. For example, many gnuboard links are verified to be live but when checking i'm prompted for  a login and password. 

    I use a standalone desktop software to store and check all my tiered links from all my different link sources. These days I use GSA SER to run single tiers instead - either as T1 to money site urls or as last tier pointing at links that have been checked to be live, e.g tiered campaign from rankerx, or T1 gsa ser links.

    The frequency of verifying links really isn't going to make a difference - if you are automating tiers you will always be building links to some dead links at some point - it's inevitable when using public link sources to create tiers. A link that's live today could die at any point in the future.

    Ideally you should set the reverify links option to everyday if you want to automate tiers.
    Thank you very much. Do you prefer to use ranker x for t1 or are u just using it to diversify backlinks?
  • sickseosickseo London,UK
    There's a few ways you could use a tool like rankerx. I've got my set of custom templates, 1 tier, 2 tier and 3 tier. I've got it set up with zennoposter so campaign creation is completely automated. These get pointed at money site urls as well as my pbns. You could point them at anything you want powered up.
Sign In or Register to comment.