What are you using as T1?
peterperseo
Just like the object.
What are you using as T1? Which type of links?
What are you using as T1? Which type of links?
Tagged:
Comments
Are you getting good results (in google ranking) with thiese selected T1s?
Thank you very much for your help
I've not done any isolated tests with just these sites. I'm poinitng them at the homepage of all my clients sites and my own projects to boost the referring domains and authority. But once juiced up with tiers, there is no reason why they won't help with rankings.
Non contextual links will still have the keyword in the url and the page where you create the link will also have some keyword related content if you set up your campaign to do that. So yes, there is still plenty of value in using non-contextual links. If it's do follow it still passes link juice. Makes no difference if it's a non-contextual.
If it's no follow, it will still help with rankings in search engines that treat no follow and dofollow links the same. If you're only interested in Google then it will still help with crawling/indexing links it's pointed at.
I don't discriminate in anyway with my T1 links when deciding between contextuals and non-contextuals - not with these engines as they all have low obls, so very good for pushing link juice through. Besides I'm more interested in boosting my referring domains count, so I'll use all sites available.
If you just chase contextuals all the time, you won't have many sites to play with.
Bear in mind these links don't get crawled and indexed in a few hours. It could take weeks for google to find them naturally, in some cases months.
But definitely best to test what's possble.
I'm also using hrefer to scrape other search engines. Seems to work better with less ip blocks. Using ipv4 - rotating mobile proxies, datacenter proxies and even storm proxies that have rotating ips. They all work quite nicely. Can scrape Bing, Yahoo, SO.com, Seznam.cz and Duck Duck Go.
Storm proxies are no good for scraping google. All ips are already blocked. They work ok in Hrefer though.
GSA Search Engine Ranker Projects | How To Create Contextual
Works for me like charm
so.com works ok ? I think with the new google updates its even harder.
so.com works real good. This javascript change from google is a bit of a pain. Waiting on scrapebox to come up with a solution. But hrefer is still powering ahead with no issues.
It's best to buy few link lists and extract links and expand instead of scraping. Both time and money will save for sure
I've been down the road of buying link lists. The list of sites is tiny compared to what I scrape myself. Plus, you share that site list with hundreds of other users, so every site in that list is guaranteed to get spammed to death.
If you do your own scraping, there is no limit to the size of your list. Plus, if you use custom footprints and keywords, you will find sites that most list sellers will never find.
For T2 usage, it's not that important, but for T1 usage, it's very important, especially now that we've got the new engines from SER Nuke. There are literally thousands of new sites out there, if you can find them.
Also is there any other option then scrapebox to test if the links are indexed or not?
I also use wikis and articles in my last tier as they can also create paths for the googlebot to crawl, but you need very good content to rely on natural indexing of these content links.
There are redirects and indexers, but I don't expect them to help much on their own. Some redirects are blocked from indexing, others are urls that don't exist anywhere but on paper, so they themselves need another layer to get them crawled. Generally speaking they are harder to index since google spam update last year.
Only aware of scrapebox for index checking.
Also since you seem you are very well versed with gsa set up, can you help me with proper settings for reverifing links in t1 and t2? Should I stop veryfing them lets say after 10 days or 30? or should I keep verying links at all times every day?
Seo indexer and Url redirect pro I use as T1 only.
I don't run automated tiers anymore with the software. Too many broken tiers are created so I don't advise doing that. For example, many gnuboard links are verified to be live but when checking i'm prompted for a login and password.
I use a standalone desktop software to store and check all my tiered links from all my different link sources. These days I use GSA SER to run single tiers instead - either as T1 to money site urls or as last tier pointing at links that have been checked to be live, e.g tiered campaign from rankerx, or T1 gsa ser links.
The frequency of verifying links really isn't going to make a difference - if you are automating tiers you will always be building links to some dead links at some point - it's inevitable when using public link sources to create tiers. A link that's live today could die at any point in the future.
Ideally you should set the reverify links option to everyday if you want to automate tiers.