I just ran a test on a tier 1 project. Like all projects i feed the links automatically to lindexed.
This particular project had just 1 tier 2 (kitchen sink) pointing at it.
I was at 22% indexed 4 days ago.
I took all the unindexed links ran then in GEO Indexer (Full mode) took 2 days to index about 1k links. I checked indexing today and I'm at 50% verified.
One of my tricks with public proxies is I take my spare copy of SER, turn it on, and have it write the public proxies to a file on my desktop. I then use that file of PP to use in Scrapebox to do my indexing check. It works very well.
Dumb question on this that I couldn't find with a quick search. Do you remove the indexed links from SER once verified or leave them accumulating more backlinks?
Ah yes another fellow Brit. I haven't used BM in a while maybe I should fire it back up. I seem to have loads of tools but only use a few every day. Anyway I mention Tops because I learned a great deal from his in depth threads and videos. In fact he was the most helpful guy in the SE forum. A bit like @ron here LOL.
Old but true. Stare at "F-" for a few minutes and you know why you need your kitchen sinks/index tiers or whatever you want to call it. You can replace "unimportant" by "not enough" ..
"2) I'm only running the Indexer on T1A2A3A junk links that aren't already indexed. I know it will get some of them indexed, but I am curious to see if it is worthwhile overall. I never measured the results on this tierseparated from everything else - however this time I am measuring the effect. I still do it last, and when I have the extra capacity to let that many links run through the Indexer."
I was about to add indexing to my overall strategy and thought I'd start with T123, but seems like you have a different approach.
When you said
"1) T1, T2, T3 - try to do everything you can to get these properties indexed"
Do you feed links from T123 to indexer services directly or lower tiers so in turn higher tiers are found?
For me, every verified link, regardless of tier, first goes to lindexed. In effect, that helps all contextual tiers.
Anything I do with GSA Indexer is a separate deal, performed every so often (when I have extra time), to give an extra boost. The contextual tiers are my primary concern. Those are all brand new pages.
Most of the links on the kitchen sinks tiers are indexed quickly as most of those links are some type of comment on already existing/indexed websites - as I have documented.
I tried GSA Indexer for the first time earlier today. I didn't know what to expect out of it. I've been using Indexification - now I'm considering dropping that subscription and just using the GSA Indexer. For me it's more than enough to get my work done and save me monthly subscriber cash as well. I push local business sites in my area only. So you can imagine I don't even use half of my subscription's 50,000 links per day to index limit.
I added a new client's domain to host just this morning after getting his design ready. His domain is a few years old. When I checked his links Google was only seeing 1-literally 1. After I launched his new design online I used GSA Indexer to start linking right away. I checked a little while ago and his links on Google are now at about 171 or so: ThunarElectric.com
I'm new here. I don't know if this has already come up, if so, I apologize in advance, but it would be great if the private proxies we use in GSA Search Engine Ranker could be used simultaneously running it and the GSA Indexer?
A quick question.. So far I have sent only do-follow links to GSA SEO Indexer (using it as the only indexer), but I try to make do- and no-follow links 50/50. Should I send no-follows as well or am I doing the right thing?
I'd anxiously want to click the "Index Check" under verified links list, but I want to be sure.. does it use proxies? Should it use proxies in the first place? I am just wondering how safe it is.. or if it's gonna check all the links in index with my vps ip and leave footprints..?
I never use SER to check indexing because it would ban my private proxies. If you do it, use public proxies only. I use Scrapebox with public proxies myself because I track indexing outside of SER.
@ron - I was starting to do the same for checking if my imported targets where indexed, but then I started to check if when SB said they were not indexed, that was actually the case.. Manually checking the targets in Google showed that many were actually indexed - I tried 20 and only 1 was actually not indexed.. Any ideas why? Do you do a custom check, or is it literally using SB's index check?
Comments
I also use mine to send non-indexed to Indexifciation.com using api.
Great tool.
What is/are the Tops threads?
Head in GSA and ScrapeBox and brain in the fridge!
I actually got put on to Backlink Monitor by Matthew Woodward, he did a great review/video on his site.
For me, every verified link, regardless of tier, first goes to lindexed. In effect, that helps all contextual tiers.
Anything I do with GSA Indexer is a separate deal, performed every so often (when I have extra time), to give an extra boost. The contextual tiers are my primary concern. Those are all brand new pages.
Most of the links on the kitchen sinks tiers are indexed quickly as most of those links are some type of comment on already existing/indexed websites - as I have documented.
I tried GSA Indexer for the first time earlier today. I didn't know what to expect out of it. I've been using Indexification - now I'm considering dropping that subscription and just using the GSA Indexer. For me it's more than enough to get my work done and save me monthly subscriber cash as well. I push local business sites in my area only. So you can imagine I don't even use half of my subscription's 50,000 links per day to index limit.
I added a new client's domain to host just this morning after getting his design ready. His domain is a few years old. When I checked his links Google was only seeing 1-literally 1. After I launched his new design online I used GSA Indexer to start linking right away. I checked a little while ago and his links on Google are now at about 171 or so: ThunarElectric.com
I'm new here. I don't know if this has already come up, if so, I apologize in advance, but it would be great if the private proxies we use in GSA Search Engine Ranker could be used simultaneously running it and the GSA Indexer?
I'd anxiously want to click the "Index Check" under verified links list, but I want to be sure.. does it use proxies? Should it use proxies in the first place? I am just wondering how safe it is.. or if it's gonna check all the links in index with my vps ip and leave footprints..?
I never use SER to check indexing because it would ban my private proxies. If you do it, use public proxies only. I use Scrapebox with public proxies myself because I track indexing outside of SER.
Sorry, I just re-read that and I meant as @jjumpm2 mentioned that do you get a lot of sites reported as not indexed when they are?