Does indexing tiers work for indexing/crawling?
Still a GSA newbie, but wanted to ask if my current strategy is somewhat good or if I'm just building useless tiers .
Have been using speedy indexer for tier1 contextual links (below there is my current setup for most projects) and for indexing I have been using blog comments (autoapproved sites took from 2 link lists), filtering for OBL 20, 50 or 100 (depending on tier, OBL 20 for tier 1, OBL 50 for tier 2 and OBL 100 for tier 3).
If I just leave the blog comments for indexing/crawling purposes, in your experience, can I still get the tiers indexed even without using paid indexers (I use speedyindex mostly, or sinbyte and I get most of contextual links indexed, I use SEOCM AI articles for content + AI images for better indexing and place 1000 in spintax format words articles as blog comments lol)?
Also, does crawling services like Colinkri works or it's just a waste of money?
I also read about putting the links into an owned site sitemap and request crawling with GSC but still have to understand if I can put external URLs there...
Current setup:
Tier1: Contextuals (articles, forums, social networks, bookmarking sites, web2.0, wikis)
+ sometimes Blog comments max OBL 20:
Tier2: Contextuals
+ always blog comments max OBL 50, guestbooks, image comments
Tier3: multiple projects only using Blog comments with a max OBL 100 (mainly for indexing purposes).
Thanks in advance for reading!
Have been using speedy indexer for tier1 contextual links (below there is my current setup for most projects) and for indexing I have been using blog comments (autoapproved sites took from 2 link lists), filtering for OBL 20, 50 or 100 (depending on tier, OBL 20 for tier 1, OBL 50 for tier 2 and OBL 100 for tier 3).
If I just leave the blog comments for indexing/crawling purposes, in your experience, can I still get the tiers indexed even without using paid indexers (I use speedyindex mostly, or sinbyte and I get most of contextual links indexed, I use SEOCM AI articles for content + AI images for better indexing and place 1000 in spintax format words articles as blog comments lol)?
Also, does crawling services like Colinkri works or it's just a waste of money?
I also read about putting the links into an owned site sitemap and request crawling with GSC but still have to understand if I can put external URLs there...
Current setup:
Tier1: Contextuals (articles, forums, social networks, bookmarking sites, web2.0, wikis)
+ sometimes Blog comments max OBL 20:
Tier2: Contextuals
+ always blog comments max OBL 50, guestbooks, image comments
Tier3: multiple projects only using Blog comments with a max OBL 100 (mainly for indexing purposes).
Thanks in advance for reading!
Comments
I'll adjust the blog comment projects and keep doing them for indexing then, thanks for answering. Hope that the current Google spam update of June won't tank the indexing methods.
yeah probably I can set higher OBL and get more blog comments built... I set up OBL 20 blog comments with the serverifiedlists targets (unless gsa ignores my setting) and serpgrow site lists.
I have been using blog comments only for indexing at the moment but maybe I'll try sending low OBL ones as tier 1 for actual ranking purposes maybe, as a test.
Speedyindexer works great, I mainly work with parasites and speedyindexer + addmyurl almost always index parasite pages, for harder ones I add linkdexing as well (from what I've seen it uses g newssites for indexing links), and gsa is one of the few options for boost rankings as I obviously can't use "normal" guest posts/niche edits unless I want to waste money... and the sernuke engines seems a fine thing to add beyond GSAser and CTR manipulation for parasite pages:) Have many things to learn before though as I'm still at my first gsaser campaigns...