Hello, it depends a lot on the content you used. Google seems to be much more developed to understand the text since the last update, not to mention that there has been a bug on their side concerning the indexing for some time.
link indexing is the biggest problem with link pyramids these days. From my own experience,
I noticed that no one service helps to add a link to the index, but only makes the Google
robot pay attention to the page in order to crawl it. And Google itself decides add to index it
or not. According to my observations, all free sites for posting links began to be ignored by
Google. Perhaps there are some site markers, or perhaps simply because of the huge spam
content. The only solution I have found for myself is increasing the trust of links. In order
to drive a page into the index, it needs 10-20 indexed links.
therefore, it is necessary to put links from pages that are already in the index. Of all the
link types, only blog comments are relevant. Previously, they were considered spam and not
interesting. But now they are the most valuable, because only they can drive pages into the
index, and then by this the pages drive the pages of the previous tier into the index.
In other words, indexing starts at the end of the entire pyramid and works like an avalanche.
But the problem is that you need a lot of links on the last tier. And you can't do a lot
of blog comments. The dropout rate is too high. You have to wait a long time. I have one
idea, but I have not tested it yet, but I have seen similar schemes on other people's projects.
When we build a pyramid, we must also create a project in parallel, in which there will be
about 20 profiles. In these profiles, we insert all links that need indexing. For example,
other profiles, article links, web 2.0. Then we pump the blog comments on these
20 profiles. 20x20 = 400 blog comments. This is not much, but in theory they should add these
profiles to the index, and the profile should add the rest of the links to the index.
as practice has shown, the quality of texts does not affect indexing.
I do web 2.0 blogs with both manual copyright and machine copy. However, no web 2.0
is indexed even in 2-3 months if it is not referenced by a sufficient number of indexed
links.
you build few tiers. Than create 10-20 web 2.0 with text. Then put the list of links of last Tier (only profiles and articles) into these 20 web 2.0 blogs. Than create an edditional Tier, where set only blog comment to these 20 web 2.0. Need about 10-20 dofollow blog comments on each web 2.0. Than these web 2.0 is indexing and indexing previous tier. And that tier are indexing previos tier etc.
Comments
i need any tool or services to force index URL
whatever is blackhat or whithat tool
This is the latest Result i check today . Submit 10/17
..