It looks like you're new here. If you want to get involved, click one of these buttons!
I have everything going into the indexer only because they came down in pricing and now have large daily limits. Otherwise I used to spend some insane amount previously like $130 to index 10,000 links. So I only used it for T1 in the old days because it was so expensive.
Now everything goes into the indexer API because it is cheap to do so.
There are no shortcuts to understand your indexing issues. You have to make the time to analyze these things. Take a fresh set of links on any level, weed out the dead links, run it through an indexing check, and take the non-indexed to use for the test. Stick those in an indexer, and then check indexing each day for about 3 days. Then you will understand your issues.
@PeterParker - We all need to verify things for ourselves. It's a pretty important mindset to not take everything you hear in forums to be the truth. I see people running around with false information gotten from another noob. It happens a lot.
I can assure you that indexing is one of the 'critical' functions of SEO. It is important to understand that some properties index well while others do not. It is also important to understand that it is getting more difficult to index crap properties. So you need to do every trick in the book to make it happen. And if you have issues on the indexing end, try to 'systematically' tear things apart. I wasn't that thorough in my approach, but it helped me gain an insight into what the hell I was doing. I really didn't have a great plan until I did that.
@AlexR - You are back! I have found that using SER works great for that. They can take a pounding, and they react well to lesser quality links.
However, I know plenty of people that will build a buffer around their buffer. I guess you can take that logic to infinity. I just don't view it as a threat for precisely the reason that if something was not working out, you could disconnect the links on those buffers.
Since the goal isn't to rank the web2's (at least it is not my goal), I only care about the conveyance of link juice from lower tiers through that buffer. It has worked well for me, and through all of the penalties of the last two years, some web2's rank in the first 5 pages of G - with nothing but SER links.
@Enzo - Yes, only use submission per day. The reasoning is very simple.
You are using a multithreaded application that is not really verifying each link as you create it. It checks verification later in the day. You will consistently overshoot your target limit by a mile if you use that setting.
That's why you need to stick with submitted per day. SER can count! And there is no verification needed. So just estimate (roughly) your verification rate per 10 submitted, know how many actual verified links you want to build, and work out the math. It is that simple. And it works!
@Enzo - You just get a feel for a few projects and then you can say "I submitted 100 today, and my verified increased by 20. So I know verified/submitted = 20%. So in order to get 50 on this project, I need to set submitted per day to 250." And then do that for every project.
@AlexR - I would probably fire just contextuals, and skip all the other junk. Put a filter of PR1+ and let it fly. Other than that, you may have to go outside SER to hit those things with other types of links.