It looks like you're new here. If you want to get involved, click one of these buttons!
How many resources does the indexer consume? I gather it consumes negligable RAM and CPU (for me). I can handle many connections. Bandwidth? Captchas?
Also, can the demo work with GSA SER?
@psikogeek - The issue isn't so much the CPU and RAM (but since everybody has a different system configuration that depends on what you have).
I wrote a lot about it elsewhere, but the main issue I see is that it will challenge your internet connection if you have SER running at the same time. I believe Full uses 300 connections.
So if you run Indexer on Full while using SER, SER productivity will drop significantly - especially if you import a large list into Indexer.
Since I love how well Indexer gets things indexed, I create a time each week when I bring SER down to work on maintenance, and tweeks, and new projects. And I then import a list into Indexer and let it run by itself. You get the best of both worlds that way.
Also remember you can use Deep Indexing which hits less sites, and you can adjust the connections. These are all things you should play with.
Yes, I have installed the demo. At first, I could not tell if it was working....SER was idle on jobs that would index. Then, I figure out that it worked because it forces you to click after every 50 indexing submits.
I had misinterpreted the "50" thing to be "50 index sites" at first, then "50 URLs to index per day" until it kept popping up.
Might buy it. (I actually fear indexers, so the low indexing rate appeals to me.)
ron said: Since I love how well Indexer gets things indexed, I create a time each week when I bring SER down to work on maintenance and tweeks and new projects. And I then import a list into Indexer and let it run by itself. You get the best of both worlds that way.
Where does one find the list of URL's to submit to SEO Indexer ?. I'm thinking of trying to run SER on my Berman VPS and then running Indexer on a spare laptop (although it will probably eat up all my broadband bandwidth) ?
@filescape - You export your verified list from your projects. The key is to first re-verify your verified links so you weed out the dead links. By the way, you can select all projects and re-verify all together, or you can grab groups and do it that way, or individually. I usually do the whole thing at once.
The other key is to then take that list and run it through scrapebox with a bunch of public proxies to determine which ones are already indexed. You will find that at least 25% are already indexed, so it doesn't make sense to put those in GSA Indexer. It will save you a lot of processing time with Indexer if you do that.