Separate whois/ping services
Kaine
thebestindexer.com
Hello @Sven
I 've been trying for several days to take advantage of the index .
I test and compare with other soft specialized in the field .
Would it be possible to offer the possibility of separating the whois / ping services?
Let me explain .
If I want to index one site I prefer to use the whois .
If I need process a big list ( last tier ) I prefer use ping services .
In this regard, it would be possible to choose or change ping services ?
If i try to index 100 000 urls in same time, it's simply not possible ( within a reasonable time ) .
My point of views and what I want to do:
Able to ping my last tier (large quantity) with ping Google ( all blogsearch ) and Yahoo API only.
In the absolut and the best world (^ ^) they should be a strong Seo Link Robot Pro without the drawbacks points.
Strength: Good thread management, index simultaneously all the urls at the same time every services ping = efficiency. Edition Ping services in a .txt
Weak point : due to strong point, bug if the list is too large. Unable to choose the number of thread. No proxy support .
If 100 threads is chosen, it would have to ping 100 URL (User) on all ping services at the same time, then do the following.
I know I ask a lot but it is here that have made the best software
Thank for reading.
Comments
In the current version it will receive the list of links to send to the ping services and will prepare the list of each ping service with the data to send. Then it mixes that list and start a random entry from it, not in sequence as in old versions. The result is a more random sequence of ping services to use and there URL.
However it is still using the max. number of threads you set in options - the threads used by projects.
As pings are very fast (loading) have sometimes not achieve a strong level thread. I would like to set 1000 threads per instance and that use them (Or something very close permanently).
Have already 30-40% improvement, but I think we can get much more (no limit realities depending on the configuration of the user, can be more than 1500 threads ...).
I think once optimized at maximum, it will be possible to index 100,000 url 's in a few hours. : )
Pull a thread longer avoids falling rates.
I was speaking of ping module in SER. I can not improve speed here really. The threads are shared between all projects/tasks so it can only be as fast as free threads are available.
And as it uses random items from the to-ping URLs, it's all it can do.
You can simply ping every url sent to GSA SEO indexer if you add only one url into the program
http://pingomatic.com/ping/?title=GSA+FORUM&blogurl=http://forum.gsa-online.de&rssurl=http://&chk_weblogscom=on&chk_blogs=on&chk_feedburner=on&chk_newsgator=on&chk_myyahoo=on&chk_pubsubcom=on&chk_blogdigger=on&chk_weblogalot=on&chk_newsisfree=on&chk_topicexchange=on&chk_google=on&chk_tailrank=on&chk_skygrid=on&chk_collecta=on&chk_superfeedr=on
Just change the Title variable and the blogurl variable with the link being processed. Pingomatic will ping it for you!
I think this is a fairly simple addition that will ping every single url sent to GSA SEO indexer. And it will work with deep links as well not only the root domain. Please try to add it in the indexer. thanks.
Does anyone experience this also, even do I have a high end desktop with exceed of pc ram.