tcsmith007 Send me a email with the NAME you signed up + the email. greeny1232@yahoo.com
feyt333
Hi ordered and email sent
mamadou
@greeny1232
Subscribed , with my name mamadou. I also sent you an email to greeny1232[at]yahoo.com. Looking forward for the activation.
mamadou
@Sven ... are the links in this api sent automatically once they are verified ? .... @greeny1232 ... Please add ( links received in the last 30 days ) we really need this as well.
Sven www.GSA-Online.de
no, they are collected and send if 100 URLs are in queue or 5 minutes passed.
tsaimllc
When I try to copy paste a large list of links (more than 15K) it crashes and they dont get loaded in.
There needs to be a way to import a text file of links, or fix this because this is a big problem because I cannot index a lot of my links.
greeny1232
mamadou We have to clean the database every 5-7 days roughly because we process millions a day and it slows down the indexer/database.
tsaimllc Try to import 10k at a time and see if that works, if it doesn't then do 5k at a time. We aren't going to be changing the code again for importing text files. The code is made to optimize the indexing, so doing something like that would slow down the entire database.
Tim89 www.expressindexer.solutions
try increasing the php script memory
mamadou
@greeny1232 .. You can make it like 5 / 7 days. ( links received in the last 7 days ) .. we need to monitor if SER is sending the links regularly and successfully or no.
justAIMe
edited March 2014
greeny1232 ...Signed up 2 days ago haven't heard anything about my order? Still can not log in?
greeny1232
Check the email you signed up with, the spam folder too. It will be from "pinger tool"
mamadou We aren't changing the days it keeps the links in the database.
tsaimllc
what @tim89 said Im pretty sure that should work and nothing would need to be changed (and may actually speed up things)
supermanden
is there any way to import URL's from a sitemap to this indexer?
Milan
any discount coupon ?
cloudattack
@supermanden You can try using scrapebox sitemap scraper and import urls that way. We don't plan on integrating that at this point.
i got it and started using it today, what is the indexing rate guaranteed or expected ?
cloudattack
@milan it depends on the links processed, but typically you'll get rates around 40-80%.
LeadK
more like 80% not indexed, at least for me
meph
I checked some links from 2 different projects for Tier 1 and now i have like 36% for each. They are only contextual links.
What % do you have?
greeny1232
it depends on what links you are sending, some are more friendly then others. And no, I don't have a list of what ones index better then others. I just send everything from GSA.
Pratik
How many concurrent machines will this support? Just asking as maybe in future if I may want to expand to multiple servers.
Thanks.
XXXX
Hi @cloudattack ,
the secong day your Incredible Indexer doesn't accept links - it's really incredible... What is the matter?
Thanks
cloudattack
@Pratik You may use up to 3 machines and as usual up to your daily limit (100k or 200k).
@XXXX We just did a quick purge yesterday, there was slight downtime but only for a brief moment.
perseous
hello , seem My GSA doesn't send links to be processed like it used to. Please check for me, sent you email as swell . perseous9@...
greeny1232
The indexer is working fine as it has, except for a very few days of downtime which there was a problem. We do weekly purging, that may have been a day when you were submitting your links. But it was only down for at most 2 hours during that time.
@greeny1232 - Maybe you could inform all users in advance of all planned purges ( lets say every Sun. at 09H00 GMT is purge time), so they know not to submit links during that 2 hour window.
That way you spare yourself all the complaints, and your customers don't have to waste there time sending links during a purge and then resubmitting again after the purge.
Then again this is a GSA forum,. so most links are submitted automatically by GSA and the clients have not much control over this automated process..
Another option to consider is doing daily purges instead of weekly, because If the weekly purge is about 120 minutes, then a daily purge could only be about 17 minutes, as there are 6 times less data to purge.
Just my 2 cents worth
greeny1232
We have tried to do daily purges and it slows down the server, so we are trying to do it 1x a week. I have been posting when I am doing it on the FB group (kinda why I made it in the first place)
Dont have time to check Facebook pages everyday in anticipation there might be a purge. Why not just fix a day and time, so all knows when it is going to happen. You cant expect people to go check on facebook evertime they want to post links, just to know if there is maybe a purge in progress or not.
perseous
even i added links manually, the statics doesn't show them. Don't know what's the issue. Could u check for me?
tsaimllc
edited April 2014
In the end, I decided to cancel incredible indexer due to the absolute horrible 'backend' which is basically unusable. Very basic, no changes or feature requests have been made, (or no desire), cannot upload a file, and putting in anything more than 10K links gives you a server error. Probably the most annoying problem was after you submit your links it tells you 'ok submitted' but its just a blank page with that text, no back button, home button etc, and if you hit the back button you are greeted with a "Cannot Complete Request This document is no longer available." error, so you hit refresh, then you have to 'are you sure?" pop up. I went with Link Processor and man, I forgot what a real backend looks like. Functional. Legit. they are cheaper (currently) as well. 2/5 stars.
Comments
tsaimllc Try to import 10k at a time and see if that works, if it doesn't then do 5k at a time. We aren't going to be changing the code again for importing text files. The code is made to optimize the indexing, so doing something like that would slow down the entire database.
mamadou We aren't changing the days it keeps the links in the database.
That way you spare yourself all the complaints, and your customers don't have to waste there time sending links during a purge and then resubmitting again after the purge.
Then again this is a GSA forum,. so most links are submitted automatically by GSA and the clients have not much control over this automated process..
Another option to consider is doing daily purges instead of weekly, because
If the weekly purge is about 120 minutes, then a daily purge could only be about 17 minutes, as there are 6 times less data to purge.
Just my 2 cents worth
"Cannot Complete Request
This document is no longer available." error, so you hit refresh, then you have to 'are you sure?" pop up.
I went with Link Processor and man, I forgot what a real backend looks like. Functional. Legit.
they are cheaper (currently) as well.
2/5 stars.