No targets to post to - please help
Hi,
I'm a noob, and within my projects I always get these errors:
"No targets to post to (maybe blocked by search engines, no URL extraction chosen)"
What does this mean? I have added a list of targeted urls, and when I click on "Show stats about left target urls" it shows me
180,000 left target urls. So GSA does not have to look into search engines but can use the left urls.
Do I need to change something?
I'm a noob, and within my projects I always get these errors:
"No targets to post to (maybe blocked by search engines, no URL extraction chosen)"
What does this mean? I have added a list of targeted urls, and when I click on "Show stats about left target urls" it shows me
180,000 left target urls. So GSA does not have to look into search engines but can use the left urls.
Do I need to change something?
Comments
I randomly get that "no targets to post to" error but as long as there are URLs in "show urls -> left target URLs" I don't worry as I see SER keeps posting just fine.
sadly I closed gsa after posting here so the errors are gone now. but what I remember is that gsa had stoped submitting after that error occured 3 hours before I came home. But its 2. tier, maybe there were no tier 1 urls left, I'm not sure.
but maybe I have to disable search engines in options tab to get gsa work only with target urls? I might have read something about it somewhere.
From what I remember, using search engines has the lowest priority in SER, i.e. if you have imported lists and checked search engines, SER will attempt to post to all imported links first and once they are finished, it will start using search engines, so that can't be the cause either.
See if this problem occurs again if you run that project and if so you can either post your settings here or ask Sven to look into it and send him a project backup.
yes it happened the last few days also. when I restart next day it starts to submit links again until the error occurs again.
with search engines I meant the ones in the options where gsa scrapes new target urls. should I disable all checked SE there to force gsa to submit to the target urls list?