[5.38] Project cananot use global list after manual target import
Hi. I have a project where I disabled search engines and also the global lists. I imported 130k URLs with the import urls from txt file function. That list only ran 1300 of the urls. All 130k are verified in other projects.
Then, I cleared the target url history, cache (including accounts), and enabled the global lists which contain millions of URLs. The project won't even get to max threads because it runs out of urls.
I think the issue is caused either by having no links sources selected when importing, or by using the "continuously link even if failed before" option.
I also duplicated the projects, cleared all setting, and had the global list selected before making active, and that had the same problem.
Emails fine, captcha solving is fine, and my global lists work for other projects.
Thanks
Then, I cleared the target url history, cache (including accounts), and enabled the global lists which contain millions of URLs. The project won't even get to max threads because it runs out of urls.
I think the issue is caused either by having no links sources selected when importing, or by using the "continuously link even if failed before" option.
I also duplicated the projects, cleared all setting, and had the global list selected before making active, and that had the same problem.
Emails fine, captcha solving is fine, and my global lists work for other projects.
Thanks
Comments
how can i force my Tier2 project to submit article/boorkmarks/comments and so on again and again and again, i found out, that most accounts are only used one time! (option unchecked "avoid posting to main domain twice!" ;-))
i would use my tier 2 accounts for minimum 10 submissions, cause i think that this does not harm to much and will speed up the linkbuilding process!
so, momently there is only one option for me, -> duplicate tier2 10times and ready... ;-) but i think it will be much more natural, if the social network, article accounts have more articles posted (like some people suggest for the web20s to); perhaps it will increase the indexing rate and (if we can post randomly a few articles without links to the accounts, like web20 in senuke, or wished from @ron from the new premium service of web20 submitting) more natural ! ;-)
any thoughts or expierences ?
Also If I use use only verified link form global and check non search engines, sometimes software w8 and give me msg that there is no new targets... Also it submit to failed and all rest not on verified.
Are you saying that clearing cache mean the project won't use the global lists once it's started?
@Sven, see here on a different project I've run out of URLs (only one project running), when I've only submitted 20k urls (avoid posting twice is switched off). I have something like 20 million unique URLs, and most projects will post 120,000+ times even with "avoid posting twice" turned on.
... yet I've run out of URLs:
It really does look like that importing URLs interferes with the global lists somehow.
To many different subjects and issues here. Anyway let's try..
Global site list has nothing to do with imported URLs.
@team74 that screenshot shows that after the attention line, it loaded 5 URLs from the site lists...and should start to submit to them shortly.
If you import new target URLs, it should use them almost immediately.