Because I want to make sure SER posts the article without me touching it. I have my own sites loaded and I want to post the article to all of them. I used lower retry but that didn't work so I guessed increasing the retries will do the trick.
Some time ago in this thread I suggested that an option to adjust the time between retries would be useful. It was then when you implemented a random time interval between retries and then it worked fine for a time. Then the problem is back again.
I'm still struggling with this. I have to delete the URL history at least 3 times to have 1 article sent to 13 sites.
It's true that I haven't got the best proxies but the sites aren't hosted on a super cheap hosting so this should work without problem especially after 255 retries.
3. if you see it'S no longer submitting to the site, send me the entries where that site is included....so we get a clue why the submission failes at all.
Comments