Resubmissions of Submitted URLs Process
We all have those days when we leave SER running for quite some time, only to find out our verification rates are extremely low (high submissions/low verifications). The usual suspects are proxy problems, high thread counts, captcha services not working/low on credits, cpu/memory is overloaded, etc. What were left with is a massive amount of submitted/scraped urls that would have, under ideal circumstances, been successfully verified. So my question is, what process do you go about resubmitting these urls, once you've corrected all the "kinks in your chain"?
My Resubmission Process is:
1. Project > Right Click > Show URLs > Submitted > Export
2. Project > Right Click > Show URLs > Verified > Export
3. Trim to Root both Submitted and Verified Lists and then Subtract Verified List from Submitted List(can be done within SB)
4. Project > Right Click > Modify Project > Duplicate
5. Right click on the Duplicated Project and click Import Target URLs
6. Click Start!
If anyone else has a more ideal method.....please let me know. :-)
My Resubmission Process is:
1. Project > Right Click > Show URLs > Submitted > Export
2. Project > Right Click > Show URLs > Verified > Export
3. Trim to Root both Submitted and Verified Lists and then Subtract Verified List from Submitted List(can be done within SB)
4. Project > Right Click > Modify Project > Duplicate
5. Right click on the Duplicated Project and click Import Target URLs
6. Click Start!
If anyone else has a more ideal method.....please let me know. :-)
Tagged:
Comments
Actually.....what your requesting on that thread is not the same. Your trying to import and export site lists according to the Verified/Submitted/Identified folders. What I am explaining above is the ability to run through site lists on a per project basis.
Note: I just edited step #3. This, so far, is the only way I know of, to resubmit URLs on a per project basis, and have them resubmitted almost immediately. If your importing lists via Tools>Advanced, those lists are spread out through all projects and URLs are chose at random, at random intervals.