How to bulk import unique chunks of URLs into multiple projects (& awesome feature suggestion)
After I complete a scrape with Hrefer, I usually split the file into 20-50 chunks of 30k-300k unique URLs depending on how big the original scraped text file is. All files are unique URLs. It looks like this:
After splitting the files, I will go into SER where I have 20-50 processing projects numbered such as this: ...And I will import the corresponding split file into each project, eg Split 1 gets imported into Processing project 1, Split 2 gets imported into Processing project 2 and so on. I have to manually do this for each project and it takes forever. Does anyone have any suggestions on how to automate this a bit or a better way to do it? @sven?
If not, I have a feature suggestion. A lot of us process large lists like this in a similar fashion rather than just importing 1 huge file into 1 project. What if we could simply select our group of "Processing Projects", right click, select import target URLs from folder, and have SER automatically take 1 text file from that folder per project and import them each individually into each project.
Alternatively, we could select our group of "Processing Projects", right click, select import target URLs from folder, and have SER automatically take that 1 HUGE text file and have it automatically split it into proportionate chunks of text files depending on the number of processing projects you have selected and import them accordingly into each individual project. This would be AWESOME.
Thanks!
After splitting the files, I will go into SER where I have 20-50 processing projects numbered such as this: ...And I will import the corresponding split file into each project, eg Split 1 gets imported into Processing project 1, Split 2 gets imported into Processing project 2 and so on. I have to manually do this for each project and it takes forever. Does anyone have any suggestions on how to automate this a bit or a better way to do it? @sven?
If not, I have a feature suggestion. A lot of us process large lists like this in a similar fashion rather than just importing 1 huge file into 1 project. What if we could simply select our group of "Processing Projects", right click, select import target URLs from folder, and have SER automatically take 1 text file from that folder per project and import them each individually into each project.
Alternatively, we could select our group of "Processing Projects", right click, select import target URLs from folder, and have SER automatically take that 1 HUGE text file and have it automatically split it into proportionate chunks of text files depending on the number of processing projects you have selected and import them accordingly into each individual project. This would be AWESOME.
Thanks!
Comments
@Justin, I've noticed that sometimes SER detecs one engine as another if they are both ticked. I don't have samples on hand unfortunately.
I wanted to know if it is viable to keep, for example, 20 instead of 10 Sorting projects but with different engines, i.e.
1 Project wit Articles, Wikis, Web 2.0
2nd Project with all other engines
3rd Project again with Articles, Wikis, Web 2.0
etc.
This way for example Articles wont get confused as Blog comment.