Skip to content

Bug when I am monitoring a folder from another project?

Hello! I don´t know if it a bug or is something that is not running well in my computer.
First of all I am going to tell you how I am using it.
1. Scrapebox is scrapping urls into folder 1
2. GSA I project 1 for removing dup from folder 1 to folder 2
3. GSA I project 2 identifying from folder 2.

The thing is that the monitoring is not working well. When I start the project 2 it finds all the files in the folder but It won’t find the new ones that the project 1 is creating. I need to pause the project 2 and hit on run again to make it find the new files. Is this a bug? Did you notice that?

Thanks

Comments

  • s4nt0ss4nt0s Houston, Texas
    @adrianicus - Hmm, does the same thing happen if you're monitoring directly from the folder where scrapebox is scraping to folder 1 or is it only if you're monitoring the output dedup folder you made?

    I'll test this on my end and get back to you.
  • It seems that is only when I monitor the folders from another project. It might be the way the project 1 add the files in the folder ... I don´t know.  Today what I had to do was to stop the project, delete the files that it already monitor and start the project again, so as you can see is a really pain in the ass ^^
  • I seem to be having the same problem.  

    I think it's because when you choose to monitor a folder in PI, Scrapebox then automatically makes another folder inside that and then puts in a readable document.  Platform Identifier won't look inside the inner folder to see the readable file though, it just looks for a readable file in the first level folder rather than open up the second folder.  

    I'm trying to find a way around it or see if there's a way to change the output path on Scrapebox.

    So far I can't find anything.
  • s4nt0ss4nt0s Houston, Texas
    @TFU - You would need to use the automator plugin. When scrapebox is writing to a file, it locks the file, and that's what Pi can't read it until Scrapebox is completely done writing to it. That's why the automator plugin is needed to save and split files every x amount of URL's. 

    In SB 1 Pi could monitor the harvestor sessions folder without issues, but v2 seems to lock the file when writing to it, unfortunately, that's out of PI's control. It will work fine with the automator plugin though. 
  • That is not my case Santos. Did you find any solutions at the end? 
    Thanks!

  • s4nt0ss4nt0s Houston, Texas
    edited April 2015
    @adrianicus - Yes @TFU seems to be talking about something different and that's due to SB locking the file while its writing to it. We thought your problem had to do with HTTPS urls and changed that in a previous update. Are you still having the issue? If so, we'll look into it again.  
  • Thanks @s4ant0s for replying and for the resolution and apologies to @adrianicus.  I thought it was the same problem, didn't meant to hijack your thread.  :)
Sign In or Register to comment.