From paper Sven cites:
"Our proposed method achieves a success rate of 97.4%"
Wow. That is....solved!
I'm not sure I understand the mechanism used by their agent, though.
" Our approach simply applies …
Q1) If the file is not present, it will read/edit the file from default location. So make sure you copy the file to the folder before clicking EDIT.
Q2) Yes correct.
The file exists in the selected folder
I have verifi…
Edit project -> "Folder with data files to use" -> point it to a custom one per project/language/niche -> copy files there that you want to modify -> modify them
Question 1: I did as you said
When I edit now i …
6gb file imported directly to the project? was that project having any content before? Would it be possible to send me that file?
Project was completely reset - right click reset data select all options except articles
Try to duplicate your project for ex. 20 times and split the urls into each project.
I have split the urls into 10 pieces
now it is working
each piece is about 600mb
so there is certainly a limit application can hand…
OK, but if you really aim to get as many proxies as you can, you really should have a look on GSA Proxy Scraper as this tool gives you the best public proxies available as it constantly tests things and gives you a value for the prox…
Don't get me wrong, but this all sounds like you are going to "recode" things. Else, whats the meaning behind all these detail questions?
I have 0 intention to recode or release such software as yours
Certainly I can code my…
There is no pattern matching...it's my own code that checks each string for being an IP or domain followed by some integer being the port.
ty for the answer @Sven but the part I am asking is
Also it would be very very…
well I do it like this:
1. read file one line after the other
2. take url, make a hash (Murmur2)
3. find hash in memory
4. add to new file + list if not found
5. read next line
As you see, it's almost the same as your …
what hash algo do you use?
I didnt use any hash algo because it would make it slower. although this would make it lesser ram > first hash the link like SHA256 (this would make it constant size like 64 characters), then che…
By the way i have loaded entire target urls into the ram (a hashset in c#) and it is deduped in less than a minute meanwhile GSA ser takes more than an hour
2 GB text file takes around 2 GB in ram memory
see this line...
00:00:53: [ ] No E-Mails to read on this account. Make sure you do not check/delete emails using your E-Mail-Client/Browser.
It showy that emails are getting deleted correctly. Why your browser shows the ema…
please turn the project status to "active (email verify only)" and run it.
then post the log.
stop and restart it and post the second log.
first run logs
23:45:51: [ ] updating blacklist from http://www.securemecca.c…
do you see it in the log that there is something wrong or where do you think emails get not deleted?
When i set project to only email check, i did not notice any errors, all seems fine
When i check inbox of the mail via brow…
@Ozz i dont know whether any of my proxies are private or not
But i am able to get over 2000 working with google proxies at my per run among 580k total proxies
none of them have password
I don't use any proxy so far with GSA - since i am not using…
@Zeusy yes but it is not spin-file feature
For example i am being have to generate different file for each spinfile
If you use like below at your article you will get always same lines
@AlexR for reading random n lines from the file , what project file do you need for what reason ?
either it is supported feature or not
also copy paste above to your project and try yourself to see
and the way i generated links field urls is cor…
@Sven the answer i expected.
Seems like i have to compose different file for each one
Obviously your random selection algorithm is not working correctly
Probably using same seed or not generating another true random etc etc
If i could see the co…