Currently , I 'm not using site lists at all ( disabled in projects options & in general options ) , I even disabled all indexing / filters options in General options.
I also checked all my 30 proxies on GSA & Scrapebox and 95% of them show successful & fast results all the time.
My current LPM is 0.43 !!
I'm really desperate and I don't know where to look further ?!
@Lee the point I was making was if you are going to isolate a problem, don't mix the two things. Run a pure sitelist campaign and see how it goes, and run a pure search and post routine without site lists. It could help to isolate the issue.
I looked at the log and the only thing I can think of is maybe something is screwed up in the URL field.
The other thing I would suggest is to create a brand new project from scratch. No cheating by duplicating projects. Create it from scratch, send the links to yahoo or whatever. Turn off all other projects, and run it with and without site lists. I bet that the new project will work. If it doesn't, then global settings would be the main suspect.
I already tried this . I started a brand new project with only 1 URL & a few unique keywords & I stopped all other projects & I left it running for hours.. same problem.
I used a huge top keywords list ( 100 K ) and it didn't help. Also , I'd like to mention that the engines articles, social bookmarks,social networks & wikis don't rely on keywords. ( As mentioned by Sven in this thread )
Another thing I tried today , I totally uninstalled SER using Revo uninstaller to remove all left over files. they I installed it again and that didn't help either !!
Nope , it's not the Captchas ... CS is solving whatever SER send to it ( if it can ) as usual. I kept monitoring it and I couldn't find anything abnormal with CS.
it looks very strange to me as you have put much effort to find the issue already. you can try to investigate that 000/000 message in your log manually with right click the log -> open URL. when you get results for that search term than try to test it against some of your proxies in the browser. install a plugin to quickly switch proxies for that.
i just quickly re-scanned the thread so i'm not sure if this was already been answered. do you have any AV/Firewall tool installed that is capable of blocking urls somehow?
Comments
Thanks for asking.
Edit : this is my current LPM 0.00 !!! : (
Call me simple, but I would try first: Disable global site lists and see if SER works by searching for targets and posts.
If it does, then either your mapping is wrong on the global site lists, or something is corrupted with those site lists.
Also, I thought you were supposed to turn off search engines if you use global site lists - that is, if you just want to use site lists for targets.
Come on Ron, catch up with the game.
I use both search engines and global sites with decent submission and verified results
If a target cant be found with search results, it pulls targets from the global sites lists
That way you constantly have a belly full of targets in ser
If your getting a lot of search results as shown in that image, its possible your proxies are banned or being over abused by others if they are shared
Run them through a test in scrapebox and see what they show
Currently , I 'm not using site lists at all ( disabled in projects options & in general options ) , I even disabled all indexing / filters options in General options.
I also checked all my 30 proxies on GSA & Scrapebox and 95% of them show successful & fast results all the time.
My current LPM is 0.43 !!
I'm really desperate and I don't know where to look further ?!
@ron @LeeG
I sent you guys a copy of the current log file. Any help from you guys are really appreciated.
@Lee the point I was making was if you are going to isolate a problem, don't mix the two things. Run a pure sitelist campaign and see how it goes, and run a pure search and post routine without site lists. It could help to isolate the issue.
I looked at the log and the only thing I can think of is maybe something is screwed up in the URL field.
The other thing I would suggest is to create a brand new project from scratch. No cheating by duplicating projects. Create it from scratch, send the links to yahoo or whatever. Turn off all other projects, and run it with and without site lists. I bet that the new project will work. If it doesn't, then global settings would be the main suspect.
Going by the amount of 0/0 results, I would look at changing your keywords for searches
A lot of your searches are returning zero results
That would be my own personal start point. Scrape a big chunk of keywords
Then add the words from file and not clipboard
"Then add the words from file and not clipboard"
I already tried this . I started a brand new project with only 1 URL & a few unique keywords & I stopped all other projects & I left it running for hours.. same problem.
@LeeG
I used a huge top keywords list ( 100 K ) and it didn't help. Also , I'd like to mention that the engines articles, social bookmarks,social networks & wikis don't rely on keywords. ( As mentioned by Sven in this thread )
I need your help. I tried everything and i can't find the problem.
This is my proxies list tested in Scrapebox
I did what you suggested and I left it for about one hour. SER almost stopped since I disable proxies
It just sit there and don't do anything !! , I have no idea why this is happening.
Also I see a lot of : 000/000 [Page END] results on google PK for Oxwall with query "Top Rated" "Most
in logs.
@AlexR
Nope , it's not the Captchas ... CS is solving whatever SER send to it ( if it can ) as usual. I kept monitoring it and I couldn't find anything abnormal with CS.
There are only three things that affect ser
Proxies
Keywords
Search engine choice
Other than that, it will be content not being approved because its poor quality
Keywords are not the issue since I'm using Engines don't need keywords and I have a huge 100k keyword list anyway.
Search engines ... I have 10 random Google selected.
My content ... is High quality hand written Articles from the leading articles. ( I was using them 3 months ago without any issues )
So , what on earth could be affecting my SER Submissions ?!!