Keyword Filter
AlexR
Cape Town
I have been reviewing why some of my projects are not hitting targets and I think it's that some of the keywords are a little restrictive.
One of the most useful tools is the check index function in Gscraper and it allows you to remove keywords if below a certain level of indexed targets in the SE. This is so that you don't spend ages scraping for keywords that don't generate poor results.
Within SER this is far more critical than in Gscraper.
FEATURE REQUEST:
Option to remove keywords with below X pages in SE index before it does all the various searches or parsing of SE results.
Let's look at some numbers:
- Let's say you have 5 Google SE's selected.
- Let's say each platform has 5 footprints.
- Let's say you have 20 platforms selected.
Let's say you have 10 keywords (some people have a list of 500 keywords+ so there's a good chance at least 10 will not have many indexed pages) that is too specific or doesn't have many pages in the SE index.
10 keywords x 5 SEs (each tests it) x 5 footprints x 20 platforms = 5000 WASTED SEARCHES!
With 1 projects that's 5 000 wasted searches.
With 10 projects that's 50 000 wasted searches.
With 100 projects = 500 000 wasted searches.
Or put another way if you have 100 keywords with low index pages
With 1 projects that's 50 000 wasted searches.
With 10 projects that's 500 000 wasted searches.
With 100 projects = 5 000 000 wasted searches.
The numbers just are exponential!
The same logic applies to editing engine files to remove low performing footprints.
------------------------------------------------------------------------------------------------
Part 2 of this request would be an option to remove "footprints" with below X indexed SE pages after an update. This means that after every update we don't need to re-edit our footprints and remove low performing footprints. It would also let us choose the threshold for indexed pages.
Would love to hear some ideas as this is an area where a huge improvement can be had.