Great idea. However, the only way I can think of to accomplish this efficiently would be to use some sort of user feedback system (i.e. user's of the search page will report back on sites that abuse pop-up ad's hence placing them in a db that is cross-referenced and not returned in the search list based on criteria provided). As I understand it the way most search engines (web crawlers like google) work is to search for the keywords tag in the pages. In order to determine if the page matching the search words has any pop-ups, the engine would need to download the entire document and look for the javascript code that opens the pop-up (which would take forever). Once you get into if it's more than 2 pop-ups, it gets even more difficult, as most of those uncontrollable series of pop-ups you mention are a chain reaction starting from the original pop-up (using window open and window close events on each subsequent pop-up to open yet a new ad).
I am a codeslinger, and would love to write such a search engine. Using the user feed-back idea however, would not be effective in avoiding those sites until a substantial amount of users actually report a substantial amount of abusers to the search site (could take some time, and decisions on rules : i.e. do we consider a site an abuser based on one user report or many?). Not to mention new sites are created every day, and of those that are abusers would still return in search results until they are reported.
Any other idea's anyone?
Beedle