The main tank, brimful with ideas. Enjoy them, discuss them, take them. - Of course, this is also the #1 place for new submissions!
By griphonman
#1773
Don't you hate sites with pop-up ads every three seconds and if your pop-up killer doesn't get them all you race to see how many you can close - only to have more pop up?! Instead of having pop up killer programs, how about a search engine like Google add an extra option or feature to their advanced search. The bots could tell which sites have more than one or two pop-ups and put them in the idiot webmasters category. Some people don't get the idea that annoying advertising is bad advertising. You would have the option to search sites with "no pop-ups," "few pop-ups" or "no pop-up filter."
By Captain_Clean
#2706
I just think there is too much pop-up windows out there. I agree this is the worst programmers can do.
It would be great if google would have the no-pop-up seasrch option. Simply awesome!
By Beedle
#2712
Great idea. However, the only way I can think of to accomplish this efficiently would be to use some sort of user feedback system (i.e. user's of the search page will report back on sites that abuse pop-up ad's hence placing them in a db that is cross-referenced and not returned in the search list based on criteria provided). As I understand it the way most search engines (web crawlers like google) work is to search for the keywords tag in the pages. In order to determine if the page matching the search words has any pop-ups, the engine would need to download the entire document and look for the javascript code that opens the pop-up (which would take forever). Once you get into if it's more than 2 pop-ups, it gets even more difficult, as most of those uncontrollable series of pop-ups you mention are a chain reaction starting from the original pop-up (using window open and window close events on each subsequent pop-up to open yet a new ad).

I am a codeslinger, and would love to write such a search engine. Using the user feed-back idea however, would not be effective in avoiding those sites until a substantial amount of users actually report a substantial amount of abusers to the search site (could take some time, and decisions on rules : i.e. do we consider a site an abuser based on one user report or many?). Not to mention new sites are created every day, and of those that are abusers would still return in search results until they are reported.
Any other idea's anyone? :)

Beedle
By lamine
#2904
The ISP's could solve the problem by allowing only web pages that have been requested to be fed back to users. But that would mean developping their proxy servers further and waisting valuable processing power... I guess it just didn't become enough of a problem yet?
By Jack Nobbz
#8349
Just use firefox. Problem solved.
OFFSHORE

Is there anymore need for physical cards? I suppos[…]

A Place for problems and solutions

This is a really good proposal. One title could be[…]

Team Innovating Forum

Are there forums for team innovating? Normally peo[…]

Whats your favorite Xbox game?

Mine is outrun2