Decimal Computers or Not Binary Computers
In today's world computers use the binary system to interpret data and instructions. These are translated by compilers and programs as sequences of 0's and 1's according to agreed codes, and in this form the computer handles them, meaning 1's a pulse of electricity and 0's a lack of it (in fact a pulse with much lower voltage). With the technology available at the time of major development of computers, I can understand that they just could make a difference between 0's and 1's by then, but nowadays electronic devices are finer and more accurate and I'm sure they can discern and create flows of different voltages of electricity.
My idea would be a computer that instead of binary system, uses the decimal system, by manipulating the data in the decimal system. Maybe this is a bit too difficult to achieve, since there must be some reason why no one sells one of these already (I guess I'm not the only one to think of it). So another possibility could be using other base but 2. Probably is a matter of reliability, because electronic components tend to age, or maybe is a matter of programming convenience (you must change your whole mindset for programming one of these). But I'm sure than any higher system (base 3, 4, 5, 6, 7...) would render faster computers.