The digital language is made up of just two 'alphabets', 0 and 1 . Computers being a digital entity understands only 0 and 1. But how does it understand a 0 as '0' and 1 as '1'? At the hardware level we have bunch of elements called transistors (modern computers have billions of them and we are soon heading towards an era where they would become obsolete). These transistors are basically switching devices. Turning ON and OFF based on supply of voltage given to its input terminal. If you translate the presence of voltage at the input of the transistor as 1 and absence of voltage as 0 (you can do it other way too). There!! You have the digital language. Can you now imagine billions of those transistors flipping 0's and 1's synchronously in less than a nanosecond interval, just so you could read this answer?? :)