How does a computer machine understand 0s and 1s?
As we all know the programs are stored as binary digits. My question is how exactly does the computer understand 0 & 1?
As we all know the programs are stored as binary digits. My question is how exactly does the computer understand 0 & 1?