Hmm, this is sort of a nebulous question. If you're asking if we moved away from binary encoding to, say, ternary then no it hasn't changed. (Quantum would change that but classical computers won't be going anywhere.) But several technologies behind the cloud are very close to the kernel and the processor.
The cloud in part was made possible by advances in hardware virtualization. It was probably 5–10 years ago that chip manufacturers started introducing chips with hypervisor support right on the chip thus making the performance of virtualized machines comparable to non-virtualized machines.
Software-defined networking has also enabled the advancement of cloud computing. To be efficient, that has to be implemented at a fairly low level.
Containerization has also become very prevalent in the last few years. I'm not a expert in this, but as I understand it, containerization has been made possible, at least for Docker, by changes to how kernels handle the execution of processes. They're quasi-virtualized but don't use a hypervisor. Most of it is black magic to me.
IoT is another cloud technology that will be evermore present in the coming years. IoT has already had major implications on security. IoT devices have been compromised and used in DDoS attacks (see Silicon Valley's depiction of this in the most recent season). Device manufacturers have realized that they cannot ignore security for embedded software.