Md Ameenuddin Siddiquiaisemi.hashnode.devยทNov 20, 2023Brain floatThere is a trend in DL towards using FP16 instead of FP32 because lower precision calculations seem to be not critical for neural networks. Additional precision gives nothing, while being slower, takes more memory and reduces speed of communication. ...VLSI circuit designAdd a thoughtful commentNo comments yetBe the first to start the conversation.