0

MobileNetV3 Paper Walkthrough: The Tiny Giant Getting Even Smarter

https://towardsdatascience.com/mobilenetv3-paper-walkthrough-the-tiny-giant-getting-even-smarter/(towardsdatascience.com)
MobileNetV3 improves upon its predecessors by incorporating Squeeze-and-Excitation (SE) modules and new hard activation functions into its building blocks. The architecture, which comes in Large and Small variants, was partially designed using Neural Architecture Search (NAS) to optimize for accuracy and latency. The core building block is a modified bottleneck from MobileNetV2, now including an optional SE module and using either ReLU6 or a hard-swish activation. These new activation functions, hard-swish and hard-sigmoid, are piecewise linear approximations of their standard counterparts, designed to be computationally cheaper and more efficient for low-power mobile devices.
0 pointsby chrisf9 hours ago

Comments (0)

No comments yet. Be the first to comment!

Want to join the discussion?