Reviews: Efficient Neural Architecture Transformation Search in Channel-Level for Object Detection
–Neural Information Processing Systems
The paper reads very well and manages to present both the challenges of NAS and the proposed idea in a very understandable form (although English grammar and spelling could be improved). The paper's main idea is to constrain the search space of NAS to the dilation factor of convolutions, such that the effective receptive field of units in the network can be varied, while keeping the network weights fixed (or at least allowing the weights to be re-used and smoothly varied during the optimization). This idea is very attractive from a computational point of view, since it allows the notoriously expensive NAS process to achieve faster progress by avoiding the need for ImageNet pre-training after every architecture change. On the flip side, the proposed NATS method only explores part of the potential search space of neural architecture variations. So, the longer-term effect will depend on how restrictive this choice of search space is.
Neural Information Processing Systems
Feb-11-2025, 21:31:22 GMT
- Technology: