# SMU激活函数-Pytorch **Repository Path**: luokai-dandan/smu-activation-function ## Basic Information - **Project Name**: SMU激活函数-Pytorch - **Description**: SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE github原作者链接 https://github.com/iFe1er/SMU_pytorch, 对原作者SMU.py添加绘图,使得SMU更容易理解,且绘图和原论文一致。 希望能够帮助到大家,一起”快乐“地学习深度学习! - **Primary Language**: Python - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: https://gitee.com/luokai1997 - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2021-11-18 - **Last Updated**: 2022-07-09 ## Categories & Tags **Categories**: Uncategorized **Tags**: Python, PyTorch, 深度学习, 神经网络, 激活函数 ## README # SMU_pytorch A Pytorch Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE ## The original author's link https://github.com/iFe1er/SMU_pytorch ## arXiv https://arxiv.org/abs/2111.04682 ## Requirements Pytorch 1.7 Matplotlib Numpy ## Tensorflow version of SMU activation Please check https://github.com/iFe1er/SMU for tensorflow 2.x implementation. ## Reference: >@ARTICLE{2021arXiv211104682B, > author = {{Biswas}, Koushik and {Kumar}, Sandeep and {Banerjee}, Shilpak and {Pandey}, Ashish Kumar}, > title = "{SMU: smooth activation function for deep networks using smoothing maximum technique}", > journal = {arXiv e-prints}, > keywords = {Computer Science - Machine Learning, Computer Science - Artificial Intelligence, Computer Science - Computer Vision and Pattern Recognition, Computer Science - Neural and Evolutionary Computing}, > year = 2021, > month = nov, > eid = {arXiv:2111.04682}, > pages = {arXiv:2111.04682}, >archivePrefix = {arXiv}, > eprint = {2111.04682}, > primaryClass = {cs.LG}, > adsurl = {https://ui.adsabs.harvard.edu/abs/2021arXiv211104682B}, > adsnote = {Provided by the SAO/NASA Astrophysics Data System} >}