Comparison Study and Analysis of Implementing Activation Function of Machine Learning in MATLAB and FPGA
Mallika Roy; Jishnu Nath Paul; Josita Sengupta; Swagata Bhattacharya1
1
Publication Date:
2024/06/27
Abstract:
This study examines the implementation and
comparative analysis of sigmoid, approximation sigmoid,
and hard sigmoid activation functions on FPGA using
Verilog HDL and Xilinx ISE simulator and investigates
key performance parameters including device usage,
clock load, and time characteristics among. The findings
suggest that sigmoid functions provide greater accuracy
at the expense of larger processors. An approximate
sigmoid roughly strikes a balance between accuracy and
efficiency, whereas a hard sigmoid is more efficient but
imprecise. Comparison of MATLAB results showed the
effect of non-stationary computation and lower number,
where lower quantization level resulted in improved
accuracy. This study highlights the trade-off involved in
FPGA-based neural network implementations and fixed-
point emphasis. It also suggests future research on
reducing the representation and developing effective
activation algorithms.
Keywords:
Activation Function; FPGA; Verilog HDL; Xilinx; MATLAB; Machine Learning.
DOI:
https://doi.org/10.38124/ijisrt/IJISRT24JUN1109
PDF:
https://ijirst.demo4.arinfotech.co/assets/upload/files/IJISRT24JUN1109.pdf
REFERENCES
- Bañuelos-Saucedo, M A, et al. “Implementation of a Neuron Model Using FPGAS.” Journal of Applied Research and Technology, vol. 1, no. 03, 1 Oct. 2003, https://doi.org/10.22201/icat.16656423.2003.1.03.611. Accessed 25 Aug. 2023.
- Beiu, Valeriu. Closse Approximations of Sigmoid Functions by Sum of Step for VLSI Implementation of Neural Networks. 2014.
- Deng, Li. “A Tutorial Survey of Architectures, Algorithms, and Applications for Deep Learning.” APSIPA Transactions on Signal and Information Processing, vol. 3, 2014, www.cambridge.org/core/ journals/apsipa-transactions-on-signal-and-informationa-processing/article/tutorial-survey-of-architectures-algorithms-and-applications-for-deep-learning/023B6ADF962FA37F8EC684B209E3DFAE, https://doi.org/10.1017/atsip.2013.9. Accessed 15 Aug. 2019.
- Dubey, Shiv Ram, et al. “Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark.” Neurocomputing, vol. 503, Sept. 2022, pp. 92–108, https://doi.org/10.1016/j.neucom. 2022.06.111. Accessed 28 May 2024.
- Feng, Jianli, and Shengnan Lu. “Performance Analysis of Various Activation Functions in Artificial Neural Networks.” Journal of Physics: Conference Series, vol. 1237, June 2019, p. 022030, https://doi.org/10.1088/1742-6596/1237/2/022030.
- Gustineli, Murilo. “A Survey on Recently Proposed Activation Functions for Deep Learning.” ArXiv.org, 6 Apr. 2022, arxiv.org/abs/2204.02921. Accessed 2 July 2023.
- Kwan, H.K. “Simple Sigmoid-like Activation Function Suitable for Digital Hardware Implementation.” Electronics Letters, vol. 28, no. 15, 1992, p. 1379, https://doi.org/10.1049/el:19920877.
- Muhammed, Thamer, et al. IMPLEMENTATION of a SIGMOID ACTIVATION FUNCTION for NEURAL NETWORK USING FPGA IMPLEMENTATION of a SIGMOID ACTIVATION FUNCTION for NEURAL NETWORK USING FPGA. 2012.
- Ngah, Syahrulanuar, and Rohani Abu Bakar. Sigmoid Function Implementation Using the Unequal Segmentation of Differential Lookup Table and Second Order Nonlinear Function.
- Reza Raeisi, and Armin Kabir. IMPLEMENTATION of ARTIFICIAL NEURAL NETWORK on FPGA. 1 Jan. 2006. Accessed 3 June 2024.