https://www.selleckchem.com/products/sr-717.html
A new network with super-approximation power is introduced. This network is built with Floor ( ⌊ x ⌋ ) or ReLU ( max 0 , x ) activation function in each neuron; hence, we call such networks Floor-ReLU networks. For any hyperparameters N ∈ N + and L ∈ N + , we show that Floor-ReLU networks with width max d , 5 N + 13 and depth 64 d L + 3 can uniformly approximate a Hölder function f on [ 0 , 1 ] d with an approximation error 3 λ d α / 2 N - α L , where α ∈ ( 0 , 1 ] and λ are the Hölder order and constant, respectively. More generally for