Is a Multilayer Perceptron a recursive function?
Clash Royale CLAN TAG#URR8PPP
up vote
4
down vote
favorite
I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.
neural-networks concepts perceptron
 |Â
show 1 more comment
up vote
4
down vote
favorite
I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.
neural-networks concepts perceptron
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24
 |Â
show 1 more comment
up vote
4
down vote
favorite
up vote
4
down vote
favorite
I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.
neural-networks concepts perceptron
I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.
neural-networks concepts perceptron
edited Aug 15 at 20:29


DukeZhou♦
2,85521028
2,85521028
asked Aug 15 at 15:52
user3352632
1464
1464
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24
 |Â
show 1 more comment
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24
 |Â
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
4
down vote
Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.
On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.
add a comment |Â
up vote
3
down vote
Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as
$$ o_n = f(o_n-1)$$
Where $o_n$ is the output of layer $n$.
But this clearly doesn't reveal, much does it?
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
4
down vote
Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.
On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.
add a comment |Â
up vote
4
down vote
Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.
On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.
On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.
Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.
On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.
answered Aug 15 at 17:27


John Doucette
1,86620
1,86620
add a comment |Â
add a comment |Â
up vote
3
down vote
Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as
$$ o_n = f(o_n-1)$$
Where $o_n$ is the output of layer $n$.
But this clearly doesn't reveal, much does it?
add a comment |Â
up vote
3
down vote
Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as
$$ o_n = f(o_n-1)$$
Where $o_n$ is the output of layer $n$.
But this clearly doesn't reveal, much does it?
add a comment |Â
up vote
3
down vote
up vote
3
down vote
Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as
$$ o_n = f(o_n-1)$$
Where $o_n$ is the output of layer $n$.
But this clearly doesn't reveal, much does it?
Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as
$$ o_n = f(o_n-1)$$
Where $o_n$ is the output of layer $n$.
But this clearly doesn't reveal, much does it?
answered Aug 15 at 19:28
Daniel
1187
1187
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f7579%2fis-a-multilayer-perceptron-a-recursive-function%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05
A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33
So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35
Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08
Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24