Is a Multilayer Perceptron a recursive function?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
4
down vote

favorite












I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.







share|improve this question






















  • Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
    – DuttaA
    Aug 15 at 16:05











  • A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
    – Douglas Daseeco
    Aug 15 at 20:33










  • So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
    – Douglas Daseeco
    Aug 15 at 20:35










  • Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
    – Ray
    Aug 15 at 21:08










  • Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
    – FauChristian
    Aug 15 at 21:24














up vote
4
down vote

favorite












I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.







share|improve this question






















  • Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
    – DuttaA
    Aug 15 at 16:05











  • A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
    – Douglas Daseeco
    Aug 15 at 20:33










  • So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
    – Douglas Daseeco
    Aug 15 at 20:35










  • Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
    – Ray
    Aug 15 at 21:08










  • Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
    – FauChristian
    Aug 15 at 21:24












up vote
4
down vote

favorite









up vote
4
down vote

favorite











I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.







share|improve this question














I read somewhere that a Multilayer Perceptron is a recursive function in its forward propagation phase. I am not sure, what is the recursive part? For me, I would see a MLP as a chained function. So, it would nice anyone could relate a MLP to a recursive function.









share|improve this question













share|improve this question




share|improve this question








edited Aug 15 at 20:29









DukeZhou♦

2,85521028




2,85521028










asked Aug 15 at 15:52









user3352632

1464




1464











  • Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
    – DuttaA
    Aug 15 at 16:05











  • A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
    – Douglas Daseeco
    Aug 15 at 20:33










  • So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
    – Douglas Daseeco
    Aug 15 at 20:35










  • Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
    – Ray
    Aug 15 at 21:08










  • Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
    – FauChristian
    Aug 15 at 21:24
















  • Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
    – DuttaA
    Aug 15 at 16:05











  • A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
    – Douglas Daseeco
    Aug 15 at 20:33










  • So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
    – Douglas Daseeco
    Aug 15 at 20:35










  • Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
    – Ray
    Aug 15 at 21:08










  • Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
    – FauChristian
    Aug 15 at 21:24















Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05





Probably wrong... Although RNN maybe perceived as recursive to some....And NN maybe designed as a recursive function but I am not sure of anyone does that
– DuttaA
Aug 15 at 16:05













A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33




A multilayer perceptron isn't strictly a function. It is a conceptual design for a circuit simulation with a mathematical convention of how to train it that often works. The use of the term implies layers of activation functions attenuated with a matrix of parameters at the input of each layer. It is tuned by propagating a correction signal derived from an error function's gradient backward through the simulation. It can be implemented by algorithms that are recursive or iterate using loops instead, and the implementation can be called as a function.
– Douglas Daseeco
Aug 15 at 20:33












So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35




So the multilayer perceptron concept (sometimes called ANNs) isn't technically a recursive function but its implementation may involve recursion or may perform repetitious tasks using iteration in a loop instead. And the entire implementation is usually either called as a function or as a method to an object oriented class instance.
– Douglas Daseeco
Aug 15 at 20:35












Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08




Are you sure it was only talking about the forward pass? The full training process could be modeled as a recursive process (although it usually isn't), but the forward pass by itself, not so much.
– Ray
Aug 15 at 21:08












Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24




Every iterative computing task can be realized in a recursive function (preferably tail recursive to automatic memory allocation) or using a loop, with appropriate iteration technique. That is why functional programning languages and collections libraries have iterators. It was found that the idea of tail recursion was only comprehensible to a small subset of those that program, so LISPers use recursion more frequently and with greater efficiency than Java programmers, who tend to use the Iterator interface or its sub interfaces.
– FauChristian
Aug 15 at 21:24










2 Answers
2






active

oldest

votes

















up vote
4
down vote













Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.



On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.






share|improve this answer



























    up vote
    3
    down vote













    Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as



    $$ o_n = f(o_n-1)$$



    Where $o_n$ is the output of layer $n$.



    But this clearly doesn't reveal, much does it?






    share|improve this answer




















      Your Answer




      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "658"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f7579%2fis-a-multilayer-perceptron-a-recursive-function%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      4
      down vote













      Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.



      On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.






      share|improve this answer
























        up vote
        4
        down vote













        Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.



        On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.






        share|improve this answer






















          up vote
          4
          down vote










          up vote
          4
          down vote









          Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.



          On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.






          share|improve this answer












          Inherently, no. The MLP is just a data structure. It represents a function, but a standard MLP is just representing an input-output mapping, and there's no recursive structure to it.



          On the other hand, possibly your source is referring to the common algorithms that operate over MLPs, specifically forward propagation for prediction and back propagation for training. Both of these algorithms are easy to think about recursively, with each node performing a sort of recursive call with its children or parents as the target, and some useful information about activations or errors attached. I actually encourage my students to implement it recursively for this reason, even though it's probably not the most efficient solution.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Aug 15 at 17:27









          John Doucette

          1,86620




          1,86620






















              up vote
              3
              down vote













              Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as



              $$ o_n = f(o_n-1)$$



              Where $o_n$ is the output of layer $n$.



              But this clearly doesn't reveal, much does it?






              share|improve this answer
























                up vote
                3
                down vote













                Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as



                $$ o_n = f(o_n-1)$$



                Where $o_n$ is the output of layer $n$.



                But this clearly doesn't reveal, much does it?






                share|improve this answer






















                  up vote
                  3
                  down vote










                  up vote
                  3
                  down vote









                  Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as



                  $$ o_n = f(o_n-1)$$



                  Where $o_n$ is the output of layer $n$.



                  But this clearly doesn't reveal, much does it?






                  share|improve this answer












                  Sure, you can define plenty of things we don't generally need to regard as recursive as so. An MLP is just a series of functions applied to its input. This can be loosely formulated as



                  $$ o_n = f(o_n-1)$$



                  Where $o_n$ is the output of layer $n$.



                  But this clearly doesn't reveal, much does it?







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Aug 15 at 19:28









                  Daniel

                  1187




                  1187



























                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f7579%2fis-a-multilayer-perceptron-a-recursive-function%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Comments

                      Popular posts from this blog

                      What does second last employer means? [closed]

                      List of Gilmore Girls characters

                      Confectionery