Why is Perl so afraid of “deep recursion”?

后端 未结 4 1139

I recently stumbled across the book Higher-order Perl, which basically suggests ways to do things in Perl in a functional way. The author explains that Perl has 6 o

相关标签:
4条回答
  • 2021-01-07 17:03

    Because in Haskell, we have laziness & guarded recursion and tail call optimization and Perl has neither.

    This essentially means that every function call allocates a set amount of memory called "the stack" until the function returns. When you write recursive code, you build up a huge amount of memory because of these nested function calls and eventually can stack overflow. TCO allows the unused chunks to be optimized away.

    Without this it's not really a good idea to rely on recursion. For example, say you wrote a recursive version of map, it'd crash with any decent sized list. In Haskell, guarded recursion means that with large lists, the recursive solution is much faster than a "loop" like function.

    TLDR: Haskell's implementations are designed to handle a functional/recursive style and perl's implementation is not and back in the good old days, 100 levels of function calls was a sane limit.

    0 讨论(0)
  • 2021-01-07 17:06

    The default limit is too low, but was appropriate for the smaller machines Perl originally ran on. Now 100 is laughable if you are doing serious recursive work, but as you say it can be tuned. I assume Haskell has some other way of catching infinite recursion?

    0 讨论(0)
  • 2021-01-07 17:09

    The “deep recursion” warning is optional, and an indicator that something may have gone wrong: most of the time, a function calling itself over and over again isn't intended (Perl is a multi-paradigm language, and many people don't use functional idioms). And even when consciously employing recursion, it is far too easy to forget the base case.

    It's easy to switch the “deep recursion” warning off:

    use warnings;
    no warnings 'recursion';
    
    sub recurse {
      my $n = shift;
      if ($n) {
        recurse($n - 1);
      }
      else {
        print "look, no warnings\n";
      }
    }
    
    recurse(200);
    

    Output:

    look, no warnings
    

    It is true that Perl does not perform tail recursion optimization, because that would mess up the caller output (which is vital for some things like pragmas or Carp). If you want to manually perform a tail call, then

    return foo(@args);
    

    becomes

    @_ = @args; # or something like `*_ = \@args`
    goto &foo;
    

    although bad things can happen if you foolishly localized @_.

    0 讨论(0)
  • 2021-01-07 17:11

    Runaway recursion will crash either a Perl program or a Haskell program after unbounded allocation eventually exhausts available memory. Both languages deal with this potential pitfall in different ways.


    Perl

    Perl is a multi-paradigm language but procedural at heart, and deep stack depth in that paradigm can indicate runaway recursion. The perldiag documentation on this diagnostic states as much, emphasis added.

    • Deep recursion on anonymous subroutine
    • Deep recursion on subroutine "%s"
      (W recursion) This subroutine has called itself (directly or indirectly) 100 times more than it has returned. This probably indicates an infinite recursion, unless you're writing strange benchmark programs, in which case it indicates something else.

    It is merely a runtime warning, so if you believe it to be spurious, disable it with a pragma.

    sub known_deep_recursion {
      no warnings 'recursion';
    
      ...;
    }
    

    A more drastic and evidently unnecessary way to achieve the same effect is described in the same section of the Perl documentation.

    This threshold can be changed from 100, by recompiling the perl binary, setting the C pre-processor macro PERL_SUB_DEPTH_WARN to the desired value.


    Haskell

    Haskell is a purely functional language, where recursion is essential. Writing Haskell tail-recursively, as in

    A recursive function is tail recursive if the final result of the recursive call is the final result of the function itself. If the result of the recursive call must be further processed (say, by adding 1 to it, or consing another element onto the beginning of it), it is not tail recursive.

    takes advantage of compiler support for lazy evaluation and guarded recursion, which means well written Haskell code can be both concise and have excellent performance.

    But Haskell is not immune to runaway allocations related to recursion. Poorly written Haskell code will have stack problems also, as noted on the Stack overflow page on HaskellWiki.

    There is no call stack in Haskell. Instead we find a pattern matching stack whose entries are essentially case expressions waiting for their scrutinee to be evaluated enough that they can match a constructor (WHNF).

    When GHC is evaluating a thunked [that is, unevaluated] expression it uses an internal stack. This inner stack for thunk evaluation is the one that can overflow in practice.

    Haskell does not have a call stack like procedural languages, so these crashes are referred to as space leaks.

    Haskell programs will sometimes consume a lot more memory than necessary, and this is often due to too much, or too little, laziness.

    For example, the simple Haskell program below

    mysum :: [Integer] -> Integer
    mysum = foldr (+) 0
    
    main = print (mysum [1..10000000000])
    

    when run on my machine — if it does not on yours, increase the upper bound until it does — results in

    mysum: Out of memory

    Changing the program to

    import Data.List (foldl')
    
    mysum :: [Integer] -> Integer
    mysum = foldl' (+) 0
     
    main = print (mysum [1..10000000000])
    

    takes a while to run but eventually terminates with the correct result. Reasoning about a Haskell program’s space usage can be tricky for beginners. Rooting out space leaks using the profiler is an advanced topic.


    Summary

    Both are excellent languages, especially in their respective niches. We programmers sometimes make mistakes. Resource limitations and the halting problem will always be out there conspiring against us. Programming languages deal with these limitations in their own way.

    0 讨论(0)
提交回复
热议问题