Simple example for Erlang memoization

后端 未结 2 2132
Happy的楠姐
Happy的楠姐 2021-02-14 07:31

Suppose you have a simple function, which can get quite expensive for large values:

fact(0) -> 1;
fact(N) -> N * fact(N - 1).

Where can I

2条回答
  •  生来不讨喜
    2021-02-14 08:02

    The idea is that every time you ask for your heavy calculation, you immediately check in the cache if you've already evaluated it. If yes, you simply return the stored value. If not, you have to evaluate the new value and to store it, before returning it to the final user.

    A dict, rather than a dets table, could also work.

    (The following solution is untested)

    -module(cache_fact).
    
    -export([init/0, fact/1]).
    
    init() ->
        {ok, _} = dets:open_file(values, []).
    
    fact(N) ->
        case dets:lookup(values, N) of
          [] ->
            Result = do_fact(N), 
            dets:insert_new(values, {N, Result}),
            Result;
          [{N, Cached}] ->
            Cached
        end.
    
    do_fact(0) ->
        1;
    do_fact(N) ->
        N * do_fact(N-1).
    

    You might want to encapsulate the whole thing into an Erlang generic server. In the init function you should create the DETS table, the fact/1 function should represent your API and you should implement the logic in the handle_call functions.

    A nicer example could be writing a shortener service for URLs, cached.

    As suggested by @Zed, it would make sense to store the partial results as well to avoid further re-calculations. If this is the case:

    -module(cache_fact).
    
    -export([init/0, fact/1]).
    
    init() ->
        {ok, _} = dets:open_file(values, []).
    
    fact(0) ->
        1;
    fact(N) ->
        case dets:lookup(values, N) of
          [] ->
            Result = N * fact(N-1),
            dets:insert_new(values, {N, Result}),
            Result;
          [{N, Cached}] ->
            Cached
        end.
    

    Obviously, even if this helps for large numbers, you have to consider the additional cost of adding an entry to the lookup table for every step. Considering the reasons why the caching has been introduced (we assume the calculation is very heavy, so the lookup insertion time is insignificant), this should be perfectly fine.

提交回复
热议问题