Optimizing runtime in recursive function

Memoization! Another fancy technical term I encounter when I was climbing my way up the famous Mount. Algorithm. It helps improving the runtime and time complexity in recursive functions. Once you get a hang of it, your recursions’ runtime will be shortened tremendously, you can thank me later. Before we dive into implementation of memoization, let’s touch base on some other terms just to make sure we’re on the same page.

Recursion

The process in which a function calls itself directly or indirectly is called recursion and the corresponding function is called as recursive function. Using recursive algorithm, certain problems can be solved quite easily.

Functions like this will continue to place itself on the callstack until it hits the machenism to return a value without making further recursive call, the base case. Take this function for example,

NOTE: all the code snippet is written in JavaScript, I know, don’t yell at me.

It’s rather a useless function but the point is, the function will be calling itself recursively until the base case it met, which is the input integer equals to 0. The call stack would grow in a leaner fashion according to its input.

Now that we get that out of the way, let’s look at another classic solution for finding the number of fibonacci sequence according to its order. Fibonacci sequence is the sequence starting at 0 and 1, a number is determined by the sum of two preceeding ones.

The solution is rather simple, it’s recursively calling itself as it subtract the input by one and two, returning the sum of the two function to its parent. However, when we need to figure out a number that’s in later order in the sequence, such as the 50th number, we have a problem. Because every parent calls for the same function twice as the numbers decrement down to the base case, the call stack is growing exponentially. First level will generates two children call, the two children will generate four more children, then eight more, sixteen more, so on and so forth. The bigO in this solution is a solid O(2^n) in an average case, which is not ideal. So how could we make this faster and less expensive? Memoization to the resue!

Memoization

In computing, memoization or memoisation is an optimization technique used primarily to speed up computer programs by storing the results of expensive function calls and returning the cached result when the same inputs occur again.

Some of you might notice it sounds very familiar with another concept, Caching. They are both relying on keeping the result of a complicated process in a quick access point for repeated requests. When you’re visiting your favorite website the second time and it loads significantly faster than the first time you were there, it’s not because your internet speed got a free upgrade but your computer has cached most of the major elements of the webpage in your local storage rather than downloading everything from the web.

So how do we implement it in the recursive functions? Let’s first take a look at the call stack of the same finbonacci sequence in a tree graph.

Each node represents a function call, and the leaves are when the function hits a base case, therefore returning a value. If you look at the graph carefully, it’s not hard to notice there are multiple instances where the function was called with the same input, such as f(3) and f(4). Let’s cache the result of that into something that we’re able to quickly access for repeated calls.

Speaking of quick access, nothing can beat the key-value pair search. In JavaScript we have the Object data type(Dictionary in Python, Hash in Ruby) where we can store the input as a key that points to the result of that key as its value. Let’s take a look at the revised version of the fibonacci sequence solution.

Here we injected another argument named memo and default its value to an empty object to the function, the child function calls will inherit the memo boject and populate it with it’s returned result. On the first line we’re getting an early return if we find the key ’n’ in the memo without running the rest of the code, otherwise we’re calling the resursive calls on line 4 and keeping the result in the memo object and returning the same result at the end. This way we’re able to reconstruct the call stack tree into something like this,

Now the repeated input wouldn’t invoke more recursive functions for result but accessing the memo object insteak, which turn the time complexity from O(2^n) to O(n).

Conclusion

Memoization is a technique of caching function results within the function scope to make the function memorize a return value from a same input. We can turn any pure fucntion memoized but it’s the most beneficial when apply to function that does heavy computation through recursive calls.

--

--

--

Software Engineer, React.js || Javascript || RoR

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Demystifying Async-Await, Generators And Symbols in Javascript Using Iterator Pattern — Part 1

React: Form Validation (having nested schema) with Formik, Yup, and Material-UI

Build an Admin panel with Node.js and React

[Leet Code] Generate a string with characters that have odd counts

Journey to finding the best Knex query for my login!

FOUR Useful Debugging Tools in the latest Chrome Update.

Zero to 15 - Building a Nothing PWA in 15 mins

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Pan Li

Pan Li

Software Engineer, React.js || Javascript || RoR

More from Medium

Demystifying Functions

Recursion — Recursive Search

The 8-puzzle problem using A* algorithm