So I am trying to implement Lowest common subsequence in Python and was trying this alternative to my previous solution. I tried using a dictionary instead of a 2-D matrix to memoize the results.
JavaScript
x
14
14
1
def lcs(s1, s2):
2
3
cache = {}
4
if len(s1) == 0 or len(s2) == 0:
5
return 0
6
if (s1, s2) in cache:
7
return cache[s1, s2]
8
else:
9
if s1[-1] == s2[-1]:
10
cache[s1, s2] = 1 + lcs(s1[:-1], s2[:-1])
11
else:
12
cache[s1, s2] = max(lcs(s1[:-1], s2), lcs(s1, s2[:-1]))
13
print cache
14
It’s returning
JavaScript
1
2
1
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'
2
which I understand is because I am not returning anything so how can I do something like this.
JavaScript
1
2
1
return cache[s1, s2] = 1 + lcs(s1[:-1], s2[:-1])
2
And I am trying to implement it without using any decorators.
Advertisement
Answer
Try this
JavaScript
1
14
14
1
cache = {}
2
def lcs(s1, s2):
3
global cache
4
if len(s1) == 0 or len(s2) == 0:
5
return 0
6
if (s1, s2) in cache:
7
return cache[(s1, s2)]
8
else:
9
if s1[-1] == s2[-1]:
10
cache[(s1, s2)] = 1 + lcs(s1[:-1], s2[:-1])
11
else:
12
cache[(s1, s2)] = max(lcs(s1[:-1], s2), lcs(s1, s2[:-1]))
13
return cache[(s1, s2)]
14