I have two sorted arrays and I am trying to find median of two sorted arrays.For example,if input is nums1 = [1,3], nums2 = [2] then the output will median=2.00000 and if the input is p = [1,2], t = [3,4] then the output will be median=2.50000 I have added both the arrays together and sorted them and later by using their lengths I have tried to calculate the correct value. Below is my code
JavaScript
x
18
18
1
class Solution(object):
2
def findMedianSortedArrays(self, nums1, nums2):
3
4
nums1.extend(nums2)
5
nums1.sort()
6
7
if len(nums1)%2 ==0:
8
a = len(nums1)/2
9
return float(nums1[a]+nums1[a-1])/float(2)
10
else:
11
a = len(nums1) / 2
12
return float(nums1[a])
13
14
if __name__ == "__main__":
15
p = [1,3]
16
t = [2]
17
print(Solution().findMedianSortedArrays(p,t))
18
Below is the error in the logs.
JavaScript
1
3
1
return float(nums1[a])
2
TypeError: list indices must be integers or slices, not float
3
Advertisement
Answer
Division of integers yields a float
but you could trivially check this by printing out a
just before using it as an index. And the error is extremely clear that a
is a float at that point.
Either coerce it to int, as
JavaScript
1
2
1
nums1[int(a)]
2
or use floor division, with
JavaScript
1
2
1
a = len(nums1) // 2
2