First, here is my code:
JavaScript
x
31
31
1
class matrix:
2
def __init__(self, m, n):
3
self._m, self._n = m, n
4
5
L = [] # We first define our pre-matrix as an empty list
6
for i in range(m):
7
L.append(0)
8
9
for j in range(m):
10
L[j] = [0] * n
11
12
self._matrix = L
13
14
15
def __setitem__(self, c, value):
16
self._matrix[c[0] - 1][c[1] - 1] = value
17
18
def __str__(self):
19
l = ""
20
for i in range(self._m):
21
l = l + " "
22
for j in range(self._n):
23
l = l + str(self._matrix[i][j]) + " "
24
l = l + " n"
25
return l
26
27
28
def __add__(self, other):
29
result = [[self._matrix[i][j] + other._matrix[i][j] for j in range(len(self._matrix[0]))] for i in range(len(self._matrix))]
30
return result
31
When adding two (non-zero) matrices I can not make the result be printed nicely as my __str__
method would do, instead of having
JavaScript
1
3
1
a b c
2
d e f
3
I get the usual,
JavaScript
1
2
1
[[a, b, c],[d, e, f]]
2
Does anyone have an idea on how to fix the issue?
Advertisement
Answer
The return type of add function is a list
. It should be a matrix
.
JavaScript
1
5
1
def __add__(self, other):
2
result = matrix(self._m, self._n)
3
result._matrix = [[self._matrix[i][j] + other._matrix[i][j] for j in range(len(self._matrix[0]))] for i in range(len(self._matrix))]
4
return result
5