I’m trying to build a PDE in python. I’m new to this and wondering where I have gone wrong. Would appreciate some help. I understand that I have a python object and I’m trying to cast it to a float64 but is there any way around this?
Here is my error
--------------------------------------------------------------------------- UFuncTypeError Traceback (most recent call last) <ipython-input-10-0b4b2c2546dc> in <module>() 33 I1 = np.trace(C) 34 ---> 35 J = np.linalg.det(F) 36 37 D = np.linalg.inv(C) <__array_function__ internals> in det(*args, **kwargs) /usr/local/lib/python3.7/dist-packages/numpy/linalg/linalg.py in det(a) 2156 t, result_t = _commonType(a) 2157 signature = 'D->D' if isComplexType(t) else 'd->d' -> 2158 r = _umath_linalg.det(a, signature=signature) 2159 r = r.astype(result_t, copy=False) 2160 return r UFuncTypeError: Cannot cast ufunc 'det' input from dtype('O') to dtype('float64') with casting rule 'same_kind'
Here is my code
import numpy as np from sympy import Symbol, Function, Number # coordinates x, y = Symbol('x'), Symbol('y') normal_x, normal_y = Symbol('normal_x'), Symbol('normal_y') # time t = Symbol('t') # make input variables input_variables = {'x': x, 'y': y} # A 1D array u = np.array([10, 20, 30]) v = np.array([10, 20, 30]) u = Function('u')(*input_variables) v = Function('v')(*input_variables) Exx = u.diff(x) Eyy = v.diff(y) Exy = 0.5 * (u.diff(x) + v.diff(y)) I = np.identity(2) grad_u = np.array([[Exx, Exy], [Exy, Eyy]]) F = np.add(grad_u, I) F_t = np.transpose(F) C = np.matmul(F_t, F) I1 = np.trace(C) J = np.linalg.det(F) D = np.linalg.inv(C) con1 = (J**(-2/3))*mu con2 = (K/2)*(J**2 - 1) S = con1*(I - np.matmul((I1/3), D)) + con2*D
Advertisement
Answer
A symbolic calculation like this should be done with sympy rather than numpy. There is no good reason to use numpy for any part of what you are doing so it is best to avoid using it altogether when using sympy until you understand what the different libraries are for:
from sympy import Symbol, Function, Number, eye, Matrix, factor # coordinates x, y = Symbol('x'), Symbol('y') normal_x, normal_y = Symbol('normal_x'), Symbol('normal_y') # time t = Symbol('t') # make input variables input_variables = {'x': x, 'y': y} # A 1D array u = [10, 20, 30] v = [10, 20, 30] u = Function('u')(*input_variables) v = Function('v')(*input_variables) Exx = u.diff(x) Eyy = v.diff(y) Exy = (u.diff(x) + v.diff(y)) / 2 I = eye(2) grad_u = Matrix([[Exx, Exy], [Exy, Eyy]]) F = grad_u + I F_t = F.T C = F_t @ F I1 = C.trace() J = F.det() D = C.adjugate() / C.det() D = D.applyfunc(factor) # I guess you need to define some more symbols here: # #con1 = (J**(-S(2)/3))*mu #con2 = (K/2)*(J**2 - 1) # #S = con1*(I - np.matmul((I1/3), D)) + con2*D