pyerrors.roots
View Source
0import numpy as np 1import scipy.optimize 2from autograd import jacobian 3from .obs import derived_observable 4 5 6def find_root(d, func, guess=1.0, **kwargs): 7 r'''Finds the root of the function func(x, d) where d is an `Obs`. 8 9 Parameters 10 ----------------- 11 d : Obs 12 Obs passed to the function. 13 func : object 14 Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper. 15 Example: 16 ```python 17 import autograd.numpy as anp 18 def root_func(x, d): 19 return anp.exp(-x ** 2) - d 20 ``` 21 guess : float 22 Initial guess for the minimization. 23 24 Returns 25 ------- 26 Obs 27 `Obs` valued root of the function. 28 ''' 29 root = scipy.optimize.fsolve(func, guess, d.value) 30 31 # Error propagation as detailed in arXiv:1809.01289 32 dx = jacobian(func)(root[0], d.value) 33 try: 34 da = jacobian(lambda u, v: func(v, u))(d.value, root[0]) 35 except TypeError: 36 raise Exception("It is required to use autograd.numpy instead of numpy within root functions, see the documentation for details.") from None 37 deriv = - da / dx 38 39 res = derived_observable(lambda x, **kwargs: (x[0] + np.finfo(np.float64).eps) / (d.value + np.finfo(np.float64).eps) * root[0], [d], man_grad=[deriv]) 40 return res
View Source
7def find_root(d, func, guess=1.0, **kwargs): 8 r'''Finds the root of the function func(x, d) where d is an `Obs`. 9 10 Parameters 11 ----------------- 12 d : Obs 13 Obs passed to the function. 14 func : object 15 Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper. 16 Example: 17 ```python 18 import autograd.numpy as anp 19 def root_func(x, d): 20 return anp.exp(-x ** 2) - d 21 ``` 22 guess : float 23 Initial guess for the minimization. 24 25 Returns 26 ------- 27 Obs 28 `Obs` valued root of the function. 29 ''' 30 root = scipy.optimize.fsolve(func, guess, d.value) 31 32 # Error propagation as detailed in arXiv:1809.01289 33 dx = jacobian(func)(root[0], d.value) 34 try: 35 da = jacobian(lambda u, v: func(v, u))(d.value, root[0]) 36 except TypeError: 37 raise Exception("It is required to use autograd.numpy instead of numpy within root functions, see the documentation for details.") from None 38 deriv = - da / dx 39 40 res = derived_observable(lambda x, **kwargs: (x[0] + np.finfo(np.float64).eps) / (d.value + np.finfo(np.float64).eps) * root[0], [d], man_grad=[deriv]) 41 return res
Finds the root of the function func(x, d) where d is an Obs
.
Parameters
- d (Obs): Obs passed to the function.
- func (object):
Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper.
Example:
python import autograd.numpy as anp def root_func(x, d): return anp.exp(-x ** 2) - d
- guess (float): Initial guess for the minimization.
Returns
- Obs:
Obs
valued root of the function.