pyerrors.roots
1import numpy as np 2import scipy.optimize 3from autograd import jacobian 4from .obs import derived_observable 5 6 7def find_root(d, func, guess=1.0, **kwargs): 8 r'''Finds the root of the function func(x, d) where d is an `Obs`. 9 10 Parameters 11 ----------------- 12 d : Obs 13 Obs passed to the function. 14 func : object 15 Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper. 16 Example: 17 ```python 18 import autograd.numpy as anp 19 def root_func(x, d): 20 return anp.exp(-x ** 2) - d 21 ``` 22 guess : float 23 Initial guess for the minimization. 24 25 Returns 26 ------- 27 Obs 28 `Obs` valued root of the function. 29 ''' 30 root = scipy.optimize.fsolve(func, guess, d.value) 31 32 # Error propagation as detailed in arXiv:1809.01289 33 dx = jacobian(func)(root[0], d.value) 34 try: 35 da = jacobian(lambda u, v: func(v, u))(d.value, root[0]) 36 except TypeError: 37 raise Exception("It is required to use autograd.numpy instead of numpy within root functions, see the documentation for details.") from None 38 deriv = - da / dx 39 40 res = derived_observable(lambda x, **kwargs: (x[0] + np.finfo(np.float64).eps) / (d.value + np.finfo(np.float64).eps) * root[0], [d], man_grad=[deriv]) 41 return res
def
find_root(d, func, guess=1.0, **kwargs):
8def find_root(d, func, guess=1.0, **kwargs): 9 r'''Finds the root of the function func(x, d) where d is an `Obs`. 10 11 Parameters 12 ----------------- 13 d : Obs 14 Obs passed to the function. 15 func : object 16 Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper. 17 Example: 18 ```python 19 import autograd.numpy as anp 20 def root_func(x, d): 21 return anp.exp(-x ** 2) - d 22 ``` 23 guess : float 24 Initial guess for the minimization. 25 26 Returns 27 ------- 28 Obs 29 `Obs` valued root of the function. 30 ''' 31 root = scipy.optimize.fsolve(func, guess, d.value) 32 33 # Error propagation as detailed in arXiv:1809.01289 34 dx = jacobian(func)(root[0], d.value) 35 try: 36 da = jacobian(lambda u, v: func(v, u))(d.value, root[0]) 37 except TypeError: 38 raise Exception("It is required to use autograd.numpy instead of numpy within root functions, see the documentation for details.") from None 39 deriv = - da / dx 40 41 res = derived_observable(lambda x, **kwargs: (x[0] + np.finfo(np.float64).eps) / (d.value + np.finfo(np.float64).eps) * root[0], [d], man_grad=[deriv]) 42 return res
Finds the root of the function func(x, d) where d is an Obs
.
Parameters
- d (Obs): Obs passed to the function.
func (object): Function to be minimized. Any numpy functions have to use the autograd.numpy wrapper. Example:
import autograd.numpy as anp def root_func(x, d): return anp.exp(-x ** 2) - d
guess (float): Initial guess for the minimization.
Returns
- Obs:
Obs
valued root of the function.