bayespecon.logdet.logdet_eigenvalue¶
- bayespecon.logdet.logdet_eigenvalue(rho, eigs)[source]¶
Eigenvalue-based log|I - rho*W|.
Pre-compute
eigs = np.linalg.eigvals(W).realonce; each evaluation costs O(n) and is exactly differentiable by pytensor autodiff.Notes
Stability requires
|rho| < 1 / max(|eigs|); for row-standardised W this is|rho| < 1. When1 - rho * eig_iis numerically zero (rho exactly at the stability boundary) the unguardedlogwould return-inf. The argument is therefore clamped at a small floor (1e-300) before takinglog; this keeps NUTS gradients finite and produces a very large negative penalty rather than a hard NaN. Callers should still constrainrhoaway from1 / eig_maxvia the prior bounds.