Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible API change: Separate fit results from Model/add history to Features Table #256

Open
drvdputt opened this issue Feb 20, 2023 · 1 comment

Comments

@drvdputt
Copy link

In our current design, the model details, fit settings and fit results are put together in the Model class. This brings some complexity with it:

  • We always have to think about what happens when a model is "refit", especially if a different redshift is specified, and features that were deactivated due to the wavelength range have to be reactivated.
  • Storing the uncertainties is an issue. They are only meaningful for a fit result, not a fit setup.
  • The bounds are stored alongside the result. They are only meaningful for a fitting setup, not the result (although they are still informative to the user, i.e. they can see if a bound has been hit).

Alternative design

  1. Model: stores the features table provided at construction, but does not change afterwards. Every call to fit returns a new Results object. Model objects can not be saved.
  2. Result: Contains the fitted features table, the uncertainties, maybe even a copy of the user-supplied spectrum? This would be the new home of the "plot" function, and various extraction / analysis tools. Results can be saved and loaded.

Possible issues with this:

  • What does "guess" do then? Change Model or return a Result?
  • Don't we want to be able to plot unfitted models / initial guesses too?
  • How to use a result as initial conditions for another fit (Answer: pass it to the guess function).
  • Maybe Model doesn't even need to be a class, but just a fit-function that takes yaml pack spectrum, or a Result spectrum (for re-fitting).
  • The user has to learn two classes instead of one.

We're not going to change this now, since we really need to get PAHFIT running. But I really wanted to write this down, and we should keep this type of redesign in mind if we run into more symptoms of the same underlying conceptual problem.

@jdtsmith
Copy link
Contributor

jdtsmith commented Mar 3, 2023

Yeah, guess pretty much is required to change the model, since it is a warm-up to fit. And fit can be a prelude to another fit, e.g. on the next spectrum over in the cube. We can leave this open for a while, but I think a better approach is to tweak Features so it gives indication about:

  • the provenance and history of the Features, which can go in features.meta['history'] or similar. Updated when: read from yaml (which one), guessed with a spectrum (mention which guess model, and which spectrum/instrument(s)), fitted to a spectrum, etc.
  • any auto-disabled features (e.g. fell off range given redshift). These can be re-enabled if you fit using a different redshift.
  • feature values (e.g. FWHM) which have been altered by the instrument pack. This is "opt-in" as I understand it; the user must explicitly request using the Feature's FWHM line values.
  • uncertainties on fitted output cells.

@jdtsmith jdtsmith changed the title Possible API change: Separate fit results from Model Possible API change: Separate fit results from Model/add history to Features Table May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants