I am finding some inconsistencies in making NonlinearModelFit work with the ParametricNDSolve output. Here is an example that works (starting with a new Kernel):

```
eqs = {a'(t) == -k1 a(t) - k2 a(t)^2,
b'(t) == k1 a(t) + k2 a(t)^2,
a(0) == a0, b(0) == b0};
fixedparams = {k1 -> 1.2, b0 -> 0};
fns = {a, b};
params = {k2, a0};
solution = ParametricNDSolve(eqs /. fixedparams, fns, {t, 0, 5}, params)
fitfn = a /. solution;
paramsForDataSet = {k2 -> 1.263, a0 -> 0.0321};
dataset = {#, ((fitfn @@ params) /. paramsForDataSet)(#) +
RandomVariate(NormalDistribution(0, 0.0002))} & /@ Range(0, 5, 0.01);
ListPlot(dataset, PlotRange -> Full)
```

```
initialGuess = {k2 -> 2.0, a0 -> 0.3};
tmp = Values@initialGuess;
Dynamic@Column({Show(ListPlot(dataset, PlotRange -> Full),
Plot((fitfn @@ tmp)(t), {t, 0, 5},
PlotRange -> Full, PlotStyle -> Red),
PlotRange -> Full, ImageSize -> Large),
ListPlot({#1, #2 - (fitfn @@ tmp)(#1)} & @@@ dataset,
PlotRange -> Full, AspectRatio -> 0.2,
ImageSize -> Large)})
```

This last bit gives me a graph of dynamic updating of my adjustment and waste as it converges. Here is the adjustment procedure:

```
result = NonlinearModelFit(dataset, (fitfn @@ params)(t),
Evaluate(List @@@ initialGuess), t,
StepMonitor :> (tmp = params))
tmp = Values@result("BestFitParameters")
```

This looks great! But when I complicate the model a bit, I drop the core. Again from a new core:

```
eqs = {a'(t) == -k1 a(t) - k2 a(t)^2, b'(t) == k1 a(t) + k2 a(t)^2,
c(t) == q a(t) + r b(t), c(0) == q a0 + r b0, a(0) == a0,
b(0) == b0};
fixedparams = {k1 -> 1.2, b0 -> 0};
fns = {a, b, c};
params = {k2, a0, q, r};
solution = ParametricNDSolve(eqs /. fixedparams, fns, {t, 0, 5}, params)
fitfn = c /. solution;
paramsForDataSet = {k2 -> 1.263, a0 -> 0.0321, q -> 0.341,
r -> 0.8431};
dataset = {#, ((fitfn @@ params) /. paramsForDataSet)(#) +
RandomVariate(NormalDistribution(0, 0.0002))} & /@ Range(0, 5, 0.01);
ListPlot(dataset, PlotRange -> Full)
```

```
initialGuess = {k2 -> 2.0, a0 -> 0.3, q -> 0.32, r -> 0.88};
tmp = Values@initialGuess;
Dynamic@Column({Show(ListPlot(dataset, PlotRange -> Full),
Plot((fitfn @@ tmp)(t), {t, 0, 5}, PlotRange -> Full,
PlotStyle -> Red),
PlotRange -> Full, ImageSize -> Large),
ListPlot({#1, #2 - (fitfn @@ tmp)(#1)} & @@@ dataset,
PlotRange -> Full, AspectRatio -> 0.2,
ImageSize -> Large)})
result = NonlinearModelFit(dataset, (fitfn @@ params)(t),
Evaluate(List @@@ initialGuess), t,
StepMonitor :> (tmp = params))
tmp = Values@result("BestFitParameters")
```

The only differences are:

- adding c (t) and c (0) to the equations
- adding c to fns
- Add q and r to the parameters
- Add values ββfor q and r to paramsForDataSet and initialGuess
- changing fitfn to c instead of a

Everything else is identical, but this time the core fails. Any suggestions would be welcome.

(In case this is a mistake in Mathematica, I have sent an error report to Wolfram. However, I do not want to rule out that I may be doing something wrong, so I am also asking here).