python – How to persist non-trainable variables in tf.Estimator checkpoint?

I am trying to include a Dense layer that can not be trained and initialized as an identity matrix, as part of my Stress Flow Estimator. The intuition is that this dense layer will pass through your inputs during standard training and a fine-tuning step later. The problem is that I do not want these weights to be updated at all during the initial round, only during fine tuning.

I can do several things to make these weights not trainable, including using the trainable argument in the Denso constructor or filtering anything with a dense name before moving to MomentumOptimizer.compute_gradients ().

But in any case (making dense non-trainable or simply not passing it to the optimizer), tf will throw an error saying that it can not find a key related to the dense layer.

I understand that since in the first execution, where the density is not trainable, it will not be conserved in the checkpoint file. In the same way, if it is filtered so that it does not happen to compute_gradients, the same problem occurs.

Is there a method to simply persist in the untrained variables, even with only their initialized values, in the executions?

NotFoundError (see above for tracking): Dense / kernel key / Momentum no
found at the control point