-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save Nested Layers For Rubik #377
base: master
Are you sure you want to change the base?
Conversation
…agemaker-debugger into save_nested_layers_tmp
Codecov Report
@@ Coverage Diff @@
## master #377 +/- ##
==========================================
- Coverage 85.49% 82.76% -2.73%
==========================================
Files 86 86
Lines 6514 6539 +25
==========================================
- Hits 5569 5412 -157
- Misses 945 1127 +182
Continue to review full report at Codecov.
|
@@ -794,9 +832,16 @@ def _save_layer_values(self, logs): | |||
# Layer Inputs are flattened and passed as a list into | |||
# the next layer. Unpacking it speeds up the _make_numpy fn. | |||
layer_input = layer_input[0] | |||
layer_input_tensor_name = get_export_name_for_keras(str(layer_name), "input") | |||
layer_name = str(layer_name) | |||
idx = layer_name_dict.get(layer_name, 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this be 0 always because this is the first time the dict has been accessed after L820?
@@ -849,6 +894,9 @@ def _on_any_batch_end(self, batch, mode, logs=None): | |||
self._export_model() | |||
self._exported_model[self.mode] = True | |||
|
|||
if is_tf_version_2x(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
any restriction for eager/non-eager/tf.function?
if tensor_type in ["input", "output", "weight"]: | ||
if isinstance(layer, str): | ||
# Tensor.name is meaningless when eager execution is enabled. | ||
return f"{layer}/{tensor_type}s" | ||
return f"{layer}_{layer_idx}/{tensor_type}_{tensor_idx}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this change how layer/tensor names have looked so far?
layer_inputs = self.saved_layers[layer_name].layer_input | ||
for layer_idx, tensor in enumerate(layer_inputs): | ||
if isinstance(tensor, list): | ||
tensor_list = tensor | ||
else: | ||
tensor_list = [tensor] | ||
if hasattr(tensor_list[0], "numpy") is False: | ||
self.logger.warning( | ||
"cannot save layer values during forward pass with tf.function" | ||
) | ||
continue | ||
else: | ||
for t_idx, t in enumerate(tensor_list): | ||
export_name = get_export_name_for_keras( | ||
layer_name, | ||
tensor_type="input", | ||
tensor=tensor, | ||
layer_idx=layer_idx, | ||
tensor_idx=t_idx, | ||
) | ||
self._save_tensor_to_file(export_name, t, layer_collection) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks similar to what's done for outputs below. possible to make it common?
Description of changes:
_flatten_layers
API. Seetests/tensorflow2/test_nested_layers.py
InputOutputSaver
object. Seetests/tensorflow2/test_model_that_reuses_layers.py
tests/tensorflow2/test_concat_layer.py
Style and formatting:
I have run
pre-commit install
to ensure that auto-formatting happens with every commit.Issue number, if available
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.