Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace max tensor with max cut #1295

Merged
merged 15 commits into from
Dec 25, 2024
Merged

Replace max tensor with max cut #1295

merged 15 commits into from
Dec 25, 2024

Conversation

elad-c
Copy link
Collaborator

@elad-c elad-c commented Dec 18, 2024

Pull Request Description:

Replace max tensor with max cut in resource utilization.

Checklist before requesting a review:

  • I set the appropriate labels on the pull request.
  • I have added/updated the release note draft (if necessary).
  • I have updated the documentation to reflect my changes (if necessary).
  • All function and files are well documented.
  • All function and classes have type hints.
  • There is a licenses in all file.
  • The function and variable names are informative.
  • I have checked for code duplications.
  • I have added new unittest (if necessary).

@ofirgo ofirgo requested a review from irenaby December 18, 2024 09:18
Copy link
Collaborator

@ofirgo ofirgo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly comments regarding TODOs that can be handled in the next PR.

Two other things:

  • Most of the "TODOs" for non-cover code lines need to be covered
  • We need to remember to insert an "experimental" message (or even as part of the RU target API?) for using the mp with max cut feature (also in the next PR)

@@ -154,6 +154,9 @@ def solve(self, estimate_factor: float, iter_limit: int = 500) -> Tuple[List[Bas
cut_route = routes[next_cut]

if next_cut == self.target_cut:
# TODO maxcut: Why do we filter the cuts (cut_route) but not the max cut size (cut_sost).
# This is a mismatch between max_cut and max(cuts).
# Also, unfiltered cut_route seems perfect, including input and output tensor sizes of current op.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

regarding the first question - seems like a bug, the max cut size should match the size of the remained memory elements in the cut.
regarding the filtering - it is possible that there is an issue here, but basically, it is not that we are removing input tensor sizes is that we remove memory elements of operations that have no other operations that are dependent on them that hasn't been computed yet. So we need to look further into it before just removing the filtering

@@ -71,6 +82,7 @@ def __init__(self, model_graph: Graph):
inputs_tensors_memory = [sum([t.total_size for t in self.operation_node_children(n)])
for n in nodes if n in model_graph.get_inputs()]

# TODO maxcut: why both inputs and outputs of each nodes, while the A* solves for node outputs only???
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to deeply get into the algorithm to understand this.
If it is not giving you trouble then leave it for now.
Or, create some unittest that you for sure know what the result should be and see if changing this code according to what you think it should be changes the result.

@elad-c elad-c merged commit ceaf820 into main Dec 25, 2024
42 checks passed
@elad-c elad-c deleted the replace_max_tensor_with_max_cut branch December 25, 2024 15:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants