Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Require an autoscheduler for build_gradient_module()/GRADIENT_DESCENT #8525

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion doc/HalideCMakePackage.md
Original file line number Diff line number Diff line change
Expand Up @@ -474,7 +474,9 @@ verbatim.
If `GRADIENT_DESCENT` is set, then the module will be built suitably for
gradient descent calculation in TensorFlow or PyTorch. See
`Generator::build_gradient_module()` for more documentation. This corresponds to
passing `-d 1` at the generator command line.
passing `-d 1` at the generator command line. Note that you *must* also specify
an autoscheduler (via the `AUTOSCHEDULER` argument) when specifying this
argument.
Comment on lines +477 to +479
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should check this in HalideGeneratorHelpers.cmake

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First we should get buy-in from everyone as to whether this change is a good one.


If the `C_BACKEND` option is set, this command will invoke the configured C++
compiler on a generated source. Note that a `<target>.runtime` target is _not_
Expand Down
3 changes: 1 addition & 2 deletions src/AbstractGenerator.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -218,8 +218,7 @@ Module AbstractGenerator::build_gradient_module(const std::string &function_name
if (!asp.name.empty()) {
auto_schedule_results = grad_pipeline.apply_autoscheduler(context.target(), asp);
} else {
user_warning << "Autoscheduling is not enabled in build_gradient_module(), so the resulting "
"gradient module will be unscheduled; this is very unlikely to be what you want.\n";
user_error << "An autoscheduler must be specified when producing a gradient-descent module().\n";
}

Module result = grad_pipeline.compile_to_module(gradient_inputs, function_name, context.target(), linkage_type);
Expand Down