Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement support for Vizier in AutoTuner #2428

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
91 changes: 70 additions & 21 deletions docs/user/InstructionsForAutoTuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,29 @@ AutoTuner provides a generic interface where users can define parameter configur
This enables AutoTuner to easily support various tools and flows. AutoTuner also utilizes [METRICS2.1](https://github.com/ieee-ceda-datc/datc-rdf-Metrics4ML) to capture PPA
of individual search trials. With the abundant features of METRICS2.1, users can explore various reward functions that steer the flow autotuning to different PPA goals.

AutoTuner provides two main functionalities as follows.
* Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
* Parametric sweeping experiments for ORFS
AutoTuner provides three main functionalities as follows.
* [Ray] Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
* [Ray] Parametric sweeping experiments for ORFS
* [Vizier] Multi-objective optimization of ORFS parameters


AutoTuner contains top-level Python script for ORFS, each of which implements a different search algorithm. Current supported search algorithms are as follows.
* Random/Grid Search
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/))
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.org/))
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
* Ray (Single-objective optimization)
* Random/Grid Search
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/docs/bayesopt.html))
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html))
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
* Vizier (Multi-objective optimization)
* Random/Grid/Shuffled Search
* Quasi Random Search ([quasi-random](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook/quasi-random-search))
* Gaussian Process Bandit ([GP-Bandit](https://acsweb.ucsd.edu/~shshekha/GPBandits.html))
* Non-dominated Sorting Genetic Algorithm II ([NSGA-II](https://ieeexplore.ieee.org/document/996017))

User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) of three objectives to set the direction of tuning are written in the script. Each coefficient is expressed as a global variable at the `get_ppa` function in `PPAImprov` class in the script (`coeff_perform`, `coeff_power`, `coeff_area`). Efforts to optimize each of the objectives are proportional to the specified coefficients.
For Ray algorithms, optimized function can be adjusted with user-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) for three objectives to set the direction of tuning. They are defined in the [distributed.py sricpt](../../tools/AutoTuner/src/autotuner/distributed.py) in `get_ppa` method of `PPAImprov` class. Efforts to optimize each of the objectives are proportional to the specified coefficients.

Using Vizier algorithms, used can choose which metrics should be optimized with `--use-metrics` argument.


## Setting up AutoTuner
Expand All @@ -28,8 +37,10 @@ that works in Python3.8 for installation and configuration of AutoTuner,
as shown below:

```shell
# Install prerequisites
# Install prerequisites for both Ray Tune and Vizier
./tools/AutoTuner/installer.sh
# Or install prerequisites for `ray` or `vizier`
./tools/AutoTuner/installer.sh vizier

# Start virtual environment
./tools/AutoTuner/setup.sh
Expand All @@ -50,7 +61,8 @@ Alternatively, here is a minimal example to get started:
1.0,
3.7439
],
"step": 0
"step": 0,
"scale": "log"
},
"CORE_MARGIN": {
"type": "int",
Expand All @@ -67,6 +79,7 @@ Alternatively, here is a minimal example to get started:
* `"type"`: Parameter type ("float" or "int") for sweeping/tuning
* `"minmax"`: Min-to-max range for sweeping/tuning. The unit follows the default value of each technology std cell library.
* `"step"`: Parameter step within the minmax range. Step 0 for type "float" means continuous step for sweeping/tuning. Step 0 for type "int" means the constant parameter.
* `"scale"`: Vizier-specific parameter setting [scaling type](https://oss-vizier.readthedocs.io/en/latest/guides/user/search_spaces.html#scaling), allowed values: `linear`, `log` and `rlog`.

## Tunable / sweepable parameters

Expand Down Expand Up @@ -118,13 +131,21 @@ The order of the parameters matter. Arguments `--design`, `--platform` and
`--config` are always required and should precede *mode*.
```

The `autotuner.vizier` module integrates OpenROAD flow into the Vizier optimizer.
It is used for multi-objective optimization with an additional features improving chance of finding valid parameters.
Moreover, various algorithms are available for tuning parameters.

Each mode relies on user-specified search space that is
defined by a `.json` file, they use the same syntax and format,
though some features may not be available for sweeping.

```{note}
The following commands should be run from `./tools/AutoTuner`.
```

#### Tune only

* AutoTuner: `python3 -m autotuner.distributed tune -h`
* Ray-based AutoTuner: `python3 -m autotuner.distributed tune -h`

Example:

Expand All @@ -145,19 +166,39 @@ python3 -m autotuner.distributed --design gcd --platform sky130hd \
sweep
```

#### Multi-object optimization

* Vizier-based AutoTuner: `python3 -m autotuner.vizier -h`

Example:

```shell
python3 -m autotuner.vizier --design gcd --platform sky130hd \
--config ../../flow/designs/sky130hd/gcd/autotuner.json
```

### Google Cloud Platform (GCP) distribution with Ray

GCP Setup Tutorial coming soon.


### List of input arguments
### List of common input arguments
| Argument | Description | Default |
|-------------------------------|-------------------------------------------------------------------------------------------------------|---------|
| `--design` | Name of the design for Autotuning. ||
| `--platform` | Name of the platform for Autotuning. ||
| `--config` | Configuration file that sets which knobs to use for Autotuning. ||
| `--experiment` | Experiment name. This parameter is used to prefix the FLOW_VARIANT and to set the Ray log destination.| test |
| `--samples` | Number of samples for tuning. | 10 |
| `--jobs` | Max number of concurrent jobs. | # of CPUs / 2 |
| `--openroad_threads` | Max number of threads usable. | 16 |
| `--timeout` | Time limit (in hours) for each trial run. | No limit |
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] | 0 |
| | ||

### Input arguments specific to Ray
| Argument | Description | Default |
|-------------------------------|-------------------------------------------------------------------------------------------------------|---------|
| `--git_clean` | Clean binaries and build files. **WARNING**: may lose previous data. ||
| `--git_clone` | Force new git clone. **WARNING**: may lose previous data. ||
| `--git_clone_args` | Additional git clone arguments. ||
Expand All @@ -166,16 +207,11 @@ GCP Setup Tutorial coming soon.
| `--git_orfs_branch` | OpenROAD-flow-scripts branch to use. ||
| `--git_url` | OpenROAD-flow-scripts repo URL to use. | [ORFS GitHub repo](https://github.com/The-OpenROAD-Project/OpenROAD-flow-scripts) |
| `--build_args` | Additional arguments given to ./build_openroad.sh ||
| `--samples` | Number of samples for tuning. | 10 |
| `--jobs` | Max number of concurrent jobs. | # of CPUs / 2 |
| `--openroad_threads` | Max number of threads usable. | 16 |
| `--server` | The address of Ray server to connect. ||
| `--port` | The port of Ray server to connect. | 10001 |
| `--timeout` | Time limit (in hours) for each trial run. | No limit |
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] | 0 |
| | ||

#### Input arguments specific to tune mode
#### Input arguments specific to Ray tune mode
The following input arguments are applicable for tune mode only.

| Argument | Description | Default |
Expand All @@ -190,7 +226,19 @@ The following input arguments are applicable for tune mode only.
| `--resume` | Resume previous run. ||
| | ||

### GUI
### Input arguments specific to Vizier
| Argument | Description | Default |
|-------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|
| `--orfs` | Path to the OpenROAD-flow-scripts repository ||
| `--results` | Path where JSON file with results will be saved ||
| `-a` or `--algorithm` | Algorithm for the optimization engine, one of GAUSSIAN_PROCESS_BANDIT, RANDOM_SEARCH, QUASI_RANDOM_SEARCH, GRID_SEARCH, SHUFFLED_GRID_SEARCH, NSGA2 | NSGA2 |
| `-m` or `--use-metrics` | Metrics to optimize, list of worst_slack, clk_period-worst_slack, total_power, core_util, final_util, design_area, core_area, die_area, last_successful_stage | all available metrics |
| `-i` or `--iterations` | Max iteration count for the optimization engine | 2 ||
| `-s` or `--suggestions` | Suggestion count per iteration of the optimization engine | 5 ||
| `--use-existing-server` | Address of the running Vizier server ||
| | ||

### GUI for optimizations with Ray Tune

Basically, progress is displayed at the terminal where you run, and when all runs are finished, the results are displayed.
You could find the "Best config found" on the screen.
Expand All @@ -216,6 +264,7 @@ Assuming the virtual environment is setup at `./tools/AutoTuner/autotuner_env`:
./tools/AutoTuner/setup.sh
python3 ./tools/AutoTuner/test/smoke_test_sweep.py
python3 ./tools/AutoTuner/test/smoke_test_tune.py
python3 ./tools/AutoTuner/test/smoke_test_vizier.py
python3 ./tools/AutoTuner/test/smoke_test_sample_iteration.py
```

Expand Down
2 changes: 2 additions & 0 deletions flow/scripts/detail_route.tcl
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,8 @@ if { [env_var_exists_and_non_empty POST_DETAIL_ROUTE_TCL] } {

check_antennas -report_file $env(REPORTS_DIR)/drt_antennas.log

report_metrics 5 "detailed route"

if {![design_is_routed]} {
error "Design has unrouted nets."
}
Expand Down
3 changes: 3 additions & 0 deletions flow/test/test_autotuner.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ python3 -m unittest tools.AutoTuner.test.smoke_test_sweep.${PLATFORM}SweepSmokeT
echo "Running Autotuner smoke tests for --sample and --iteration."
python3 -m unittest tools.AutoTuner.test.smoke_test_sample_iteration.${PLATFORM}SampleIterationSmokeTest.test_sample_iteration

echo "Running Autotuner smoke Vizier test"
python3 -m unittest tools.AutoTuner.test.smoke_test_vizier.${PLATFORM}VizierSmokeTest.test_vizier

if [ "$PLATFORM" == "asap7" ] && [ "$DESIGN_NAME" == "gcd" ]; then
echo "Running Autotuner ref file test (only once)"
python3 -m unittest tools.AutoTuner.test.ref_file_check.RefFileCheck.test_files
Expand Down
13 changes: 12 additions & 1 deletion tools/AutoTuner/installer.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,20 @@
# Get the directory where the script is located
script_dir="$(dirname "${BASH_SOURCE[0]}")"

dependencies=""
if [[ "$#" -eq 0 ]]; then
echo "Installing dependencies for Ray Tune and Vizier"
dependencies="ray,vizier"
elif [[ "$#" -ne 1 ]] || ([[ "$1" != "ray" ]] && [[ "$1" != "vizier" ]]); then
echo "Please specify whether 'ray' or 'vizier' dependencies should be installed" >&2
exit 1
else
dependencies="$1"
fi

# Define the virtual environment name
venv_name="autotuner_env"
python3 -m venv "$script_dir/$venv_name"
source "$script_dir/$venv_name/bin/activate"
pip3 install -e "$script_dir"
pip3 install -e "$script_dir[$dependencies]"
deactivate
2 changes: 2 additions & 0 deletions tools/AutoTuner/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements.txt"] }
optional-dependencies.ray = { file = ["requirements-ray.txt"] }
optional-dependencies.vizier = { file = ["requirements-vizier.txt"] }
optional-dependencies.dev = { file = ["requirements-dev.txt"] }

[build-system]
Expand Down
11 changes: 11 additions & 0 deletions tools/AutoTuner/requirements-ray.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
ray[tune]==2.9.3
ax-platform>=0.3.3,<=0.3.7
hyperopt==0.2.7
optuna==3.6.0
pandas>=2.0,<=2.2.1
bayesian-optimization==1.4.0
colorama==0.4.6
tensorboard>=2.14.0,<=2.16.2
protobuf==3.20.3
SQLAlchemy==1.4.17
urllib3<=1.26.15
3 changes: 3 additions & 0 deletions tools/AutoTuner/requirements-vizier.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
jax<=0.4.33
google-vizier[jax]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you lock these requirements?

tqdm
12 changes: 1 addition & 11 deletions tools/AutoTuner/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,11 +1 @@
ray[default,tune]==2.9.3
ax-platform>=0.3.3,<=0.3.7
hyperopt==0.2.7
optuna==3.6.0
pandas>=2.0,<=2.2.1
bayesian-optimization==1.4.0
colorama==0.4.6
tensorboard>=2.14.0,<=2.16.2
protobuf==3.20.3
SQLAlchemy==1.4.17
urllib3<=1.26.15
ray[default]==2.9.3
Loading