Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Create a backwards compatible flow for xo/xclbin generation with Calyx-AXI-wrapper #2267

Draft
wants to merge 81 commits into
base: main
Choose a base branch
from

Conversation

nathanielnrn
Copy link
Contributor

@nathanielnrn nathanielnrn commented Aug 20, 2024

At the highest level this PR should allow for correct xclbins to be produced from Calyx programs.

Broadly speaking this PR does 2 things (sorry for combining them, but some small fixes are sprinkled throughout making things hard to separate into their own changes):

  1. This PR introduces an axi_controller_generator.py which is meant to create a subordinate adhering to Xilinx's control spec. There might be some issues with this as executing on hardwares seems to hang at the moment, and the only difference between the generated verilog is the addition of the controller subordinate AFAIK.
  2. Adds some fud2 functions to go from verilog to a .xo. This is different from the previous .futil to .xo (which should eventually be deprecated), and is the "correct" way to create a .xo with the new Calyx-AXI-wrappers.

I'll do my best to comment the PR heavily to explain which changes affect what.

Furthermore, not sure if the controller itself needs a deep dive w.r.t code review. There are probably problems with it, that require waveform debugging to uncover.

EDIT: More work than I thought regarding backwards compatibility. Commented where changes are needed

In general there are a few outstanding TODOs even after this PR is merged. I'll make issues about these:

  • Update the static wrapper generator to correctly add a Xilinx control subordinate to the wrapper.
  • Change axi-generator.py to axi_generator.py
  • Test that xclbins get produced correctly with our static wrapper (currently this has mainly been tested on dynamic versions.
  • Figure out why the current dynamic wrapper seems to hang when running on actual hardware (likely an issue with the control subordinate)

nathanielnrn and others added 30 commits June 19, 2024 17:18
TODO: Hook up the slices in the highest level module for ap_done,
ap_start. Also thinnk about go done signals/how these connect to rest of
wrapper
* Prepare fifo for case idiom

* Attempt with . Failing due to empty control

* Delete calyx-py/test/correctness/queues/fifo.futil
TODO: Hook up ap_start from controller in wrapper to the main_compute
module
This works around #2198 causing combinational loops
Comment on lines 135 to 146
def output_with_attributes(
self,
name: str,
size: int,
attribute_literals: List[Union[str, Tuple[str, int]]],
) -> ExprBuilder:
"""Declare an output port on the component with attributes.

Returns an expression builder for the port.
"""
return self._port_with_attributes(name, size, False, attribute_literals)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is some unrelated cleanup, the output method already exists and does the same thing

@@ -43,4 +43,4 @@ ipx::save_core [ipx::current_core]
close_project -delete

# Package the project as an .xo file.
package_xo -xo_path ${xoname} -kernel_name Toplevel -ip_directory ${path_to_packaged} -kernel_xml ./kernel.xml
package_xo -xo_path ${xoname} -kernel_name wrapper -ip_directory ${path_to_packaged} -kernel_xml ./kernel.xml
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: Another breaking change from the verilog wrapper. probably just change the name of wrapper to Toplevel

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This whole file is largely just clean up, getting rid of dead code/simplifying some sed commands

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are the fud2 changes

Comment on lines 91 to 100
timeout = 5000
#Base addresses for memories
await tb.control_manager.write(0x0010, encode([0x0],4))
await tb.control_manager.write(0x0014, encode([0x0],4))
await tb.control_manager.write(0x0018, encode([0x0],4))
await tb.control_manager.write(0x001C, encode([0x0],4))
await tb.control_manager.write(0x0020, encode([0x0],4))
await tb.control_manager.write(0x0024, encode([0x0],4))
#Assert ap_start by writing 1 to 0x0000
await tb.control_manager.write(0x0000, encode([0x1],1))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: Change this from being hardcoded to be dynamic and also dependent on if we are targeting Xilinx or not

@@ -23,7 +23,7 @@ def gen_xml(yxi):
root,
"kernel",
{
"name": yxi["toplevel"],
"name": "wrapper",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: Might need to change this from wrapper to Toplevel according to the backwards compatibility fix implemented, mentioned above.

@nathanielnrn nathanielnrn marked this pull request as draft August 20, 2024 06:00
Base automatically changed from limited-ref-ports to main August 28, 2024 16:00
@rachitnigam
Copy link
Contributor

@sampsyo @nathanielnrn what should we do with this PR? I assume @nathanielnrn has massively less time to directly work on this so we should make a decision about how we're going to merge things or close them out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants