Skip to content

Commit f92232e

Browse files
docs(faq): Remove outdated KFPv1 and Rok references
- Remove MiniKF/Rok integration mention (no longer maintained) - Remove PersistentVolumeClaim section (KFPv2 uses artifacts, not volumes) - Update compiler errors section for KFPv2 compatibility - Simplify ModuleNotFoundError solution with current approaches Signed-off-by: Stefano Fioravanzo <stefano.fioravanzo@gmail.com>
1 parent 7e6edf3 commit f92232e

1 file changed

Lines changed: 12 additions & 38 deletions

File tree

FAQ.md

Lines changed: 12 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -18,25 +18,9 @@ pipeline, Kale sets the notebook server's image as the steps' base image (or a
1818
custom user-defined image), so all those incremental changes (e.g. new
1919
installations) will be lost.
2020

21-
You will notice this is not happening in our CodeLab because, when running in
22-
MiniKF, Kale integrates with Rok, a data management platform that takes care of
23-
snapshotting the mounted volumes and making them available to the pipeline step.
24-
Thus preserving the exact development environment found in the notebook.
25-
26-
### Pod has unbound immediate PersistentVolumeClaim
27-
28-
In order to data, Kale mounts a data volume on each pipeline step. Since steps
29-
can run concurrently, your storage class needs to support `RWX`
30-
(`ReadWriteMany`) volumes. If that is not the case, the pod will be left
31-
unschedulable as it won't find this kind of resource.
32-
33-
What you can do in this case is either install a storage class that enables
34-
`RWX` volumes or:
35-
36-
1. Retrieve the `.py` file generated by Kale (it should be next to the `.ipynb`)
37-
2. Search for `marshal_vop` definition (`marshal_vop = dsl.VolumeOp...`)
38-
3. Change this line `modes=dsl.VOLUME_MODE_RWM`, to `modes=dsl.VOLUME_MODE_RWO`
39-
4. Run the `.py` file
21+
To solve this, you can either:
22+
1. Build a custom Docker image with all your dependencies pre-installed
23+
2. List additional packages in the cell tags that Kale will include in `packages_to_install`
4024

4125
### Data passing and pickle errors
4226

@@ -61,27 +45,17 @@ implemented.
6145

6246
### Compiler errors
6347

64-
When compiling your notebook you may encounter the following error:
65-
```
66-
Internal compiler error: Compiler has produced Argo-incompatible workflow.
67-
Please create a new issue at https://github.com/kubeflow/pipelines/issues attaching the pipeline code and the pipeline package.
68-
```
69-
followed by some explanation. For example:
70-
```
71-
Error: time="2020-10-12T17:57:45-07:00" level=fatal msg="/dev/stdin failed to parse: error unmarshaling JSON: while decoding JSON: json: unknown field \"volumes\""
72-
```
73-
74-
This is an error raised by the KFP compiler. Kale compile process contains
75-
converting to KFP DSL and then compiling it, so it triggers the KFP compiler.
48+
If you encounter compiler errors, ensure you're using a compatible version of
49+
KFP (v2.4.0+). The KFP v2 compiler produces IR YAML that is submitted to the
50+
Kubeflow Pipelines backend.
7651

77-
The KFP compiler runs `argo lint` on the generated workflow, if it finds the
78-
`argo` executable in your environment's `PATH`.
52+
Common issues:
53+
- **Missing dependencies**: Ensure all required packages are listed in your imports cell
54+
- **Invalid Python syntax**: Check that your notebook cells contain valid Python 3.12+ code
55+
- **Type mismatches**: KFP v2 uses typed artifacts; ensure inputs/outputs match expected types
7956

80-
To overcome this issue, you could either remove `argo` from your `PATH` or
81-
replace it with a version that is supported by KFP. At the time of writing this
82-
section, the recommended version is 2.4.3. Follow [this
83-
link](https://github.com/argoproj/argo/releases/tag/v2.4.3) to get the proper
84-
binary.
57+
If issues persist, check the generated `.kale.py` file in the `.kale/` directory
58+
and file an issue at https://github.com/kubeflow-kale/kale/issues.
8559

8660
## Limitations
8761

0 commit comments

Comments
 (0)