Over-squash is a tool for solving the problem of information bottlenecks in graph neural networks (GNNs) or implementing effective mechanisms to mitigate over-squashing issues. This project uses Python and deep learning frameworks, including PyTorch and PyTorch Geometric. It aims to enhance the ability of GNNs to handle long-range dependencies without suffering from information loss or compression, thereby improving their performance in tasks requiring deep relational information.
To set up the project environment and install all necessary dependencies, follow these steps:
-
Clone the repository:
git clone https://github.com/yonatansverdlov/Over-squashing.git
-
Navigate into the project directory:
cd Over-squashing
-
Create a new Conda environment and activate it:
conda create --name oversquash -c conda-forge python=3.11 conda activate oversquash
-
Install the necessary dependencies from the
requirements.txt
file:pip install -r requirements.txt
We present three types of experiments: Over-squashing experiments, Transductive learning, MolHIV and LRGB.
First run
cd bottleneck/script
Choose data_type, one of the four options: Ring, Tree, CrossRing, CliqueRing. Then, for Tree, choose a radius between 2 and 8, and for others, between 2 and 15.
If all radios are needed, please run
python train.py --dataset_name data_type --all True
Otherwise, run
python train.py --dataset_name data_type --radius radius.
First run
cd bottleneck/script
Select a data_type
from the following nine options: Cora, Cite, Pubm, Cham, Squi, Actor, Corn, Texas, Wisc.
Next, choose the number of different seeds (between 1 and 10) indicated by repeat
, and run:
python train.py --dataset_name data_type --repeat repeat
- Create a new Conda environment and activate it:
cd Over-squashing conda create --name lrgb -c conda-forge python=3.10 conda activate lrgb
- Install the necessary dependencies from the
lrgb_requirements.txt
file:pip install -r lrgb_requirements.txt
This project is licensed under the MIT License.
For any questions or feedback, reach out to me at [email protected]
.