A general framework for building and training constructive feed-forward neural networks. Provides an implementation of sibling-descendant CCNN (Cascade-Correlation) [1,2] with extendable wrappers to tensorflow, keras, scipy, and scikit-learn. Also supports custom topologies, training algorithms, and loss functions [3, 4].
Installation | Features | Examples | References
pip install pyccnn
Regression and Bayesian Regression
Classification
Unsupervised Learning
Simple regression problem:
import tensorflow.compat.v1 as tf
from pyccnn.core import activations, losses
from pyccnn.core.model import CCNN
from pyccnn.core.monitor import EarlyStoppingMonitor
from pyccnn.core.units.perceptron import TensorflowPerceptron, ScipyPerceptron
# read the data and split into train and test
X_train, X_test, y_train, y_test = ...
# build cascade correlation network
output_unit = ScipyPerceptron(activations=[activations.linear], loss_function=losses.mse)
candidate_unit = TensorflowPerceptron([tf.nn.tanh], losses.S1, EarlyStoppingMonitor(1e-3, 500, 10000))
ccnn = CCNN(1, 1, output_unit, candidate_unit, losses.fvu)
# train network
ccnn.train(X_train, y_train, EarlyStoppingMonitor(1e-10, 10, 10), X_test, y_test)
Other examples can be found here.
- Fahlman, Scott E., and Christian Lebiere. "The Cascade-Correlation Learning Architecture." NIPS. 1989.
- Baluja, Shumeet, and Scott E. Fahlman. Reducing network depth in the cascade-correlation learning architecture. CARNEGIE-MELLON UNIV PITTSBURGH PA SCHOOL OF COMPUTER SCIENCE, 1994.
- Kwok, Tin-Yau, and Dit-Yan Yeung. "Bayesian regularization in constructive neural networks." International Conference on Artificial Neural Networks. Springer, Berlin, Heidelberg, 1996.
- Kwok, Tin-Yau, and Dit-Yan Yeung. "Objective functions for training new hidden units in constructive neural networks." IEEE Transactions on neural networks 8.5 (1997): 1131-1148.
- https://www.psych.mcgill.ca/perpg/fac/shultz/personal/Recent_Publications_files/cc_tutorial_files/v3_document.htm