Skip to content

A general framework for cascade correlation architectures in Python with wrappers to keras, tensorflow and sklearn

License

Notifications You must be signed in to change notification settings

mike-gimelfarb/cascade-correlation-neural-networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cascade-correlation-neural-networks

A general framework for building and training constructive feed-forward neural networks. Provides an implementation of sibling-descendant CCNN (Cascade-Correlation) [1,2] with extendable wrappers to tensorflow, keras, scipy, and scikit-learn. Also supports custom topologies, training algorithms, and loss functions [3, 4].

Python PyPI License: MIT PyPI - Downloads

Installation | Features | Examples | References

Installation

pip install pyccnn

Features

Regression and Bayesian Regression

Classification

Unsupervised Learning

Examples

Simple regression problem:

import tensorflow.compat.v1 as tf 
from pyccnn.core import activations, losses
from pyccnn.core.model import CCNN
from pyccnn.core.monitor import EarlyStoppingMonitor
from pyccnn.core.units.perceptron import TensorflowPerceptron, ScipyPerceptron

# read the data and split into train and test
X_train, X_test, y_train, y_test = ...

# build cascade correlation network
output_unit = ScipyPerceptron(activations=[activations.linear], loss_function=losses.mse)
candidate_unit = TensorflowPerceptron([tf.nn.tanh], losses.S1, EarlyStoppingMonitor(1e-3, 500, 10000))
ccnn = CCNN(1, 1, output_unit, candidate_unit, losses.fvu)

# train network
ccnn.train(X_train, y_train, EarlyStoppingMonitor(1e-10, 10, 10), X_test, y_test)

Other examples can be found here.

References

  1. Fahlman, Scott E., and Christian Lebiere. "The Cascade-Correlation Learning Architecture." NIPS. 1989.
  2. Baluja, Shumeet, and Scott E. Fahlman. Reducing network depth in the cascade-correlation learning architecture. CARNEGIE-MELLON UNIV PITTSBURGH PA SCHOOL OF COMPUTER SCIENCE, 1994.
  3. Kwok, Tin-Yau, and Dit-Yan Yeung. "Bayesian regularization in constructive neural networks." International Conference on Artificial Neural Networks. Springer, Berlin, Heidelberg, 1996.
  4. Kwok, Tin-Yau, and Dit-Yan Yeung. "Objective functions for training new hidden units in constructive neural networks." IEEE Transactions on neural networks 8.5 (1997): 1131-1148.
  5. https://www.psych.mcgill.ca/perpg/fac/shultz/personal/Recent_Publications_files/cc_tutorial_files/v3_document.htm