-
-
Notifications
You must be signed in to change notification settings - Fork 45.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement genetic algorithm for optimizing continuous functions #12378
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Click here to look at the relevant links ⬇️
🔗 Relevant Links
Repository:
Python:
Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.
algorithms-keeper
commands and options
algorithms-keeper actions can be triggered by commenting on this PR:
@algorithms-keeper review
to trigger the checks for only added pull request files@algorithms-keeper review-all
to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.
# Initialize population | ||
self.population = self.initialize_population() | ||
|
||
def initialize_population(self) -> list[np.ndarray]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file genetic_algorithm/genetic_algorithm_optimization.py
, please provide doctest for the function initialize_population
for _ in range(self.population_size) | ||
] | ||
|
||
def fitness(self, individual: np.ndarray) -> float: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file genetic_algorithm/genetic_algorithm_optimization.py
, please provide doctest for the function fitness
value = float(self.function(*individual)) # Ensure fitness is a float | ||
return value if self.maximize else -value # If minimizing, invert the fitness | ||
|
||
def select_parents( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file genetic_algorithm/genetic_algorithm_optimization.py
, please provide doctest for the function select_parents
) | ||
) | ||
|
||
def evolve(self, verbose=True) -> np.ndarray: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As there is no test file in this pull request nor any test function or class in the file genetic_algorithm/genetic_algorithm_optimization.py
, please provide doctest for the function evolve
Please provide type hint for the parameter: verbose
for more information, see https://pre-commit.ci
Describe your change:
Added a flexible genetic algorithm that allows users to define their own target functions for optimization.
Included features for population initialization, fitness evaluation, selection, crossover, and mutation.
Example function provided for minimizing f(x, y) = x^2 + y^2.
Configurable parameters for population size, mutation probability, and generations.
Add an algorithm? ✅ Yes
Fix a bug or typo in an existing algorithm? ❌ No
Add or change doctests? ❌ No
Documentation change? ❌ No
Checklist: