October 22, 2024

Evolutionary Feature Selection: A Novel Wrapper Feature Selection Architecture Based on Evolutionary Strategies

EFS adapts features like chromosomes, using crossover and mutation to optimize models, inspired by nature’s evolution.

Aaryan Dubey

CW's Deep Learning Engineer

A vibrant, surreal seascape with rocky formations, colorful corals, and purple plants leading to a calm ocean with a distant island.

The idea behind EFS 

EFS was drawn from the idea of evolution of creatures in the environment . How the simplest cell creatures can turn into complex multicellular organisms by going through an evolution cycle. Surviving and adapting to the worst survival conditions provided to them. We took inspiration from this and worked on an algorithm that grows from scratch. Here we take feature sets as chromosomes and we evolve these feature sets to find the best optimal minima for the given problem. These sets have gone through different generations and need to adapt to survive in that generation to move into future generations. Some creatures {features sets } mutate, while others  perform cross breeding among different sets to adapt and evolve the ordeal provided. These steps help the creatures to exploit and explore the search space more efficiently.

What is Feature Selection?

Before diving deeper into EFS, let's understand what feature selection is all about. Feature selection is a crucial process in machine learning that aims to identify and select the most relevant features from a larger pool of potential features. 

"Feature selection is an approach to selecting the best set of features from a feature pool. Its goal is to increase the performance of the machine learning model by providing sufficient information while avoiding redundant or irrelevant features."

By choosing the right features, we can improve model performance, reduce overfitting, and speed up training times.

How EFS utilizes feature selection

  1. Chromosomes as Feature Sets: In EFS, each potential solution (a subset of features) is represented as a chromosome. These chromosomes are bit strings where each bit corresponds to a feature - 1 indicates the feature is selected, and  0 indicates it's not.
  1. Evolutionary Growth: EFS starts with simple solutions (chromosomes with few active features) and gradually increases complexity over generations. This mimics how life evolved from simple to complex organisms.
  2. Generational Adaptation: In each generation, chromosomes (feature sets) must "adapt" to survive. This adaptation is measured by how well the selected features perform in a machine learning model. For example F1 score , roc_auc score, etc.
  1. Chaos-Driven Operations: EFS applies the chaos theory, specifically a logistic map function, to introduce controlled randomness in generating new chromosomes and guiding mutation operations.
  2. Crossover and Mutation:
    • Crossover: EFS combines features from different successful chromosomes to create new feature sets.
  • Mutation: Some chromosomes undergo random changes in their feature selection.
  1. Natural Selection: Only the best-performing feature sets (those that lead to the highest model accuracy) survive to the next generation.
  2. Exploration vs. Exploitation: The algorithm balances exploring new feature combinations with exploiting known good solutions, allowing it to thoroughly search the feature space.
  3. Evolution Strategies: The main strategy for the proposed architecture is to increase the number of active genes, which allows the proposed method to achieve an optimal solution.The number of active features in chromosomes increases at each generation, allowing the algorithm to find the minimal set of features that yield the best performance.

This approach allows EFS to efficiently navigate the vast space of possible feature combinations, gradually honing in on an optimal subset of features for the given problem.

The EFS Algorithm

EFS works by evolving feature sets to find the optimal solution for a given problem. Here's how it operates:

  1. Initialization: Start with simple feature sets (chromosomes).
  2. Generational Growth: Increase complexity over generations.
  3. Adaptation: Feature sets must adapt to survive each generation.
  4. Mutation and Crossbreeding: Some feature sets mutate randomly, while others combine through crossbreeding.
  5. Selection: Only the best-performing sets survive to the next generation.

As the creators describe: "These sets go through different generations and need to adapt to survive in that generation to move into future generations. Some creatures (feature sets) mutate, some perform crossbreeding among different sets to adapt and evolve to the ordeal provided."

This evolutionary approach allows EFS to efficiently explore and exploit the search space, mimicking natural selection to find optimal feature combinations.

Evolutionary Feature Selection (EFS): Step-by-Step Setup

Step 1: Initialization
- Create a set of chromosomes and add them to a "Local List"
- Chromosomes are bit strings of size SF (number of features)

Step 2: Mask Creation
- For each chromosome, create a mask, a θ value, and a solution
- Initialize masks randomly and encode using Gray code
- Use masks to determine active genes in crossover and mutation

Step 3: Generational Growth Setup

- Set maximum generations (mg) < Number of features (SF)
- Configure number of active genes to increase with each generation

Step 4: Crossover Setup
- Implement uniform crossover: Use mask of first pa
- Implement simple crossover: Combine first half of one parent with second half of another

Step 5: Mutation Setup
- Implement mutation with mask: OR operation between chromosome and its mask
- Implement simple mutation: Randomly flip bits in selected genes

Step 6: Evolution Strategy Setup
- Configure system to increase number of active genes in each generation
- Set up mechanism to combine winners of previous generation with current generation
- Prepare crossover between good and bad performers to maintain diversity

Step 7: Evaluation Setup
- Use K-Nearest Neighbor (KNN) classifier as fitness function
- Set K = 3 with mean absolute distance

Step 8: Termination Criteria
- Set number of generations to (Number of features - 1)
- Prepare to output chromosome with highest fitness value from the global list

Step 9: Main Loop
- For each generation: a. Generate new chromosomes b. Evaluate chromosomes c. Perform selection d. Apply crossover and mutation e. Update global list

Step 10: Final Output
- Return the best-performing chromosome (feature subset) and its performance metrics

Check the open-source code made available by CW's engineers here.

Check the entire paper by Aaryan Dubey, Alexandre Hoppe Inoue, Pedro Terra Fernandes Birmann, Sammuel Ramos da Silva here.