Skip to content

Add adaptive training features for improved GAN stability#499

Open
Rakshitha-Ireddi wants to merge 1 commit intosdv-dev:mainfrom
Rakshitha-Ireddi:feature/adaptive-training-stability
Open

Add adaptive training features for improved GAN stability#499
Rakshitha-Ireddi wants to merge 1 commit intosdv-dev:mainfrom
Rakshitha-Ireddi:feature/adaptive-training-stability

Conversation

@Rakshitha-Ireddi
Copy link

@Rakshitha-Ireddi Rakshitha-Ireddi commented Feb 21, 2026

Authors

  • Ireddi Rakshitha
  • Yaswanth Devavarapu

This PR introduces research-level adaptive training features to CTGAN that improve training stability and convergence through dynamic parameter adjustment mechanisms.

Features Added

1. Adaptive Discriminator-Generator Step Balancing

  • Dynamically adjusts `discriminator_steps` based on loss ratio between generator and discriminator
  • Automatically balances training when one network becomes too strong
  • Prevents mode collapse and improves convergence

2. Gradient Clipping and Monitoring

  • Optional gradient clipping to prevent gradient explosion
  • Tracks gradient norms for both generator and discriminator
  • Improves training stability, especially for difficult datasets

3. Adaptive Learning Rate Scheduling

  • Automatically reduces learning rate when loss plateaus
  • Configurable patience period and reduction factor
  • Applied independently to both generator and discriminator optimizers
  • Helps escape local minima and improve final convergence

4. Early Stopping

  • Stops training automatically when loss stops improving
  • Configurable patience period
  • Prevents overfitting and saves computational resources

5. Generator Eval Mode Fix

Implementation Details

  • All features are backward compatible and opt-in (disabled by default)
  • Comprehensive unit tests added for all new features
  • Follows existing code patterns and Google-style docstrings
  • No breaking changes to existing API

Usage Example

from ctgan import CTGAN

ctgan = CTGAN(
    epochs=100,
    adaptive_training=True,        # Enable adaptive step balancing
    gradient_clipping=1.0,          # Clip gradients to norm 1.0
    early_stopping=True,           # Enable early stopping
    early_stopping_patience=10,     # Wait 10 epochs
    adaptive_lr=True,               # Enable adaptive LR
    lr_patience=5,                  # Reduce LR after 5 epochs
    lr_factor=0.5                  # Reduce LR by 50%
)

ctgan.fit(data, discrete_columns=['col1', 'col2'])
synthetic_data = ctgan.sample(1000)

Research Impact

These features address common challenges in GAN training:

  • Training instability: Gradient clipping and monitoring
  • Mode collapse: Adaptive step balancing
  • Slow convergence: Adaptive learning rate scheduling
  • Overfitting: Early stopping mechanism
  • Inconsistent sampling: Proper eval mode handling

Files Changed

  • `ctgan/synthesizers/ctgan.py`: Core implementation of adaptive features
  • `tests/unit/synthesizer/test_ctgan.py`: Comprehensive test coverage

Checklist

  • Code follows existing patterns and style
  • All tests pass
  • Backward compatible (all features opt-in)
  • Comprehensive documentation in docstrings
  • No markdown files committed

- Implement adaptive discriminator-generator step balancing based on loss convergence
- Add gradient clipping and gradient norm monitoring for training stability
- Implement adaptive learning rate scheduling based on loss plateaus
- Add early stopping mechanism based on convergence metrics
- Fix generator eval mode during sampling (addresses issue sdv-dev#309)
- Add comprehensive unit tests for all new features

This PR introduces research-level features that improve CTGAN training stability
and convergence through adaptive mechanisms that dynamically adjust training
parameters based on loss behavior.
@Rakshitha-Ireddi Rakshitha-Ireddi requested a review from a team as a code owner February 21, 2026 05:55
@Rakshitha-Ireddi Rakshitha-Ireddi requested review from sarahmish and removed request for a team February 21, 2026 05:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant