🎯 EasyHPO Interface

EasyHPO provides a simplified, user-friendly interface for hyperparameter optimization that requires minimal setup while providing powerful optimization capabilities including LLM enhancement.

One-Liner Optimization

Just pass your data and get optimized hyperparameters

LLM-Enhanced

Intelligent parameter suggestions using Large Language Models

⚙️ Auto-Configuration

Automatically configures hyperparameter spaces based on task type

Quick Start

Basic Usage - Just Pass Data

The simplest way to use EasyHPO:

python
from MAT_HPO_LIB import EasyHPO
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split

#  Create your dataset
X, y = make_classification(n_samples=1000, n_features=20, n_classes=3)
X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2)

#  One-liner optimization
optimizer = EasyHPO(task_type="classification", max_trials=30)
results = optimizer.optimize(X_train, y_train, X_val, y_val)

print(f" Best hyperparameters: {results['best_hyperparameters']}")
print(f" Best performance: {results['best_performance']}")

LLM-Enhanced Optimization

Enable LLM-enhanced optimization for intelligent hyperparameter suggestions:

python
optimizer = EasyHPO(
    task_type="time_series_classification",
    llm_enabled=True,              #  Enable LLM enhancement
    llm_model="llama3.2:3b",           # LLM model to use
    llm_strategy="adaptive",          # Smart adaptive strategy
    max_trials=30
)
results = optimizer.optimize(X_train, y_train, X_val, y_val)

LLM Strategies

"fixed_alpha"

Fixed mixing ratio between LLM and RL suggestions. Stable and predictable.

"adaptive"

Dynamically adjusts based on performance trends. Intelligent and self-tuning.

"performance_based"

Uses LLM when performance stagnates. Efficient resource usage.

Task Types and Auto-Configuration

EasyHPO automatically configures hyperparameter spaces based on task type:

Time Series Classification

python
optimizer = EasyHPO(task_type="time_series_classification", max_trials=30)
results = optimizer.optimize(X_train, y_train, X_val, y_val)

Automatically optimizes:

  • hidden_size: [32, 64, 128, 256, 512]
  • learning_rate: [0.0001, 0.01]
  • batch_size: [16, 32, 64, 128]
  • dropout: [0.0, 0.5]
  • class_weight_N: [0.5, 3.0] (for imbalanced datasets)

❤️ ECG Classification

python
#  Specialized for ECG data
optimizer = EasyHPO(task_type="ecg_classification", max_trials=30)
results = optimizer.optimize(X_train, y_train, X_val, y_val)

#  Or use the convenience function
from MAT_HPO_LIB.easy_hpo import quick_ecg_optimize
best_params = quick_ecg_optimize(X_train, y_train, X_val, y_val)

⚙️ Configuration Options

python
optimizer = EasyHPO(
    task_type="time_series_classification",  # Task type for auto-config
    llm_enabled=True,                        # Enable LLM enhancement
    llm_model="llama3.2:3b",                 # LLM model to use
    llm_strategy="adaptive",                 # LLM mixing strategy
    max_trials=30,                           # Number of optimization trials
    timeout_minutes=120,                     # Maximum time (None for no limit)
    auto_save=True,                          # Auto-save results
    output_dir="./my_hpo_results",           # Output directory
    verbose=True                             # Print progress
)

Results and Performance

python
results = optimizer.optimize(X_train, y_train, X_val, y_val)

#  Best hyperparameters
best_params = results['best_hyperparameters']

#  Performance metrics
performance = results['best_performance']
print(f"F1: {performance['f1']:.4f}")
print(f"AUC: {performance['auc']:.4f}")

#  Optimization statistics
stats = results['optimization_stats']
print(f"Trials completed: {stats['trials_completed']}")
print(f"Total time: {stats['optimization_time']:.2f}s")

Complete Example: ECG Classification

python
from MAT_HPO_LIB import EasyHPO
import numpy as np
from sklearn.model_selection import train_test_split

#  Simulate ECG data (in practice, load your real data)
n_samples, sequence_length = 1000, 500
X = np.random.randn(n_samples, sequence_length, 1)  # ECG signals
y = np.random.randint(0, 9, n_samples)              # 9 cardiac conditions

#  Split data
X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2,
                                                  stratify=y, random_state=42)

#  Optimize with EasyHPO
optimizer = EasyHPO(
    task_type="ecg_classification",
    llm_enabled=True,              # Use LLM for intelligent suggestions
    max_trials=50,                 # More trials for better results
    verbose=True
)

#  Run optimization
print(" Starting ECG hyperparameter optimization...")
results = optimizer.optimize(X_train, y_train, X_val, y_val)

#  Display results
print(f"\\n Best Performance:")
print(f"   F1 Score: {results['best_performance']['f1']:.4f}")
print(f"   AUC: {results['best_performance']['auc']:.4f}")
print(f"   G-mean: {results['best_performance']['gmean']:.4f}")

print(f"\\n⚙️ Best Hyperparameters:")
for param, value in results['best_hyperparameters'].items():
    print(f"   {param}: {value}")

Next Steps

LLM Strategies Guide

Learn about different LLM enhancement strategies

Working Examples

See complete code examples and use cases

Full API Reference

Detailed documentation for advanced usage