 
                                               
	
    
	
	HyperParameter Optimisation (HPO) Tools 
 
  	
	
        
  
	
	  
- 01.     Google Vizier Service - A Python-based research interface for blackbox and hyperparameter optimization 
- 02.    Hyperopt: Distributed Asynchronous Hyper-parameter Optimization 
- 03.    Optuna works with any machine or deep learning framework. An open source hyperparameter optimization framework to automate hyperparameter search 
- 04.    scikit-optimize - 
Sequential model-based optimization in Python 
- 05.   Talos radically changes the ordinary Keras, TensorFlow (tf.keras), and PyTorch workflow by fully automating hyperparameter tuning and model evaluation.
- 06.   Bayesian Optimization - Pure Python implementation of bayesian global optimization with gaussian processes.
- 07.   KerasTuner is an easy-to-use, scalable hyperparameter optimization framework that solves the pain points of hyperparameter search. 
- 08.   NNI automates feature engineering, neural architecture search, hyperparameter tuning, and model compression for deep learning. 
- 09.   Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a toolkit of libraries (Ray AIR) for simplifying ML compute
- 10.   SHERPA is a Python library for hyperparameter tuning of machine learning models. 
- 11.   Polyaxon supports random search and grid search, and provides a simple interface for advanced approaches, such as Hyperband and Bayesian Optimization.  
- 12.   mlmachine is a Python library that organizes and accelerates notebook-based machine learning experiments.
- 13.   Dragonfly is an open source python library for scalable Bayesian optimisation.
- 14.   flaml.tune is a module for economical hyperparameter tuning. It frees users from manually tuning many hyperparameters for a software, such as machine learning training procedures.
- 15.   HEBO: Heteroscedastic Evolutionary Bayesian Optimisation
- 16.   Nevergrad - A gradient-free optimization platform
- 17.   SigOpt is a model development platform that makes it easy to track runs, visualize training, and scale hyperparameter optimization for any type of model built with any library on any infrastructure.
- 18.   ZOOpt - Zeroth-order optimization does not rely on the gradient of the objective function, but instead, learns from samples of the search space
- 19.   GPyOpt is a Python open-source library for Bayesian Optimization developed by the Machine Learning group of the University of Sheffield.
- 20.   Spearmint is a software package to perform Bayesian optimization and automatically run experiments and adjusts a number of parameters so as to minimize some objective in as few runs as possible.