Optimizing Multi-Objective Problems with Desirability Functions
When working in Data Science, it is not uncommon to encounter problems with competing objectives. Whether designing products, tuning algorithms or optimizing portfolios, we often need to balance several metrics to get the best possible outcome. Sometimes, maximizing one metrics comes at the expense of another, making it hard to have an overall optimized solution.
While several solutions exist to solve multi-objective Optimization problems, I found desirability function to be both elegant and easy to explain to non-technical audience. Which makes them an interesting option to consider. Desirability functions will combine several metrics into a standardized score, allowing for a holistic optimization.
In this article, we’ll explore:
The mathematical foundation of desirability functions
How to implement these functions in Python
How to optimize a multi-objective problem with desirability functions
Visualization for interpretation and explanation of the results
To ground these concepts in a real example, we’ll apply desirability functions to optimize a bread baking: a toy problem with a few, interconnected parameters and competing quality objectives that will allow us to explore several optimization choices.
By the end of this article, you’ll have a powerful new tool in your data science toolkit for tackling multi-objective optimization problems across numerous domains, as well as a fully functional code available here on GitHub.
What are Desirability Functions?
Desirability functions were first formalized by Harringtonand later extended by Derringer and Suich. The idea is to:
Transform each response into a performance score between 0and 1Combine all scores into a single metric to maximize
Let’s explore the types of desirability functions and then how we can combine all the scores.
The different types of desirability functions
There are three different desirability functions, that would allow to handle many situations.
Smaller-is-better: Used when minimizing a response is desirable
def desirability_smaller_is_better-> float:
"""Calculate desirability function value where smaller values are better.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x <= x_min:
return 1.0
elif x >= x_max:
return 0.0
else:
return/Larger-is-better: Used when maximizing a response is desirable
def desirability_larger_is_better-> float:
"""Calculate desirability function value where larger values are better.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x <= x_min:
return 0.0
elif x >= x_max:
return 1.0
else:
return/Target-is-best: Used when a specific target value is optimal
def desirability_target_is_best-> float:
"""Calculate two-sided desirability function value with target value.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_target: Targetvalue
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x_min <= x <= x_target:
return/elif x_target < x <= x_max:
return/else:
return 0.0
Every input parameter can be parameterized with one of these three desirability functions, before combining them into a single desirability score.
Combining Desirability Scores
Once individual metrics are transformed into desirability scores, they need to be combined into an overall desirability. The most common approach is the geometric mean:
Where di are individual desirability values and wi are weights reflecting the relative importance of each metric.
The geometric mean has an important property: if any single desirability is 0, the overall desirability is also 0, regardless of other values. This enforces that all requirements must be met to some extent.
def overall_desirability:
"""Compute overall desirability using geometric mean
Parameters:
-----------
desirabilities : list
Individual desirability scores
weights : list
Weights for each desirability
Returns:
--------
float
Overall desirability score
"""
if weights is None:
weights =* len# Convert to numpy arrays
d = np.arrayw = np.array# Calculate geometric mean
return np.prod**)
The weights are hyperparameters that give leverage on the final outcome and give room for customization.
A Practical Optimization Example: Bread Baking
To demonstrate desirability functions in action, let’s apply them to a toy problem: a bread baking optimization problem.
The Parameters and Quality Metrics
Let’s play with the following parameters:
Fermentation TimeFermentation TemperatureHydration LevelKneading TimeBaking TemperatureAnd let’s try to optimize these metrics:
Texture Quality: The texture of the bread
Flavor Profile: The flavor of the bread
Practicality: The practicality of the whole process
Of course, each of these metrics depends on more than one parameter. So here comes one of the most critical steps: mapping parameters to quality metrics.
For each quality metric, we need to define how parameters influence it:
def compute_flavor_profile-> float:
"""Compute flavor profile score based on input parameters.
Args:
params: List of parameter valuesReturns:
Weighted flavor profile score between 0 and 1
"""
# Flavor mainly affected by fermentation parameters
fermentation_d = desirability_larger_is_betterferment_temp_d = desirability_target_is_besthydration_d = desirability_target_is_best# Baking temperature has minimal effect on flavor
weights =return np.averageHere for example, the flavor is influenced by the following:
The fermentation time, with a minimum desirability below 30 minutes and a maximum desirability above 180 minutes
The fermentation temperature, with a maximum desirability peaking at 24 degrees Celsius
The hydration, with a maximum desirability peaking at 75% humidity
These computed parameters are then weighted averaged to return the flavor desirability. Similar computations and made for the texture quality and practicality.
The Objective Function
Following the desirability function approach, we’ll use the overall desirability as our objective function. The goal is to maximize this overall score, which means finding parameters that best satisfy all our three requirements simultaneously:
def objective_function-> float:
"""Compute overall desirability score based on individual quality metrics.
Args:
params: List of parameter values
weights: Weights for texture, flavor and practicality scores
Returns:
Negative overall desirability score"""
# Compute individual desirability scores
texture = compute_texture_qualityflavor = compute_flavor_profilepracticality = compute_practicality# Ensure weights sum up to one
weights = np.array/ np.sum# Calculate overall desirability using geometric mean
overall_d = overall_desirability# Return negative value since we want to maximize desirability
# but optimization functions typically minimize
return -overall_d
After computing the individual desirabilities for texture, flavor and practicality; the overall desirability is simply computed with a weighted geometric mean. It finally returns the negative overall desirability, so that it can be minimized.
Optimization with SciPy
We finally use SciPy’s minimize function to find optimal parameters. Since we returned the negative overall desirability as the objective function, minimizing it would maximize the overall desirability:
def optimize-> list:
# Define parameter bounds
bounds = {
'fermentation_time':,
'fermentation_temp':,
'hydration_level':,
'kneading_time':,
'baking_temp':}
# Initial guessx0 =# Run optimization
result = minimize,
bounds=list),
method='SLSQP'
)
return result.x
In this function, after defining the bounds for each parameter, the initial guess is computed as the middle of bounds, and then given as input to the minimize function of SciPy. The result is finally returned.
The weights are given as input to the optimizer too, and are a good way to customize the output. For example, with a larger weight on practicality, the optimized solution will focus on practicality over flavor and texture.
Let’s now visualize the results for a few sets of weights.
Visualization of Results
Let’s see how the optimizer handles different preference profiles, demonstrating the flexibility of desirability functions, given various input weights.
Let’s have a look at the results in case of weights favoring practicality:
Optimized parameters with weights favoring practicality. Image by author.
With weights largely in favor of practicality, the achieved overall desirability is 0.69, with a short kneading time of 5 minutes, since a high value impacts negatively the practicality.
Now, if we optimize with an emphasis on texture, we have slightly different results:
Optimized parameters with weights favoring texture. Image by author.
In this case, the achieved overall desirability is 0.85, significantly higher. The kneading time is this time 12 minutes, as a higher value impacts positively the texture and is not penalized so much because of practicality.
Conclusion: Practical Applications of Desirability Functions
While we focused on bread baking as our example, the same approach can be applied to various domains, such as product formulation in cosmetics or resource allocation in portfolio optimization.
Desirability functions provide a powerful mathematical framework for tackling multi-objective optimization problems across numerous data science applications. By transforming raw metrics into standardized desirability scores, we can effectively combine and optimize disparate objectives.
The key advantages of this approach include:
Standardized scales that make different metrics comparable and easy to combine into a single target
Flexibility to handle different types of objectives: minimize, maximize, target
Clear communication of preferences through mathematical functions
The code presented here provides a starting point for your own experimentation. Whether you’re optimizing industrial processes, machine learning models, or product formulations, hopefully desirability functions offer a systematic approach to finding the best compromise among competing objectives.
The post Optimizing Multi-Objective Problems with Desirability Functions appeared first on Towards Data Science.
#optimizing #multiobjective #problems #with #desirability
Optimizing Multi-Objective Problems with Desirability Functions
When working in Data Science, it is not uncommon to encounter problems with competing objectives. Whether designing products, tuning algorithms or optimizing portfolios, we often need to balance several metrics to get the best possible outcome. Sometimes, maximizing one metrics comes at the expense of another, making it hard to have an overall optimized solution.
While several solutions exist to solve multi-objective Optimization problems, I found desirability function to be both elegant and easy to explain to non-technical audience. Which makes them an interesting option to consider. Desirability functions will combine several metrics into a standardized score, allowing for a holistic optimization.
In this article, we’ll explore:
The mathematical foundation of desirability functions
How to implement these functions in Python
How to optimize a multi-objective problem with desirability functions
Visualization for interpretation and explanation of the results
To ground these concepts in a real example, we’ll apply desirability functions to optimize a bread baking: a toy problem with a few, interconnected parameters and competing quality objectives that will allow us to explore several optimization choices.
By the end of this article, you’ll have a powerful new tool in your data science toolkit for tackling multi-objective optimization problems across numerous domains, as well as a fully functional code available here on GitHub.
What are Desirability Functions?
Desirability functions were first formalized by Harringtonand later extended by Derringer and Suich. The idea is to:
Transform each response into a performance score between 0and 1Combine all scores into a single metric to maximize
Let’s explore the types of desirability functions and then how we can combine all the scores.
The different types of desirability functions
There are three different desirability functions, that would allow to handle many situations.
Smaller-is-better: Used when minimizing a response is desirable
def desirability_smaller_is_better-> float:
"""Calculate desirability function value where smaller values are better.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x <= x_min:
return 1.0
elif x >= x_max:
return 0.0
else:
return/Larger-is-better: Used when maximizing a response is desirable
def desirability_larger_is_better-> float:
"""Calculate desirability function value where larger values are better.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x <= x_min:
return 0.0
elif x >= x_max:
return 1.0
else:
return/Target-is-best: Used when a specific target value is optimal
def desirability_target_is_best-> float:
"""Calculate two-sided desirability function value with target value.
Args:
x: Input parameter value
x_min: Minimum acceptable value
x_target: Targetvalue
x_max: Maximum acceptable value
Returns:
Desirability score between 0 and 1
"""
if x_min <= x <= x_target:
return/elif x_target < x <= x_max:
return/else:
return 0.0
Every input parameter can be parameterized with one of these three desirability functions, before combining them into a single desirability score.
Combining Desirability Scores
Once individual metrics are transformed into desirability scores, they need to be combined into an overall desirability. The most common approach is the geometric mean:
Where di are individual desirability values and wi are weights reflecting the relative importance of each metric.
The geometric mean has an important property: if any single desirability is 0, the overall desirability is also 0, regardless of other values. This enforces that all requirements must be met to some extent.
def overall_desirability:
"""Compute overall desirability using geometric mean
Parameters:
-----------
desirabilities : list
Individual desirability scores
weights : list
Weights for each desirability
Returns:
--------
float
Overall desirability score
"""
if weights is None:
weights =* len# Convert to numpy arrays
d = np.arrayw = np.array# Calculate geometric mean
return np.prod**)
The weights are hyperparameters that give leverage on the final outcome and give room for customization.
A Practical Optimization Example: Bread Baking
To demonstrate desirability functions in action, let’s apply them to a toy problem: a bread baking optimization problem.
The Parameters and Quality Metrics
Let’s play with the following parameters:
Fermentation TimeFermentation TemperatureHydration LevelKneading TimeBaking TemperatureAnd let’s try to optimize these metrics:
Texture Quality: The texture of the bread
Flavor Profile: The flavor of the bread
Practicality: The practicality of the whole process
Of course, each of these metrics depends on more than one parameter. So here comes one of the most critical steps: mapping parameters to quality metrics.
For each quality metric, we need to define how parameters influence it:
def compute_flavor_profile-> float:
"""Compute flavor profile score based on input parameters.
Args:
params: List of parameter valuesReturns:
Weighted flavor profile score between 0 and 1
"""
# Flavor mainly affected by fermentation parameters
fermentation_d = desirability_larger_is_betterferment_temp_d = desirability_target_is_besthydration_d = desirability_target_is_best# Baking temperature has minimal effect on flavor
weights =return np.averageHere for example, the flavor is influenced by the following:
The fermentation time, with a minimum desirability below 30 minutes and a maximum desirability above 180 minutes
The fermentation temperature, with a maximum desirability peaking at 24 degrees Celsius
The hydration, with a maximum desirability peaking at 75% humidity
These computed parameters are then weighted averaged to return the flavor desirability. Similar computations and made for the texture quality and practicality.
The Objective Function
Following the desirability function approach, we’ll use the overall desirability as our objective function. The goal is to maximize this overall score, which means finding parameters that best satisfy all our three requirements simultaneously:
def objective_function-> float:
"""Compute overall desirability score based on individual quality metrics.
Args:
params: List of parameter values
weights: Weights for texture, flavor and practicality scores
Returns:
Negative overall desirability score"""
# Compute individual desirability scores
texture = compute_texture_qualityflavor = compute_flavor_profilepracticality = compute_practicality# Ensure weights sum up to one
weights = np.array/ np.sum# Calculate overall desirability using geometric mean
overall_d = overall_desirability# Return negative value since we want to maximize desirability
# but optimization functions typically minimize
return -overall_d
After computing the individual desirabilities for texture, flavor and practicality; the overall desirability is simply computed with a weighted geometric mean. It finally returns the negative overall desirability, so that it can be minimized.
Optimization with SciPy
We finally use SciPy’s minimize function to find optimal parameters. Since we returned the negative overall desirability as the objective function, minimizing it would maximize the overall desirability:
def optimize-> list:
# Define parameter bounds
bounds = {
'fermentation_time':,
'fermentation_temp':,
'hydration_level':,
'kneading_time':,
'baking_temp':}
# Initial guessx0 =# Run optimization
result = minimize,
bounds=list),
method='SLSQP'
)
return result.x
In this function, after defining the bounds for each parameter, the initial guess is computed as the middle of bounds, and then given as input to the minimize function of SciPy. The result is finally returned.
The weights are given as input to the optimizer too, and are a good way to customize the output. For example, with a larger weight on practicality, the optimized solution will focus on practicality over flavor and texture.
Let’s now visualize the results for a few sets of weights.
Visualization of Results
Let’s see how the optimizer handles different preference profiles, demonstrating the flexibility of desirability functions, given various input weights.
Let’s have a look at the results in case of weights favoring practicality:
Optimized parameters with weights favoring practicality. Image by author.
With weights largely in favor of practicality, the achieved overall desirability is 0.69, with a short kneading time of 5 minutes, since a high value impacts negatively the practicality.
Now, if we optimize with an emphasis on texture, we have slightly different results:
Optimized parameters with weights favoring texture. Image by author.
In this case, the achieved overall desirability is 0.85, significantly higher. The kneading time is this time 12 minutes, as a higher value impacts positively the texture and is not penalized so much because of practicality.
Conclusion: Practical Applications of Desirability Functions
While we focused on bread baking as our example, the same approach can be applied to various domains, such as product formulation in cosmetics or resource allocation in portfolio optimization.
Desirability functions provide a powerful mathematical framework for tackling multi-objective optimization problems across numerous data science applications. By transforming raw metrics into standardized desirability scores, we can effectively combine and optimize disparate objectives.
The key advantages of this approach include:
Standardized scales that make different metrics comparable and easy to combine into a single target
Flexibility to handle different types of objectives: minimize, maximize, target
Clear communication of preferences through mathematical functions
The code presented here provides a starting point for your own experimentation. Whether you’re optimizing industrial processes, machine learning models, or product formulations, hopefully desirability functions offer a systematic approach to finding the best compromise among competing objectives.
The post Optimizing Multi-Objective Problems with Desirability Functions appeared first on Towards Data Science.
#optimizing #multiobjective #problems #with #desirability
·184 Views