Rauf

AI and ML

Optimization in AI

Optimization in AI means improving solutions to achieve better performance. In this Web Note, I will discuss topics such as Local Search, Hill Climbing, Simulated Annealing, Linear Programming, Constraint Satisfaction, and Backtracking Search one by one. Let's start 😊

Local Search

Local search is an optimization technique used to find solutions by exploring neighboring states. For example, imagine Rauf wants to visit Adil's home to borrow a bike. If Rauf tries different paths starting from his current location, evaluating each path's distance until finding the shortest one, this is a simple form of local search.

# Example of a local search in Python
def local_search(paths):
    best_path = None
    shortest_distance = float('inf')
    for path, distance in paths.items():
        if distance < shortest_distance:
            shortest_distance = distance
            best_path = path
    return best_path

paths = {'Path A': 10, 'Path B': 15, 'Path C': 8}
print("Best path:", local_search(paths))

Hill Climbing

Before understanding hill climbing, we need to know about global maximum, global minimum, local maximum, and local minimum. Imagine Rauf has many routes to reach Adil's home, but he wants the quickest one. A global minimum is the quickest route overall, while a local minimum is a quick route in a specific area but not the best overall. Similarly, a global maximum could represent the slowest overall route, and a local maximum a slower route in a smaller region. Hill climbing evaluates neighboring solutions to move toward the best (highest or lowest) solution. However, it may get stuck in a local maximum or minimum.

# Hill Climbing Example in Python
def hill_climbing(function, start, step_size):
    current = start
    while True:
        neighbors = [current + step_size, current - step_size]
        next_step = max(neighbors, key=function)
        if function(next_step) <= function(current):
            break
        current = next_step
    return current

# Example function to maximize
def f(x):
    return -(x - 3) ** 2 + 9

print("Best solution:", hill_climbing(f, start=0, step_size=1))

Simulated Annealing

Simulated annealing overcomes the limitations of hill climbing by sometimes allowing moves to worse solutions to escape local optima and find the global minimum. This is similar to cooling metals slowly to find their most stable state.

# Simulated Annealing Example in Python
import math
import random

def simulated_annealing(function, start, temp, cooling_rate):
    current = start
    best = current
    while temp > 1:
        neighbor = current + random.uniform(-1, 1)
        delta_e = function(neighbor) - function(current)
        if delta_e > 0 or math.exp(delta_e / temp) > random.random():
            current = neighbor
        if function(current) > function(best):
            best = current
        temp *= cooling_rate
    return best

# Example function to maximize
def f(x):
    return -(x - 3) ** 2 + 9

print("Best solution:", simulated_annealing(f, start=0, temp=100, cooling_rate=0.95))

Linear Programming

Linear programming is used to optimize a linear objective function subject to linear constraints. For example, if Rauf wants to buy ingredients for a recipe while minimizing cost and staying within a budget, linear programming can help.

# Linear Programming Example in Python
from scipy.optimize import linprog

# Minimize: c1 * x1 + c2 * x2
c = [3, 2]
A = [[1, 1], [2, 1]]
b = [8, 10]
result = linprog(c, A_ub=A, b_ub=b, method='highs')
print("Optimal solution:", result.x)

Constraint Satisfaction and Backtracking Search

Constraint satisfaction involves solving problems with constraints, such as Sudoku. Backtracking search systematically explores possible solutions by backtracking when constraints are violated.

# Backtracking Example for Sudoku Solver

def is_safe(board, row, col, num):
    for x in range(9):
        if board[row][x] == num or board[x][col] == num or board[row - row % 3 + x // 3][col - col % 3 + x % 3] == num:
            return False
    return True

def solve_sudoku(board):
    for row in range(9):
        for col in range(9):
            if board[row][col] == 0:
                for num in range(1, 10):
                    if is_safe(board, row, col, num):
                        board[row][col] = num
                        if solve_sudoku(board):
                            return True
                        board[row][col] = 0
                return False
    return True

# Example Sudoku board
board = [
    [5, 3, 0, 0, 7, 0, 0, 0, 0],
    [6, 0, 0, 1, 9, 5, 0, 0, 0],
    [0, 9, 8, 0, 0, 0, 0, 6, 0],
    [8, 0, 0, 0, 6, 0, 0, 0, 3],
    [4, 0, 0, 8, 0, 3, 0, 0, 1],
    [7, 0, 0, 0, 2, 0, 0, 0, 6],
    [0, 6, 0, 0, 0, 0, 2, 8, 0],
    [0, 0, 0, 4, 1, 9, 0, 0, 5],
    [0, 0, 0, 0, 8, 0, 0, 7, 9]
]

solve_sudoku(board)
print("Solved board:", board)