Adaptive Constraint-Based Agents in Artificial Environments

[SEARCH PARADIGMS]   [Refinement Search]   [Local Search]   [Discussion]

[ Please note: The project has been discontinued as of May 31, 2005 and is superseded by the projects of the ii Labs. There won't be further updates to these pages. ]

Local Search

(Related publication: [PUBLink])

Local search approaches perform a search by iteratively changing an initial state (resp. initial plan). In each iteration, the successor choice criterion determines a state which will become the new search state. The potential successor states that can be chosen for a state are referred to as the state's neighborhood. The quality of the neighborhood states can be computed by an objective/cost function.

Neighborhood states are mostly states that differ in value switches of single variables or a simple extension/reduction of the plan. Of course, more complex changes are possible as well. For an optimization task, the neighborhood is usually constituted by feasible states only. For a satisfaction task, the neighborhood consists of partially consistent states.

Most of the local-search methods are incomplete, and it is possible for them to become trapped in local optima or on plateaus. Many local-search methods apply additional techniques to escape from these local minima and plateaus - such as tabu lists, random walks or restarts. With increasing dynamics of the agent's environment, the importance of these features decreases, as the search space changes quickly and less improvement is possible.

Local search techniques include evolutionary algorithms [PUBLink], simulated annealing [PUBLink], tabu search [PUBLink], min-conflicts [PUBLink], GSAT [PUBLink] [PUBLink] and Walksat [PUBLink]. Applications to planning problems were proposed by Kautz and Selman [PUBLink] [PUBLink], Muslea [PUBLink], Ambite and Knoblock [PUBLink], Gerevini and Serina [PUBLink], Brafman and Hoos [PUBLink], and Chien et al. [PUBLink], among others.

[SEARCH PARADIGMS]   [Refinement Search]   [Local Search]   [Discussion]

For questions, comments or suggestions, please contact us.

Last update:
May 19, 2001 by Alexander Nareyek