Jump to content

Draft:Fixed set search: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Rakabog (talk | contribs)
No edit summary
Rakabog (talk | contribs)
Line 10: Line 10:


=== Ground set ===
=== Ground set ===
Firstly, a any problem solution <math>S</math> must be a subset derived from a ground set of elements <math>G</math>. For example in the case of the TSP, defined on a graph the ground set is the set of all edges. In the case of the minimum dominating set problem the ground set is the set of all nodes of the graph.
Firstly, a any problem solution <math>P</math> must be a subset derived from a ground set of elements <math>G</math>. For example in the case of the TSP, defined on a graph the ground set is the set of all edges. In the case of the minimum dominating set problem the ground set is the set of all nodes of the graph.


=== Generation of fixed sets ===
=== Generation of fixed sets ===
The second building block involves defining a method for generating multiple fixed sets <math>F</math> while controlling their size (<math>|F|</math>). When integrating a fixed set into the randomized greedy algorithm, it must be capable of producing a feasible solution of equal or superior quality to those from the base greedy algorithm. Let <math>\mathcal{S}_n = \{S_1, S_2,.., S_n\}</math> represent the set of <math>n</math> best solutions based on the function f generated in previous steps. A base solution <math>B</math> is randomly selected from these best <math>n</math> solutions. If a fixed set <math>F</math> satisfies <math>F\subseteq B</math>, it can potentially generate a feasible solution of at least the same quality as <math>B</math>, and <math>F</math> can include any number of elements from <math>B</math>. The goal is for <math>F</math> to include elements frequently occurring in a group of high-quality solutions. Let us define <math>S_{kn}</math> as the set of <math>k</math> randomly selected solutions from the <math>n</math> best ones, <math>S_{n}</math>.
The second building block involves defining a method for generating multiple fixed sets <math>F</math> while controlling their size (<math>|F|</math>). When integrating a fixed set into the randomized greedy algorithm, it must be capable of producing a feasible solution of equal or superior quality to those from the base greedy algorithm. Let <math>\mathcal{P}_n = \{S_1, S_2,.., S_n\}</math> represent the set of <math>n</math> best solutions based on the function f generated in previous steps. A base solution <math>B</math> is randomly selected from these best <math>n</math> solutions. If a fixed set <math>F</math> satisfies <math>F\subseteq B</math>, it can potentially generate a feasible solution of at least the same quality as <math>B</math>, and <math>F</math> can include any number of elements from <math>B</math>. The goal is for <math>F</math> to include elements frequently occurring in a group of high-quality solutions. Let us define <math>P_{kn}</math> as the set of <math>k</math> randomly selected solutions from the <math>n</math> best ones, <math>P_{n}</math>.


Using these building blocks it is possible to define a function <math>Fix(B, S_{kn}, Size)</math> for generating a fixed set <math>F\subseteq B</math> that consists of <math>Size</math> elements of the base solution <math>B = \{e_1, e_2,...,e_l\}</math> that most frequently occur in <math>S_{kn}</math>. Let use define the function <math>C(e,S)</math>, for an element <math>e\in G</math> and a solution <math>S \subset G</math> , which is equal to <math>1</math> if <math>e\in S</math> and <math>0</math> otherwise. We can define a function that counts the number of occurrences of element <math>e</math> in the elements of the set <math>S_{kn}</math> using the function <math>C(e,S)</math> as follows.
Using these building blocks it is possible to define a function <math>Fix(B, S_{kn}, Size)</math> for generating a fixed set <math>F\subseteq B</math> that consists of <math>Size</math> elements of the base solution <math>B = \{e_1, e_2,...,e_l\}</math> that most frequently occur in <math>P_{kn}</math>. Let use define the function <math>C(e,S)</math>, for an element <math>e\in G</math> and a solution <math>S \subset G</math> , which is equal to <math>1</math> if <math>e\in S</math> and <math>0</math> otherwise. We can define a function that counts the number of occurrences of element <math>e</math> in the elements of the set <math>S_{kn}</math> using the function <math>C(e,S)</math> as follows.


<math display="block">O(e, P_{kn}) = \sum_{S \in P_{kn}}C(e,S)</math>
O(vx; Skn) = X S2Skn C(vx; S) (14)


Now, we can define Fix(B; Skn; Size) as the set of Size elements vx 2 B that have the largest value of O(vx; Skn)
Now, we can define <math>Fix(B, S_{kn}, Size)</math> as the set of <math>Size</math> elements <math>e\in B</math> that have the largest value of <math>O(e, P_{kn})</math>

=== Learning mechanism ===
The learning mechanism in FSS is facilitated through fixed sets. To achieve this, the RGC algorithm from GRASP is adapted to incorporate preselected elements that must be included in newly generated solutions. We denote the solution generated with such an algorithm and a preselected set of elements <math>F</math> as <math>RGF(F)</math>.

In FSS, akin to GRASP, solutions are iteratively generated and subjected to local search. Initially, a population of solutions <math>P</math> is created by executing <math>N</math> iterations of the GRASP algorithm. This population is then used to randomly generate a fixed set <math>F</math> of size <math>Size</math>, following the method outlined in the preceding section. <math>F</math> is utilized to produce a new solution <math>S = RGF(F)</math>, upon which local search is applied. The population is expanded with these newly generated locally optimal solutions. This cycle continues until no new best solutions are discovered for an extended period, signifying stagnation. At this point, the size of the fixed set may be increased if stagnation persists. If the maximum allowed size is reached, the fixed set size resets to the minimum allowed value. This iterative process continues until a stopping criterion is met.

An essential aspect of the algorithm is defining the array of permissible fixed set sizes, tied to the fixed part of the solution. In our implementation, this array is defined as

<math display="block">C_i = 1-\frac{1}{2^i}</math>.

The size of fixed sets used is proportional to the base solution <math>B</math>, specifically <math>C_i|B|</math> at the <math>i</math>-th level.

Revision as of 10:43, 18 April 2024

Fixed Set Search (FSS)

Fixed set search (FSS) is a novel metaheuristic that has been effectively applied to various optimization problems, including the traveling salesman problem, power dominating set problem, machine scheduling, and minimum weighted vertex cover. Additionally, FSS has been adapted for bi-objective optimization and a matheuristic setting. As a population-based metaheuristic integrating local search, FSS is particularly effective for problems where local search methods excel. FSS enhances the GRASP metaheuristic by adding a learning mechanism. The GRASP involves generating solutions through a randomized greedy algorithm and refining them with local search.

The FSS is inspired by the observation that high-quality solutions often share common elements. The core strategy of FSS is to identify these shared elements—termed the "fixed set"—and incorporate them into newly generated solutions. This approach focuses computational efforts on completing these partial solutions by "filling in the gaps." This concept draws on earlier metaheuristic strategies such as chunking, vocabulary building, and consistent chains from tabu search, and relates closely to the matheuristic POPMUSIC paradigm.

Method outline

The FSS algorithm begins by initializing a solution population using GRASP, then employs a learning mechanism to generate fixed sets. These fixed sets are used iteratively in an extended randomized greedy algorithm alongside a pre-selected set of elements to create new solutions. Each new solution undergoes a local search to improve its quality. Apart from implementing the corresponding GRASP algorithm, several building blocks are necessary for FSS implementation.

Ground set

Firstly, a any problem solution must be a subset derived from a ground set of elements . For example in the case of the TSP, defined on a graph the ground set is the set of all edges. In the case of the minimum dominating set problem the ground set is the set of all nodes of the graph.

Generation of fixed sets

The second building block involves defining a method for generating multiple fixed sets while controlling their size (). When integrating a fixed set into the randomized greedy algorithm, it must be capable of producing a feasible solution of equal or superior quality to those from the base greedy algorithm. Let represent the set of best solutions based on the function f generated in previous steps. A base solution is randomly selected from these best solutions. If a fixed set satisfies , it can potentially generate a feasible solution of at least the same quality as , and can include any number of elements from . The goal is for to include elements frequently occurring in a group of high-quality solutions. Let us define as the set of randomly selected solutions from the best ones, .

Using these building blocks it is possible to define a function for generating a fixed set that consists of elements of the base solution that most frequently occur in . Let use define the function , for an element and a solution , which is equal to if and otherwise. We can define a function that counts the number of occurrences of element in the elements of the set using the function as follows.

Now, we can define as the set of elements that have the largest value of

Learning mechanism

The learning mechanism in FSS is facilitated through fixed sets. To achieve this, the RGC algorithm from GRASP is adapted to incorporate preselected elements that must be included in newly generated solutions. We denote the solution generated with such an algorithm and a preselected set of elements as .

In FSS, akin to GRASP, solutions are iteratively generated and subjected to local search. Initially, a population of solutions is created by executing iterations of the GRASP algorithm. This population is then used to randomly generate a fixed set of size , following the method outlined in the preceding section. is utilized to produce a new solution , upon which local search is applied. The population is expanded with these newly generated locally optimal solutions. This cycle continues until no new best solutions are discovered for an extended period, signifying stagnation. At this point, the size of the fixed set may be increased if stagnation persists. If the maximum allowed size is reached, the fixed set size resets to the minimum allowed value. This iterative process continues until a stopping criterion is met.

An essential aspect of the algorithm is defining the array of permissible fixed set sizes, tied to the fixed part of the solution. In our implementation, this array is defined as

.

The size of fixed sets used is proportional to the base solution , specifically at the -th level.