|
31 | 31 |
|
32 | 32 | % write a summary sentence for each major section
|
33 | 33 | \section*{Abstract}
|
34 |
| -This is the abstract. Consider writing a one sentence summary of each major section in the report. |
| 34 | +todo |
35 | 35 |
|
36 | 36 | \begin{description}
|
37 |
| - \item[Keywords:] {\small\texttt{Keywords, Go, Here}} |
| 37 | + \item[Keywords:] {\small\texttt{Clever, Algorithms, Unconventional, Optimization}} |
38 | 38 | \end{description}
|
39 | 39 |
|
40 | 40 | % summarise the document breakdown with cross references
|
41 | 41 | \section{Introduction}
|
42 | 42 | \label{sec:introduction}
|
43 |
| -This is the introduction. Consider summarizing the message of the document, then provide a breakdown of the sections of the document with cross references. |
| 43 | +% project |
44 | 44 |
|
45 |
| -\section{A Section} |
46 |
| -\label{sec:a_section} |
47 |
| -Write things! |
| 45 | +% report |
48 | 46 |
|
49 |
| -\subsection{A SubSection} |
50 |
| -Write more things! |
| 47 | +% breakdown |
51 | 48 |
|
52 |
| -% summarise the document message and areas for future consideration |
| 49 | +What do we need to know about this general class of algorithms: unconventional optimization |
| 50 | +nomenclature |
| 51 | + |
| 52 | +% |
| 53 | +% Black-Box Methods |
| 54 | +% |
| 55 | +\section{Black-Box Methods} |
| 56 | +\label{sec:black_box} |
| 57 | +They make little or few assumptions about the problem domain |
| 58 | + |
| 59 | + |
| 60 | +% |
| 61 | +% Randomness |
| 62 | +% |
| 63 | +\section{Randomness} |
| 64 | +The are stochastic processes. |
| 65 | + |
| 66 | +stochastic global optimization |
| 67 | + |
| 68 | +% |
| 69 | +% State Space |
| 70 | +% |
| 71 | +\section{State Space} |
| 72 | +The typically require the problem to be phrased as a search space which is traversed and sampled. |
| 73 | +We care about the size of moves, the patters of sampling and re-sampling, the number of samples managed. |
| 74 | + |
| 75 | +% |
| 76 | +% Induction |
| 77 | +% |
| 78 | +\section{Induction} |
| 79 | +The typically learn by doing (trial and error) |
| 80 | +generate, guess, revise |
| 81 | + |
| 82 | + |
| 83 | +% |
| 84 | +% No Free Lunch |
| 85 | +% |
| 86 | +\section{No Free Lunch} |
| 87 | +\label{sec:nfl} |
| 88 | +all the same across all problems with no prior info |
| 89 | + |
| 90 | + |
| 91 | +% |
| 92 | +% Problems |
| 93 | +% |
| 94 | +\section{Problems} |
| 95 | +\label{sec:problems} |
| 96 | +lots of hard problems |
| 97 | +a book out there has a summary of the general properties of problems to which these techniques are suited |
| 98 | + |
| 99 | +What types of computational problems are we solving with these algorithms? |
| 100 | +Give example classes for each, give canonical instances for each (all covered in this book) |
| 101 | + |
| 102 | +\subsection{Function Optimization} |
| 103 | +Generate a set of parameters (continuous) or something like a permutation (combinatorial). |
| 104 | + |
| 105 | +\subsection{Function Approximation} |
| 106 | +Generate a representation that produces outputs in the presence of inputs. |
| 107 | + |
| 108 | +% |
| 109 | +% Conclusions: summarise the document message and areas for future consideration |
| 110 | +% |
53 | 111 | \section{Conclusions}
|
54 | 112 | \label{sec:conclusions}
|
55 |
| -This is the conclusion. Consider summarizing the message of the document once again, and highlighting areas for future consideration. |
| 113 | +todo |
56 | 114 |
|
57 | 115 | % bibliography
|
58 | 116 | \bibliographystyle{plain}
|
|
0 commit comments