ISBN-13: 9783030027285 / Angielski / Twarda / 2019 / 104 str.
This book bridges the widening gap between two crucial constituents of computational intelligence: the rapidly advancing technologies of machine learning in the digital information age, and the relatively slow-moving field of general-purpose search and optimization algorithms. With this in mind, the book serves to offer a data-driven view of optimization, through the framework of memetic computation (MC). The authors provide a summary of the complete timeline of research activities in MC - beginning with the initiation of memes as local search heuristics hybridized with evolutionary algorithms, to their modern interpretation as computationally encoded building blocks of problem-solving knowledge that can be learned from one task and adaptively transmitted to another. In the light of recent research advances, the authors emphasize the further development of MC as a simultaneous problem learning and optimization paradigm with the potential to showcase human-like problem-solving prowess; that is, by equipping optimization engines to acquire increasing levels of intelligence over time through embedded memes learned independently or via interactions. In other words, the adaptive utilization of available knowledge memes makes it possible for optimization engines to tailor custom search behaviors on the fly - thereby paving the way to general-purpose problem-solving ability (or artificial general intelligence). In this regard, the book explores some of the latest concepts from the optimization literature, including, the sequential transfer of knowledge across problems, multitasking, and large-scale (high dimensional) search, systematically discussing associated algorithmic developments that align with the general theme of memetics. The presented ideas are intended to be accessible to a wide audience of scientific researchers, engineers, students, and optimization practitioners who are familiar with the commonly used terminologies of evolutionary computation. A full appreciation of the mathematical formalizations and algorithmic contributions requires an elementary background in probability, statistics, and the concepts of machine learning. A prior knowledge of surrogate-assisted/Bayesian optimization techniques is useful, but not essential.