首页 > 最新文献

Handbook of Approximation Algorithms and Metaheuristics最新文献

英文 中文
Principles and Strategies of Tabu Search 禁忌搜索的原则和策略
Pub Date : 2018-05-09 DOI: 10.1201/9781351236423-21
F. Glover, M. Laguna, R. Martí
Tabu Search is a meta-heuristic that guides a local heuristic search procedure to explore the solution space beyond local optimality. One of the main components of Tabu Search is its use of adaptive memory, which creates a more flexible search behavior. Memory-based strategies are therefore the hallmark of tabu search approaches, founded on a quest for “integrating principles,” by which alternative forms of memory are appropriately combined with effective strategies for exploiting them. A novel finding is that such principles are sometimes sufficiently potent to yield effective problem solving behavior in their own right, with negligible reliance on memory. Over a wide range of problem settings, however, strategic use of memory can make dramatic differences in the ability to solve problems. Pure and hybrid Tabu Search approaches have set new records in finding better solutions to problems in production planning and scheduling, resource allocation, network design, routing, financial analysis, telecommunications, portfolio planning, supply chain management, agent-based modeling, business process design, forecasting, machine learning, data mining, biocomputation, molecular design, forest management and resource planning, among many other areas.
禁忌搜索是一种元启发式算法,它指导局部启发式搜索过程探索超越局部最优的解空间。禁忌搜索的主要组成部分之一是使用自适应记忆,这创建了更灵活的搜索行为。因此,基于记忆的策略是禁忌搜索方法的标志,它建立在对“整合原则”的追求之上,通过这种原则,可以将不同形式的记忆与利用它们的有效策略适当地结合起来。一个新颖的发现是,这些原则有时足以产生有效的解决问题的行为,而不依赖于记忆。然而,在广泛的问题设置中,策略性地使用记忆可以在解决问题的能力方面产生巨大的差异。纯禁忌搜索和混合禁忌搜索方法在为生产计划和调度、资源分配、网络设计、路由、财务分析、电信、投资组合规划、供应链管理、基于代理的建模、业务流程设计、预测、机器学习、数据挖掘、生物计算、分子设计、森林管理和资源规划等许多领域的问题找到更好的解决方案方面创造了新的记录。
{"title":"Principles and Strategies of Tabu Search","authors":"F. Glover, M. Laguna, R. Martí","doi":"10.1201/9781351236423-21","DOIUrl":"https://doi.org/10.1201/9781351236423-21","url":null,"abstract":"Tabu Search is a meta-heuristic that guides a local heuristic search procedure to explore the solution space beyond local optimality. One of the main components of Tabu Search is its use of adaptive memory, which creates a more flexible search behavior. Memory-based strategies are therefore the hallmark of tabu search approaches, founded on a quest for “integrating principles,” by which alternative forms of memory are appropriately combined with effective strategies for exploiting them. A novel finding is that such principles are sometimes sufficiently potent to yield effective problem solving behavior in their own right, with negligible reliance on memory. Over a wide range of problem settings, however, strategic use of memory can make dramatic differences in the ability to solve problems. Pure and hybrid Tabu Search approaches have set new records in finding better solutions to problems in production planning and scheduling, resource allocation, network design, routing, financial analysis, telecommunications, portfolio planning, supply chain management, agent-based modeling, business process design, forecasting, machine learning, data mining, biocomputation, molecular design, forest management and resource planning, among many other areas.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"313 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122244292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Practical Algorithms for Two-Dimensional Packing of General Shapes 一般形状二维填充的实用算法
Pub Date : 2018-05-09 DOI: 10.1201/9781351236423-33
Yannan Hu, H. Hashimoto, S. Imahori, M. Yagiura
{"title":"Practical Algorithms for Two-Dimensional Packing of General Shapes","authors":"Yannan Hu, H. Hashimoto, S. Imahori, M. Yagiura","doi":"10.1201/9781351236423-33","DOIUrl":"https://doi.org/10.1201/9781351236423-33","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133829731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Reactive Search: Machine Learning for Memory-Based Heuristics 响应式搜索:基于记忆的启发式机器学习
Pub Date : 2005-09-01 DOI: 10.1201/9781351236423-19
R. Battiti, M. Brunato
Most state-of-the-art heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology. In some cases, these parameters are tuned through a feedback loop that includes the user as a crucial learning component: depending on preliminary algorithm tests some parameter values are changed by the user, and different options are tested until acceptable results are obtained. Therefore, the quality of results is not automatically transferred to different instances and the feedback loop can require a lengthy "trial and error" process every time the algorithm has to be tuned for a new application. Parameter tuning is therefore a crucial issue both in the scientific development and in the practical use of heuristics. In some cases the role of the user as an intelligent (learning) part makes the reproducibility of heuristic results difficult and, as a consequence, the competitiveness of alternative techniques depends in a crucial way on the user's capabilities. Reactive Search advocates the use of simple sub-symbolic machine learning to automate the parameter tuning process and make it an integral (and fully documented) part of the algorithm. If learning is performed on line, task-dependent and local properties of the configuration space can be used by the algorithm to determine the appropriate balance between diversification (looking for better solutions in other zones of the configuration space) and intensification (exploring more intensively a small but promising part of the configuration space). In this way a single algorithm maintains the flexibility to deal with related problems through an internal feedback loop that considers the previous history of the search. In the following, we shall call reaction the act of modifying some algorithm parameters in response to the search algorithm's behavior during its execution, rather than between runs. Therefore, a reactive heuristic is a technique with the ability of tuning some important parameters during execution by means of a machine learning mechanism. It is important to notice that such heuristics are intrinsically history-dependent; thus, the practical success of this approach in some cases raises the need of a sounder theoretical foundation of non-Markovian search techniques.
大多数最先进的启发式以一定数量的选择和自由参数为特征,其适当设置是一个提出研究方法问题的主题。在某些情况下,这些参数通过反馈循环进行调整,其中包括用户作为关键的学习组件:根据初步算法测试,用户可以更改一些参数值,并测试不同的选项,直到获得可接受的结果。因此,结果的质量不会自动转移到不同的实例,并且每次必须为新应用程序调整算法时,反馈循环可能需要一个漫长的“试错”过程。因此,参数调整在科学发展和启发式的实际应用中都是一个至关重要的问题。在某些情况下,用户作为智能(学习)部分的角色使得启发式结果难以再现,因此,替代技术的竞争力在很大程度上取决于用户的能力。Reactive Search提倡使用简单的子符号机器学习来自动化参数调整过程,并使其成为算法的一个组成部分(并有完整的文档)。如果在线进行学习,则算法可以使用组态空间的任务相关和局部属性来确定多样化(在组态空间的其他区域寻找更好的解)和集约化(更深入地探索组态空间中较小但有前途的部分)之间的适当平衡。通过这种方式,单个算法通过考虑先前搜索历史的内部反馈循环来保持处理相关问题的灵活性。在下文中,我们将根据搜索算法在执行期间(而不是在运行之间)的行为修改某些算法参数的行为称为反应。因此,反应性启发式是一种通过机器学习机制在执行过程中调整一些重要参数的技术。重要的是要注意,这种启发式本质上依赖于历史;因此,这种方法在某些情况下的实际成功提出了对非马尔可夫搜索技术的更完善的理论基础的需求。
{"title":"Reactive Search: Machine Learning for Memory-Based Heuristics","authors":"R. Battiti, M. Brunato","doi":"10.1201/9781351236423-19","DOIUrl":"https://doi.org/10.1201/9781351236423-19","url":null,"abstract":"Most state-of-the-art heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology. In some cases, these parameters are tuned through a feedback loop that includes the user as a crucial learning component: depending on preliminary algorithm tests some parameter values are changed by the user, and different options are tested until acceptable results are obtained. Therefore, the quality of results is not automatically transferred to different instances and the feedback loop can require a lengthy \"trial and error\" process every time the algorithm has to be tuned for a new application. Parameter tuning is therefore a crucial issue both in the scientific development and in the practical use of heuristics. In some cases the role of the user as an intelligent (learning) part makes the reproducibility of heuristic results difficult and, as a consequence, the competitiveness of alternative techniques depends in a crucial way on the user's capabilities. Reactive Search advocates the use of simple sub-symbolic machine learning to automate the parameter tuning process and make it an integral (and fully documented) part of the algorithm. If learning is performed on line, task-dependent and local properties of the configuration space can be used by the algorithm to determine the appropriate balance between diversification (looking for better solutions in other zones of the configuration space) and intensification (exploring more intensively a small but promising part of the configuration space). In this way a single algorithm maintains the flexibility to deal with related problems through an internal feedback loop that considers the previous history of the search. In the following, we shall call reaction the act of modifying some algorithm parameters in response to the search algorithm's behavior during its execution, rather than between runs. Therefore, a reactive heuristic is a technique with the ability of tuning some important parameters during execution by means of a machine learning mechanism. It is important to notice that such heuristics are intrinsically history-dependent; thus, the practical success of this approach in some cases raises the need of a sounder theoretical foundation of non-Markovian search techniques.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"119 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114496020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Approximation Algorithms for the Selection of Robust Tag SNPs 鲁棒标签snp选择的近似算法
Pub Date : 2004-09-17 DOI: 10.1201/9781420010749.ch77
Kui Zhang, K. Chao, Yao-Ting Huang, Ting Chen
Recent studies have shown that the chromosomal recombination only takes places at some narrow hotspots. Within the chromosomal region between these hotspots (called haplotype block), little or even no recombination occurs, and a small subset of SNPs (called tag SNPs) is sufficient to capture the haplotype pattern of the block. In reality, the tag SNPs may be genotyped as missing data, and we may fail to distinguish two distinct haplotypes due to the ambiguity caused by missing data. In this paper, we formulate this problem as finding a set of SNPs (called robust tag SNPs) which is able to tolerate missing data. To find robust tag SNPs, we propose two greedy and one LP-relaxation algorithms which give solutions of ((m+1)lnfrac{K(K-1)}{2}), (ln((m+1)frac{K(K-1)}{2})), and O(mln K) approximation respectively, where m is the number of SNPs allowed for missing data and K is the number of patterns in the block.
最近的研究表明,染色体重组只发生在一些狭窄的热点。在这些热点之间的染色体区域(称为单倍型片段),很少甚至没有发生重组,并且一小部分snp(称为标签snp)足以捕获该片段的单倍型模式。在现实中,标签snp可能被基因分型为缺失数据,由于缺失数据造成的模糊性,我们可能无法区分两种不同的单倍型。在本文中,我们将这个问题表述为寻找一组能够容忍缺失数据的snp(称为鲁棒标签snp)。为了找到稳健的标签snp,我们提出了两种贪婪和一种lp -松弛算法,分别给出((m+1)lnfrac{K(K-1)}{2}), (ln((m+1)frac{K(K-1)}{2}))和O(mln K)近似的解,其中m是允许缺失数据的snp数量,K是块中的模式数量。
{"title":"Approximation Algorithms for the Selection of Robust Tag SNPs","authors":"Kui Zhang, K. Chao, Yao-Ting Huang, Ting Chen","doi":"10.1201/9781420010749.ch77","DOIUrl":"https://doi.org/10.1201/9781420010749.ch77","url":null,"abstract":"Recent studies have shown that the chromosomal recombination only takes places at some narrow hotspots. Within the chromosomal region between these hotspots (called haplotype block), little or even no recombination occurs, and a small subset of SNPs (called tag SNPs) is sufficient to capture the haplotype pattern of the block. In reality, the tag SNPs may be genotyped as missing data, and we may fail to distinguish two distinct haplotypes due to the ambiguity caused by missing data. In this paper, we formulate this problem as finding a set of SNPs (called robust tag SNPs) which is able to tolerate missing data. To find robust tag SNPs, we propose two greedy and one LP-relaxation algorithms which give solutions of ((m+1)lnfrac{K(K-1)}{2}), (ln((m+1)frac{K(K-1)}{2})), and O(mln K) approximation respectively, where m is the number of SNPs allowed for missing data and K is the number of patterns in the block.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131046742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Very Large-Scale Neighborhood Search 大规模邻域搜索
Pub Date : 2000-09-01 DOI: 10.1201/9781420010749.ch20
Özlem Ergun, Abraham P. Punnen, J. Orlin, R. Ahuja
Many optimization problems that model the essential issues of important real-world decision making are computationally intractable. Therefore, a practical approach for solving such problems is to employ heuristic techniques that find nearly optimal solutions within a reasonable amount of computation time. Improvement algorithms generally start with a feasible solution and iteratively try to obtain a better solution. Neighborhood search algorithms, which are alternatively called local search algorithms, are a wide class of improvement algorithms where at each iteration an improving solution is found by searching a “neighborhood” of the current solution. A critical issue in the design of a neighborhood search algorithm is defining what solutions constitute the neighborhood of a solution. As a rule of thumb, the larger the neighborhood, the better is the quality of the locally optimal solutions, including the final solution selected upon termination. Similarly, the larger the neighborhood, the longer it takes to search the neighborhood. Thus, a larger neighborhood does not necessarily produce a more effective heuristic unless one can search the larger neighborhood efficiently. This article concentrates on neighborhood search algorithms where the size of the neighborhood is “very large” with respect to the size of the input data and the neighborhood can be searched efficiently. We survey three broad classes of very large-scale neighborhood (VLSN) search algorithms: variable-depth methods in which large neighborhoods are searched heuristically, large neighborhoods that are searched by solving a constrained minimum–cost flow problem, and other situations that give rise to efficiently searchable large neighborhoods. Keywords: very large-scale neighborhood search; cyclic-exchange neighborhood; variable-depth neighborhood; multiexchange neighborhood; heuristics
许多模拟现实世界重要决策的基本问题的优化问题在计算上是难以处理的。因此,解决此类问题的实用方法是采用启发式技术,在合理的计算时间内找到接近最优的解决方案。改进算法通常从可行解开始,迭代地尝试获得更好的解。邻域搜索算法,也称为局部搜索算法,是一类广泛的改进算法,在每次迭代中,通过搜索当前解的“邻域”来找到改进解。邻域搜索算法设计中的一个关键问题是定义哪些解构成一个解的邻域。根据经验,邻域越大,局部最优解的质量越好,包括终止时选择的最终解。同样,邻域越大,搜索邻域所需的时间就越长。因此,较大的邻域不一定产生更有效的启发式,除非可以有效地搜索较大的邻域。本文主要讨论邻域搜索算法,其中邻域的大小相对于输入数据的大小“非常大”,并且可以有效地搜索邻域。我们研究了三大类非常大规模邻域(VLSN)搜索算法:启发式搜索大邻域的变深度方法,通过求解约束最小成本流问题搜索大邻域的变深度方法,以及产生有效搜索大邻域的其他情况。关键词:超大规模邻域搜索;cyclic-exchange社区;可变深度附近;multiexchange社区;启发式
{"title":"Very Large-Scale Neighborhood Search","authors":"Özlem Ergun, Abraham P. Punnen, J. Orlin, R. Ahuja","doi":"10.1201/9781420010749.ch20","DOIUrl":"https://doi.org/10.1201/9781420010749.ch20","url":null,"abstract":"Many optimization problems that model the essential issues of important real-world decision making are computationally intractable. Therefore, a practical approach for solving such problems is to employ heuristic techniques that find nearly optimal solutions within a reasonable amount of computation time. Improvement algorithms generally start with a feasible solution and iteratively try to obtain a better solution. Neighborhood search algorithms, which are alternatively called local search algorithms, are a wide class of improvement algorithms where at each iteration an improving solution is found by searching a “neighborhood” of the current solution. A critical issue in the design of a neighborhood search algorithm is defining what solutions constitute the neighborhood of a solution. As a rule of thumb, the larger the neighborhood, the better is the quality of the locally optimal solutions, including the final solution selected upon termination. Similarly, the larger the neighborhood, the longer it takes to search the neighborhood. Thus, a larger neighborhood does not necessarily produce a more effective heuristic unless one can search the larger neighborhood efficiently. This article concentrates on neighborhood search algorithms where the size of the neighborhood is “very large” with respect to the size of the input data and the neighborhood can be searched efficiently. We survey three broad classes of very large-scale neighborhood (VLSN) search algorithms: variable-depth methods in which large neighborhoods are searched heuristically, large neighborhoods that are searched by solving a constrained minimum–cost flow problem, and other situations that give rise to efficiently searchable large neighborhoods. \u0000 \u0000 \u0000Keywords: \u0000 \u0000very large-scale neighborhood search; \u0000cyclic-exchange neighborhood; \u0000variable-depth neighborhood; \u0000multiexchange neighborhood; \u0000heuristics","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128419628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 84
Stochastic Local Search 随机局部搜索
Pub Date : 1900-01-01 DOI: 10.1201/9781420010749.ch19
H. Hoos, T. Stützle
Interestingly, stochastic local search that you really wait for now is coming. It's significant to wait for the representative and beneficial books to read. Every book that is provided in better way and utterance will be expected by many peoples. Even you are a good reader or not, feeling to read this book will always appear when you find it. But, when you feel hard to find it as yours, what to do? Borrow to your friends and don't know when to give back it to her or him.
有趣的是,你期待已久的随机局部搜索出现了。等待有代表性和有益的书去读是很重要的。每一本以更好的方式和表达方式提供的书都将被许多人所期待。即使你是不是一个好读者,当你找到这本书的时候,你总会有读这本书的感觉。但是,当你觉得很难找到属于你的时候,该怎么办呢?向朋友借东西,却不知道什么时候还。
{"title":"Stochastic Local Search","authors":"H. Hoos, T. Stützle","doi":"10.1201/9781420010749.ch19","DOIUrl":"https://doi.org/10.1201/9781420010749.ch19","url":null,"abstract":"Interestingly, stochastic local search that you really wait for now is coming. It's significant to wait for the representative and beneficial books to read. Every book that is provided in better way and utterance will be expected by many peoples. Even you are a good reader or not, feeling to read this book will always appear when you find it. But, when you feel hard to find it as yours, what to do? Borrow to your friends and don't know when to give back it to her or him.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116952065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 128
Restriction Methods 限制的方法
Pub Date : 1900-01-01 DOI: 10.1201/9781420010749.ch3
T. Gonzalez
{"title":"Restriction Methods","authors":"T. Gonzalez","doi":"10.1201/9781420010749.ch3","DOIUrl":"https://doi.org/10.1201/9781420010749.ch3","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116005923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
LP Rounding and Extensions LP舍入和扩展
Pub Date : 1900-01-01 DOI: 10.1201/9781420010749.ch7
Ramesh Krishnamurti, D. Gaur
{"title":"LP Rounding and Extensions","authors":"Ramesh Krishnamurti, D. Gaur","doi":"10.1201/9781420010749.ch7","DOIUrl":"https://doi.org/10.1201/9781420010749.ch7","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122699262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Greedy Methods 贪婪的方法
Pub Date : 1900-01-01 DOI: 10.1201/9781420010749.ch4
S. Khuller, B. Raghavachari, N. Young
{"title":"Greedy Methods","authors":"S. Khuller, B. Raghavachari, N. Young","doi":"10.1201/9781420010749.ch4","DOIUrl":"https://doi.org/10.1201/9781420010749.ch4","url":null,"abstract":"","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"309 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133484066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Neural Networks 神经网络
Pub Date : 1900-01-01 DOI: 10.1201/9781420010749.ch22
H. Siegelmann, B. Dasgupta, Derong Liu
Artificial neural networks have been proposed as a tool for machine learning (e.g., see [23, 41, 47, 52]) and many results have been obtained regarding their application to practical problems in robotics control, vision, pattern recognition, grammatical inferences and other areas (e.g., see [8, 19, 29, 61]). In these roles, a neural network is trained to recognize complex associations between inputs and outputs that were presented during a supervised training cycle. These associations are incorporated into the weights of the network, which encode ∗Supported in part by NSF grants CCR-0206795, CCR-0208749 and IIS-0346973.
人工神经网络已经被提出作为机器学习的工具(例如,参见[23,41,47,52]),并且在机器人控制、视觉、模式识别、语法推理和其他领域的实际问题应用方面已经获得了许多结果(例如,参见[8,19,29,61])。在这些角色中,神经网络被训练来识别在监督训练周期中呈现的输入和输出之间的复杂关联。这些关联被合并到网络的权重中,其编码为∗,部分由NSF拨款CCR-0206795, CCR-0208749和IIS-0346973支持。
{"title":"Neural Networks","authors":"H. Siegelmann, B. Dasgupta, Derong Liu","doi":"10.1201/9781420010749.ch22","DOIUrl":"https://doi.org/10.1201/9781420010749.ch22","url":null,"abstract":"Artificial neural networks have been proposed as a tool for machine learning (e.g., see [23, 41, 47, 52]) and many results have been obtained regarding their application to practical problems in robotics control, vision, pattern recognition, grammatical inferences and other areas (e.g., see [8, 19, 29, 61]). In these roles, a neural network is trained to recognize complex associations between inputs and outputs that were presented during a supervised training cycle. These associations are incorporated into the weights of the network, which encode ∗Supported in part by NSF grants CCR-0206795, CCR-0208749 and IIS-0346973.","PeriodicalId":262519,"journal":{"name":"Handbook of Approximation Algorithms and Metaheuristics","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131689119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
期刊
Handbook of Approximation Algorithms and Metaheuristics
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1