A Locally Tuned Neural Network for Ground Truth Incorporation

Jenq-Neng Hwang, D. T. Davis, L. Tsang
{"title":"A Locally Tuned Neural Network for Ground Truth Incorporation","authors":"Jenq-Neng Hwang, D. T. Davis, L. Tsang","doi":"10.1109/IGARSS.1992.578341","DOIUrl":null,"url":null,"abstract":"In most remote sensing applications, the forward probiem denotes the calculation of fields and waves from given parameters of the media. The inverse problem is to calculate the target or media parameters from measured fields and waves through relevant remote sensing electromagnetic theory. One of the most important steps of applying artificial neural networks (ANNs) to solve parameter inversion problems in remote sensing applications is to first establish a very reliable approximation of the true foryard mapping, y = Nx), based on an ANN approximation $, trained by data pairs of the media parameters x , and the measurements of the fields and waves y , generated through a rempte sensing electromagnetic model $' . While the trained A\" $ can accurately approximate the electromagnetic model with negligible deviations, the degree of accurate ANN approximation of $' to the true mapping $ can only be verified by some available ground truth, which should be used to fine tune the trained ANN approximation $ . In this paper, we applied a minimum disturbance principle in fine tuning the approximated ANN by incorporating the small amount of available ground truth. More specifically, the ground truth is used to slightly modify the local vicinity of the mapping associated with this pair of training data without disturbing the whole mapping (i.e., without rocking the whole boat). This can be achieved by a locally tuned ANN formed by the radial basis functions, instead of the projection based ANN formed by the global logistic sigmoidal functions. FORWARD MODELS FOR INVERSE MODELS Remote sensing problems are of the general class of inverse problems, where we wish to infer the physical parameters which could cause a particular effect. Inverse problems admit of two lines of attack; creating a forward model of the process which then must be manipulated to yield an inverse [3], or creating an explicit inverse of the physical process [7]. An explicit inverse suffers from many-to-one problems when more than one cause could account for a particular effect. Forward models, on the other hand, can accurately model a causal relauonship. With a method of inverting the forward model, we can find a possible multiplicity of solutions from which we can select according to other information or additional constraints we wish to impose [ 11. The inversion of a forward model takes the form of a search in the input space of the model for an input which produces the desired output. With gradient information relating the input to some performance criteria, the search of the input space can proceed as a directed search, usually taken in the direction of this gradient. USE OF DATA DRIVEN MODELS WITH IMPLICIT FUNCTIONS There are three main types of forward models available: explicit functions, implicit functions, and data driven models. Explicit functions take the input and perform some direct functional mapping from input to output. To iteratively obtain an inverse. it is a simple matter of taking gradients of some performance criteria with respect to the inputs, using the known functional mapping. Implicit functions occur when we perform some operation on initial values, but do not have a direct functional relationship, as with initial conditions and their differential equations. Without a direct functional form for the implicit function, it is often infeasible to calculate gradients and perform a directed search in the input space. Data driven models come in two types: parametric and non-parametric. Parametric models assume some functional form for the process, while non-parametric models use flexible basis functions able to r e p resent a wide variety of functions. Data driven models are iteratively trained with representative data, adjusting the parameters of the model by minimizing the squared error between the data and the actual output of the model. Data driven models can be very useful for inverse problems, provided that the gradient of the output vector with respect to the input vector can be found. This is true not only of data driven appLdions; when we have an implicit functional relationship between the input and output, a data driven model can provide us with the means of inverting that implicit functional relationship. By training a data driven model with data produced by our implicit model, we can make a copy of our implicit model that we can invert. USING DIFFERENT TYPES OF DATA There are a few issues in connection with training a data driven model. One of these issues is the accuracy of the model. It may happen that the model is not uniformly accurate through its input space. We would then desire a method to line tune this area without retraining the entire model. Another issue is the type of data we use, and the domain we derive it from. We may use an analytic model to obtain our data, or we may be using measured data. Our measured data may come from a changing environment, or from a changing sample population. It would be advantageous to be able to use this data to update our model, with a minimum of training and a minimum of disturbance. Similarly, when using an analytic model, we may sometimes have a few points of measured data, and it would again be desirable to add the information this data represents to our model without having tu train a complete model. These problems can be solved by use of locally tuned functions about our regions of interest [5]. These functions would Serve as perturbations of our t~ained representation. Moreover, the different type of data and different types of error can be provided for simultaneously with our locally tuned functions. For example, we may receive more data in a particular region of the input space, and our model may have been ill trained in that region. A locally tuned function can be added to our model which alleviates both sources of error. We propose using radial 91-728?0/92$03.00 0 IEE 1992 1064 basis functions as locally tuned functions to add to trained neural networks, composed of sigmoidal activations. This provides a powerful combination of the features of sigmoid networks and radial basis functions, using the best fames of each","PeriodicalId":441591,"journal":{"name":"[Proceedings] IGARSS '92 International Geoscience and Remote Sensing Symposium","volume":"52 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] IGARSS '92 International Geoscience and Remote Sensing Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IGARSS.1992.578341","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

In most remote sensing applications, the forward probiem denotes the calculation of fields and waves from given parameters of the media. The inverse problem is to calculate the target or media parameters from measured fields and waves through relevant remote sensing electromagnetic theory. One of the most important steps of applying artificial neural networks (ANNs) to solve parameter inversion problems in remote sensing applications is to first establish a very reliable approximation of the true foryard mapping, y = Nx), based on an ANN approximation $, trained by data pairs of the media parameters x , and the measurements of the fields and waves y , generated through a rempte sensing electromagnetic model $' . While the trained A" $ can accurately approximate the electromagnetic model with negligible deviations, the degree of accurate ANN approximation of $' to the true mapping $ can only be verified by some available ground truth, which should be used to fine tune the trained ANN approximation $ . In this paper, we applied a minimum disturbance principle in fine tuning the approximated ANN by incorporating the small amount of available ground truth. More specifically, the ground truth is used to slightly modify the local vicinity of the mapping associated with this pair of training data without disturbing the whole mapping (i.e., without rocking the whole boat). This can be achieved by a locally tuned ANN formed by the radial basis functions, instead of the projection based ANN formed by the global logistic sigmoidal functions. FORWARD MODELS FOR INVERSE MODELS Remote sensing problems are of the general class of inverse problems, where we wish to infer the physical parameters which could cause a particular effect. Inverse problems admit of two lines of attack; creating a forward model of the process which then must be manipulated to yield an inverse [3], or creating an explicit inverse of the physical process [7]. An explicit inverse suffers from many-to-one problems when more than one cause could account for a particular effect. Forward models, on the other hand, can accurately model a causal relauonship. With a method of inverting the forward model, we can find a possible multiplicity of solutions from which we can select according to other information or additional constraints we wish to impose [ 11. The inversion of a forward model takes the form of a search in the input space of the model for an input which produces the desired output. With gradient information relating the input to some performance criteria, the search of the input space can proceed as a directed search, usually taken in the direction of this gradient. USE OF DATA DRIVEN MODELS WITH IMPLICIT FUNCTIONS There are three main types of forward models available: explicit functions, implicit functions, and data driven models. Explicit functions take the input and perform some direct functional mapping from input to output. To iteratively obtain an inverse. it is a simple matter of taking gradients of some performance criteria with respect to the inputs, using the known functional mapping. Implicit functions occur when we perform some operation on initial values, but do not have a direct functional relationship, as with initial conditions and their differential equations. Without a direct functional form for the implicit function, it is often infeasible to calculate gradients and perform a directed search in the input space. Data driven models come in two types: parametric and non-parametric. Parametric models assume some functional form for the process, while non-parametric models use flexible basis functions able to r e p resent a wide variety of functions. Data driven models are iteratively trained with representative data, adjusting the parameters of the model by minimizing the squared error between the data and the actual output of the model. Data driven models can be very useful for inverse problems, provided that the gradient of the output vector with respect to the input vector can be found. This is true not only of data driven appLdions; when we have an implicit functional relationship between the input and output, a data driven model can provide us with the means of inverting that implicit functional relationship. By training a data driven model with data produced by our implicit model, we can make a copy of our implicit model that we can invert. USING DIFFERENT TYPES OF DATA There are a few issues in connection with training a data driven model. One of these issues is the accuracy of the model. It may happen that the model is not uniformly accurate through its input space. We would then desire a method to line tune this area without retraining the entire model. Another issue is the type of data we use, and the domain we derive it from. We may use an analytic model to obtain our data, or we may be using measured data. Our measured data may come from a changing environment, or from a changing sample population. It would be advantageous to be able to use this data to update our model, with a minimum of training and a minimum of disturbance. Similarly, when using an analytic model, we may sometimes have a few points of measured data, and it would again be desirable to add the information this data represents to our model without having tu train a complete model. These problems can be solved by use of locally tuned functions about our regions of interest [5]. These functions would Serve as perturbations of our t~ained representation. Moreover, the different type of data and different types of error can be provided for simultaneously with our locally tuned functions. For example, we may receive more data in a particular region of the input space, and our model may have been ill trained in that region. A locally tuned function can be added to our model which alleviates both sources of error. We propose using radial 91-728?0/92$03.00 0 IEE 1992 1064 basis functions as locally tuned functions to add to trained neural networks, composed of sigmoidal activations. This provides a powerful combination of the features of sigmoid networks and radial basis functions, using the best fames of each
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于局部调谐神经网络的地面真值整合
能够使用这些数据更新我们的模型是有利的,训练最少,干扰最小。类似地,当使用分析模型时,我们有时可能有一些测量数据点,并且再次需要将这些数据表示的信息添加到我们的模型中,而无需训练完整的模型。这些问题可以通过使用关于我们感兴趣的区域的局部调谐函数来解决[5]。这些函数将作为我们获得的表示的扰动。此外,我们的局部调优函数可以同时提供不同类型的数据和不同类型的错误。例如,我们可能在输入空间的特定区域接收到更多的数据,而我们的模型可能在该区域训练不良。我们可以在模型中加入一个局部调谐函数,以减轻这两种误差来源。我们建议使用径向91-728?0/92$03.00 0 iee1992 1064基函数作为局部调谐函数添加到由s型激活组成的训练神经网络中。这提供了一个强大的组合的特征的s型网络和径向基函数,使用各自的最佳名称
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
MAC-91 On Montespertoli: Preliminary Analysis Of Multifrequency SAR Sensitivity To Soil And Vegetation Parameters The Us Army Atmospheric Profiler Research Facility: Description And Capabilities L-Band Radar Scattering from Grass Examination of the Physical, Electrical, and Microwave, Evolution of Sea Water Into Young Ice Implementation and Performance of the Magellan Digital Correlator Subsystem
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1