Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249122
Rosetta Roberts, Isaac Griffith
Introduction: Current solutions to multilingual parsing, for programming languages, are flawed. Current implementations are either limited in scope or difficult to develop and maintain. The development of multilingual parsers requires the combination of multiple base grammars, leading to a maintenance headache as these grammars evolve. Such a repetitive process should be automated. Objective: Develop an approach to normalize a grammar in such a way that the grammar is equivalent to the original, but in a state which reduces the effort to merge grammars by reducing ambiguity in automated merge decision making. Methods: This normalization procedure transforms grammars such that each production is one of two forms. Additionally, the normalized grammars maintain a set of additional constraints we identified as useful. A pilot study demonstrating this approach was conducted on three existing grammars. Results: The normalization algorithm was shown to correctly normalize the three grammars. Conclusions: This work presents a normalization method towards easing the development of automatically merging programming language grammars.
{"title":"Grammar Normalization to Support Automated Merging","authors":"Rosetta Roberts, Isaac Griffith","doi":"10.1109/IETC47856.2020.9249122","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249122","url":null,"abstract":"Introduction: Current solutions to multilingual parsing, for programming languages, are flawed. Current implementations are either limited in scope or difficult to develop and maintain. The development of multilingual parsers requires the combination of multiple base grammars, leading to a maintenance headache as these grammars evolve. Such a repetitive process should be automated. Objective: Develop an approach to normalize a grammar in such a way that the grammar is equivalent to the original, but in a state which reduces the effort to merge grammars by reducing ambiguity in automated merge decision making. Methods: This normalization procedure transforms grammars such that each production is one of two forms. Additionally, the normalized grammars maintain a set of additional constraints we identified as useful. A pilot study demonstrating this approach was conducted on three existing grammars. Results: The normalization algorithm was shown to correctly normalize the three grammars. Conclusions: This work presents a normalization method towards easing the development of automatically merging programming language grammars.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131753573","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249173
Amanda Bordelon, Sungmin Hong, Yohann Béarzi, C. Vachet, G. Gerig
A challenge in quality control for synthetic fiber-reinforced concrete is determining the actual spatial distribution of fibers. This paper presents the first computer algorithm to identify synthetic macrofibers within hardened concrete that has been scanned in an industrial X-ray computed tomographic scanner. The algorithm can also be used to obtain the spatial distribution of other inclusions such as air voids or steel fibers as well. Visualization of synthetic fibers was the primary focus of this work. The heterogeneous nature of concrete results in a noisy image which makes identifying contrast edge segmentation difficult for to the image processing. In order to identify only fibers, the air voids touching the fibers must be identified separately because they are similar in grayscale as the synthetic fibers. These air voids are assumed to be spherical in shape, and once identified can be extracted from the remaining fiber-aggregate-cement system. In this study, it was determined that the algorithm works best for straight macrosynthetic fibers where the pixel resolution is similar or smaller than the diameter of the fibers and if the fibers remain straight lines in the 3D matrix.
{"title":"Visualizing Air Voids and Synthetic Fibers from X-Ray Computed Tomographic Images of Concrete","authors":"Amanda Bordelon, Sungmin Hong, Yohann Béarzi, C. Vachet, G. Gerig","doi":"10.1109/IETC47856.2020.9249173","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249173","url":null,"abstract":"A challenge in quality control for synthetic fiber-reinforced concrete is determining the actual spatial distribution of fibers. This paper presents the first computer algorithm to identify synthetic macrofibers within hardened concrete that has been scanned in an industrial X-ray computed tomographic scanner. The algorithm can also be used to obtain the spatial distribution of other inclusions such as air voids or steel fibers as well. Visualization of synthetic fibers was the primary focus of this work. The heterogeneous nature of concrete results in a noisy image which makes identifying contrast edge segmentation difficult for to the image processing. In order to identify only fibers, the air voids touching the fibers must be identified separately because they are similar in grayscale as the synthetic fibers. These air voids are assumed to be spherical in shape, and once identified can be extracted from the remaining fiber-aggregate-cement system. In this study, it was determined that the algorithm works best for straight macrosynthetic fibers where the pixel resolution is similar or smaller than the diameter of the fibers and if the fibers remain straight lines in the 3D matrix.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124611522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249109
Aubrey L. Hebdon, Elizabeth D. S. Smith, W. Guthrie
The objective of this research was to investigate the effects of twisted steel micro-rebar (TSMR) fibers on the early cracking behavior of concrete bridge decks. The methodology for this research involved the evaluation of four newly constructed bridge decks, two that were constructed using conventional concrete and two that were constructed using TSMR. At bridge deck ages of 3 months, 1 year, and 2 years, the extent and severity of any deck surface cracking was documented in terms of crack lengths and widths, respectively. The data were used to create crack maps, and crack density was calculated for each bridge deck. Bridge decks containing TSMR exhibited notably reduced cracking when compared to the conventional decks. The bridge decks containing TSMR also exhibited the ability to limit the expansion of existing cracks. The significant decrease in the amount of cracks and reduced crack widths indicate that the TSMR fibers were successful in mitigating bridge deck cracking.
{"title":"Mitigation of Cracking in Concrete Bridge Decks Using Twisted Steel Micro-Rebar","authors":"Aubrey L. Hebdon, Elizabeth D. S. Smith, W. Guthrie","doi":"10.1109/IETC47856.2020.9249109","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249109","url":null,"abstract":"The objective of this research was to investigate the effects of twisted steel micro-rebar (TSMR) fibers on the early cracking behavior of concrete bridge decks. The methodology for this research involved the evaluation of four newly constructed bridge decks, two that were constructed using conventional concrete and two that were constructed using TSMR. At bridge deck ages of 3 months, 1 year, and 2 years, the extent and severity of any deck surface cracking was documented in terms of crack lengths and widths, respectively. The data were used to create crack maps, and crack density was calculated for each bridge deck. Bridge decks containing TSMR exhibited notably reduced cracking when compared to the conventional decks. The bridge decks containing TSMR also exhibited the ability to limit the expansion of existing cracks. The significant decrease in the amount of cracks and reduced crack widths indicate that the TSMR fibers were successful in mitigating bridge deck cracking.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124464031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249193
Al-Waled Al-Dulaimi, T. Moon, J. Gunther
A method of transforming speech from one speaker's voice to another is discussed, which operates by moving speech magnitude information from a source speaker to a target speaker using a process involving dynamic warping in both the time domain and the frequency domain. This process involves only spectral magnitudes, and has been found to introduce significant deleterious signal processing artifacts. It has been found that by reconstruction of phase information significantly improves the quality of the transformed speech.
{"title":"Phase Effects on Speech and Its Influence on Warped Speech","authors":"Al-Waled Al-Dulaimi, T. Moon, J. Gunther","doi":"10.1109/IETC47856.2020.9249193","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249193","url":null,"abstract":"A method of transforming speech from one speaker's voice to another is discussed, which operates by moving speech magnitude information from a source speaker to a target speaker using a process involving dynamic warping in both the time domain and the frequency domain. This process involves only spectral magnitudes, and has been found to introduce significant deleterious signal processing artifacts. It has been found that by reconstruction of phase information significantly improves the quality of the transformed speech.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129099751","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249211
S. Chowdhury, M. Schoen
In this work, different Machine Learning (ML) techniques are used and evaluated based on their performance of classifying peer reviewed published content. The ultimate objective is to extract meaningful information from published abstracts. In pursuing this objective, the ML techniques are utilized to classify different publications into three fields: Science, Business, and Social Science. The ML techniques applied in this work are Support Vector Machines, Naïve Bayes, K-Nearest Neighbor, and Decision Tree. In addition to the description of the utilized ML algorithms, the methodology and algorithms for text recognition using the aforementioned ML techniques are provided. The comparative study based on four different performance measures suggests that – with the exception of Decision Tree algorithm – the proposed ML techniques with the detailed pre-processing algorithms work well for classifying publications into categories based on the text provided in the abstract.
{"title":"Research Paper Classification using Supervised Machine Learning Techniques","authors":"S. Chowdhury, M. Schoen","doi":"10.1109/IETC47856.2020.9249211","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249211","url":null,"abstract":"In this work, different Machine Learning (ML) techniques are used and evaluated based on their performance of classifying peer reviewed published content. The ultimate objective is to extract meaningful information from published abstracts. In pursuing this objective, the ML techniques are utilized to classify different publications into three fields: Science, Business, and Social Science. The ML techniques applied in this work are Support Vector Machines, Naïve Bayes, K-Nearest Neighbor, and Decision Tree. In addition to the description of the utilized ML algorithms, the methodology and algorithms for text recognition using the aforementioned ML techniques are provided. The comparative study based on four different performance measures suggests that – with the exception of Decision Tree algorithm – the proposed ML techniques with the detailed pre-processing algorithms work well for classifying publications into categories based on the text provided in the abstract.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133416704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249120
Alec Escamilla, C. Staples, P. Andersen, W. Guthrie
The objective of this work was to perform project-level studies of selected pavement segments in Springville City to develop new standard pavement designs based on appropriate traffic levels, material properties, and climatic conditions. Mechanistic-empirical (M-E) analyses were then performed to design new standard pavement structures for minor collector streets in Springville City. Because of the engineering, economic, and environmental benefits of re-using existing materials, full-depth reclamation in conjunction with cement stabilization was specifically considered. The results of the M-E analyses indicate that, as the asphalt and CTB layer thicknesses increase, the number of allowable equivalent single axle loads (ESALs) also increases. In addition, as the 7-day unconfined compressive strength (UCS) values increase, the number of allowable ESALs increases. Among the options, a 7-day UCS value of 500 psi may be preferred in many instances to allow the shallowest pavement designs with the least amount of asphalt, which would be expected to be the least expensive. The advanced M-E design process demonstrated in this study has broad application for developing standard pavement designs, especially those incorporating CTB layers, for other street classes and in other regions.
{"title":"Mechanistic-Empirical Pavement Design for Minor Collector Streets Incorporating Cement-Treated Base Layers","authors":"Alec Escamilla, C. Staples, P. Andersen, W. Guthrie","doi":"10.1109/IETC47856.2020.9249120","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249120","url":null,"abstract":"The objective of this work was to perform project-level studies of selected pavement segments in Springville City to develop new standard pavement designs based on appropriate traffic levels, material properties, and climatic conditions. Mechanistic-empirical (M-E) analyses were then performed to design new standard pavement structures for minor collector streets in Springville City. Because of the engineering, economic, and environmental benefits of re-using existing materials, full-depth reclamation in conjunction with cement stabilization was specifically considered. The results of the M-E analyses indicate that, as the asphalt and CTB layer thicknesses increase, the number of allowable equivalent single axle loads (ESALs) also increases. In addition, as the 7-day unconfined compressive strength (UCS) values increase, the number of allowable ESALs increases. Among the options, a 7-day UCS value of 500 psi may be preferred in many instances to allow the shallowest pavement designs with the least amount of asphalt, which would be expected to be the least expensive. The advanced M-E design process demonstrated in this study has broad application for developing standard pavement designs, especially those incorporating CTB layers, for other street classes and in other regions.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"157 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133558616","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249130
M. Jafarpour, Mohammad Shekaramiz, A. Javan, A. Moeini
The increasing data volume and the algorithms complexity in recent years has made topics such as cloud computing extremely important. Meanwhile, improving the communication parameters in cloud computing has led to improvements such as the increase in efficiency, and the reduction of execution time and the energy consumption. The concept of communication can be treated as mathematical modeling with graphs, where the network requirements are evaluated with the graph parameters. Parameters such as connectivity, toughness, tenacity, and graph diameter are commonly used in cloud computing, social networks, networks security, etc. For example, connectivity is one of the major graph parameters that defines the extent of the graph's vulnerability. This parameter is evaluated in two vertex and edge modes. Creating graphs with a maximum vertex connectivity value was proposed by Harary and is referred to as Harary graphs. In this paper, we explore other graphs that have a different structure from Harary graphs and we make an attempt to improve other parameters such as graph diameter and toughness in the proposed graphs. In addition to cloud computing, the theoretical results of the proposed models are applicable in various fields such as social networks, network security, electronic circuit design, geography and urban design, bioinformatics and more.
{"title":"Building Graphs with Maximum Connectivity","authors":"M. Jafarpour, Mohammad Shekaramiz, A. Javan, A. Moeini","doi":"10.1109/IETC47856.2020.9249130","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249130","url":null,"abstract":"The increasing data volume and the algorithms complexity in recent years has made topics such as cloud computing extremely important. Meanwhile, improving the communication parameters in cloud computing has led to improvements such as the increase in efficiency, and the reduction of execution time and the energy consumption. The concept of communication can be treated as mathematical modeling with graphs, where the network requirements are evaluated with the graph parameters. Parameters such as connectivity, toughness, tenacity, and graph diameter are commonly used in cloud computing, social networks, networks security, etc. For example, connectivity is one of the major graph parameters that defines the extent of the graph's vulnerability. This parameter is evaluated in two vertex and edge modes. Creating graphs with a maximum vertex connectivity value was proposed by Harary and is referred to as Harary graphs. In this paper, we explore other graphs that have a different structure from Harary graphs and we make an attempt to improve other parameters such as graph diameter and toughness in the proposed graphs. In addition to cloud computing, the theoretical results of the proposed models are applicable in various fields such as social networks, network security, electronic circuit design, geography and urban design, bioinformatics and more.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130224195","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249145
W. Chalgham, A. Seibi, Jim Lee
One of the limitations of pipelines performance and structural integrity assessment is the continuous inspection of possible leaks due to corrosion or other types of failure mechanisms. Efforts to develop new technologies started several decades ago where different inspection techniques were used to enhance pipelines structural integrity. A research project has started with the goal of monitoring the pipeline health and detecting leaks and their location and size once they occur. One of the steps of the research project is to conduct an experimental study mimicking a leaking pipeline and develop correlations between leaks and their location and size. The goal of the experiment is to validate the simulation model by comparing the measured data with the simulation results. If the difference is negligible and the model is verified, the rest of the tests where the leak location and size are varied will be conducted only in the simulation software to avoid additional experimental cost and time. However, after running the first experiment, the recorded data did not match the simulation results. This paper aims at reducing the relative error between the experimental and numerical results using a lean six-sigma based approach in order to proceed with the next steps of the research project.
{"title":"Reducing the Relative Error Between the Experimental and Numerical Results of a Pipeline Leak Flowrate Using a Lean Six-Sigma Based Approach","authors":"W. Chalgham, A. Seibi, Jim Lee","doi":"10.1109/IETC47856.2020.9249145","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249145","url":null,"abstract":"One of the limitations of pipelines performance and structural integrity assessment is the continuous inspection of possible leaks due to corrosion or other types of failure mechanisms. Efforts to develop new technologies started several decades ago where different inspection techniques were used to enhance pipelines structural integrity. A research project has started with the goal of monitoring the pipeline health and detecting leaks and their location and size once they occur. One of the steps of the research project is to conduct an experimental study mimicking a leaking pipeline and develop correlations between leaks and their location and size. The goal of the experiment is to validate the simulation model by comparing the measured data with the simulation results. If the difference is negligible and the model is verified, the rest of the tests where the leak location and size are varied will be conducted only in the simulation software to avoid additional experimental cost and time. However, after running the first experiment, the recorded data did not match the simulation results. This paper aims at reducing the relative error between the experimental and numerical results using a lean six-sigma based approach in order to proceed with the next steps of the research project.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129797972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249220
Amirhossein Manzourolajdad
The RNA molecule is capable of folding into different shapes, with some being more stable than others. The structural space of the RNA can be described by Stochastic Context-free Grammars (SCFG), offering a probabilisitic distribution of structural scenarios. In a more accurate folding model, the probability associated with a structural scenario is more informative of its stability. Here, we offer two different ways of calculating the Shannon Entropy of the SCFG-modeled RNA structural space; Grammar Space (GS) Entropy and SCFG Entropy. The former is the Shannon Entropy of the infinite number of structures produced by the model and the latter is the Shannon Entropy of a limited subset of structures all of which belong to the same RNA sequence. After a brief introduction on the two measures, we explore the relationship between these measures on a given set of RNA folding models and biologically functional RNA sequences. We show that these two measures of entropy are indeed correlated. While more theoretical work is needed in understanding the convergence behavior between the two, this observation suggests that GS Entropy is a promising characteristic in future model training approaches.
{"title":"Do Empirical and Abstract Shannon Entropies Converge in Value? A Case in RNA Molecular Structure","authors":"Amirhossein Manzourolajdad","doi":"10.1109/IETC47856.2020.9249220","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249220","url":null,"abstract":"The RNA molecule is capable of folding into different shapes, with some being more stable than others. The structural space of the RNA can be described by Stochastic Context-free Grammars (SCFG), offering a probabilisitic distribution of structural scenarios. In a more accurate folding model, the probability associated with a structural scenario is more informative of its stability. Here, we offer two different ways of calculating the Shannon Entropy of the SCFG-modeled RNA structural space; Grammar Space (GS) Entropy and SCFG Entropy. The former is the Shannon Entropy of the infinite number of structures produced by the model and the latter is the Shannon Entropy of a limited subset of structures all of which belong to the same RNA sequence. After a brief introduction on the two measures, we explore the relationship between these measures on a given set of RNA folding models and biologically functional RNA sequences. We show that these two measures of entropy are indeed correlated. While more theoretical work is needed in understanding the convergence behavior between the two, this observation suggests that GS Entropy is a promising characteristic in future model training approaches.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127173068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-02DOI: 10.1109/IETC47856.2020.9249115
Jakob W. Kunzler, Spencer M. Ammermon, K. Warnick
Next-generation wireless devices use many coherent antennas to enhance wireless performance at the expense of size, weight, power, cost, and complexity. This has caused many multiantenna signal processing architectures to be unfit for deployment on unmanned aircraft. Recent advances in FPGA technology have reduced the footprint of a 16 antenna signal processor to a feasible size for unmanned aircraft. We are exploring the capabilities of this new technology by demonstrating an adaptive beamformer spatial filtered GPS system. The multiantenna implementation will increase the quality of GPS services and provide resistance to multipath and adversarial spoofing. This paper documents early progress in developing and verifying the firmware for the future demonstration.
{"title":"Progress Toward Airborne GPS Spatial Filtering Powered by Recent Advances in FPGA Technology","authors":"Jakob W. Kunzler, Spencer M. Ammermon, K. Warnick","doi":"10.1109/IETC47856.2020.9249115","DOIUrl":"https://doi.org/10.1109/IETC47856.2020.9249115","url":null,"abstract":"Next-generation wireless devices use many coherent antennas to enhance wireless performance at the expense of size, weight, power, cost, and complexity. This has caused many multiantenna signal processing architectures to be unfit for deployment on unmanned aircraft. Recent advances in FPGA technology have reduced the footprint of a 16 antenna signal processor to a feasible size for unmanned aircraft. We are exploring the capabilities of this new technology by demonstrating an adaptive beamformer spatial filtered GPS system. The multiantenna implementation will increase the quality of GPS services and provide resistance to multipath and adversarial spoofing. This paper documents early progress in developing and verifying the firmware for the future demonstration.","PeriodicalId":186446,"journal":{"name":"2020 Intermountain Engineering, Technology and Computing (IETC)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130354374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}