Constructing digitial elevation model(DEM) from dense LiDAR points becomes increasingly important. Natural Neighbor Interpolation (NNI) is a popular approach to DEM construction from point datasets but is computationally intensive. In this study, we present a set of General Purpose computing Graphics Processing Unit(GPGPU) based algorithms that can significant speed up the process. Evaluating three real world LiDAR datasets each contains 6~7 million points shows that our CUDA based implementation on a NVIDIA GTX 480 GPU card is several times to nearly 2 orders faster than the current state-of-the-art NNI based DEM construction using graphics hardware acceleration.
{"title":"Constructing natural neighbor interpolation based grid DEM using CUDA","authors":"Simin You, Jianting Zhang","doi":"10.1145/2345316.2345349","DOIUrl":"https://doi.org/10.1145/2345316.2345349","url":null,"abstract":"Constructing digitial elevation model(DEM) from dense LiDAR points becomes increasingly important. Natural Neighbor Interpolation (NNI) is a popular approach to DEM construction from point datasets but is computationally intensive. In this study, we present a set of General Purpose computing Graphics Processing Unit(GPGPU) based algorithms that can significant speed up the process. Evaluating three real world LiDAR datasets each contains 6~7 million points shows that our CUDA based implementation on a NVIDIA GTX 480 GPU card is several times to nearly 2 orders faster than the current state-of-the-art NNI based DEM construction using graphics hardware acceleration.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132969440","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neuro-biological processes that will form the foundation for intelligent system architectures. This is achieved by artificially re-creating the highly parallelized computing architecture of the mammalian brain. As an interdisciplinary technology inspired from biology, artificial neural systems have been successfully utilized in many applications, such as control systems, signal processing, pattern recognition, vision systems, and robotics etc. In addition, the emerging neuromorphic computing field can also exploit the characteristic behavior of novel material systems with advanced processing techniques to achieve very large scale integration with highly parallel neural architectures for the fabrication physical architectures. This talk will focus on the technological challenges that we are seeking to overcome to enable intelligent parallel neuromorphic computing systems.
{"title":"Computational intelligence and neuromorphic computing potential for geospatial research and applications","authors":"R. Pino","doi":"10.1145/2345316.2345325","DOIUrl":"https://doi.org/10.1145/2345316.2345325","url":null,"abstract":"In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neuro-biological processes that will form the foundation for intelligent system architectures. This is achieved by artificially re-creating the highly parallelized computing architecture of the mammalian brain. As an interdisciplinary technology inspired from biology, artificial neural systems have been successfully utilized in many applications, such as control systems, signal processing, pattern recognition, vision systems, and robotics etc. In addition, the emerging neuromorphic computing field can also exploit the characteristic behavior of novel material systems with advanced processing techniques to achieve very large scale integration with highly parallel neural architectures for the fabrication physical architectures. This talk will focus on the technological challenges that we are seeking to overcome to enable intelligent parallel neuromorphic computing systems.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124748055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Scalable Vector Graphics (SVG) is an effective web map technology for client-side vector mapping. We demonstrate this through a web map application developed for the Trans-Border Institute of the University of San Diego. The Trans-Border Institute currently releases reports summarizing data on the number of drug-related homicides in Mexico and produces maps to show drug-related killings in each Mexican state. Together, the reports and maps are used to inform U. S. audiences about the public security situation in Mexico and the effects of the war on drugs. The deadly consequences of the drug-war in Mexico become clear in the maps produced by the Trans-Border Institute. The maps, however, are not as effective as they could be because they are released as static images. These static images do not allow for the exploration of the data themselves and make it difficult to view changes over time. To facilitate data exploration and thereby assist the Trans-Border Institute in more effectively disseminating information on the drug-war in Mexico, an interactive web map was developed using SVG. For client side vector graphics, SVG provides clear advantages in that it is open, interoperable, and extensible and is resolution independent. In addition, behaviors, including animation, can be included in the markup file itself or added through scripting. These advantages make SVG optimal for developing high quality interactive (and non-interactive) vector maps. Despite these advantages, SVG has not been widely adopted. However, recent technological changes and trends have made competing systems, like Flash, less optimal and have heightened the awareness and ease of using SVG. Combined, these changes have paved the way for SVG becoming the most widely used client-side vector standard. Its effectiveness is demonstrated in an SVG-based, interactive web map developed for the Trans-Border Institute and shown in this presentation.
{"title":"Effectiveness of scalable vector graphics (SVG) for client-side vector mapping","authors":"Theresa Firestine, F. Hardisty","doi":"10.1145/2345316.2345364","DOIUrl":"https://doi.org/10.1145/2345316.2345364","url":null,"abstract":"Scalable Vector Graphics (SVG) is an effective web map technology for client-side vector mapping. We demonstrate this through a web map application developed for the Trans-Border Institute of the University of San Diego. The Trans-Border Institute currently releases reports summarizing data on the number of drug-related homicides in Mexico and produces maps to show drug-related killings in each Mexican state. Together, the reports and maps are used to inform U. S. audiences about the public security situation in Mexico and the effects of the war on drugs. The deadly consequences of the drug-war in Mexico become clear in the maps produced by the Trans-Border Institute. The maps, however, are not as effective as they could be because they are released as static images. These static images do not allow for the exploration of the data themselves and make it difficult to view changes over time. To facilitate data exploration and thereby assist the Trans-Border Institute in more effectively disseminating information on the drug-war in Mexico, an interactive web map was developed using SVG. For client side vector graphics, SVG provides clear advantages in that it is open, interoperable, and extensible and is resolution independent. In addition, behaviors, including animation, can be included in the markup file itself or added through scripting. These advantages make SVG optimal for developing high quality interactive (and non-interactive) vector maps. Despite these advantages, SVG has not been widely adopted. However, recent technological changes and trends have made competing systems, like Flash, less optimal and have heightened the awareness and ease of using SVG. Combined, these changes have paved the way for SVG becoming the most widely used client-side vector standard. Its effectiveness is demonstrated in an SVG-based, interactive web map developed for the Trans-Border Institute and shown in this presentation.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129625667","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Real time traffic monitoring systems perform spatial and time dependent analysis of measurement data of different types such as traditional inductive loop detector data, microwave radar data, and GPS data. The goal of these systems is to provide information such as average speeds, volumes and densities on a given segment of a roadway. One of the fastest growing data source for traffic monitoring systems is GPS data collected from mobile devices. To some extent, in the industry GPS data is already replacing the traditional traffic sensing technologies. There is a large demand in industry and transportation agencies to have access to high resolution state of traffic on highways and arterial roads globally. This means that traffic information providers have to provide traffic information on a resolution that goes beyond the widely used TMC code based representation of the roadway. In order to obtain the high resolution state of traffic, noisy observations need to be fused into a mathematical model that represents the evolution of the system either based on physics or statistics. Common frameworks for fusing the data into physical models include for example Kalman filtering and particle filtering. Prior to the data fusion stage in the real time system, offline geospatial modelling has already been done. For example, construction and calibration of an accurate physics based traffic model includes tasks such as building a directed graph of the road network, detection of road geometry at lane level and speed limit detection. In all these tasks, GPS data is vital. Real time systems that use GPS data include geospatial pre-processing components such as map matching and path inference. The rapidly growing volume of GPS data cannot be handled using traditional methods but instead parallel computing systems are needed to handle the future volumes. Also, the new high resolution algorithms are developed to leverage the parallel processing frameworks. In this talk I will discuss directions taken to respond to the demand of providing high resolution information about the state of the traffic both in the context of modeling and implementation of a large scale system.
{"title":"Big data computing for traffic information by GPS sensing","authors":"Olli-Pekka Tossavainen","doi":"10.1145/2345316.2345324","DOIUrl":"https://doi.org/10.1145/2345316.2345324","url":null,"abstract":"Real time traffic monitoring systems perform spatial and time dependent analysis of measurement data of different types such as traditional inductive loop detector data, microwave radar data, and GPS data. The goal of these systems is to provide information such as average speeds, volumes and densities on a given segment of a roadway. One of the fastest growing data source for traffic monitoring systems is GPS data collected from mobile devices. To some extent, in the industry GPS data is already replacing the traditional traffic sensing technologies.\u0000 There is a large demand in industry and transportation agencies to have access to high resolution state of traffic on highways and arterial roads globally. This means that traffic information providers have to provide traffic information on a resolution that goes beyond the widely used TMC code based representation of the roadway.\u0000 In order to obtain the high resolution state of traffic, noisy observations need to be fused into a mathematical model that represents the evolution of the system either based on physics or statistics. Common frameworks for fusing the data into physical models include for example Kalman filtering and particle filtering.\u0000 Prior to the data fusion stage in the real time system, offline geospatial modelling has already been done. For example, construction and calibration of an accurate physics based traffic model includes tasks such as building a directed graph of the road network, detection of road geometry at lane level and speed limit detection. In all these tasks, GPS data is vital.\u0000 Real time systems that use GPS data include geospatial pre-processing components such as map matching and path inference. The rapidly growing volume of GPS data cannot be handled using traditional methods but instead parallel computing systems are needed to handle the future volumes. Also, the new high resolution algorithms are developed to leverage the parallel processing frameworks.\u0000 In this talk I will discuss directions taken to respond to the demand of providing high resolution information about the state of the traffic both in the context of modeling and implementation of a large scale system.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132941668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The explosion of the power and sophistication of computing applications in the past few years has revolutionized the way we live and work. This marked trend is of especial significance for geospatial computing, which directly relates to the very foundations of our society and essentially embraces all the diversification of its activities. Geospatial information, already important in many scientific and engineering disciplines, is increasingly becoming an integral component in consumer-driven technologies. How to further improve or enhance geospatial information processing, organizing, analysis, and visualization? Especially, handling rising flood of digital data from many difference sources puts serious technical and scientific challenges. With rapid progress of information processing and multiple disciplines, there are more and more promising computing technologies, which could be employed to solve these problems. At present, cloud computing, mobile computing, visual computing/GPU computing, business intelligence, and social computing have been playing key roles in geospatial applications. Some latest computing advancements, such as big data computing, heterogeneous computing, Internet of Things (IoT)/sensor computing, and bio-computing, have great potentials for the effective realization of information processing in the geospatial environment. This talk highlights the impacts of these current and prospective computing technologies on the future of the geo-world. Our consideration is intended to bring fresh thoughts to explore new directions for geospatial research and development. The talk also provides a vision of the ".geo" term and a combined outlook for both computer and geospatial communities, i.e., how computing technology is changing the landscape of geospatial applications and how diverse geospatial information processing requires the change of various computing technologies.
{"title":"A promising future of computing for geospatial technologies","authors":"Lindi Liao","doi":"10.1145/2345316.2345318","DOIUrl":"https://doi.org/10.1145/2345316.2345318","url":null,"abstract":"The explosion of the power and sophistication of computing applications in the past few years has revolutionized the way we live and work. This marked trend is of especial significance for geospatial computing, which directly relates to the very foundations of our society and essentially embraces all the diversification of its activities.\u0000 Geospatial information, already important in many scientific and engineering disciplines, is increasingly becoming an integral component in consumer-driven technologies. How to further improve or enhance geospatial information processing, organizing, analysis, and visualization? Especially, handling rising flood of digital data from many difference sources puts serious technical and scientific challenges.\u0000 With rapid progress of information processing and multiple disciplines, there are more and more promising computing technologies, which could be employed to solve these problems. At present, cloud computing, mobile computing, visual computing/GPU computing, business intelligence, and social computing have been playing key roles in geospatial applications. Some latest computing advancements, such as big data computing, heterogeneous computing, Internet of Things (IoT)/sensor computing, and bio-computing, have great potentials for the effective realization of information processing in the geospatial environment.\u0000 This talk highlights the impacts of these current and prospective computing technologies on the future of the geo-world. Our consideration is intended to bring fresh thoughts to explore new directions for geospatial research and development. The talk also provides a vision of the \".geo\" term and a combined outlook for both computer and geospatial communities, i.e., how computing technology is changing the landscape of geospatial applications and how diverse geospatial information processing requires the change of various computing technologies.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133264525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces a novel offline map matching approach. We develop a routing-based map matching approach for standardizing identified routes in a collected set of GPS trajectories. Our approach first identifies key waypoints in a user's GPS trajectory using a modified Peucker curve reduction algorithm. Subsequently, it sends the key waypoints to a black-box driving directions service which returns a route utilizing each of the key waypoints. The returned route is a standardized representation of the original GPS trajectory constructed using the minimum necessary set of points. A filter-and-refine approach is used to identify the incorrect portion of the returned route and a refine step is carried out by eliminating the waypoints what leads to the incorrect matching. Experiments results showed that the proposed approach works well for a data-set of 10 volunteers each collecting data an average of 34.3 days.
{"title":"Routing-based map matching for extracting routes from GPS trajectories","authors":"Terry W. Griffin, Y. Huang, Shawn Seals","doi":"10.1145/1999320.1999344","DOIUrl":"https://doi.org/10.1145/1999320.1999344","url":null,"abstract":"This paper introduces a novel offline map matching approach. We develop a routing-based map matching approach for standardizing identified routes in a collected set of GPS trajectories. Our approach first identifies key waypoints in a user's GPS trajectory using a modified Peucker curve reduction algorithm. Subsequently, it sends the key waypoints to a black-box driving directions service which returns a route utilizing each of the key waypoints. The returned route is a standardized representation of the original GPS trajectory constructed using the minimum necessary set of points. A filter-and-refine approach is used to identify the incorrect portion of the returned route and a refine step is carried out by eliminating the waypoints what leads to the incorrect matching. Experiments results showed that the proposed approach works well for a data-set of 10 volunteers each collecting data an average of 34.3 days.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117189191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cyclones are regarded as one of the most dangerous meteorological phenomena of the tropical region. The probability of landfall of a tropical cyclone depends on its movement (trajectory). Analysis of trajectories of tropical cyclones could be useful for identifying potentially predictable characteristics. In this study, tropical cyclone tracks over the North Indian Ocean basin have been analyzed and grouped into clusters based on their spatial characteristics. For the identified clusters we have also examined characteristics such as life span, maximum sustained wind speed, landfall, seasonality. The resultant clusters are forming clear groupings on some of these characteristics. The cyclones with higher maximum wind speed and longest life span are grouped in to one cluster. Another cluster includes short duration cyclonic events that are mostly deep depressions and significant for rainfall over Eastern and Central India. The clustering approach is likely to prove useful for analysis of events of significance with regard to impacts.
{"title":"Analyzing tropical cyclone tracks of North Indian Ocean","authors":"M. Paliwal, A. Patwardhan, N. L. Sarda","doi":"10.1145/1999320.1999338","DOIUrl":"https://doi.org/10.1145/1999320.1999338","url":null,"abstract":"Cyclones are regarded as one of the most dangerous meteorological phenomena of the tropical region. The probability of landfall of a tropical cyclone depends on its movement (trajectory). Analysis of trajectories of tropical cyclones could be useful for identifying potentially predictable characteristics. In this study, tropical cyclone tracks over the North Indian Ocean basin have been analyzed and grouped into clusters based on their spatial characteristics. For the identified clusters we have also examined characteristics such as life span, maximum sustained wind speed, landfall, seasonality. The resultant clusters are forming clear groupings on some of these characteristics. The cyclones with higher maximum wind speed and longest life span are grouped in to one cluster. Another cluster includes short duration cyclonic events that are mostly deep depressions and significant for rainfall over Eastern and Central India. The clustering approach is likely to prove useful for analysis of events of significance with regard to impacts.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126058248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Are you interested to write an application or a game and make it available on Windows Phone 7 devices through Microsoft's Marketplace? At the workshop, we will demonstrate how you can build Windows Phone 7 applications or games, what tools are available, what public sector specific applications have been created, how you can publish your applications, etc. We will answer any questions you may have about Windows Phone 7. So come on over and join us in this fun workshop. Bring your computer if you like. Or, you can play with our demo phones and pre-installed applications.
{"title":"Windows Phone 7 workshop","authors":"Joel Reyes, Zhiming Xue","doi":"10.1145/1999320.1999404","DOIUrl":"https://doi.org/10.1145/1999320.1999404","url":null,"abstract":"Are you interested to write an application or a game and make it available on Windows Phone 7 devices through Microsoft's Marketplace? At the workshop, we will demonstrate how you can build Windows Phone 7 applications or games, what tools are available, what public sector specific applications have been created, how you can publish your applications, etc. We will answer any questions you may have about Windows Phone 7. So come on over and join us in this fun workshop. Bring your computer if you like. Or, you can play with our demo phones and pre-installed applications.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126180579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In a multi-vendor environment, development of the Internet of Things (IoT) will be limited without the emergence of open, consensus standards that enable collaboration. Such standards will define an infrastructure that raises the level of services and quality of information for the marketplace thereby providing more opportunities, particularly for the vendors that collaborate to define the standards. Collaborative development is key to consensus adoption and wide use of information technology standards. Development of effective open standards is a balancing act. The standards need to be agile and adaptive to the rapidly changing developments in the marketplace. The standards also need to have a sound engineering foundation and respect relevant aspects of the existing technology base. The use of open standards to connect components, applications, and content -- allowing a "white box" view on the components' functionality and interfaces without revealing implementation details -- fulfills the industry requirement for protection of intellectual property and the user requirement for transparency. The COM.Geo Workshop on "Expanding GeoWeb to an Internet of Things" is an excellent opportunity to discuss how organizations can increase their business based on quality location information in the Internet of Things. Quality information in a multi-vendor environment can only be obtained using standards. An industry-based consortium is needed to establish effective standards for information sharing about location in the Internet of Things. The Open Geospatial Consortium (OGC) has a proven process for industry-wide collaborative development of efficient standards for spatial and location information. The mission of OGC is to serve as a global forum for the development and promotion of open standards and techniques in the area of geoprocessing and related information technologies. The OGC has 410+ members - geospatial technology software vendors, systems integrators, government agencies and universities - participating in the consensus standards development and maintenance process. Through its Specification Program, Interoperability Program, and Marketing and Communications Program, the OGC develops, releases and promotes open standards for spatial processing. Technology and content providers collaborate in the OGC because they recognize that lack of interoperability is a bottleneck that slows market expansion. They know that interoperability enabled by open standards positions them to both compete more effectively in the marketplace and to seek new market opportunities. The OGC recommends the following steps for advancing the GeoWeb to an IoT-based marketplace: • Definition of a standards-based "GeoWeb meets IoT" framework to spur coordinated application development. • Coordination of standards for location in IoT with other relevant standards development organizations. • Discussions of the framework in the OGC Specification Working Groups to identify
{"title":"Collaborative development of open standards for expanding GeoWeb to the internet of things","authors":"G. Percivall","doi":"10.1145/1999320.1999398","DOIUrl":"https://doi.org/10.1145/1999320.1999398","url":null,"abstract":"In a multi-vendor environment, development of the Internet of Things (IoT) will be limited without the emergence of open, consensus standards that enable collaboration. Such standards will define an infrastructure that raises the level of services and quality of information for the marketplace thereby providing more opportunities, particularly for the vendors that collaborate to define the standards. Collaborative development is key to consensus adoption and wide use of information technology standards.\u0000 Development of effective open standards is a balancing act. The standards need to be agile and adaptive to the rapidly changing developments in the marketplace. The standards also need to have a sound engineering foundation and respect relevant aspects of the existing technology base. The use of open standards to connect components, applications, and content -- allowing a \"white box\" view on the components' functionality and interfaces without revealing implementation details -- fulfills the industry requirement for protection of intellectual property and the user requirement for transparency.\u0000 The COM.Geo Workshop on \"Expanding GeoWeb to an Internet of Things\" is an excellent opportunity to discuss how organizations can increase their business based on quality location information in the Internet of Things. Quality information in a multi-vendor environment can only be obtained using standards. An industry-based consortium is needed to establish effective standards for information sharing about location in the Internet of Things. The Open Geospatial Consortium (OGC) has a proven process for industry-wide collaborative development of efficient standards for spatial and location information.\u0000 The mission of OGC is to serve as a global forum for the development and promotion of open standards and techniques in the area of geoprocessing and related information technologies. The OGC has 410+ members - geospatial technology software vendors, systems integrators, government agencies and universities - participating in the consensus standards development and maintenance process. Through its Specification Program, Interoperability Program, and Marketing and Communications Program, the OGC develops, releases and promotes open standards for spatial processing. Technology and content providers collaborate in the OGC because they recognize that lack of interoperability is a bottleneck that slows market expansion. They know that interoperability enabled by open standards positions them to both compete more effectively in the marketplace and to seek new market opportunities.\u0000 The OGC recommends the following steps for advancing the GeoWeb to an IoT-based marketplace:\u0000 • Definition of a standards-based \"GeoWeb meets IoT\" framework to spur coordinated application development.\u0000 • Coordination of standards for location in IoT with other relevant standards development organizations.\u0000 • Discussions of the framework in the OGC Specification Working Groups to identify ","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114457022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The models for representing, maintaining and using "navigable" geographic features are evolving from a 2D centerline roadway model, through a highly detailed 3D pedestrian, indoor and multimodal model and into a Internet of (Locatable) Things. As this evolution proceeds, the volume of data that can be processed and delivered to end user applications could reach an untenable torrent, both from a human cognition as well as a machine resource perspective. The key as always is information, not just data. Contextualized interpretation, not just a collection of undifferentiated ground truth facts. What's needed at the edges of the GeoWeb - particularly for relatively network and processing challenged mobile devices - is the notion of "byte-sized" (pun intended) content that's "right-sized" for each individual actor based on highly dynamic personal or organizational usage contexts. It's clear that edge applications will continue to play a role in providing such a contextual filter. Less obvious is how other GeoWeb participants will also provide contextual value. The application developer interface to a geodata provider is a pathway for application development time, product creation time and run time information exchange. This exchange will inform the processes and business rules that a data provider uses to prioritize the gathering, processing and correlation of observations, the mediation of geodata product quality level guarantees, and the delivery models for the application-ready features themselves. The effectiveness of this pathway will depend on low processing latencies, not only between observation detection and feature change availability, but also between an end user's context and what features are provided at what levels of detail. There is ample precedent in the current vehicle navigation ecosystem for leveraging this pathway to make the resulting user experience compelling and economically viable. Moving to an integrated 3D model of the built and natural world as a framework for an Internet of Things will require enriching and formalizing this interface in order to build contextual value into the GeoWeb.
{"title":"Navigation-to-thing and highly-context-focused 'around me' use cases","authors":"P. Bouzide","doi":"10.1145/1999320.1999393","DOIUrl":"https://doi.org/10.1145/1999320.1999393","url":null,"abstract":"The models for representing, maintaining and using \"navigable\" geographic features are evolving from a 2D centerline roadway model, through a highly detailed 3D pedestrian, indoor and multimodal model and into a Internet of (Locatable) Things. As this evolution proceeds, the volume of data that can be processed and delivered to end user applications could reach an untenable torrent, both from a human cognition as well as a machine resource perspective.\u0000 The key as always is information, not just data. Contextualized interpretation, not just a collection of undifferentiated ground truth facts. What's needed at the edges of the GeoWeb - particularly for relatively network and processing challenged mobile devices - is the notion of \"byte-sized\" (pun intended) content that's \"right-sized\" for each individual actor based on highly dynamic personal or organizational usage contexts.\u0000 It's clear that edge applications will continue to play a role in providing such a contextual filter. Less obvious is how other GeoWeb participants will also provide contextual value. The application developer interface to a geodata provider is a pathway for application development time, product creation time and run time information exchange. This exchange will inform the processes and business rules that a data provider uses to prioritize the gathering, processing and correlation of observations, the mediation of geodata product quality level guarantees, and the delivery models for the application-ready features themselves. The effectiveness of this pathway will depend on low processing latencies, not only between observation detection and feature change availability, but also between an end user's context and what features are provided at what levels of detail.\u0000 There is ample precedent in the current vehicle navigation ecosystem for leveraging this pathway to make the resulting user experience compelling and economically viable. Moving to an integrated 3D model of the built and natural world as a framework for an Internet of Things will require enriching and formalizing this interface in order to build contextual value into the GeoWeb.","PeriodicalId":400763,"journal":{"name":"International Conference and Exhibition on Computing for Geospatial Research & Application","volume":"123 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114511049","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}