The Underwater Wireless Sensor Network (UWSN) consists of sensor nodes equipped with a small battery of limited energy resource. A key design issue here is the energy efficiency that needs to be addressed in order to increase the lifetime of the network. In this paper, we apply a hexagon tessellation with an ideal cell size to deploy the underwater sensor nodes for two-dimensional UWSN. Upon this setting, we propose an enhanced hybrid transmission method that considers load balancing of data transmission in two-dimensional UWSN. The proposed method applies the threshold annulus that is defined as the distance between a node and the Base Station (BS) and allocates different frequencies to different annuluses. The simulation results show that the proposed method enhances the energy efficiency compared to the existing multi-hop forwarding methods and hybrid transmission methods.
{"title":"A new energy efficient data transmission method for underwater wireless sensor networks","authors":"Sangbo Seo, Seungmi Song, E. Kim, Sungun Kim","doi":"10.1145/1456223.1456359","DOIUrl":"https://doi.org/10.1145/1456223.1456359","url":null,"abstract":"The Underwater Wireless Sensor Network (UWSN) consists of sensor nodes equipped with a small battery of limited energy resource. A key design issue here is the energy efficiency that needs to be addressed in order to increase the lifetime of the network. In this paper, we apply a hexagon tessellation with an ideal cell size to deploy the underwater sensor nodes for two-dimensional UWSN. Upon this setting, we propose an enhanced hybrid transmission method that considers load balancing of data transmission in two-dimensional UWSN. The proposed method applies the threshold annulus that is defined as the distance between a node and the Base Station (BS) and allocates different frequencies to different annuluses. The simulation results show that the proposed method enhances the energy efficiency compared to the existing multi-hop forwarding methods and hybrid transmission methods.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127338934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Component Based Software Engineering (CBSE) is concerned with the assembly of pre-existing software components that leads to a software system that responds to client-specific requirements. Component selection and component systems assembly have become two of the key issues involved in this process. This work presents an approach for component selection, solution based on maximum number of provided operations that satisfy the current requirements. Another contribution of this paper addresses the problem of automatic assembly of component systems. The algorithm CSAC (Component System Automatic Configurations) is presented. With our case study we show that our approach is efficient and generally applicable in practical scenarios.
{"title":"Automatic configuration for the component selection problem","authors":"A. Vescan, Horia F. Pop","doi":"10.1145/1456223.1456321","DOIUrl":"https://doi.org/10.1145/1456223.1456321","url":null,"abstract":"Component Based Software Engineering (CBSE) is concerned with the assembly of pre-existing software components that leads to a software system that responds to client-specific requirements. Component selection and component systems assembly have become two of the key issues involved in this process.\u0000 This work presents an approach for component selection, solution based on maximum number of provided operations that satisfy the current requirements.\u0000 Another contribution of this paper addresses the problem of automatic assembly of component systems. The algorithm CSAC (Component System Automatic Configurations) is presented. With our case study we show that our approach is efficient and generally applicable in practical scenarios.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"120 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121039709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents some results of an ongoing project aimed at modeling the main concepts related to Customer Relationship Management (CRM). More precisely, the paper presents O-CREAM, a CRM ontology based on DOLCE and on two DOLCE-based modules, DnS (exploited for modeling roles and for handling reification) and OIO (exploited for modeling business knowledge by means of information objects). The project relies on the belief that all the actors involved in CRM could benefit from an ontological investigation of this field, aimed at providing a core set of formally described concepts and relations, useful both for describing CRM processes and for specifying the functionality of CRM applications. In particular, a well-formed CRM ontology would support communication and interoperability both in intra-organization and in inter-organization CRM processes. The paper discusses in details the axiomatization for the sale and customer relationship concepts, as well as for the corresponding business knowledge items (i.e., sale and customer records). It concludes by sketching a possible concrete exploitation of O-CREAM.
{"title":"Towards a first ontology for customer relationship management","authors":"Diego Magro, A. Goy","doi":"10.1145/1456223.1456352","DOIUrl":"https://doi.org/10.1145/1456223.1456352","url":null,"abstract":"This paper presents some results of an ongoing project aimed at modeling the main concepts related to Customer Relationship Management (CRM). More precisely, the paper presents O-CREAM, a CRM ontology based on DOLCE and on two DOLCE-based modules, DnS (exploited for modeling roles and for handling reification) and OIO (exploited for modeling business knowledge by means of information objects). The project relies on the belief that all the actors involved in CRM could benefit from an ontological investigation of this field, aimed at providing a core set of formally described concepts and relations, useful both for describing CRM processes and for specifying the functionality of CRM applications. In particular, a well-formed CRM ontology would support communication and interoperability both in intra-organization and in inter-organization CRM processes. The paper discusses in details the axiomatization for the sale and customer relationship concepts, as well as for the corresponding business knowledge items (i.e., sale and customer records). It concludes by sketching a possible concrete exploitation of O-CREAM.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"570 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126680129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Databases have become increasingly large and the data they contain is increasingly bulky. Thus the problem of knowledge extraction has become very significant and requires multiple techniques for processing the data available in order to extract the information contained from it. We particularly consider the data available on the web. Regarding the problem of the data exchange on the internet, XML is playing an increasing important role in this issue and has become a dominating standard proposed to deal with huge volumes of electronic documents. We are especially involved in extracting knowledge from complex tree structures such as XML documents. As they are heterogeneous and with complex structures, the resources available in such documents present the difficulty of querying these data. In order to deal with this problem, automatic tools are of compelling need. We especially consider the problem of constructing a mediator schema whose role is to give the necassary information about the resources structure and through which the data can be queried. In this paper, we present a new approach, called FFTM, dealing with the problem of schema mining through which we particularly focused on the use of soft embedding concept in order to extract more relevant knowledge. Indeed, crisp methods often discard interesting approximate patterns. For this purpose, we have adopted fuzzy constraints for discovering and validating frequent substructures in a large collection of semi-structured data, where both patterns and the data are modeled by labeled trees. The FFTM approach has been tested and validated on synthetic and XML document databases. The experimental results obtained show that our approach is very relevant and palliates the problem of the crisp approach.
{"title":"FFTM: optimized frequent tree mining with soft embedding constraints on siblings","authors":"M. Sghaier, S. Yahia, Anne Laurent, M. Teisseire","doi":"10.1145/1456223.1456309","DOIUrl":"https://doi.org/10.1145/1456223.1456309","url":null,"abstract":"Databases have become increasingly large and the data they contain is increasingly bulky. Thus the problem of knowledge extraction has become very significant and requires multiple techniques for processing the data available in order to extract the information contained from it. We particularly consider the data available on the web. Regarding the problem of the data exchange on the internet, XML is playing an increasing important role in this issue and has become a dominating standard proposed to deal with huge volumes of electronic documents. We are especially involved in extracting knowledge from complex tree structures such as XML documents.\u0000 As they are heterogeneous and with complex structures, the resources available in such documents present the difficulty of querying these data. In order to deal with this problem, automatic tools are of compelling need. We especially consider the problem of constructing a mediator schema whose role is to give the necassary information about the resources structure and through which the data can be queried. In this paper, we present a new approach, called FFTM, dealing with the problem of schema mining through which we particularly focused on the use of soft embedding concept in order to extract more relevant knowledge. Indeed, crisp methods often discard interesting approximate patterns. For this purpose, we have adopted fuzzy constraints for discovering and validating frequent substructures in a large collection of semi-structured data, where both patterns and the data are modeled by labeled trees. The FFTM approach has been tested and validated on synthetic and XML document databases. The experimental results obtained show that our approach is very relevant and palliates the problem of the crisp approach.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122704955","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Il-Gu Jung, Eunjin Ko, Hyun-Chul Kang, Gilhaeng Lee
Recently, plenty of voice quality measuring devices of VoIP(Voice over Internet Protocol) service is developed to support high quality VoIP service for communication. However, the existing voice quality measuring devices select the unique measurement method, which uses PESQ(Perceptual Evaluation of Speech Quality) or R-factor only. However, the each voice quality measurement method differs from its usage and assumed range, which leads to the slightly different results at the same testing area for voice services. At the actual VoIP service business and related regulatory organizations, they request to use the above two types of algorithm at the same time for measuring voice quality. However, the development expenses are very high to use the above two types of algorithm at the same time for measuring voice quality due to additional resources and subsidiary devices. VoIP business and regulatory agencies also request the hardware specification of the voice quality measurement device is similar to that of VoIP service user terminal, considering VoIP service supply at the wireless mobile internet such as WiBro(Wireless Broadband). In this paper, we present the voice quality measurement method by using of the above two algorithms at the one physical measurement device at the same time for more accurate and reliable VoIP voice quality measurement. In addition, we introduce the development method of the voice quality measurement device with relatively low cost.
近年来,为了支持高质量的VoIP通信业务,开发了大量VoIP(voice over Internet Protocol)业务的语音质量测量设备。然而,现有的语音质量测量设备选择了独特的测量方法,即仅使用语音质量感知评价(PESQ)或r因子。但是,由于每种语音质量测量方法的用途和假设范围不同,导致在同一语音业务测试区域的结果略有不同。在实际的VoIP业务和相关监管机构中,他们要求同时使用上述两种算法来测量语音质量。但是,由于需要额外的资源和附属设备,同时使用上述两种算法测量语音质量的开发费用非常高。考虑到在WiBro(wireless Broadband)等无线移动互联网上提供VoIP业务,VoIP业务和监管机构还要求语音质量测量设备的硬件规格与VoIP业务用户终端的硬件规格相似。本文提出了在同一物理测量设备上同时使用上述两种算法进行语音质量测量的方法,以实现更加准确可靠的VoIP语音质量测量。此外,我们还介绍了成本相对较低的语音质量测量装置的研制方法。
{"title":"Voice quality measurement system for telephone service","authors":"Il-Gu Jung, Eunjin Ko, Hyun-Chul Kang, Gilhaeng Lee","doi":"10.1145/1456223.1456233","DOIUrl":"https://doi.org/10.1145/1456223.1456233","url":null,"abstract":"Recently, plenty of voice quality measuring devices of VoIP(Voice over Internet Protocol) service is developed to support high quality VoIP service for communication. However, the existing voice quality measuring devices select the unique measurement method, which uses PESQ(Perceptual Evaluation of Speech Quality) or R-factor only. However, the each voice quality measurement method differs from its usage and assumed range, which leads to the slightly different results at the same testing area for voice services. At the actual VoIP service business and related regulatory organizations, they request to use the above two types of algorithm at the same time for measuring voice quality. However, the development expenses are very high to use the above two types of algorithm at the same time for measuring voice quality due to additional resources and subsidiary devices. VoIP business and regulatory agencies also request the hardware specification of the voice quality measurement device is similar to that of VoIP service user terminal, considering VoIP service supply at the wireless mobile internet such as WiBro(Wireless Broadband). In this paper, we present the voice quality measurement method by using of the above two algorithms at the one physical measurement device at the same time for more accurate and reliable VoIP voice quality measurement. In addition, we introduce the development method of the voice quality measurement device with relatively low cost.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125831454","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Uzunova, D. Jolly, E. Nikolov, Kamel Boumediene
The main aim of this paper is the obtaining of transfer function of the macroscopic traffic flow model using an exact analytical method of solution of the non-linear partial differential equations, presenting the transportation problem model as distributed parameter system. The different analytical methods of solution are shown, some of them giving an exact solution. Using this exact method we obtain directly the plant transfer function for the traffic flow model viewed as a distributed parameter system. The entire physical phenomenon is presenting through the partial differential equations whose show the distribution in vehicles on the high ways. All the research is devoted on the transfer function result because of the control system modeling. In the presenting paper we shown the verification of the analytical result of the model simulation thought the time and frequency characteristics analysis.
{"title":"The macroscopic LWR model of the transport equation viewed as a distributed parameter system","authors":"M. Uzunova, D. Jolly, E. Nikolov, Kamel Boumediene","doi":"10.1145/1456223.1456339","DOIUrl":"https://doi.org/10.1145/1456223.1456339","url":null,"abstract":"The main aim of this paper is the obtaining of transfer function of the macroscopic traffic flow model using an exact analytical method of solution of the non-linear partial differential equations, presenting the transportation problem model as distributed parameter system. The different analytical methods of solution are shown, some of them giving an exact solution. Using this exact method we obtain directly the plant transfer function for the traffic flow model viewed as a distributed parameter system. The entire physical phenomenon is presenting through the partial differential equations whose show the distribution in vehicles on the high ways. All the research is devoted on the transfer function result because of the control system modeling. In the presenting paper we shown the verification of the analytical result of the model simulation thought the time and frequency characteristics analysis.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"148 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127466405","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we propose to solve the hybrid flow shop scheduling problem with multiple criteria, by using an hybrid method. This latter consists on the one hand, to use a meta-heuristic based on ant colony optimization algorithm to generate feasible solutions and, on the other hand, an aggregation multi-criteria method based on fuzzy logic is used to assist the decision-maker to express his preferences according to the considered objective functions. The aggregation method uses the Choquet integral which allows us to take into account the interactions between the different criteria. Experiments based on randomly generated instances were conducted to test the effectiveness of our approach.
{"title":"Hybrid approach using ant colony optimization and fuzzy logic to solve multi-criteria hybrid flow shop scheduling problem","authors":"Safa Khalouli, F. Ghedjati, A. Hamzaoui","doi":"10.1145/1456223.1456236","DOIUrl":"https://doi.org/10.1145/1456223.1456236","url":null,"abstract":"In this paper, we propose to solve the hybrid flow shop scheduling problem with multiple criteria, by using an hybrid method. This latter consists on the one hand, to use a meta-heuristic based on ant colony optimization algorithm to generate feasible solutions and, on the other hand, an aggregation multi-criteria method based on fuzzy logic is used to assist the decision-maker to express his preferences according to the considered objective functions. The aggregation method uses the Choquet integral which allows us to take into account the interactions between the different criteria. Experiments based on randomly generated instances were conducted to test the effectiveness of our approach.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121886048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
During the past several years, Geographic information system (GIS) has allowed storage, editing, maintenance, dissemination, display and access of geospatial data. Ontologies are a level of description of the knowledge of an application that is independent of internal structure. Existing techniques related to the impedance model for each road segment which is utilized in a route finding algorithm is inadequate. The most impedance models are based on one-dimensional criterion such as distance or time which does not give proper results. Hence, this research investigates how impedance criteria of road segment can represent the real world in GIS using ontology driven architecture. To address this, first several criteria including quantitative as well as qualitative criteria such as traffic and climate are taken into account. Second, to weight these criteria, the Analytical Network Process (ANP) method is proposed. Finally, the impedance model is implemented and verified with real road network data.
{"title":"Ontology driven road network analysis based on analytical network process technique","authors":"A. Sadeghi-Niaraki, Kyehyun Kim, Cholyoung Lee","doi":"10.1145/1456223.1456349","DOIUrl":"https://doi.org/10.1145/1456223.1456349","url":null,"abstract":"During the past several years, Geographic information system (GIS) has allowed storage, editing, maintenance, dissemination, display and access of geospatial data. Ontologies are a level of description of the knowledge of an application that is independent of internal structure. Existing techniques related to the impedance model for each road segment which is utilized in a route finding algorithm is inadequate. The most impedance models are based on one-dimensional criterion such as distance or time which does not give proper results. Hence, this research investigates how impedance criteria of road segment can represent the real world in GIS using ontology driven architecture. To address this, first several criteria including quantitative as well as qualitative criteria such as traffic and climate are taken into account. Second, to weight these criteria, the Analytical Network Process (ANP) method is proposed. Finally, the impedance model is implemented and verified with real road network data.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123292806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Patient monitoring in medical applications such as diagnosis of sleep disorders commonly adopts invasive monitoring equipments such as pulse oximetry and polysomnogram (PSG), but their attachment to the patient's body disturb sleep and therefore compromise results. Furthermore, the invasive approaches often fail to monitor continuously because the devices can be pulled off by the subject during sleep unconsciously. This paper presents an automated noninvasive video monitoring approach to analyze (covered) human activity in conditions with persistent heavy occlusion. The proposed method is a model-based approach, employing both static shape features and dynamic motion features to suppress false positive detection, to identify human activity, and to self-improve the covered human pose estimation.
{"title":"Covered body analysis in application to patient monitoring","authors":"Ching-Wei Wang","doi":"10.1145/1456223.1456251","DOIUrl":"https://doi.org/10.1145/1456223.1456251","url":null,"abstract":"Patient monitoring in medical applications such as diagnosis of sleep disorders commonly adopts invasive monitoring equipments such as pulse oximetry and polysomnogram (PSG), but their attachment to the patient's body disturb sleep and therefore compromise results. Furthermore, the invasive approaches often fail to monitor continuously because the devices can be pulled off by the subject during sleep unconsciously. This paper presents an automated noninvasive video monitoring approach to analyze (covered) human activity in conditions with persistent heavy occlusion. The proposed method is a model-based approach, employing both static shape features and dynamic motion features to suppress false positive detection, to identify human activity, and to self-improve the covered human pose estimation.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123403442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pooya Moradian Zadeh, Maryam Mohi, M. S. Moshkenani
Developing internet usage and services urged play strong role for social network. Social networks are environment which uses internet as interface to provide relations between people, in the other word to interchange data and information between persons. Email and Instant Messengers are popular examples of them. Whereas these environments are continuously and instantly developing, revising and viewing by humans, they are good places for mining. In this paper, the topic of exchanged information between users in this type of networks will be our target. Our method is to use a hierarchical dictionary of semantically related topics and words that is mapped to a graph. Then extracted keywords from context of social network area compared to graph nodes and the dependency between them will be computed. This model can be used in many applications such as marketing, advertising and high-risk group detection.
{"title":"Mining social network for extracting topic of textual conversations","authors":"Pooya Moradian Zadeh, Maryam Mohi, M. S. Moshkenani","doi":"10.1145/1456223.1456273","DOIUrl":"https://doi.org/10.1145/1456223.1456273","url":null,"abstract":"Developing internet usage and services urged play strong role for social network. Social networks are environment which uses internet as interface to provide relations between people, in the other word to interchange data and information between persons. Email and Instant Messengers are popular examples of them. Whereas these environments are continuously and instantly developing, revising and viewing by humans, they are good places for mining. In this paper, the topic of exchanged information between users in this type of networks will be our target. Our method is to use a hierarchical dictionary of semantically related topics and words that is mapped to a graph. Then extracted keywords from context of social network area compared to graph nodes and the dependency between them will be computed. This model can be used in many applications such as marketing, advertising and high-risk group detection.","PeriodicalId":309453,"journal":{"name":"International Conference on Soft Computing as Transdisciplinary Science and Technology","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116619670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}