Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514367
M. Waje, P. Dakhole
Quantum Dot Cellular Automata is one of the six emerging technologies which help us to overcome the limitations of CMOS technology. Design of 4-bit ALU for AND, OR, XOR, and ADD operations using QCA is discussed through this paper. This design of 4-bit ALU using QCA is simple in structure having significantly lesser elements as compared to CMOS design. It also gives better result in terms of speed, area and power. A QCADesigner tool is used for Simulation of different components of 4 bit ALU.
{"title":"Design and implementation of 4-bit arithmetic logic unit using Quantum Dot Cellular Automata","authors":"M. Waje, P. Dakhole","doi":"10.1109/IADCC.2013.6514367","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514367","url":null,"abstract":"Quantum Dot Cellular Automata is one of the six emerging technologies which help us to overcome the limitations of CMOS technology. Design of 4-bit ALU for AND, OR, XOR, and ADD operations using QCA is discussed through this paper. This design of 4-bit ALU using QCA is simple in structure having significantly lesser elements as compared to CMOS design. It also gives better result in terms of speed, area and power. A QCADesigner tool is used for Simulation of different components of 4 bit ALU.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134057285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514412
V. Chandrakanth, S. Tripathi
Fourier transform algorithm has encompassed diverse fields of engineering including specialized fields like radars, communications and image processing systems. Therefore there have been continual efforts to improve the efficiency of FFT implementation in real time systems and other hardware. To reduce design time and time to market, FPGA vendors have developed IP cores which can be readily used in our applications. But these IP core designs though efficient are highly abstract and do not provide the designer to modify them according to his requirement which leads to inefficient design realization. Vendor provided IP cores do not give access to FFT kernel matrix thus restricting the configurability and efficiency of using them. In this paper we have designed a customized architecture to perform FFT with access to twiddle factors for improved configurability. The designed architecture is further modified to perform variable point FFT targeted for application in multirate systems. The architecture designed is generic and can be implemented on any vendor platform.
{"title":"Customized architecture for implementing configurable FFT on FPGA","authors":"V. Chandrakanth, S. Tripathi","doi":"10.1109/IADCC.2013.6514412","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514412","url":null,"abstract":"Fourier transform algorithm has encompassed diverse fields of engineering including specialized fields like radars, communications and image processing systems. Therefore there have been continual efforts to improve the efficiency of FFT implementation in real time systems and other hardware. To reduce design time and time to market, FPGA vendors have developed IP cores which can be readily used in our applications. But these IP core designs though efficient are highly abstract and do not provide the designer to modify them according to his requirement which leads to inefficient design realization. Vendor provided IP cores do not give access to FFT kernel matrix thus restricting the configurability and efficiency of using them. In this paper we have designed a customized architecture to perform FFT with access to twiddle factors for improved configurability. The designed architecture is further modified to perform variable point FFT targeted for application in multirate systems. The architecture designed is generic and can be implemented on any vendor platform.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"183 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134304984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514298
Zubair Khan Ravindra Singh, Sumit Sanwal, Arun Gangwar, Shabbir Alam
In this paper we are proposing a new approach for tasks allocation in a massively parallel system using Finite Automata. On the basis of task flow model of finite automata., we find the turnaround time for a parallel system using finite automata as a directed acyclic graph in the second section of the paper we discuss regarding the finite automata and directed acyclic graph after that we change finite automata into DAG for massively parallel system. All the simulations are performing in Intel C++ parallel compiler and compare these results with several interesting scheduling algorithms and we get better turnaround time.
{"title":"Task allocation in a massively parallel system using Finite Automata","authors":"Zubair Khan Ravindra Singh, Sumit Sanwal, Arun Gangwar, Shabbir Alam","doi":"10.1109/IADCC.2013.6514298","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514298","url":null,"abstract":"In this paper we are proposing a new approach for tasks allocation in a massively parallel system using Finite Automata. On the basis of task flow model of finite automata., we find the turnaround time for a parallel system using finite automata as a directed acyclic graph in the second section of the paper we discuss regarding the finite automata and directed acyclic graph after that we change finite automata into DAG for massively parallel system. All the simulations are performing in Intel C++ parallel compiler and compare these results with several interesting scheduling algorithms and we get better turnaround time.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134334379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514305
R. Mehra, Ashutosh Kumar Singh
Recently received signal strength (RSS)-based distance estimation technique has been proposed as a low complexity, low-cost solution for mobile communication node with minimum RSSI error. After investigating the existing algorithm of location technique, it is observed that the distribution of RSSI-value at each sample point is fluctuant even in the same position due to shadow fading effect. Therefore, here present a novel method for RSSI error reduction in distance estimation using recursive least square (RLS)-algorithm to the existing deterministic algorithms. The proposed method collects RSSI-values from the mobile communication node to build the probability model. Once the probability models are estimated for different standard deviation related to path loss exponent using adaptive filtering in real time, it is possible to accurately determine the distance between the mobile communication node and fixed communication node. From simulation results it is shown, that the accuracy of RSSI-value for mobile communication node in real time distance estimation is improved in changing environments.
{"title":"Real time RSSI error reduction in distance estimation using RLS algorithm","authors":"R. Mehra, Ashutosh Kumar Singh","doi":"10.1109/IADCC.2013.6514305","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514305","url":null,"abstract":"Recently received signal strength (RSS)-based distance estimation technique has been proposed as a low complexity, low-cost solution for mobile communication node with minimum RSSI error. After investigating the existing algorithm of location technique, it is observed that the distribution of RSSI-value at each sample point is fluctuant even in the same position due to shadow fading effect. Therefore, here present a novel method for RSSI error reduction in distance estimation using recursive least square (RLS)-algorithm to the existing deterministic algorithms. The proposed method collects RSSI-values from the mobile communication node to build the probability model. Once the probability models are estimated for different standard deviation related to path loss exponent using adaptive filtering in real time, it is possible to accurately determine the distance between the mobile communication node and fixed communication node. From simulation results it is shown, that the accuracy of RSSI-value for mobile communication node in real time distance estimation is improved in changing environments.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132367331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514403
A. Chaudhari, J. Kulkarni
Magnetic Resonance Imaging (MRI) offers a lot of information for medical examination. Fast, accurate and reproducible segmentation of MRI is desirable in many applications. Brain image segmentation is important from clinical point of view for detection of tumor. Brain images mostly contain noise, inhomogeneity and sometimes deviation. Therefore, accurate segmentation of brain images is a very difficult task. In this paper we present an automatic method of brain segmentation for detection of tumor. The MR images from T1, T2 and flair sequences are used for the study along with axial, coronal and sagitial slices. The segmentation of MR images is done using textural features based on gray level co occurrence matrix. The textural feature used is the entropy of image.
{"title":"Local entropy based brain MR image segmentation","authors":"A. Chaudhari, J. Kulkarni","doi":"10.1109/IADCC.2013.6514403","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514403","url":null,"abstract":"Magnetic Resonance Imaging (MRI) offers a lot of information for medical examination. Fast, accurate and reproducible segmentation of MRI is desirable in many applications. Brain image segmentation is important from clinical point of view for detection of tumor. Brain images mostly contain noise, inhomogeneity and sometimes deviation. Therefore, accurate segmentation of brain images is a very difficult task. In this paper we present an automatic method of brain segmentation for detection of tumor. The MR images from T1, T2 and flair sequences are used for the study along with axial, coronal and sagitial slices. The segmentation of MR images is done using textural features based on gray level co occurrence matrix. The textural feature used is the entropy of image.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134189497","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514456
R. S. Pippal, C. Jaidhar, S. Tapaswi
The goal of this paper is to design a mutual authentication scheme that supports secure data service migration among multiple registered devices (like PC, Laptop, Smartphone, etc.) so that each user can use the most suitable device whenever he/she feels. Authentication based on single factor depends on user's knowledge of some secret i.e. a password or a PIN. However, it is not secure enough. Two factor authentication is one which can be used as strong authentication scheme. This paper proposes mutual authentication scheme for session transfer among registered devices using smart card. Its security relies on the hardness of solving discrete logarithm problem and one way hash function. Random nonce is employed as a replacement for timestamp so as to avoid the cost of implementing clock synchronization between user and the server. Security analysis proves that this scheme is immune to the presented attacks and provides essential security features.
{"title":"A novel smart card mutual authentication scheme for session transfer among registered devices","authors":"R. S. Pippal, C. Jaidhar, S. Tapaswi","doi":"10.1109/IADCC.2013.6514456","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514456","url":null,"abstract":"The goal of this paper is to design a mutual authentication scheme that supports secure data service migration among multiple registered devices (like PC, Laptop, Smartphone, etc.) so that each user can use the most suitable device whenever he/she feels. Authentication based on single factor depends on user's knowledge of some secret i.e. a password or a PIN. However, it is not secure enough. Two factor authentication is one which can be used as strong authentication scheme. This paper proposes mutual authentication scheme for session transfer among registered devices using smart card. Its security relies on the hardness of solving discrete logarithm problem and one way hash function. Random nonce is employed as a replacement for timestamp so as to avoid the cost of implementing clock synchronization between user and the server. Security analysis proves that this scheme is immune to the presented attacks and provides essential security features.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129152883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514328
R. Singh, D. Sharma
As web is the largest collection of information and plenty of pages or documents are newly added and deleted on frequent basis due to the dynamic nature of the web. The information present on the web is of great need, the world is full of questions and the web is serving as the major source of gaining information about specific query made by the user. As per the search engine for the query, a number of pages are retrieved among which the quality of the page that are retrieved is questioned. On the retrieved pages the search engine apply certain algorithms to bring an order to the pages retrieved so that the most relevant document or pages are displayed at the top of list. In this paper a new page ranking algorithm known as the RatioRank is discussed, in which the inlink weights and outlink weights are used with the consideration of number of visit count and is compared with some algorithms by using certain parameters.
{"title":"RatioRank: Enhancing the impact of inlinks and outlinks","authors":"R. Singh, D. Sharma","doi":"10.1109/IADCC.2013.6514328","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514328","url":null,"abstract":"As web is the largest collection of information and plenty of pages or documents are newly added and deleted on frequent basis due to the dynamic nature of the web. The information present on the web is of great need, the world is full of questions and the web is serving as the major source of gaining information about specific query made by the user. As per the search engine for the query, a number of pages are retrieved among which the quality of the page that are retrieved is questioned. On the retrieved pages the search engine apply certain algorithms to bring an order to the pages retrieved so that the most relevant document or pages are displayed at the top of list. In this paper a new page ranking algorithm known as the RatioRank is discussed, in which the inlink weights and outlink weights are used with the consideration of number of visit count and is compared with some algorithms by using certain parameters.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132460281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514386
V. Vijaya Kishore, R. V. S. Satyanarayana
Several lung diseases are diagnosed detecting patterns of lung tissue in various medical imaging obtained from MRI, CT, US and DICOM. In recent years many image processing procedures are widely used on medical images to detect lung patterns at an early and treatment stages. Several approaches to lung segmentation combine geometric and intensity models to enhance local anatomical structure. When the lung images are added with noise, two difficulties are primarily associated with the detection of nodules; the detection of nodules that are adjacent to vessels or the chest wall corrupted and having very similar intensity; and the detection of nodules that are non-spherical in shape due to noise. In such cases, intensity thresholding or model based methods might fail to identify those nodules. Edges characterize boundaries and are hence of fundamental importance in image processing. Image edge detection significantly reduces the amount of data by filtering and preserving the important structural attributes. So understanding of edge detecting algorithms is necessary. In this paper Morphology based Region of interest segmentation combined with watershed transform of DICOM lung image is performed and comparative analysis in noisy environment such as Gaussian, Salt & Pepper, Poisson and speckle is performed. The ROI lung area blood vessels and nodules from the major lung portion are extracted using different edge detection filters such as Average, Gaussian, Laplacian, Sobel, Prewitt, Unsharp and LoG in presence of noise. The results are helpful to study and analyse the influence of noise on the DICOM images while extracting region of interest and to know how effectively the operators are able to detect, overcoming the impact of different noise. The evaluation process is based on parameters from which decision for the choice can be made.
{"title":"Performance evaluation of edge detectors - morphology based ROI segmentation and nodule detection from DICOM lung images in the noisy environment","authors":"V. Vijaya Kishore, R. V. S. Satyanarayana","doi":"10.1109/IADCC.2013.6514386","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514386","url":null,"abstract":"Several lung diseases are diagnosed detecting patterns of lung tissue in various medical imaging obtained from MRI, CT, US and DICOM. In recent years many image processing procedures are widely used on medical images to detect lung patterns at an early and treatment stages. Several approaches to lung segmentation combine geometric and intensity models to enhance local anatomical structure. When the lung images are added with noise, two difficulties are primarily associated with the detection of nodules; the detection of nodules that are adjacent to vessels or the chest wall corrupted and having very similar intensity; and the detection of nodules that are non-spherical in shape due to noise. In such cases, intensity thresholding or model based methods might fail to identify those nodules. Edges characterize boundaries and are hence of fundamental importance in image processing. Image edge detection significantly reduces the amount of data by filtering and preserving the important structural attributes. So understanding of edge detecting algorithms is necessary. In this paper Morphology based Region of interest segmentation combined with watershed transform of DICOM lung image is performed and comparative analysis in noisy environment such as Gaussian, Salt & Pepper, Poisson and speckle is performed. The ROI lung area blood vessels and nodules from the major lung portion are extracted using different edge detection filters such as Average, Gaussian, Laplacian, Sobel, Prewitt, Unsharp and LoG in presence of noise. The results are helpful to study and analyse the influence of noise on the DICOM images while extracting region of interest and to know how effectively the operators are able to detect, overcoming the impact of different noise. The evaluation process is based on parameters from which decision for the choice can be made.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132288974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514430
S. Kansal, J. Madan, Ashutosh Kumar Singh
One of the most familiar problem in reverse engineering for generating CAD model from point cloud of physical part is presence of deep and narrow holes. Triangulation is one of the important step for generating a CAD model in reverse engineering. Due to formation of incorrect triangulations along the boundary of a hole, reconstruction algorithms are not able to recover the hole boundaries. A systematic approach for CAD model generation of parts with hole features is presented in this paper, which includes three modules of the system: a pre-processing algorithm to reduce the size of point cloud data, surface reconstruction algorithm based on Delaunay triangulation, and post processing algorithm to refine the mesh generated through triangulation. The proposed system is verified on some example parts containing hole features. The results obtained from the proposed system are encouraging and, we intend to implement this on some point cloud data obtained from physically existing parts.
{"title":"A systematic approach for Cad model generation Of hole features from point cloud data","authors":"S. Kansal, J. Madan, Ashutosh Kumar Singh","doi":"10.1109/IADCC.2013.6514430","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514430","url":null,"abstract":"One of the most familiar problem in reverse engineering for generating CAD model from point cloud of physical part is presence of deep and narrow holes. Triangulation is one of the important step for generating a CAD model in reverse engineering. Due to formation of incorrect triangulations along the boundary of a hole, reconstruction algorithms are not able to recover the hole boundaries. A systematic approach for CAD model generation of parts with hole features is presented in this paper, which includes three modules of the system: a pre-processing algorithm to reduce the size of point cloud data, surface reconstruction algorithm based on Delaunay triangulation, and post processing algorithm to refine the mesh generated through triangulation. The proposed system is verified on some example parts containing hole features. The results obtained from the proposed system are encouraging and, we intend to implement this on some point cloud data obtained from physically existing parts.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117279573","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-05-13DOI: 10.1109/IADCC.2013.6514296
M. Ananthi, M. Sumalatha
Web service supports interoperability for collecting, storing, manipulating and retrieving data from heterogeneous environments. The Wireless Sensor Network is a resource-constrained device, which is low cost, low power and small in size and used in various applications such as industrial control & monitoring, environmental sensing, health care, etc. The main intent is to design a middleware that hides the complexity of accessing the sensor network environment and developing an application for sensor web enablement. This concept holds great importance because integrating wireless sensor network into IP-based systems are still a challenging issue. It is very important to collect patient's details during the emergency period. To create a web service to manage patient's personal data with the help of Radio Frequency Identification Tag (RFID), Web Service is dedicated to collecting, storing, manipulating, and making available clinical information. Context-aware services are needed for searching information more accurately and to produce highly impeccable output.
{"title":"Integrating WSN with web services for patient's record management using RFID","authors":"M. Ananthi, M. Sumalatha","doi":"10.1109/IADCC.2013.6514296","DOIUrl":"https://doi.org/10.1109/IADCC.2013.6514296","url":null,"abstract":"Web service supports interoperability for collecting, storing, manipulating and retrieving data from heterogeneous environments. The Wireless Sensor Network is a resource-constrained device, which is low cost, low power and small in size and used in various applications such as industrial control & monitoring, environmental sensing, health care, etc. The main intent is to design a middleware that hides the complexity of accessing the sensor network environment and developing an application for sensor web enablement. This concept holds great importance because integrating wireless sensor network into IP-based systems are still a challenging issue. It is very important to collect patient's details during the emergency period. To create a web service to manage patient's personal data with the help of Radio Frequency Identification Tag (RFID), Web Service is dedicated to collecting, storing, manipulating, and making available clinical information. Context-aware services are needed for searching information more accurately and to produce highly impeccable output.","PeriodicalId":325901,"journal":{"name":"2013 3rd IEEE International Advance Computing Conference (IACC)","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115457907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}