Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8993072
Josef Ur, Mathew Craner, Rehab El Hajj
The National Football League (NFL) is one of the most popular sports in North America. The league showcases many strong athletes and winning is very important to all teams; however, for every winning team, there is a losing team, and it is the coaches' responsibility to decide what plays to call in order to help their teams win. Data mining play data can help show trends and areas where the top teams in the NFL excel by looking at questions like how often certain plays are run, how many yards do the plays get and where on the field are touchdown scored. With the help of data mining intelligent tools such as Naive Bayes, decision tree algorithms and association rules, we worked to isolate the areas where the best teams in the league separate themselves and produce winning franchises. By classifying the teams into two categories - the top teams and bottom teams - we were able to compare the two classes for differences which explain what results in more success in the league. Although we found that most teams - both top and bottom teams - use similar plays, there were also factors that distinguished the two types of teams. This research highlights these specific factors and some overall distinctions found between the more successful versus less successful teams to the NFL community.
{"title":"What Makes a National Football League Team Successful? an Analysis of Play by Play Data","authors":"Josef Ur, Mathew Craner, Rehab El Hajj","doi":"10.1109/UEMCON47517.2019.8993072","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8993072","url":null,"abstract":"The National Football League (NFL) is one of the most popular sports in North America. The league showcases many strong athletes and winning is very important to all teams; however, for every winning team, there is a losing team, and it is the coaches' responsibility to decide what plays to call in order to help their teams win. Data mining play data can help show trends and areas where the top teams in the NFL excel by looking at questions like how often certain plays are run, how many yards do the plays get and where on the field are touchdown scored. With the help of data mining intelligent tools such as Naive Bayes, decision tree algorithms and association rules, we worked to isolate the areas where the best teams in the league separate themselves and produce winning franchises. By classifying the teams into two categories - the top teams and bottom teams - we were able to compare the two classes for differences which explain what results in more success in the league. Although we found that most teams - both top and bottom teams - use similar plays, there were also factors that distinguished the two types of teams. This research highlights these specific factors and some overall distinctions found between the more successful versus less successful teams to the NFL community.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115682891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8993100
Parnian A. ShirinAbadi, A. Abbasi
In this paper, a novel convolutional neural network (CNN) algorithm for ultra-wideband (UWB) line-of-sight (LOS) and non-line-of-sight (NLOS) channel classification is proposed. Unlike the existing methods, which are based on machine learning algorithms and require suitable information/parameters for classification to be extracted for classification procedure, the proposed method uses deep learning approaches in which the model learns discriminating information for classification automatically by itself during the “training” phase. The performance of the proposed method is investigated in the IEEE 802.15.4a standard for UWB channels in indoor office LOS and NLOS environments.
{"title":"UWB Channel Classification Using Convolutional Neural Networks","authors":"Parnian A. ShirinAbadi, A. Abbasi","doi":"10.1109/UEMCON47517.2019.8993100","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8993100","url":null,"abstract":"In this paper, a novel convolutional neural network (CNN) algorithm for ultra-wideband (UWB) line-of-sight (LOS) and non-line-of-sight (NLOS) channel classification is proposed. Unlike the existing methods, which are based on machine learning algorithms and require suitable information/parameters for classification to be extracted for classification procedure, the proposed method uses deep learning approaches in which the model learns discriminating information for classification automatically by itself during the “training” phase. The performance of the proposed method is investigated in the IEEE 802.15.4a standard for UWB channels in indoor office LOS and NLOS environments.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117109571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8993005
Yasser Karim, Ragib Hasan
In recent years, the addition of billions of Internet of Thing (IoT) device spawned a massive demand for computing service near the edge of the network. Due to latency, limited mobility, and location awareness, cloud computing is not capable enough to serve these devices. As a result, the focus is shifting more towards distributed platform service to put ample computing power near the edge of the networks. Thus, paradigms such as Fog and Edge computing are gaining attention from researchers as well as business stakeholders. Fog computing is a new computing paradigm, which places computing nodes in between the Cloud and the end user to reduce latency and increase availability. As an emerging technology, Fog computing also brings newer security challenges for the stakeholders to solve. Before designing the security models for Fog computing, it is better to understand the existing threats to Fog computing. In this regard, a thorough threat model can significantly help to identify these threats. Threat modeling is a sophisticated engineering process by which a computer-based system is analyzed to discover security flaws. In this paper, we applied two popular security threat modeling processes - CIAA and STRIDE - to identify and analyze attackers, their capabilities and motivations, and a list of potential threats in the context of Fog computing. We posit that such a systematic and thorough discussion of a threat model for Fog computing will help security researchers and professionals to design secure and reliable Fog computing systems.
{"title":"Towards a Threat Model for Fog Computing","authors":"Yasser Karim, Ragib Hasan","doi":"10.1109/UEMCON47517.2019.8993005","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8993005","url":null,"abstract":"In recent years, the addition of billions of Internet of Thing (IoT) device spawned a massive demand for computing service near the edge of the network. Due to latency, limited mobility, and location awareness, cloud computing is not capable enough to serve these devices. As a result, the focus is shifting more towards distributed platform service to put ample computing power near the edge of the networks. Thus, paradigms such as Fog and Edge computing are gaining attention from researchers as well as business stakeholders. Fog computing is a new computing paradigm, which places computing nodes in between the Cloud and the end user to reduce latency and increase availability. As an emerging technology, Fog computing also brings newer security challenges for the stakeholders to solve. Before designing the security models for Fog computing, it is better to understand the existing threats to Fog computing. In this regard, a thorough threat model can significantly help to identify these threats. Threat modeling is a sophisticated engineering process by which a computer-based system is analyzed to discover security flaws. In this paper, we applied two popular security threat modeling processes - CIAA and STRIDE - to identify and analyze attackers, their capabilities and motivations, and a list of potential threats in the context of Fog computing. We posit that such a systematic and thorough discussion of a threat model for Fog computing will help security researchers and professionals to design secure and reliable Fog computing systems.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125070399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8993033
Noha A. Elmosilhy, Ahmed M. Abd El-Haleem, M. M. Elmesalawy
Optimal placement of small cells in integrated LTE/Wi-Fi heterogeneous network architecture is considered one of the key approaches that can be used to increase the system capacity and enhance the coverage to meet the unexpected explosion of mobile data traffic. Co-operation and interworking between different Radio Access Technologies (RATs) introduce the LTE/Wi-Fi Aggregation (LWA) which allows the traffic aggregation between Wi-Fi Access Point (WAP) and LTE small cell on the level of Radio Access Network (RAN). In this paper, the effectiveness of the optimal deployment of heterogeneous wireless small nodes in a hotspot zone is considered and explored. The deployment of different wireless small nodes is formulated as an optimization problem with the objective of (i) Maximizing the total system throughput while considering the minimum received Signal-to-Interference-plus-Noise Ratio (SINR)/Signal-to-Noise Ratio (SNR) requirements for LTE/Wi-Fi coverage (ii) Choosing the optimal number of small cells which can guarantee the coverage of the considered hotspot zone (iii) Choosing the optimal formation of WAPs Basic Service Sets (BSSs). The objective function is formulated as a Mixed Integer Non-Linear Programming (MINLP) problem and solved using genetic algorithm. The performance of the proposed optimal deployment approach is compared to the uniform distributed deployment algorithm in terms of system throughput in which a significant improvement is noticed when the proposed approach is adopted.
{"title":"Optimal Deployment of Heterogeneous Wireless Nodes in Integrated LTElWi-Fi Networks","authors":"Noha A. Elmosilhy, Ahmed M. Abd El-Haleem, M. M. Elmesalawy","doi":"10.1109/UEMCON47517.2019.8993033","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8993033","url":null,"abstract":"Optimal placement of small cells in integrated LTE/Wi-Fi heterogeneous network architecture is considered one of the key approaches that can be used to increase the system capacity and enhance the coverage to meet the unexpected explosion of mobile data traffic. Co-operation and interworking between different Radio Access Technologies (RATs) introduce the LTE/Wi-Fi Aggregation (LWA) which allows the traffic aggregation between Wi-Fi Access Point (WAP) and LTE small cell on the level of Radio Access Network (RAN). In this paper, the effectiveness of the optimal deployment of heterogeneous wireless small nodes in a hotspot zone is considered and explored. The deployment of different wireless small nodes is formulated as an optimization problem with the objective of (i) Maximizing the total system throughput while considering the minimum received Signal-to-Interference-plus-Noise Ratio (SINR)/Signal-to-Noise Ratio (SNR) requirements for LTE/Wi-Fi coverage (ii) Choosing the optimal number of small cells which can guarantee the coverage of the considered hotspot zone (iii) Choosing the optimal formation of WAPs Basic Service Sets (BSSs). The objective function is formulated as a Mixed Integer Non-Linear Programming (MINLP) problem and solved using genetic algorithm. The performance of the proposed optimal deployment approach is compared to the uniform distributed deployment algorithm in terms of system throughput in which a significant improvement is noticed when the proposed approach is adopted.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123549092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8992986
Luis Camal, Anup Kirtane, Teresa Blanco, Roberto Casas, F. Rossano, Baris Aksanli
The advancements in sensor technology have made it possible to design wearable devices specifically designed for animals. These wearable devices can be used for locating individual animals, monitor their status, and track their trajectories in the wild. Some animal groups (such as chimpanzees) exhibit complex group behavior and these group dynamics play an important role in the physical and mental health of the animals. Scientists have traditionally been monitoring group dynamics manually in the wild. This requires extensive field trips, costing a lot of time and money. This calls for using the recent developments in technology, such as smart wearable devices for this purpose. However, lack of infrastructure support (limited connectivity, limited power, etc.) in the wilderness makes this a tedious task. In this work-in-progress paper, we present our technological approach and how we address the issues of wilderness to study animal behavior. We demonstrate how we build a network of lightweight wearable devices, and how the digital output of these devices can be used to analyze animal relationship. We show an initial, exploratory experiment, outlining the capabilities of the devices and technologies used in terms of communication efficiency, and the potential of the devices that can be used in the wilderness. Our initial results show that up to 90% of the proximity-based interactions can be captured.
{"title":"A Wearable Device Network to Track Animal Behavior and Relationships in the Wild","authors":"Luis Camal, Anup Kirtane, Teresa Blanco, Roberto Casas, F. Rossano, Baris Aksanli","doi":"10.1109/UEMCON47517.2019.8992986","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8992986","url":null,"abstract":"The advancements in sensor technology have made it possible to design wearable devices specifically designed for animals. These wearable devices can be used for locating individual animals, monitor their status, and track their trajectories in the wild. Some animal groups (such as chimpanzees) exhibit complex group behavior and these group dynamics play an important role in the physical and mental health of the animals. Scientists have traditionally been monitoring group dynamics manually in the wild. This requires extensive field trips, costing a lot of time and money. This calls for using the recent developments in technology, such as smart wearable devices for this purpose. However, lack of infrastructure support (limited connectivity, limited power, etc.) in the wilderness makes this a tedious task. In this work-in-progress paper, we present our technological approach and how we address the issues of wilderness to study animal behavior. We demonstrate how we build a network of lightweight wearable devices, and how the digital output of these devices can be used to analyze animal relationship. We show an initial, exploratory experiment, outlining the capabilities of the devices and technologies used in terms of communication efficiency, and the potential of the devices that can be used in the wilderness. Our initial results show that up to 90% of the proximity-based interactions can be captured.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126780248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8992978
Amin Sahba, Ramin Sahba, P. Rad, M. Jamshidi
Applications that use communication networks and distributed systems to control traffic have high latency, especially in critical situations. The performance of these applications largely depends on the computational delay of algorithms that run on local or central processors. Therefore, providing an optimized solution to minimize this delay to a tolerable range is highly needed. This article studies a method in which autonomous vehicles around an intersection try to control the intersection traffic efficiently by communicating and interacting with each other and road-side smart devices. This problem can be addressed in the form of a network utility maximization problem. To achieve a solution that is close to an optimal solution, a gradient descent algorithm with a fixed step size can be utilized. It is necessary to find a balance between latency and accuracy, which leads to finding a velocity close to the optimal velocity. The number of loop repetitions in the scheduling algorithm, determines the latency in preparation for making the proper schedule for autonomous vehicles. In this work, we propose an approach to provide an optimized schedule for autonomous vehicles in intersections considering pedestrian traffic. Autonomous vehicles are able to communicate with each other and road side unites. However, surveillance cameras are required to observe pedestrians passing the intersection. Hence, we utilize cameras, smart sensors, processors, and communication equipment embedded in autonomous vehicles and road side unites, to collect the required data, process it, and distribute the calculated optimal decision to autonomous vehicles. To simulate the traffic behaviors resulting from applying the proposed solution, Simulation of Urban Mobility software is used.
使用通信网络和分布式系统控制流量的应用程序具有很高的延迟,特别是在关键情况下。这些应用程序的性能在很大程度上取决于在本地或中央处理器上运行的算法的计算延迟。因此,提供一个优化的解决方案,将延迟最小化到可容忍的范围是非常必要的。本文研究了一种交叉口周围的自动驾驶车辆通过相互之间以及路边智能设备之间的通信和交互来有效控制交叉口交通的方法。这个问题可以用网络效用最大化问题的形式来解决。为了获得接近最优解的解,可以使用固定步长的梯度下降算法。有必要在延迟和精度之间找到平衡,从而找到接近最佳速度的速度。调度算法中的循环重复次数决定了为自动驾驶汽车制定适当调度的准备延迟。在这项工作中,我们提出了一种方法,为自动驾驶汽车在考虑行人交通的十字路口提供优化的调度。自动驾驶汽车可以相互通信,也可以与路边的车辆通信。然而,需要监控摄像头来观察通过十字路口的行人。因此,我们利用嵌入在自动驾驶汽车和道路侧单元中的摄像头、智能传感器、处理器和通信设备来收集所需的数据,对其进行处理,并将计算出的最优决策分发给自动驾驶汽车。为了模拟应用所提出的解决方案所产生的交通行为,使用了Simulation of Urban Mobility软件。
{"title":"Optimized IoT Based Decision Making For Autonomous Vehicles In Intersections","authors":"Amin Sahba, Ramin Sahba, P. Rad, M. Jamshidi","doi":"10.1109/UEMCON47517.2019.8992978","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8992978","url":null,"abstract":"Applications that use communication networks and distributed systems to control traffic have high latency, especially in critical situations. The performance of these applications largely depends on the computational delay of algorithms that run on local or central processors. Therefore, providing an optimized solution to minimize this delay to a tolerable range is highly needed. This article studies a method in which autonomous vehicles around an intersection try to control the intersection traffic efficiently by communicating and interacting with each other and road-side smart devices. This problem can be addressed in the form of a network utility maximization problem. To achieve a solution that is close to an optimal solution, a gradient descent algorithm with a fixed step size can be utilized. It is necessary to find a balance between latency and accuracy, which leads to finding a velocity close to the optimal velocity. The number of loop repetitions in the scheduling algorithm, determines the latency in preparation for making the proper schedule for autonomous vehicles. In this work, we propose an approach to provide an optimized schedule for autonomous vehicles in intersections considering pedestrian traffic. Autonomous vehicles are able to communicate with each other and road side unites. However, surveillance cameras are required to observe pedestrians passing the intersection. Hence, we utilize cameras, smart sensors, processors, and communication equipment embedded in autonomous vehicles and road side unites, to collect the required data, process it, and distribute the calculated optimal decision to autonomous vehicles. To simulate the traffic behaviors resulting from applying the proposed solution, Simulation of Urban Mobility software is used.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114954205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8992934
Safia Rahmat, Quamar Niyaz, A. Mathur, Weiqing Sun, A. Javaid
With the widespread use of the Internet in recent times, security remains one of the major concerns. Malware poses security threats to smartphones, computers, and networks. These threats require an urgent need to build an efficient hybrid intrusion detection system, which can detect malware from smartphone and traditional systems, and ensure minimal damage to the resources of an organization. In this paper, we propose an intelligent and self-learning network traffic-based hybrid malware detection approach (HMDA) for smartphones and traditional systems considering features that show a similar trend in the network traffic. The system could be used by an organizational network to detect and mitigate any occurrence of malware-based malicious activity inside the network. The proposed HMDA is implemented using machine learning algorithms. We have used ensemble learners to train the model for the HMDA and achieved an accuracy of 95.7% using XGBoost algorithm. The Android traffic captures collected by running the malware dataset have been made publicly available upon request to authors.
{"title":"Network Traffic-Based Hybrid Malware Detection for Smartphone and Traditional Networked Systems","authors":"Safia Rahmat, Quamar Niyaz, A. Mathur, Weiqing Sun, A. Javaid","doi":"10.1109/UEMCON47517.2019.8992934","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8992934","url":null,"abstract":"With the widespread use of the Internet in recent times, security remains one of the major concerns. Malware poses security threats to smartphones, computers, and networks. These threats require an urgent need to build an efficient hybrid intrusion detection system, which can detect malware from smartphone and traditional systems, and ensure minimal damage to the resources of an organization. In this paper, we propose an intelligent and self-learning network traffic-based hybrid malware detection approach (HMDA) for smartphones and traditional systems considering features that show a similar trend in the network traffic. The system could be used by an organizational network to detect and mitigate any occurrence of malware-based malicious activity inside the network. The proposed HMDA is implemented using machine learning algorithms. We have used ensemble learners to train the model for the HMDA and achieved an accuracy of 95.7% using XGBoost algorithm. The Android traffic captures collected by running the malware dataset have been made publicly available upon request to authors.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116075369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8992952
Rafael Lopez, Christian DeGuzman, Abdelrahman Elleithy
Smart cars are capable of connecting to the internet. This ability of a vehicle to connect to the internet opens up many possibilities for car security. In this paper, we introduce a new protocol to protect an automobile from theft. The proposed protocol is called the Emergency One's Complement (E1C) protocol. Simulation results demonstrated that the protocol is capable of detecting theft as well as detect and mitigate packet spoofing.
{"title":"Use of Sensor Node Networks for Car Security","authors":"Rafael Lopez, Christian DeGuzman, Abdelrahman Elleithy","doi":"10.1109/UEMCON47517.2019.8992952","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8992952","url":null,"abstract":"Smart cars are capable of connecting to the internet. This ability of a vehicle to connect to the internet opens up many possibilities for car security. In this paper, we introduce a new protocol to protect an automobile from theft. The proposed protocol is called the Emergency One's Complement (E1C) protocol. Simulation results demonstrated that the protocol is capable of detecting theft as well as detect and mitigate packet spoofing.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"181 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122456331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8992917
D. W. H. A. D. Silva, Carlos Paz de Araujo, C. E. Chow, Bryan Sosa Barillas
The ability to compute on encrypted data in a meaningful way is a subject of increasing interest in both academia and industry. The type of encryption that allows any function to be evaluated on encrypted data is called fully homomorphic encryption, or FHE, and is one promising way to achieve secure computation. The problem was first stated in 1978 by Rivest et al. and first realized by Gentry in 2009, but remains an open problem since an FHE scheme that is both efficient and secure is yet to be presented. Most of the prominent FHE schemes follow Gentry's blueprint which concentrates the efforts of researchers on very similar algebraic structures and noise management techniques. The intrinsic complexity of these schemes results in the similar shortfalls that they share in efficiency. We introduce the application of Geometric Algebra (GA) to encryption in conjunction with p-adic arithmetic and a modified version of the Chinese Remainder Theorem and we demonstrate an efficient, noise-free, symmetric-key FHE scheme. We focus the security analysis on demonstrating that our FHE scheme is not linearly decryptable. Further, we discuss a practical approach for generalizing different types of algebraic structures in the geometric product space of two dimensions, which allows us to export GA operations to other algebras and vice-versa. Our construction supports a variety of applications, from homomorphic obfuscation to general purpose FHE computations.
{"title":"A New Approach Towards Fully Homomorphic Encryption Over Geometric Algebra","authors":"D. W. H. A. D. Silva, Carlos Paz de Araujo, C. E. Chow, Bryan Sosa Barillas","doi":"10.1109/UEMCON47517.2019.8992917","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8992917","url":null,"abstract":"The ability to compute on encrypted data in a meaningful way is a subject of increasing interest in both academia and industry. The type of encryption that allows any function to be evaluated on encrypted data is called fully homomorphic encryption, or FHE, and is one promising way to achieve secure computation. The problem was first stated in 1978 by Rivest et al. and first realized by Gentry in 2009, but remains an open problem since an FHE scheme that is both efficient and secure is yet to be presented. Most of the prominent FHE schemes follow Gentry's blueprint which concentrates the efforts of researchers on very similar algebraic structures and noise management techniques. The intrinsic complexity of these schemes results in the similar shortfalls that they share in efficiency. We introduce the application of Geometric Algebra (GA) to encryption in conjunction with p-adic arithmetic and a modified version of the Chinese Remainder Theorem and we demonstrate an efficient, noise-free, symmetric-key FHE scheme. We focus the security analysis on demonstrating that our FHE scheme is not linearly decryptable. Further, we discuss a practical approach for generalizing different types of algebraic structures in the geometric product space of two dimensions, which allows us to export GA operations to other algebras and vice-versa. Our construction supports a variety of applications, from homomorphic obfuscation to general purpose FHE computations.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122459255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-10-01DOI: 10.1109/UEMCON47517.2019.8993085
R. Burks, K. Islam, Yan Lu, Jiang Li
Generative Models have been very accommodating when it comes to generating artificial data. Two of the most popular and promising models are the Generative Adversarial Network (GAN) and Variational Autoencoder (VAE) models. They both play critical roles in classification problems by generating synthetic data to train classifier more accurately. Malware detection is the process of determining whether or not software is malicious on the host's system and diagnosing what type of attack it is. Without adequate amount of training data, it makes malware detection less efficient. In this paper, we compare the two generative models to generate synthetic training data to boost the Residual Network (ResNet-18) classifier for malware detection. Experiment results show that adding synthetic malware samples generated by VAE to the training data improved the accuracy of ResNet-18 by 2% as it compared to 6% by GAN.
{"title":"Data Augmentation with Generative Models for Improved Malware Detection: A Comparative Study*","authors":"R. Burks, K. Islam, Yan Lu, Jiang Li","doi":"10.1109/UEMCON47517.2019.8993085","DOIUrl":"https://doi.org/10.1109/UEMCON47517.2019.8993085","url":null,"abstract":"Generative Models have been very accommodating when it comes to generating artificial data. Two of the most popular and promising models are the Generative Adversarial Network (GAN) and Variational Autoencoder (VAE) models. They both play critical roles in classification problems by generating synthetic data to train classifier more accurately. Malware detection is the process of determining whether or not software is malicious on the host's system and diagnosing what type of attack it is. Without adequate amount of training data, it makes malware detection less efficient. In this paper, we compare the two generative models to generate synthetic training data to boost the Residual Network (ResNet-18) classifier for malware detection. Experiment results show that adding synthetic malware samples generated by VAE to the training data improved the accuracy of ResNet-18 by 2% as it compared to 6% by GAN.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"324 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116608116","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}