Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339699
Lin Li, Shuxiang Guo, Lingshuai Meng, Haibin Zhai, Z. Hui, Bingnan Ma, Shijun Shen
Power management of embedded systems based on machine learning have drawn more and more attention. High-level software power management and optimization have gradually become important technologies for controlling the computer system power dissipation. In paper, we have employed an improved power optimization management technique which employ Q-learning algorithm based on temperature, performance and energy. The improved Q-learning has been employed to control the uncertain states of the running system and can effectively make decisions to select a rational policy with multiple parameter constraints. As running hardware and application data can be effectively collected and modeled, the power management framework can easily explore an ideal policy by value function of Q-learning algorithm.
{"title":"An Improved Q-Learning for System Power Optimization with Temperature, Performance and Energy Constraint Modeling","authors":"Lin Li, Shuxiang Guo, Lingshuai Meng, Haibin Zhai, Z. Hui, Bingnan Ma, Shijun Shen","doi":"10.1109/TOCS50858.2020.9339699","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339699","url":null,"abstract":"Power management of embedded systems based on machine learning have drawn more and more attention. High-level software power management and optimization have gradually become important technologies for controlling the computer system power dissipation. In paper, we have employed an improved power optimization management technique which employ Q-learning algorithm based on temperature, performance and energy. The improved Q-learning has been employed to control the uncertain states of the running system and can effectively make decisions to select a rational policy with multiple parameter constraints. As running hardware and application data can be effectively collected and modeled, the power management framework can easily explore an ideal policy by value function of Q-learning algorithm.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117213789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339732
Ningzhi Wang, Cheng-Ying Duan, Junyu Li, Hanyi Shi
In data centers, data packet loss will cause high retransmission delays, which is very harmful to some real-time network loads. For this reason, many production data centers have deployed lossless networks. ExpressPass as an advanced forward-looking congestion control protocol, it can achieve losslessness through the credit appointment mechanism, while making full use of network resources. However, due to the instability of the data stream of the transport layer, its rate adjustment mechanism based on credit serial numbers cannot guarantee the real-time nature of credit serial numbers. At the same time, a large amount of credit package storage will occupy the buffer memory, resulting in a decrease in server performance. This paper presents an approach called accurate credit rate control (ACRC), which proposes a new rate adjustment algorithm based on ECN marking based on ExpressPass, which solves the real-time problem of ExpressPass's serial number-based rate adjustment algorithm and save the data bit space consumed by symbolizing the package. It solves the real-time and ends congestion problems of ExpressPass very well.
{"title":"ACRC: Accurate Credit Rate Control","authors":"Ningzhi Wang, Cheng-Ying Duan, Junyu Li, Hanyi Shi","doi":"10.1109/TOCS50858.2020.9339732","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339732","url":null,"abstract":"In data centers, data packet loss will cause high retransmission delays, which is very harmful to some real-time network loads. For this reason, many production data centers have deployed lossless networks. ExpressPass as an advanced forward-looking congestion control protocol, it can achieve losslessness through the credit appointment mechanism, while making full use of network resources. However, due to the instability of the data stream of the transport layer, its rate adjustment mechanism based on credit serial numbers cannot guarantee the real-time nature of credit serial numbers. At the same time, a large amount of credit package storage will occupy the buffer memory, resulting in a decrease in server performance. This paper presents an approach called accurate credit rate control (ACRC), which proposes a new rate adjustment algorithm based on ECN marking based on ExpressPass, which solves the real-time problem of ExpressPass's serial number-based rate adjustment algorithm and save the data bit space consumed by symbolizing the package. It solves the real-time and ends congestion problems of ExpressPass very well.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127174888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339758
Jing Jin, Yongqing Zhang
CiteSpace refers to a kind of citation visual analysis software developed on the basis of scientometrics and data virtualization. To explore the hot spot of social risk and regulation of international artificial intelligent (AI) in medical treatment, this paper retrieves the paper in 2000–2019 by taking the paper core set of Web of Science as the retrieval database and inputting keywords and then reflects the structure, rules and distribution of science and technology knowledge of the retrieved information, through visual means via CiteSpace. By analyzing data from such aspects as the number of publications and citations, ranking of publishing journals, clustering analysis of co-cited literatures and co-occurrence analysis of keywords, this paper brings the following discoveries: there are few research results related to “AI + social risk and regulation+medical”, EU's leading status in the study of AI ethnics and law, and the scholars' focus in recent years: 1. Some new issues of DNA sequencing technology in terms of laws and ethnics; 2. Personal information privacy at the medical data collection end and AI application end and medical privacy safety; 3. Decision-making of medical robot; 4. The standard application of AI in medical field; 5. Standardization of AI medical device; 6. Nanometer safety. Among all the issues above, the ethnic reorganization of doctor-patient relationship of AI-aided medical treatment triggers extensive concerns.
CiteSpace是一款基于科学计量学和数据虚拟化技术开发的引文可视化分析软件。为探究国际人工智能在医疗领域的社会风险与监管热点,本文以Web of Science的论文核心集为检索数据库,输入关键词对2000-2019年的论文进行检索,通过CiteSpace以可视化的方式反映被检索信息的科技知识结构、规律和分布。通过对出版物和被引数量、出版期刊排名、共被引文献聚类分析、关键词共现分析等方面的数据分析,本文发现:“人工智能+社会风险和监管+医疗”相关的研究成果较少,欧盟在人工智能伦理和法律研究方面的领先地位,以及近年来学者关注的重点:1。DNA测序技术在法律和伦理方面的一些新问题2. 医疗数据采集端和人工智能应用端的个人信息隐私与医疗隐私安全;3.医疗机器人决策;4. 人工智能在医疗领域的规范应用5. 人工智能医疗器械标准化;6. 纳米安全。其中,人工智能辅助医疗中医患关系的族群重组引发了广泛关注。
{"title":"The Latest Progress of Research on the Social Risks and Regulation of International Artificial Intelligent in Medical Treatment Based on CiteSpace Method","authors":"Jing Jin, Yongqing Zhang","doi":"10.1109/TOCS50858.2020.9339758","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339758","url":null,"abstract":"CiteSpace refers to a kind of citation visual analysis software developed on the basis of scientometrics and data virtualization. To explore the hot spot of social risk and regulation of international artificial intelligent (AI) in medical treatment, this paper retrieves the paper in 2000–2019 by taking the paper core set of Web of Science as the retrieval database and inputting keywords and then reflects the structure, rules and distribution of science and technology knowledge of the retrieved information, through visual means via CiteSpace. By analyzing data from such aspects as the number of publications and citations, ranking of publishing journals, clustering analysis of co-cited literatures and co-occurrence analysis of keywords, this paper brings the following discoveries: there are few research results related to “AI + social risk and regulation+medical”, EU's leading status in the study of AI ethnics and law, and the scholars' focus in recent years: 1. Some new issues of DNA sequencing technology in terms of laws and ethnics; 2. Personal information privacy at the medical data collection end and AI application end and medical privacy safety; 3. Decision-making of medical robot; 4. The standard application of AI in medical field; 5. Standardization of AI medical device; 6. Nanometer safety. Among all the issues above, the ethnic reorganization of doctor-patient relationship of AI-aided medical treatment triggers extensive concerns.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121568330","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339712
Li Qing, Xu Guanzhong, Zhou Ziqiang
Three-dimensional laser scanning technology is a new surveying and mapping technology, also known as “real scene replication technology”, which can completely and accurately reconstruct scanned objects and quickly obtain original surveying and mapping data. Multi-site cloud data registration is an important link in 3D laser scanning data processing. The seven-parameter registration model based on control points with the same name is an algorithm model with high accuracy. Through piecewise low-order interpolation method, the curve smoothing processing of empty area model is realized, and then combined with the existing hierarchical detail model algorithm, the 3D model file of power equipment is loaded and rendered in the 3D scene. The purpose of this paper is to provide reference for the subsequent construction of power 3D geographic information platform.
{"title":"Research on Application of 3D Laser Point Cloud Technology in 3D Geographic Location Information Modeling of Electric Power","authors":"Li Qing, Xu Guanzhong, Zhou Ziqiang","doi":"10.1109/TOCS50858.2020.9339712","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339712","url":null,"abstract":"Three-dimensional laser scanning technology is a new surveying and mapping technology, also known as “real scene replication technology”, which can completely and accurately reconstruct scanned objects and quickly obtain original surveying and mapping data. Multi-site cloud data registration is an important link in 3D laser scanning data processing. The seven-parameter registration model based on control points with the same name is an algorithm model with high accuracy. Through piecewise low-order interpolation method, the curve smoothing processing of empty area model is realized, and then combined with the existing hierarchical detail model algorithm, the 3D model file of power equipment is loaded and rendered in the 3D scene. The purpose of this paper is to provide reference for the subsequent construction of power 3D geographic information platform.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122231683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339623
Haijian Wang, Xinyue Liang, Menggao He, Xuefeng Li, S. Fu
With the rapid development of economy and the continuous progress of science and technology, electric power engineering plays a very important role and has been widely used in all walks of life, making some gratifying achievements in electrical engineering automation technology. PLC technology refers to programmable controller, which is a new type of industrial automation control device, with convenient programming, high reliability and powerful functions. Its application in the field of electrical automation control is an inevitable trend of development. After a large amount of data investigation, it is found that this technology can be well applied in the automation control of electrical engineering, and has a good effect. The application of PLC technology in electrical engineering can not only achieve better automation control, but also be the development direction of electrical engineering in the future. With the development and progress of the times, the demands and requirements of work are getting higher and higher, and many problems are gradually found in the application process of PLC technology. Based on this, this paper mainly analyzes the application of PLC technology in electrical engineering and its automation control.
{"title":"Analysis of Application of PLC Technology in Automation Control of Electrical Engineering","authors":"Haijian Wang, Xinyue Liang, Menggao He, Xuefeng Li, S. Fu","doi":"10.1109/TOCS50858.2020.9339623","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339623","url":null,"abstract":"With the rapid development of economy and the continuous progress of science and technology, electric power engineering plays a very important role and has been widely used in all walks of life, making some gratifying achievements in electrical engineering automation technology. PLC technology refers to programmable controller, which is a new type of industrial automation control device, with convenient programming, high reliability and powerful functions. Its application in the field of electrical automation control is an inevitable trend of development. After a large amount of data investigation, it is found that this technology can be well applied in the automation control of electrical engineering, and has a good effect. The application of PLC technology in electrical engineering can not only achieve better automation control, but also be the development direction of electrical engineering in the future. With the development and progress of the times, the demands and requirements of work are getting higher and higher, and many problems are gradually found in the application process of PLC technology. Based on this, this paper mainly analyzes the application of PLC technology in electrical engineering and its automation control.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131380404","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339752
Yinxin Long, Huajin He
Q-learning algorithm based on Markov decision process as a reinforcement learning algorithm can achieve better path planning effect for mobile robot in continuous trial and error. However, Q-learning needs a huge Q-value table, which is easy to cause dimension disaster in decision-making, and it is difficult to get a good path in complex situations. By combining deep learning with reinforcement learning and using the perceptual advantages of deep learning to solve the decision-making problem of reinforcement learning, the deficiency of Q-learning algorithm can be improved. At the same time, the path planning of deep reinforcement learning is simulated by MATLAB, the simulation results show that the deep reinforcement learning can effectively realize the obstacle avoidance of the robot and plan a collision free optimal path for the robot from the starting point to the end point.
{"title":"Robot path planning based on deep reinforcement learning","authors":"Yinxin Long, Huajin He","doi":"10.1109/TOCS50858.2020.9339752","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339752","url":null,"abstract":"Q-learning algorithm based on Markov decision process as a reinforcement learning algorithm can achieve better path planning effect for mobile robot in continuous trial and error. However, Q-learning needs a huge Q-value table, which is easy to cause dimension disaster in decision-making, and it is difficult to get a good path in complex situations. By combining deep learning with reinforcement learning and using the perceptual advantages of deep learning to solve the decision-making problem of reinforcement learning, the deficiency of Q-learning algorithm can be improved. At the same time, the path planning of deep reinforcement learning is simulated by MATLAB, the simulation results show that the deep reinforcement learning can effectively realize the obstacle avoidance of the robot and plan a collision free optimal path for the robot from the starting point to the end point.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134319886","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339698
Chunji Hao, Yi Zhou
At present, the flow arrangement of various applications of mobile intelligent terminals is only a one-way process from developers to users, and most of them are arranged only by the experience of product designers or market research within a certain range. Real-time, portability and positioning are the characteristics of mobile terminals that are different from traditional Internet. Relevant information needs to be given to users quickly, so that users can make consumption decisions quickly. How to obtain context information effectively and predict users' consumption behavior based on context has become a topic facing personalized service field. This paper analyzes the comprehensive quality of the large data multi data source Association, and analyzes the user behavior acquisition and simulation prediction framework construction method based on user perception. Thus, it can meet the needs of users to the maximum extent, reduce the operation of users and realize the personalization of user experience. In the future, the optimization needs to combine high traffic, high user volume, high complaints and so on to carry out multi data source correlation analysis, aiming at improving user perception.
{"title":"Design and Implementation of User Behavior Acquisition and Simulation Prediction Framework for Mobile Intelligent Terminal Based on User Perception","authors":"Chunji Hao, Yi Zhou","doi":"10.1109/TOCS50858.2020.9339698","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339698","url":null,"abstract":"At present, the flow arrangement of various applications of mobile intelligent terminals is only a one-way process from developers to users, and most of them are arranged only by the experience of product designers or market research within a certain range. Real-time, portability and positioning are the characteristics of mobile terminals that are different from traditional Internet. Relevant information needs to be given to users quickly, so that users can make consumption decisions quickly. How to obtain context information effectively and predict users' consumption behavior based on context has become a topic facing personalized service field. This paper analyzes the comprehensive quality of the large data multi data source Association, and analyzes the user behavior acquisition and simulation prediction framework construction method based on user perception. Thus, it can meet the needs of users to the maximum extent, reduce the operation of users and realize the personalization of user experience. In the future, the optimization needs to combine high traffic, high user volume, high complaints and so on to carry out multi data source correlation analysis, aiming at improving user perception.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133748861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339624
Ji Guanni
This article first introduces the relevant knowledge of weak signal detection, introduces the characteristics of weak signals, the current situation at home and abroad, etc., analyzes the application prospects of low-cost weak signal acquisition systems, and proposes the highperformance low-power programmable embedding. The combination of formula and virtual instrument forms the idea of embedded weak signal processing system. Then, for the embedded weak signal processing platform, a solution cantered on FPGA and LabViewis proposed. Refine the design plan, select a suitable development platform for the amplification and filtering, AD sampling, transmission control and system software, and give the basis. Then analyze, design and develop the components of the system according to the solution. The embedded system completes the functions of instruction interpretation, sampling control, and data reading and writing. At the same time, the modification of the USB firmware makes the transmission port exclusive to the acquisition device. The software system had the ability to interact with the device and process digital signals, and realizes the host computer Reproduction and storage of sampled signals. Finally, the system was debugged. Through the debugging of various hardware and software modules, as well as overall debugging, the processing function of weak signals is realized.
{"title":"Design and Implementation of a High-Performance and Low-Power Programmable Embedded Weak Signal Processing Platform","authors":"Ji Guanni","doi":"10.1109/TOCS50858.2020.9339624","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339624","url":null,"abstract":"This article first introduces the relevant knowledge of weak signal detection, introduces the characteristics of weak signals, the current situation at home and abroad, etc., analyzes the application prospects of low-cost weak signal acquisition systems, and proposes the highperformance low-power programmable embedding. The combination of formula and virtual instrument forms the idea of embedded weak signal processing system. Then, for the embedded weak signal processing platform, a solution cantered on FPGA and LabViewis proposed. Refine the design plan, select a suitable development platform for the amplification and filtering, AD sampling, transmission control and system software, and give the basis. Then analyze, design and develop the components of the system according to the solution. The embedded system completes the functions of instruction interpretation, sampling control, and data reading and writing. At the same time, the modification of the USB firmware makes the transmission port exclusive to the acquisition device. The software system had the ability to interact with the device and process digital signals, and realizes the host computer Reproduction and storage of sampled signals. Finally, the system was debugged. Through the debugging of various hardware and software modules, as well as overall debugging, the processing function of weak signals is realized.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120841549","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339706
Yu Qi, Baihui Tang, Sanxing Cao
Gifts play a pivotal role in people's daily life. Some people want to give others gifts, but they don't understand other's preferences. This makes giving gifts a difficult problem. Traditional gift recommendation systems can only recommend gifts when users are known to have certain needs. This does not solve the user's recommendation needs for gifts in scenarios where the user's preferences are unknown. According to this demand, this paper proposes a gifts recommendation system based on the public big data of social networks to solve the problem of users' difficulty in selecting gifts.
{"title":"Gifts Recommendation System Based on the Public Big Data of Social Networks","authors":"Yu Qi, Baihui Tang, Sanxing Cao","doi":"10.1109/TOCS50858.2020.9339706","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339706","url":null,"abstract":"Gifts play a pivotal role in people's daily life. Some people want to give others gifts, but they don't understand other's preferences. This makes giving gifts a difficult problem. Traditional gift recommendation systems can only recommend gifts when users are known to have certain needs. This does not solve the user's recommendation needs for gifts in scenarios where the user's preferences are unknown. According to this demand, this paper proposes a gifts recommendation system based on the public big data of social networks to solve the problem of users' difficulty in selecting gifts.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124749523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-12-11DOI: 10.1109/TOCS50858.2020.9339616
Chunming Xie, Caiming Liu
Enterprise cooperation mechanisms play an important role in the cultivation of college students, but a scientific method is needed to evaluate the potential relationship between the university-enterprise cooperation and the quality of student cultivation. This paper proposes a data correlation analysis algorithm between university-enterprise cooperation and college students' training quality, in order to scientifically evaluate the impact of university-enterprise cooperation mechanisms on college students' training quality. The transaction data set and item set reflecting the relationship between university-enterprise cooperation mechanisms and training quality are established. The association rules for the influence of university-enterprise cooperation mechanism on the cultivation quality of college students are designed. The computation method of the item set support degree for student training affairs is defined, and the computation method of the confidence degree of the association rules for the influence of university-enterprise cooperation mechanisms on training quality is defined. The strongest association rules between the university-enterprise cooperation mechanisms and the quality of student training are confirmed by the support degree and the confidence degree. At the same time, the corresponding correlation analysis algorithm is designed, and the effectiveness of the proposed algorithm is verified by experiments.
{"title":"Data Correlation Analysis Algorithm of University-enterprise Cooperation and Cultivation Quality","authors":"Chunming Xie, Caiming Liu","doi":"10.1109/TOCS50858.2020.9339616","DOIUrl":"https://doi.org/10.1109/TOCS50858.2020.9339616","url":null,"abstract":"Enterprise cooperation mechanisms play an important role in the cultivation of college students, but a scientific method is needed to evaluate the potential relationship between the university-enterprise cooperation and the quality of student cultivation. This paper proposes a data correlation analysis algorithm between university-enterprise cooperation and college students' training quality, in order to scientifically evaluate the impact of university-enterprise cooperation mechanisms on college students' training quality. The transaction data set and item set reflecting the relationship between university-enterprise cooperation mechanisms and training quality are established. The association rules for the influence of university-enterprise cooperation mechanism on the cultivation quality of college students are designed. The computation method of the item set support degree for student training affairs is defined, and the computation method of the confidence degree of the association rules for the influence of university-enterprise cooperation mechanisms on training quality is defined. The strongest association rules between the university-enterprise cooperation mechanisms and the quality of student training are confirmed by the support degree and the confidence degree. At the same time, the corresponding correlation analysis algorithm is designed, and the effectiveness of the proposed algorithm is verified by experiments.","PeriodicalId":373862,"journal":{"name":"2020 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124822942","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}