Multi-Encoder Spatio-Temporal Feature Fusion Network for Electric Vehicle Charging Load Prediction

IF 3.1 4区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Journal of Intelligent & Robotic Systems Pub Date : 2024-07-09 DOI:10.1007/s10846-024-02125-z
Yufan Chen, Mengqin Wang, Yanling Wei, Xueliang Huang, Shan Gao
{"title":"Multi-Encoder Spatio-Temporal Feature Fusion Network for Electric Vehicle Charging Load Prediction","authors":"Yufan Chen, Mengqin Wang, Yanling Wei, Xueliang Huang, Shan Gao","doi":"10.1007/s10846-024-02125-z","DOIUrl":null,"url":null,"abstract":"<p>Electric vehicles (EVs) have been initiated as a preference for decarbonizing road transport. Accurate charging load prediction is essential for the construction of EV charging facilities systematically and for the coordination of EV energy demand with the requisite peak power supply. It is noted that the charging load of EVs exhibits high complexity and randomness due to temporal and spatial uncertainties. Therefore, this paper proposes a SEDformer-based charging road prediction method to capture the spatio-temporal characteristics of charging load data. As a deep learning model, SEDformer comprises multiple encoders and a single decoder. In particular, the proposed model includes a Temporal Encoder Block based on the self-attention mechanism and a Spatial Encoder Block based on the channel attention mechanism with sequence decomposition, followed by an aggregated decoder for information fusion. It is shown that the proposed method outperforms various baseline models on a real-world dataset from Palo Alto, U.S., demonstrating its superiority in addressing spatio-temporal data-driven load forecasting problems.</p>","PeriodicalId":54794,"journal":{"name":"Journal of Intelligent & Robotic Systems","volume":"13 1","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent & Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10846-024-02125-z","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Electric vehicles (EVs) have been initiated as a preference for decarbonizing road transport. Accurate charging load prediction is essential for the construction of EV charging facilities systematically and for the coordination of EV energy demand with the requisite peak power supply. It is noted that the charging load of EVs exhibits high complexity and randomness due to temporal and spatial uncertainties. Therefore, this paper proposes a SEDformer-based charging road prediction method to capture the spatio-temporal characteristics of charging load data. As a deep learning model, SEDformer comprises multiple encoders and a single decoder. In particular, the proposed model includes a Temporal Encoder Block based on the self-attention mechanism and a Spatial Encoder Block based on the channel attention mechanism with sequence decomposition, followed by an aggregated decoder for information fusion. It is shown that the proposed method outperforms various baseline models on a real-world dataset from Palo Alto, U.S., demonstrating its superiority in addressing spatio-temporal data-driven load forecasting problems.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于电动汽车充电负荷预测的多编码器时空特征融合网络
电动汽车(EV)已成为道路交通去碳化的首选。准确的充电负荷预测对于系统地建设电动汽车充电设施以及协调电动汽车能源需求和必要的高峰电力供应至关重要。人们注意到,由于时间和空间的不确定性,电动汽车的充电负荷表现出高度的复杂性和随机性。因此,本文提出了一种基于 SEDformer 的充电道路预测方法,以捕捉充电负荷数据的时空特征。作为一种深度学习模型,SEDformer 由多个编码器和一个解码器组成。具体而言,所提出的模型包括一个基于自我注意机制的时间编码器块和一个基于序列分解的信道注意机制的空间编码器块,然后是一个用于信息融合的聚合解码器。研究表明,在美国帕洛阿尔托的实际数据集上,所提出的方法优于各种基线模型,证明了它在解决时空数据驱动的负荷预测问题方面的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Intelligent & Robotic Systems
Journal of Intelligent & Robotic Systems 工程技术-机器人学
CiteScore
7.00
自引率
9.10%
发文量
219
审稿时长
6 months
期刊介绍: The Journal of Intelligent and Robotic Systems bridges the gap between theory and practice in all areas of intelligent systems and robotics. It publishes original, peer reviewed contributions from initial concept and theory to prototyping to final product development and commercialization. On the theoretical side, the journal features papers focusing on intelligent systems engineering, distributed intelligence systems, multi-level systems, intelligent control, multi-robot systems, cooperation and coordination of unmanned vehicle systems, etc. On the application side, the journal emphasizes autonomous systems, industrial robotic systems, multi-robot systems, aerial vehicles, mobile robot platforms, underwater robots, sensors, sensor-fusion, and sensor-based control. Readers will also find papers on real applications of intelligent and robotic systems (e.g., mechatronics, manufacturing, biomedical, underwater, humanoid, mobile/legged robot and space applications, etc.).
期刊最新文献
UAV Routing for Enhancing the Performance of a Classifier-in-the-loop DFT-VSLAM: A Dynamic Optical Flow Tracking VSLAM Method Design and Development of a Robust Control Platform for a 3-Finger Robotic Gripper Using EMG-Derived Hand Muscle Signals in NI LabVIEW Neural Network-based Adaptive Finite-time Control for 2-DOF Helicopter Systems with Prescribed Performance and Input Saturation Six-Degree-of-Freedom Pose Estimation Method for Multi-Source Feature Points Based on Fully Convolutional Neural Network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1