文档名:基于残差UNet和自注意力Transformer编码器的磁场预测方法
摘要:利用有限元方法对几何结构复杂的电机和变压器进行磁场分析,存在仿真时间长且无法复用的问题.因此,该文提出一种基于残差U-Net和自注意力Transformer编码器的磁场预测方法.首先建立永磁同步电机(PMSM)和非晶合金变压器(AMT)有限元模型,得到深度学习训练所需的数据集;然后将Transformer模块与U-Net模型结合,并引入短残差机制建立ResUnet-Transformer模型,通过预测图像的像素实现磁场预测;最后通过TargetedDropout算法和动态学习率调整策略对模型进行优化,解决拟合问题并提高预测精度.计算实例证明,ResUnet-Transformer模型在PMSM和AMT数据集上测试集的平均绝对百分比误差(MAPE)均小于1%,且仅需500组样本.该文提出的磁场预测方法能减少实际工况和多工况下精细模拟和拓扑优化的时间和资源消耗,亦是虚拟传感器乃至数字孪生的关键实现方法之一.
Abstract:Accuratesimulationofelectromagneticcharacteristicsinelectricalequipmentreliesonthefiniteelementmethod.However,theincreasingcomplexityoflargeelectricalmachinesandtransformersposeschallenges,leadingtoprolongedsimulationtimeandsignificantcomputationalresourceconsumption.Atthesametime,thefiniteelementmethodcannotestablishapriorimodel.Whendesignparameters,structures,oroperatingconditionschange,itisnecessarytoreestablishthemodel.Consideringthepowerfulfeatureextractionabilityofdeeplearning,thispaperproposesamagneticfieldpredictionmethodbasedonaresidualU-Netandaself-attentionTransformerencoder.Thefiniteelementmethodisusedtoobtainthedatasetfordeeplearningtraining.Thedeeplearningmodelcanbetrainedonceandusedformultiplepredictions,addressingthelimitationsofthefiniteelementmethodandreducingcomputationaltimeandresourceconsumption.Firstly,thispaperleveragestheinherentadvantagesoftheconvolutionalneuralnetwork(CNN)inimageprocessing,particularlytheU-shapedCNN,knownasU-Net,basedontheencoderanddecoderstructure.ThisarchitectureexhibitsastrongerabilitytocapturefinedetailsandlearnfromlimitedsamplesthanthetraditionalCNN.Tomitigatenetworkdegradationandaddressconvolutionaloperationlimitations,shortresidualconnectionsandTransformermodulesareintroducedtotheU-Netarchitecture,creatingtheResUnet-Transformermodel.Theshortresidualconnectionsacceleratenetworktraining,whiletheself-attentionmechanismfromtheTransformernetworkfacilitatestheeffectiveinteractionofglobalfeatures.Secondly,thispaperintroducestheTargetedDropoutalgorithmandadaptivelearningratetosuppressoverfittingandenhancetheaccuracyofmagneticfieldpredictions.TheTargetedDropoutalgorithmincorporatespost-pruningstrategiesintothetrainingprocessofneuralnetworks,effectivelymitigatingoverfittingandimprovingthemodel'sgeneralization.Additionally,anadaptivelearningrateisimplementedusingthecosineannealingalgorithmbasedontheAdamoptimizationalgorithm,graduallyreducingthelearningrateastheobjectivefunctionconvergestotheoptimalvalueandavoidingoscillationsornon-convergence.Finally,theResUnet-Transformermodelisvalidatedthroughengineeringcasesinvolvingpermanentmagnetsynchronousmotors(PMSM)andamorphousmetaltransformers(AMT).OnthePMSMdataset,trainingtheResUnet-Transformermodelwith250samplesandtestingitwith100samples,themeansquareerror(MSE)andmeanabsolutepercentageerror(MAPE)areusedasperformanceevaluationmetrics.ComparedtoCNN,U-Net,andLinknetmodels,theResUnet-Transformermodelachievesthehighestpredictionaccuracy,withanMSEof0.07×10-3andaMAPEof1.4%.Thepredictionefficiencyofthe100testsamplesusingtheResUnet-Transformermodelsurpassesthefiniteelementmethodby66.1%.Maintainingconsistencyinstructuralandparametersettings,introducingtheTargetedDropoutalgorithmandcosineannealingalgorithmimprovesthepredictionaccuracyby36.4%and26.3%,respectively.Toevaluatethemodel'sgeneralizationcapability,thenumberoftrainingsamplesforPMSMandAMTdatasetsisvaried,andthemodelistestedusing100samples.Inadequatetrainingsamplesresultinpoormagneticfieldpredictionperformance.Whenthetrainingdatasetsizeincreasesto300,thepredictionerrordoesnotdecreasebutshowsaslightrise.However,withfurtherincreasesinthetrainingdatasetsize,theerrorsignificantlydecreases,andtheMAPEforthePMSMandAMTdatasetsreaches0.7%and0.5%,respectively,withjust500trainingsamples.
作者:金亮 尹振豪 刘璐 宋居恒 刘元凯 Author:JinLiang YinZhenhao LiuLu SongJuheng LiuYuankai
作者单位:省部共建电工装备可靠性与智能化国家重点实验室(河北工业大学)天津300401;河北省电磁场与可靠性重点实验室(河北工业大学)天津300401省部共建电工装备可靠性与智能化国家重点实验室(河北工业大学)天津300401
刊名:电工技术学报 ISTICEIPKU
Journal:TransactionsofChinaElectrotechnicalSociety
年,卷(期):2024, 39(10)
分类号:TM153
关键词:有限元方法 电磁场 深度学习 U-NetTransformer
Keywords:Finiteelementmethod electromagneticfield deeplearning U-net Transformer
机标分类号:TP391.41TM351R318
在线出版日期:2024年5月31日
基金项目:国家自然科学基金,国家自然科学基金,中央引导地方科技发展专项自由探索项目基于残差U-Net和自注意力Transformer编码器的磁场预测方法[
期刊论文] 电工技术学报--2024, 39(10)金亮 尹振豪 刘璐 宋居恒 刘元凯利用有限元方法对几何结构复杂的电机和变压器进行磁场分析,存在仿真时间长且无法复用的问题.因此,该文提出一种基于残差U-Net和自注意力Transformer编码器的磁场预测方法.首先建立永磁同步电机(PMSM)和非晶合金变压器(...参考文献和引证文献
参考文献
引证文献
本文读者也读过
相似文献
相关博文
基于残差U-Net和自注意力Transformer编码器的磁场预测方法 Magnetic Field Prediction Method Based on Residual U-Net and Self-Attention Transformer Encoder
基于残差U-Net和自注意力Transformer编码器的磁场预测方法.pdf
- 文件大小:
- 2.82 MB
- 下载次数:
- 60
-
高速下载
|