PE-GPT: A Physics-Informed Interactive Large Language Model for Power Converter Modulation Design

Fanfan Lin, Junhua Liu, Xinze Li, Shuai Zhao, Bohui Zhao, Xinyuan Liao, Hao Ma, Xin Zhang

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Abstract

In the quest to design modulation strategies for power converters, recent studies have increasingly turned to AI-based, data-driven approaches. However, these methods grapple with significant challenges, including the requirement for dual expertise in power electronics and AI, as well as the need for extensive datasets for training. Addressing these constraints, this letter introduces PE-GPT, a custom-tailored large language model uniquely adapted for power converter modulation design, both semantically and physically. By harnessing in-context learning and specialized tiered physics-informed neural networks, PE-GPT guides users through text-based dialogues, recommending actionable modulation parameters. The effectiveness of PE-GPT is validated through a practical design case involving dual active bridge converters, supported by hardware experimentation. This research underscores the transformative potential of large language models in power converter modulation design, offering enhanced accessibility, explainability, and efficiency, thereby setting a new paradigm in the field.
OriginalsprogEngelsk
Titel2024 IEEE Energy Conversion Congress and Exposition (ECCE)
Publikationsdato10 feb. 2025
DOI
StatusUdgivet - 10 feb. 2025

Fingeraftryk

Dyk ned i forskningsemnerne om 'PE-GPT: A Physics-Informed Interactive Large Language Model for Power Converter Modulation Design'. Sammen danner de et unikt fingeraftryk.

Citationsformater