A functional approach to physics-informed neural networks

Detalhes bibliográficos
Ano de defesa: 2025
Autor(a) principal: Zeiser, Mateus Henrique
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Dissertação
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Biblioteca Digitais de Teses e Dissertações da USP
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: https://www.teses.usp.br/teses/disponiveis/45/45132/tde-15012026-154642/
Resumo: This dissertation investigates the use of energy functionals derived from the variational formulation of partial differential equations (PDEs) as a basis for training Physics-Informed Neural Networks (PINNs). The work begins by revisiting the role of PDEs in modeling physical and biological phenomena, emphasizing the importance of variational principles as a mathematical foundation for obtaining weak solutions and for the development of numerical methods such as the Finite Element Method (FEM). It also reviews essential concepts from Machine Learning, including statistical learning theory, supervised learning, deep neural networks, and the training process based on empirical risk minimization. In this context, the use of automatic differentiation emerges as a key tool for computing gradients efficiently in high-dimensional models. Building on these theoretical elements, we explore the Functional PINN (Fun-PINN), a model that replaces the traditional residual-based training with the minimization of an energy functional directly associated with the PDE, which reduces the order of derivatives required during training, leading to gains in computational efficiency. We also introduce the Self-Adaptive Functional PINN (SA-Fun-PINN), which dynamically adjusts the relative importance of the energy and boundary terms during training. Both models were evaluated against classical PINNs and Self-Adaptive PINNs (SA-PINNs) in numerical experiments designed to assess convergence, accuracy, and computational efficiency. Two test cases were considered: one with a smooth solution, based on Laplace\'s equation, and another with a more oscillatory profile, based on the Poisson equation, allowing performance to be analyzed under increasing levels of complexity. The Fun-PINNs achieved stable training dynamics and offered meaningful reductions in runtime, while the self-adaptive version further improved accuracy with lower computational cost compared to SA-PINNs. In particular, the functional approach demonstrated superior performance in the oscillatory case. However, considering Fun-PINNs as a whole, these experiments do not allow us to claim results superior to those obtained with PINNs; rather, they show that Fun-PINNs are a viable alternative that also yields good results. Overall, this work highlights how classical mathematical concepts such as variational formulations and energy minimization can be effectively integrated with modern machine learning techniques. The proposed functional models combine theoretical consistency with practical efficiency, establishing a promising direction for future research on more complex PDEs, including inverse problems and hybrid approaches that link PINNs with traditional numerical solvers.
id USP_3a259b5fb970869c0c19ecbe7e938c92
oai_identifier_str oai:teses.usp.br:tde-15012026-154642
network_acronym_str USP
network_name_str Biblioteca Digital de Teses e Dissertações da USP
repository_id_str
spelling A functional approach to physics-informed neural networksUma abordagem funcional para redes neurais informadas pela físicaAprendizado de máquina científicoEquações diferenciais parciaisFuncional-PINNsFunctional PINNPartial differential equationsPhysics-informed neural networksRedes neurais informadas pela físicaScientific machine learningThis dissertation investigates the use of energy functionals derived from the variational formulation of partial differential equations (PDEs) as a basis for training Physics-Informed Neural Networks (PINNs). The work begins by revisiting the role of PDEs in modeling physical and biological phenomena, emphasizing the importance of variational principles as a mathematical foundation for obtaining weak solutions and for the development of numerical methods such as the Finite Element Method (FEM). It also reviews essential concepts from Machine Learning, including statistical learning theory, supervised learning, deep neural networks, and the training process based on empirical risk minimization. In this context, the use of automatic differentiation emerges as a key tool for computing gradients efficiently in high-dimensional models. Building on these theoretical elements, we explore the Functional PINN (Fun-PINN), a model that replaces the traditional residual-based training with the minimization of an energy functional directly associated with the PDE, which reduces the order of derivatives required during training, leading to gains in computational efficiency. We also introduce the Self-Adaptive Functional PINN (SA-Fun-PINN), which dynamically adjusts the relative importance of the energy and boundary terms during training. Both models were evaluated against classical PINNs and Self-Adaptive PINNs (SA-PINNs) in numerical experiments designed to assess convergence, accuracy, and computational efficiency. Two test cases were considered: one with a smooth solution, based on Laplace\'s equation, and another with a more oscillatory profile, based on the Poisson equation, allowing performance to be analyzed under increasing levels of complexity. The Fun-PINNs achieved stable training dynamics and offered meaningful reductions in runtime, while the self-adaptive version further improved accuracy with lower computational cost compared to SA-PINNs. In particular, the functional approach demonstrated superior performance in the oscillatory case. However, considering Fun-PINNs as a whole, these experiments do not allow us to claim results superior to those obtained with PINNs; rather, they show that Fun-PINNs are a viable alternative that also yields good results. Overall, this work highlights how classical mathematical concepts such as variational formulations and energy minimization can be effectively integrated with modern machine learning techniques. The proposed functional models combine theoretical consistency with practical efficiency, establishing a promising direction for future research on more complex PDEs, including inverse problems and hybrid approaches that link PINNs with traditional numerical solvers.Esta dissertação investiga o uso de funcionais de energia derivados da formulação variacional de equações diferenciais parciais (EDPs) como base para o treinamento de Physics-Informed Neural Networks(PINNs). O trabalho inicia revisitando o papel das EDPs na modelagem de fenômenos físicos e biológicos, enfatizando a importância dos princípios variacionais como fundamento matemático para a obtenção de soluções fracas e para o desenvolvimento de métodos numéricos, como o Método dos Elementos Finitos (FEM). Também são revisados conceitos essenciais de Aprendizado de Máquina, incluindo teoria do aprendizado estatístico, aprendizado supervisionado, redes neurais profundas e o processo de treinamento baseado na minimização do risco empírico. Nesse contexto, a utilização da diferenciação automática surge como ferramenta fundamental para o cálculo eficiente de gradientes em modelos de alta dimensão. Com base nesses elementos teóricos, exploramos a Functional-PINN (Fun-PINN), um modelo em que o treinamento é guiado pela minimização de um funcional de energia diretamente associado à EDP, o que reduz a ordem das derivadas requeridas durante o treinamento, resultando em ganhos de eficiência computacional. Além disso, introduzimos a Self-Adaptive Functional-PINN (SA Fun-PINN), que ajusta dinamicamente, durante o treinamento, a importância relativa entre os termos de energia e de contorno. Ambos os modelos foram avaliados em comparação com PINNs clássicas e suas versões auto-adaptativas (SA-PINNs), em experimentos numéricos projetados para avaliar convergência, acurácia e eficiência computacional. Foram considerados dois casos de teste: um com solução suave, baseado na equação de Laplace, e outro com perfil mais oscilatório, baseado na equação de Poisson, permitindo analisar o desempenho sob diferentes níveis de complexidade. As Fun-PINNs apresentaram dinâmicas de treinamento estáveis e proporcionaram reduções significativas no tempo de execução, enquanto a versão auto-adaptativa melhorou ainda mais a acurácia com menor custo computacional em comparação às SA-PINNs. Em particular, a abordagem funcional mostrou desempenho superior no caso oscilatório. Porém, olhando para a Fun-PINN como um todo, com esses experimentos não podemos garantir que os resultados sejam melhores do que os obtidos com as PINNs; mostramos que elas podem ser uma alternativa que também entrega bons resultados.De forma geral, este trabalho evidencia como conceitos matemáticos clássicos, como formulações variacionais e minimização de funcionais de energia, podem ser integrados de maneira eficaz a técnicas modernas de aprendizado de máquina. Os modelos funcionais propostos combinam consistência teórica com eficiência prática, estabelecendo um caminho promissor para futuras aplicações em problemas de EDP mais complexos, incluindo problemas inversos e formulações híbridas que integrem PINNs a métodos numéricos tradicionais.Biblioteca Digitais de Teses e Dissertações da USPKuhl, Nelson MugayarMarcondes, Diego RibeiroZeiser, Mateus Henrique2025-11-28info:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/masterThesisapplication/pdfhttps://www.teses.usp.br/teses/disponiveis/45/45132/tde-15012026-154642/reponame:Biblioteca Digital de Teses e Dissertações da USPinstname:Universidade de São Paulo (USP)instacron:USPLiberar o conteúdo para acesso público.info:eu-repo/semantics/openAccesseng2026-01-19T09:01:02Zoai:teses.usp.br:tde-15012026-154642Biblioteca Digital de Teses e Dissertaçõeshttp://www.teses.usp.br/PUBhttp://www.teses.usp.br/cgi-bin/mtd2br.plvirginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.bropendoar:27212026-01-19T09:01:02Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP)false
dc.title.none.fl_str_mv A functional approach to physics-informed neural networks
Uma abordagem funcional para redes neurais informadas pela física
title A functional approach to physics-informed neural networks
spellingShingle A functional approach to physics-informed neural networks
Zeiser, Mateus Henrique
Aprendizado de máquina científico
Equações diferenciais parciais
Funcional-PINNs
Functional PINN
Partial differential equations
Physics-informed neural networks
Redes neurais informadas pela física
Scientific machine learning
title_short A functional approach to physics-informed neural networks
title_full A functional approach to physics-informed neural networks
title_fullStr A functional approach to physics-informed neural networks
title_full_unstemmed A functional approach to physics-informed neural networks
title_sort A functional approach to physics-informed neural networks
author Zeiser, Mateus Henrique
author_facet Zeiser, Mateus Henrique
author_role author
dc.contributor.none.fl_str_mv Kuhl, Nelson Mugayar
Marcondes, Diego Ribeiro
dc.contributor.author.fl_str_mv Zeiser, Mateus Henrique
dc.subject.por.fl_str_mv Aprendizado de máquina científico
Equações diferenciais parciais
Funcional-PINNs
Functional PINN
Partial differential equations
Physics-informed neural networks
Redes neurais informadas pela física
Scientific machine learning
topic Aprendizado de máquina científico
Equações diferenciais parciais
Funcional-PINNs
Functional PINN
Partial differential equations
Physics-informed neural networks
Redes neurais informadas pela física
Scientific machine learning
description This dissertation investigates the use of energy functionals derived from the variational formulation of partial differential equations (PDEs) as a basis for training Physics-Informed Neural Networks (PINNs). The work begins by revisiting the role of PDEs in modeling physical and biological phenomena, emphasizing the importance of variational principles as a mathematical foundation for obtaining weak solutions and for the development of numerical methods such as the Finite Element Method (FEM). It also reviews essential concepts from Machine Learning, including statistical learning theory, supervised learning, deep neural networks, and the training process based on empirical risk minimization. In this context, the use of automatic differentiation emerges as a key tool for computing gradients efficiently in high-dimensional models. Building on these theoretical elements, we explore the Functional PINN (Fun-PINN), a model that replaces the traditional residual-based training with the minimization of an energy functional directly associated with the PDE, which reduces the order of derivatives required during training, leading to gains in computational efficiency. We also introduce the Self-Adaptive Functional PINN (SA-Fun-PINN), which dynamically adjusts the relative importance of the energy and boundary terms during training. Both models were evaluated against classical PINNs and Self-Adaptive PINNs (SA-PINNs) in numerical experiments designed to assess convergence, accuracy, and computational efficiency. Two test cases were considered: one with a smooth solution, based on Laplace\'s equation, and another with a more oscillatory profile, based on the Poisson equation, allowing performance to be analyzed under increasing levels of complexity. The Fun-PINNs achieved stable training dynamics and offered meaningful reductions in runtime, while the self-adaptive version further improved accuracy with lower computational cost compared to SA-PINNs. In particular, the functional approach demonstrated superior performance in the oscillatory case. However, considering Fun-PINNs as a whole, these experiments do not allow us to claim results superior to those obtained with PINNs; rather, they show that Fun-PINNs are a viable alternative that also yields good results. Overall, this work highlights how classical mathematical concepts such as variational formulations and energy minimization can be effectively integrated with modern machine learning techniques. The proposed functional models combine theoretical consistency with practical efficiency, establishing a promising direction for future research on more complex PDEs, including inverse problems and hybrid approaches that link PINNs with traditional numerical solvers.
publishDate 2025
dc.date.none.fl_str_mv 2025-11-28
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/masterThesis
format masterThesis
status_str publishedVersion
dc.identifier.uri.fl_str_mv https://www.teses.usp.br/teses/disponiveis/45/45132/tde-15012026-154642/
url https://www.teses.usp.br/teses/disponiveis/45/45132/tde-15012026-154642/
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv
dc.rights.driver.fl_str_mv Liberar o conteúdo para acesso público.
info:eu-repo/semantics/openAccess
rights_invalid_str_mv Liberar o conteúdo para acesso público.
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.coverage.none.fl_str_mv
dc.publisher.none.fl_str_mv Biblioteca Digitais de Teses e Dissertações da USP
publisher.none.fl_str_mv Biblioteca Digitais de Teses e Dissertações da USP
dc.source.none.fl_str_mv
reponame:Biblioteca Digital de Teses e Dissertações da USP
instname:Universidade de São Paulo (USP)
instacron:USP
instname_str Universidade de São Paulo (USP)
instacron_str USP
institution USP
reponame_str Biblioteca Digital de Teses e Dissertações da USP
collection Biblioteca Digital de Teses e Dissertações da USP
repository.name.fl_str_mv Biblioteca Digital de Teses e Dissertações da USP - Universidade de São Paulo (USP)
repository.mail.fl_str_mv virginia@if.usp.br|| atendimento@aguia.usp.br||virginia@if.usp.br
_version_ 1857669979000799232