Skip to content

intsystems/uncertainty_quantification_2024

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Test status Test coverage Docs status

Название исследуемой задачи:Математическое разложение оценки неопределенности для нейронных сетей
Тип научной работы:НИР
Автор:Руслан Рашидович Насыров
Научный руководитель:к.ф.-м.н., Зайцев Алексей Алексеевич

Abstract

Uncertainty quantification in deep learning is essential for assessing the reliability of model predictions. Despite their success in various tasks, neural networks often fail to provide meaningful measures of uncertainty, limiting their trustworthiness. We explore model-related uncertainty in neural networks, focusing on calibration techniques that align predicted probabilities with true frequencies, thus improving the interpretability and reliability of predictions.

Experiments were conducted using a ResNet-18 model trained on the CIFAR-10 dataset. Various loss functions and additional terms were employed to assess their impact on model calibration and expected calibration error (ECE). Results indicate that specific modifications can enhance calibration without significantly compromising accuracy, evidenced by reduced ECE values during validation.

This research is expected to contribute to the theoretical understanding of uncertainty in deep learning and proposes directions for improving uncertainty quantification, particularly in areas where accurate risk assessment is critical.

Research publications

Presentations at conferences on the topic of research

Software modules developed as part of the study

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages