Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Kolmogorov Arnold Block for NBeats network #1741

Open
benHeid opened this issue Dec 26, 2024 · 5 comments
Open

[ENH] Kolmogorov Arnold Block for NBeats network #1741

benHeid opened this issue Dec 26, 2024 · 5 comments
Assignees
Labels
enhancement New feature or request feature request New feature or request

Comments

@benHeid
Copy link
Collaborator

benHeid commented Dec 26, 2024

Is your feature request related to a problem? Please describe.
The following paper implements a KAN block in NBEats.

https://arxiv.org/pdf/2412.17853

Describe the solution you'd like
Implement exchangeable blocks for nbeats.

Check if the same can also be used for nhits.

@benHeid benHeid added enhancement New feature or request feature request New feature or request labels Dec 26, 2024
@Sohaib-Ahmed21
Copy link
Contributor

@benHeid @fkiraly I want to work on this issue. Kindly assign me.

@fkiraly fkiraly changed the title Add Kolmogorov Arnold Block for NBeats network [ENH] Kolmogorov Arnold Block for NBeats network Jan 5, 2025
@Sohaib-Ahmed21
Copy link
Contributor

Sohaib-Ahmed21 commented Jan 9, 2025

https://arxiv.org/pdf/2412.17853

@benHeid The paper utilizes Domain-Adversarial Training of Neural Networks (DANN) to learn domain-invariant features. Should DANN be incorporated into the implementation, or is it sufficient to simply switch between MLP and KAN blocks? Since DANN seems to operate at a higher level and can be integrated into any model, it might be better suited for the base design.

Currently, my approach involves using a flag (param) in NBeats: when the flag is set to True, KAN blocks are used; otherwise, MLP blocks are used. This enables an interchangeable implementation.

Would appreciate your thoughts or suggestions on this approach.

@benHeid
Copy link
Collaborator Author

benHeid commented Jan 9, 2025

I would focus only on implementing KAN blocks.

The paper seems to have two contributions, combining NBEATS with KAN blocks and then also the DANN for zero-shot forecasting, if I understand it correctly.

@Sohaib-Ahmed21
Copy link
Contributor

Sohaib-Ahmed21 commented Jan 9, 2025

I would focus only on implementing KAN blocks.

The paper seems to have two contributions, combining NBEATS with KAN blocks and then also the DANN for zero-shot forecasting, if I understand it correctly.

Thanks for your input.

Btw what are your thoughts about having DANN in sktime in future for zero-shot forecasting tasks where we have primary and secondary/multiple domains in training?

@benHeid
Copy link
Collaborator Author

benHeid commented Jan 10, 2025

To be honest, I am not really familiar with DANN and currently, I am not that convinced by DANN especially if you consider that the foundation models achieve something similar without doing DANN. Furthermore, I fear that this might be more complicated to implement it in a proper way. However, if you would like to implement it and you have an idea on how this would look like from an architectural perspective you are welcome to do it. But before starting to implement it, I suppose that we need to discuss API design for that.

What are your thought with regard to DANN?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature request New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants