From 5d0fb1502907c7624e28a73076fbc93de43b24b3 Mon Sep 17 00:00:00 2001 From: "qiangliu.7@outlook.com" Date: Wed, 4 Sep 2024 09:37:45 +0200 Subject: [PATCH] Fix typo in README --- README.md | 2 +- docs/index.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 86bbd46..935ee42 100644 --- a/README.md +++ b/README.md @@ -36,7 +36,7 @@ Then the dot product between $\boldsymbol{g}_{ConFIG}$ and each loss-specific gr * **Is the ConFIG computationally expensive?** -​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applied to another gradient-based method. +​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applicable to other gradient-based methods. ## Paper Info diff --git a/docs/index.md b/docs/index.md index e610f93..b36993a 100644 --- a/docs/index.md +++ b/docs/index.md @@ -41,7 +41,7 @@ Then the dot product between $\mathbf{g}_{ConFIG}$ and each loss-specific gradie * **Is the ConFIG computationally expensive?** -​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applied to another gradient-based method. +​ Like many other gradient-based methods, ConFIG needs to calculate each loss's gradient in every optimization iteration, which could be computationally expensive when the number of losses increases. However, we also introduce a **momentum-based method** where we can reduce the computational cost **close to or even lower than a standard optimization procedure** with a slight degeneration in accuracy. This momentum-based method is also applicable to other gradient-based methods. ---