Replies: 4 comments
-
Hi, We plan on adding more uttorials on how to use Line 50 in 238fcb4 As you can see, you need primal/dual variables and the matrices Nonetheless, I think the most straighforward workaround would be the following: solve you problem on CPU with original's OSQP implementation and sparse arrays. Retrive primal and dual variables, and finally take advantage of the fact that the problem is convex. Indeed, if you have primal/dual variables you can populate the Note that if your sparse matrices have some appealing structure you want to use a matvec Note that a year ago I started to create a wrapper for sparse operator on my own repo: https://github.com/Algue-Rythme/jaxopt/blob/38ea8faa7dd4c02c65bcb538673cb0f0db3f7128/jaxopt/_src/linear_operator.py#L59 Maybe you can take a look at |
Beta Was this translation helpful? Give feedback.
-
Thank you! I went ahead and made a notebook to demonstrate how it seems that Your warm-start suggestion seems to be the winner by performance. Would you mind taking a look to let me know if I did everything correctly? https://colab.research.google.com/drive/1p-ICCI3MafF7lHZa34LlJtukt4o_ufLn#scrollTo=DHCNLs3mu3_l |
Beta Was this translation helpful? Give feedback.
-
Hi, You almost got it right! I would replace Line 245 in 238fcb4 Also, no need to do Finally, note that you shouldn't be able to differentiate wrt |
Beta Was this translation helpful? Give feedback.
-
I'm finding that for my particular usecase, the original OSQP library is much faster on the CPU since it uses sparse arrays, while your implementation of the OSQP algorithm in JAX is slower because it uses dense arrays.
I'm struggling to use
custom_root
, though. How can I implicitly differentiate through the OSQP library's QP solver with JAXopt?Beta Was this translation helpful? Give feedback.
All reactions