Skip to content

Commit

Permalink
new
Browse files Browse the repository at this point in the history
  • Loading branch information
AtlasWang committed Sep 16, 2024
1 parent c654f01 commit 5fcef97
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 9 deletions.
8 changes: 6 additions & 2 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -179,9 +179,14 @@ <h2>News</h2>
<p>
</ul>

<b style="color:rgb(68, 68, 68)">[Sep. 2024]</b>
<ul style="margin-bottom:5px">
<li> 1 IEEE Trans. PAMI (symbolic visual RL) accepted</li>
</ul>

<b style="color:rgb(68, 68, 68)">[Aug. 2024]</b>
<ul style="margin-bottom:5px">
<li> We are grateful to receive the Best Paper Finalist Award from VLDB 2024 <a href="https://llm-pbe.github.io/LLM-PBE.pdf">[Paper]</a></li>
<li> We are grateful to receive the Best Paper Finalist Award from VLDB 2024 <a href="https://www.vldb.org/pvldb/vol17/p3201-li.pdf">[Paper]</a></li>
<li> Dr. Wang is grateful to receive the <a href="https://h2o.ai/ai-100/winners/">AI 100: The Top AI Thought Leaders</a> award, presented by H2O.ai</a> </li>
<li> 1 JMLR (pruning provably improves generalization) accepted</li>
<li> 1 JMLR (tighter theoretical analysis of sparse activation) accepted</li>
Expand All @@ -192,7 +197,6 @@ <h2>News</h2>
<ul style="margin-bottom:5px">
<li> 3 ECCV'24 (Few-shot 3DGS + DreamScene360 + VersatileGaussian) accepted</li>
<li> 1 npj Digital Medicine (longitudinal medical imaging) accepted</li>
<li> 1 IEEE JSTSP (factorized sparse fine-tuning) accepted</li>
</ul>

<b style="color:rgb(68, 68, 68)">[Jun. 2024]</b>
Expand Down
2 changes: 1 addition & 1 deletion prospective_students.html
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ <h2>For Prospective Students</h2>
<ul style="margin-bottom:5px">
<li>We have a very flat management structure with minimal communication overhead (“Talk is cheap. Show me the code/math”). Every student works directly and closely with Dr. Wang. The group also benefits a lot from its highly interactive, intimate, and helpful culture.</li>
<li>We provide extraordinarily strong support to students for internship, visiting, collaboration, networking, and scholarship opportunities. VITA Ph.D. students have in total won seven prestigious fellowships (NSF GRFP, IBM, Apple, Adobe, Amazon, Qualcomm, and Snap), among many other honors. You are welcome to check their rich experiences. </li>
<li>Our students are highly popular among top employers, for both internship and full-time opportunities. During Ph.D. time, almost everyone spends considerable time researching with Google, Facebook, Microsoft, Amazon, Adobe, NVIDIA, and our many other industry partners, for every year. </li>
<li>Our students are highly popular among top employers, for both internship and full-time opportunities. During Ph.D. time, almost everyone spends considerable time researching with Google, Meta, Amazon, Apple, NVIDIA, Microsoft, and our many other industry partners, for every year. </li>
</ul>
</div>
</div>
Expand Down
3 changes: 2 additions & 1 deletion publication.html
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,7 @@ <h2>Journal Paper</h2>
<div class="trend-entry d-flex">
<div class="trend-contents">
<ul>
<li>W. Zheng*, S. Sharan*, Z. Fan*, K. Wang*, Y. Xi*, and Z. Wang<br> <b style="color:rgb(71, 71, 71)">“Symbolic Visual Reinforcement Learning: A Scalable Framework with Object-Level Abstraction and Differentiable Expression Search”</b><br>IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2024. <a href="https://arxiv.org/abs/2212.14849">[Paper]</a> <a href="https://github.com/VITA-Group/DiffSES">[Code]</a></li>
<li> H. Yang*, Y. Liang, X. Guo, L. Wu, and Z. Wang<br> <b style="color:rgb(71, 71, 71)">“Pruning Before Training May Improve Generalization, Provably”</b><br> Journal of Machine Learning Research (JMLR), 2024. <a href="">[Paper]</a> <a href="">[Code]</a></li>
<li> H. Yang*, Z. Jiang*, R. Zhang, Y. Liang, and Z. Wang<br> <b style="color:rgb(71, 71, 71)">“Neural Networks with Sparse Activation Induced by Large Bias: Tighter Analysis with Bias-Generalized NTK”</b><br> Journal of Machine Learning Research (JMLR), 2024. <a href="">[Paper]</a> <a href="">[Code]</a></li>
<li> G. Holste*, M. Lin, R. Zhou, F. Wang, L. Liu, Q. Yan, S. Tassel, K. Kovacs, E. Chew, Z. Lu, Z. Wang, and Y. Peng<br> <b style="color:rgb(71, 71, 71)">“Harnessing the power of longitudinal medical imaging for eye disease prognosis using Transformer-based sequence modeling”</b><br>npj Digital Medicine, 2024. <a href="https://www.nature.com/articles/s41746-024-01207-4">[Paper]</a> <a href="">[Code]</a></li>
Expand Down Expand Up @@ -207,7 +208,7 @@ <h2>Conference Paper</h2>
<li>Z. Zhu*, Z. Fan*, Y. Jiang*, and Z. Wang<br> <b style="color:rgb(71, 71, 71)">“FSGS: Real-Time Few-shot View Synthesis using Gaussian Splatting”</b><br>European Conference on Computer Vision (ECCV), 2024. <a href="https://arxiv.org/abs/2312.00451">[Paper]</a> <a href="https://github.com/VITA-Group/FSGS">[Code] </a> </li>
<li>S. Zhou, Z. Fan*, D. Xu*, H. Chang, P. Chari, T. Bharadwaj, S. You, Z. Wang, and A. Kadambi<br> <b style="color:rgb(71, 71, 71)">“DreamScene360: Unconstrained Text-to-3D Scene Generation with Panoramic Gaussian Splatting”</b><br>European Conference on Computer Vision (ECCV), 2024. <a href="https://arxiv.org/abs/2404.06903">[Paper]</a> <a href="https://dreamscene360.github.io/">[Code] </a> </li>
<li>R. Li, Z. Fan*, B. Wang, P. Wang*, Z. Wang, and X. Wu<br> <b style="color:rgb(71, 71, 71)">“VersatileGaussian: Real-time Neural Rendering for Versatile Tasks using Gaussian Splatting”</b><br>European Conference on Computer Vision (ECCV), 2024. <a href="">[Paper]</a> <a href="">[Code] </a> </li>
<li>Q. Li, J. Hong*, C. Xie, J. Tan, R. Xin, J. Hou, X. Yin, Z. Wang, D. Hendrycks, Z. Wang, B. Li, B. He, and D. Song<br> <b style="color:rgb(71, 71, 71)">“LLM-PBE: Assessing Data Privacy in Large Language Models”</b><br>International Conference on Very Large Data Bases (VLDB), 2024. (Best Paper Finalist) <a href="https://llm-pbe.github.io/LLM-PBE.pdf">[Paper]</a> <a href="https://llm-pbe.github.io/home">[Code] </a> </li>
<li>Q. Li, J. Hong*, C. Xie, J. Tan, R. Xin, J. Hou, X. Yin, Z. Wang, D. Hendrycks, Z. Wang, B. Li, B. He, and D. Song<br> <b style="color:rgb(71, 71, 71)">“LLM-PBE: Assessing Data Privacy in Large Language Models”</b><br>International Conference on Very Large Data Bases (VLDB), 2024. (Best Paper Finalist) <a href="https://www.vldb.org/pvldb/vol17/p3201-li.pdf">[Paper]</a> <a href="https://llm-pbe.github.io/home">[Code] </a> </li>
<li>L. Sun*, N. Bhatt*, J. Liu*, Z. Fan*, Z. Wang, T. Humphreys, and U. Topcu<br> <b style="color:rgb(71, 71, 71)">“MM3DGS SLAM: Multi-modal 3D Gaussian Splatting for SLAM Using Vision, Depth, and Inertial Measurements”</b><br>IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2024. <a href="https://arxiv.org/abs/2404.00923">[Paper]</a> <a href="https://github.com/VITA-Group/MM3DGS-SLAM">[Code] </a> </li>
<li>R. Cai*, S. Muralidharan, G. Heinrich, H. Yin, Z. Wang, J. Kautz, and P. Molchanov<br> <b style="color:rgb(71, 71, 71)">“Flextron: Many-in-One Flexible Large Language Model”</b><br>International Conference on Machine Learning (ICML), 2024. (Oral) <a href="https://openreview.net/pdf?id=9vKRhnflAs">[Paper]</a> <a href="">[Code] </a> </li>
<li>R. Cai*, Y. Tian, Z. Wang, and B. Chen<br> <b style="color:rgb(71, 71, 71)">“LoCoCo: Dropping In Convolutions for Long Context Compression”</b><br>International Conference on Machine Learning (ICML), 2024. <a href="https://arxiv.org/abs/2406.05317">[Paper]</a> <a href="https://github.com/VITA-Group/LoCoCo">[Code] </a> </li>
Expand Down
Loading

0 comments on commit 5fcef97

Please sign in to comment.