Sungbin Shin

prof_pic.jpeg

Hi, I am a Ph.D student in computer science and engineering at POSTECH, advised by Prof. Namhoon Lee.

I am broadly interested in large-scale optimization for foundation models. Some examples include collaboratively learning the billion-scale models over the internet and compressing large models for efficiency.

Contact me via sungbin.shin@postech.ac.kr if you have any questions.

news

Feb 2026 Our paper on asynchronous pipeline parallelism has been accepted at ICML 2026! See you in Seoul 🇰🇷!
Feb 2026 Check out our new paper on asynchronous pipeline parallelism.
Nov 2025 I began serving as a reviewer for TMLR.
May 2025 Our paper on sharpness-aware minimization has been accepted at UAI 2025!
Sep 2024 Our paper on LLM pruning has been accepted at EMNLP 2024!

selected publications

2026

  1. Mitigating Staleness in Asynchronous Pipeline Parallelism via Basis Rotation
    Hyunji Jung*, Sungbin Shin*, Namhoon Lee
    ICML, 2026

2025

  1. Critical Influence of Overparameterization on Sharpness-aware Minimization
    Sungbin Shin*, Dongyeop Lee*, Maksym Andriushchenko, Namhoon Lee
    UAI, 2025; ICML Workshop on High-dimensional Learning Dynamics, 2023
    Best paper award at JKAIA 2023

2024

  1. Rethinking Pruning Large Language Models: Benefits and Pitfalls of Reconstruction Error Minimization
    Sungbin Shin, Wonpyo Park, Jaeho Lee, Namhoon Lee
    EMNLP, 2024

2023

  1. A Closer Look at the Intervention Procedure of Concept Bottleneck Models
    Sungbin Shin, Yohan Jo, Sungsoo Ahn, Namhoon Lee
    ICML, 2023; NeurIPS Workshop on TSRML, 2022