Hello, I am a Ph.D student at Machine Learning and Intelligence Lab (MLILAB) in KAIST, advised by Prof. Eunho Yang.

My research focuses on enhancing the efficiency of foundation models, particularly auto-regressive generative models. I aim to improve inference-time efficiency by optimizing memory usage and reducing latency, leveraging techniques such as speculative decoding and knowledge distillation, among others.

Additionally, I am interested in advancing the efficiency and effectiveness of Large Reasoning Models (LRMs) on complex reasoning tasks. Specifically, my goal is to develop methods that significantly accelerate and/or enhance reasoning model during both training and inference time.

Publications

  • LANTERN: Accelerating Visual Autoregressive Models with Relaxed Speculative Decoding [paper]
    Doohyuk Jang*, Sihwan Park*, June Yong Yang, Yeonsung Jung, Jihun Yun, Souvik Kundu, Sungyub Kim, Eunho Yang
    ICLR 2025

  • PromptKD: Distilling Student-Friendly Knowledge for Generative Language Models via Prompt Tuning [paper]
    Gyeongman Kim, Doohyuk Jang, Eunho Yang
    Findings of EMNLP 2024

  • SeamsTalk: Seamless Talking Face Generation via Flow-Guided Inpainting [paper]
    Yeongho Jeong, Gyeongman Kim, Doohyuk Jang, Jaeryong Hwang, Eunho Yang
    IEEE Access 2024

  • Reasoning Model is Stubborn: Diagnosing Instruction Overriding in Reasoning Models [paper] [project page]
    Doohyuk Jang*, Yoonjeon Kim*, Chanjae Park, Hyun Ryu, Eunho Yang
    Preprint

  • Med-PerSAM: One-Shot Visual Prompt Tuning for Personalized Segment Anything Model in Medical Domain [paper]
    Hangyul Yoon, Doohyuk Jang, Jungeun Kim, Eunho Yang
    Preprint

Education

  • Ph.D. in Graduate School of AI, Korea Advanced Institute of Science and Technology (KAIST), Sep. 2024 - Present

  • M.S. in Graduate School of AI, Korea Advanced Institute of Science and Technology (KAIST), Mar. 2023 - Aug. 2024

  • B.S. in Electrical Engineering, Computer Science, Korea Advanced Institute of Science and Technology (KAIST), Mar. 2018 - Feb. 2023

Work Experiences

  • Intern, Synopsys Korea, Gyeonggi, South Korea, Mar 2021 - Aug 2021

Projects

  • Developing a conversational language model for virtual doctors, AITRICS, Apr. 2024 - May. 2024

Acamdeic Services

  • Workshop Reviewer
    • SCOPE@ICLR 2025

Teaching Experience

  • Teaching Assistant, Machine Learning for AI (AI501), KAIST
  • Teaching Assistant, Advanced Machine Learning for AI (AI601), KAIST