About

Lei Zhang is a tenure-track assistant professor in the Department of Informatics at New Jersey Institute of Technology (NJIT), starting Spring 2025. He was recently a Postdoctoral Scholar in the Computer Science Department at Princeton University, working with Professor Andrés Monroy-Hernández. He completed his Ph.D. at the University of Michigan, where he was advised by professors Steve Oney and Anhong Guo. His research lies in the area of Human-Computer Interaction (HCI), with a primary focus on designing and building creativity support tools for Augmented Reality (AR) and Virtual Reality (VR). His work seeks to leverage emerging technologies such as AR/VR and machine intelligence to create compelling user experiences.

Lei’s work has been published at top-tier HCI venues including CHI, UIST, and CSCW. His first-authored papers have received a best paper award at CSCW 2022 and a best short paper award at VL/HCC 2019. Lei interned at Snap Research twice (Summer 2021 & 2022), working with Dr. Andrés Monroy-Hernández, Dr. Rajan Vaish, and Dr. Fannie Liu. He holds a BE in Software Engineering from Shanghai Jiao Tong University. Outside of work, Lei is passionate about music production, skateboards, and 35mm film photography.

📢 Lei is starting a new research lab and recruiting students (PhDs, Master's, undergrads) to work on exciting research projects! Please email me if you are interested in getting involved in research.

📩 lei.zhang[at]njit.edu

Bio

Publications

selected|all
Thumbnail for Empowering Children to Create AI-Enabled Augmented Reality Experiences

Empowering Children to Create AI-Enabled Augmented Reality Experiences

Lei Zhang, Shuyao Zhou, Amna Liaqat, Tinney Mak, Brian Berengard, Emily Qian, Andrés Monroy-Hernández.
UIST 2025, to appear
We introduce Capybara, an AR-based and AI-powered visual programming environment that empowers children to create, customize, and program 3D characters overlaid onto the physical world. Capybara enables children to create virtual characters and accessories using text-to-3D generative AI models, and to animate these characters through auto-rigging and body tracking. In addition, our system employs vision-based AI models to recognize physical objects, allowing children to program interactive behaviors between virtual characters and their physical surroundings.
Thumbnail for VRCopilot: Authoring 3D Layouts with Generative AI Models in VR

VRCopilot: Authoring 3D Layouts with Generative AI Models in VR

Lei Zhang, Jin Pan, Jacob Gettig, Steve Oney, Anhong Guo.
UIST 2024
We introduce VRCopilot, a mixed-initiative system that integrates pre-trained generative AI models into immersive authoring, to facilitate human-AI co-creation in VR. VRCopilot presents multimodal interactions to support rapid prototyping and iterations with AI, and intermediate representations such as wireframes to augment user controllability over the created content. We evaluated manual, semi-automatic, and fully automatic creation, and found that semi-automatic creation with wireframes enhanced the creation experience and user agency compared to fully automatic approaches.
Thumbnail for Jigsaw: Authoring Immersive Storytelling Experiences with Augmented Reality and Internet of Things

Jigsaw: Authoring Immersive Storytelling Experiences with Augmented Reality and Internet of Things

Lei Zhang, Daekun Kim, Youjean Cho, Ava Robinson, Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández.
CHI 2024
We introduce Jigsaw, a system that enables novices to both consume and create immersive stories that harness virtual and physical augmentations. Jigsaw achieves this through the novel fusion of mobile AR with off-the-shelf Internet-of-things (IoT) devices. We evaluated the consumption and creation of immersive stories through a qualitative user study with 20 participants, and found that end-users were able to create immersive stories and felt highly engaged in the playback of three stories. However, sensory overload was one of the most notable challenges across all experiences.
Thumbnail for VRGit: A Version Control System for Collaborative Content Creation in Virtual Reality

VRGit: A Version Control System for Collaborative Content Creation in Virtual Reality

Lei Zhang, Ashutosh Agrawal, Steve Oney, Anhong Guo.
CHI 2023
We introduce VRGit, a new version control system for collaborative content creation in VR. VRGit enables novel visualization and interactions for version control commands such as history navigation, commits, branching, previewing, and re-using. VRGit is also designed to facilitate real-time collaboration by providing awareness of users’ activities and version history through concepts of portals and shared history visualizations.
Thumbnail for Auggie: Encouraging Effortful Communication through Handcrafted Digital Experiences

Auggie: Encouraging Effortful Communication through Handcrafted Digital Experiences

Lei Zhang*, Tianying Chen*, Olivia Seow*, Tim Chong, Sven Kratz, Yu Jiang Tham, Andrés Monroy-Hernández, Rajan Vaish, and Fannie Liu.
CSCW 2022 (🏆 Best Paper Award)
Digital communication is often brisk and automated. From auto-completed messages to “likes,” research has shown that such lightweight interactions can affect perceptions of authenticity and closeness. On the other hand, effort in relationships can forge emotional bonds by conveying a sense of caring and is essential in building and maintaining relationships. To explore effortful communication, we designed and evaluated Auggie, an iOS app that encourages partners to create digitally handcrafted Augmented Reality (AR) experiences for each other.
Thumbnail for FlowMatic: An Immersive Authoring Tool for Creating Interactive Scenes in Virtual Reality

FlowMatic: An Immersive Authoring Tool for Creating Interactive Scenes in Virtual Reality

Lei Zhang and Steve Oney.
UIST 2020
In this paper, we introduces FlowMatic, an immersive authoring tool that raises the ceiling of expressiveness by allowing novice programmers to specify reactive behavior.
Thumbnail for Studying the Benefits and Challenges of Immersive Dataflow Programming

Studying the Benefits and Challenges of Immersive Dataflow Programming

Lei Zhang and Steve Oney.
VL/HCC 2019 (🏆 Best Short Paper Award)
In this paper, we study the benefits and challenges of immersive dataflow authoring, a paradigm that allows users to build VR applications using dataflow notation while immersed in the VR world.

News

07/2025: Our paper on AI-powered AR programing for children was accepted to UIST '25! 🇰🇷

04/2025: Serving on the PC for UIST '25, VL/HCC '25, and CHI '26.

12/2024: Invited talk at the Affinity Event of Neurodiversity Workshop in NeurIPS 2024.

11/2024: Guest lectures at Princeton University and University of Rochester.

10/2024: Presented VRCopilot at UIST '24.

08/2024: Relocated to New Jersey. 🏠

06/2024: Officially Dr. Zhang! 🎓

05/2024: Presented Jigsaw at CHI 2024. 🏄🏻‍♂️

04/2024: I'm joining NJIT as an Assistant Professor in Spring 2025, after a one-term postdoc at Princeton's Computer Science.

© Lei Zhang 2025 — Built with ❤️‍🩹