Chaehong Lee is a product-minded AI engineer based in NYC, with a background in computer vision, LLMs, and generative models. She specializes in transforming state-of-the-art research—like function-calling agents, Stable Diffusion pipelines, and edge-native vision systems—into polished, real-time products that feel intuitive and expressive to use.

Currently at Meta Reality Labs, and formerly at Microsoft Mixed Reality, she brings technical depth with strong systems thinking, shipping experience, and visual design instincts. Her recent work spans sparse autoencoder interpretability, multimodal prompt interfaces, and agentic tool-calling.

Open to AI-first teams and startups—especially in NYC, SF, or remote—building next-gen tools at the intersection of usability, creativity, and machine intelligence. Also open to software solutions architect consulting opportunities where strategic technical insight and user-centered design can bring research to life.

CV
Education
  • Washington University in St. Louis
    B.S. in Computer Science
    ↳ Double Major in Applied Mathematics
    ↳ Minor in Communication Design
  • Same school
    M.S. in Computer Vision
    ↳ Sparse to Dense Optical Flow with Deep Neural Networks
Experience
  • Meta
    New York, NY
    Software Engineer (2025-)
  • Microsoft
    Seattle, WA
    Research Engineer (2020-2024)
last updated 06.15.25
archives
folder icon
projects
folder icon
something pretty
folder icon
creative tech
folder icon
untitled folder
Instagrammable Photos Anatomy 2025

Instagrammable: visually appealing in a way that is suitable for being photographed for posting on the social media application Instagram.

Shower Thought #3: Algorithmic Gaze and the Self

In losing access to Instagram, I didn’t just stop posting.
I lost a version of myself that had been quietly shaped in tandem with the platform’s algorithmic gaze.

What started as a way to connect across time zones and languages gradually became a canvas for self-definition—one optimized for visibility, not necessarily for truth. Likes, saves, who watches your stories and in what order—these weren’t just numbers. They became signals, feedback loops. A system that knew how to amplify parts of me I hadn’t fully named yet.

It didn’t just reflect who I was becoming—it bent the contours of identity.

As a developer and a user, it’s strange to admit how deep that shaping runs. We design systems to organize attention, but somewhere along the way, the system starts organizing us. To lose access is to realize: your digital self isn’t entirely yours. It’s co-authored—by machines, by UI choices, by feedback loops you didn’t agree to but still played into. And sometimes, it can vanish without warning.


It’s not always the grand intentions of a platform that change us.
Sometimes it’s a tiny UI detail—like who appears first on your Instagram story viewer list.
Not just who watched, but what order they show up in.

That ranked list turns a passive view into a social cue. People scan it. Decode it. Reinterpret it. Maybe even act on it. A feature becomes a signal. A signal becomes meaning. Meaning turns into behavior. An innocuous line of code—shipped by an IC4 engineer chasing an internal KPI bump—becomes a culture-shifting wave.

Offline, social tension lives in gaze and gesture.
Online, it lives in metrics. Numbers, positions, repetition.
And the way we move through it shifts: not entirely consciously, but noticeably.


We think we’re building profiles. But more often, we’re crafting mirages.
We crop, caption, filter, and post—thinking we’re in control of how we’re perceived.
But curation isn’t authorship. It’s compliance. A performance tuned for visibility. A hook, a thirst-trap, a pixel-perfect echo of longing.

Just like we project fantasies onto strangers offline, online we project optimized fragments of ourselves— only to attract attention that’s based on something we can’t fully sustain. And when that attention arrives, a weird dissonance sets in. We start performing not for people, but for patterns. The system’s patterns.


Maybe it started innocently.
A thoughtful post. A good bio. A pinned tweet.

We’re told personal branding is how we survive nonlinear careers. If we’re not climbing a ladder, we’re swinging from node to node on a lattice we built ourselves.

We call it voice. Identity. Ikigai. Give it a name.
But over time, voice hardens into signal.
Presence becomes performance.
And armor—that thing we wore to protect ourselves—starts to rust inward.


Observation causes perturbations.
The algorithmic gaze has shaped us—quietly, persistently—before we ever realized we were being seen. Now, I’m turning that gaze around. In hopes of creating a wave—one that resonates—I’m taking a stab at dissecting the human–AI interplay through the lens of an algorithmic generation, a capitalist baby, and a software engineer building products that bridge human and machine.