Future of design systems, Figma MCP and Contextual Engineering
TJ Pitre is a front-end engineer and design systems strategist with over 20 years of experience helping product teams ship faster and more consistently. As the founder and CEO of Southleft, he’s built a consultancy trusted by teams at NASDAQ, Caterpillar, Indeed, and IBM—known for its strong culture of creativity, truth, and craftsmanship.
In this episode of our podcast How I Vibe Design, TJ dives into how AI is reshaping the way we design, prototype, and build. He shares his process for blending atomic design principles with AI-assisted workflows using tools like Figma MCP, Cursor, and his own creations—FigmaLint, Story UI, and Design Systems Assistant MCP. This article highlights our biggest takeaways from that conversation.
When I don’t know much about something and want to workshop the idea while building it, I call that solo activity a vibe coding session. The purpose is to create something quickly—driven by intuition and everyday language.
Contextual engineering, on the other hand, starts with clear requirements and specifications before any code is written. It aims for efficiency, precision, and scalability in large applications. It requires thinking like a product owner—writing user stories and defining technical architecture. In my workflow, I use tools like Claude Code to refine and validate specs before generating code.
Prompts in a vibe coding session start with feeling-based phrases like “I feel like it should do this.” The vocabulary leans on emotional or conceptual cues rather than technical details. As I explore the idea, the AI gains context from the conversation. However, the build can fail if expectations exceed the AI’s understanding.
When I’m building with AI, I shift from “building by vibe” to engineering with intent. I ensure the AI knows why a component exists—not just what it looks like. That involves multiple roles—product owner, architect, and developer—and structured documentation including statements like “As a user, I can…”.
The result? Contextual engineering significantly reduces design–development gaps, saves time and cost, and produces cleaner, more consistent code.
My process begins in Figma, where I build components with detailed properties, tokens, and descriptions explaining their purpose and intent.
I then run FigmaLint to ensure everything scores above 90, and copy the selection link for the component.
Next, I switch to Cursor, paste the link into a prompt template, and instruct the AI to use the Figma MCP to extract the design.
MCP pulls a JSON payload describing the component’s structure, styles, and tokens. The AI then generates TypeScript, CSS modules, and Storybook stories—organized into the right directories.
As code generates, I monitor it live to see how AI parses typography, tokens, and images from Figma. Once done, I review the results in Storybook, which are typically 80–85% complete. I inspect spacing, icons, and states, then use prompts or minor manual edits to polish details.
The entire process converts what once took two to three days into a 10-minute workflow, blending automation with human oversight.
Atomic design provides a structural foundation when working with AI tools like Figma MCP and Claude Code. By breaking components into atoms, molecules, and organisms, teams can manage complexity systematically.
For example, a banner component is not an atom (too complex) and not quite an organism (too small). It’s a middle layer that illustrates how AI can handle reusable, layered structures. Focusing AI on generating small, well-defined elements—rather than entire products—helps teams prototype faster and maintain clean, reusable code.
Most design systems are roughly 60% components, so this approach boosts efficiency, minimizes debugging, and enables rapid iteration—without sacrificing structure.
Figma components now include rich properties, tokens, and natural-language descriptions that communicate both purpose and intent. This helps Figma MCP understand not only what an element is—but why it exists, enabling smarter code generation.
I run FigmaLint on each component before handoff, aiming for a 90+ score to surface hardcoded values or missing metadata early—reducing downstream friction.
Pro tips:
Earlier bridges included tools like Style Dictionary, Generator Lab YAML flow, and Figma Bridge—precursors to today’s MCP workflows. Each reinforced Figma’s evolving role as the upstream source of truth for design and code.
Today, a component that once took 1–3 days manually now reaches 80–85% completeness in about 10 minutes—with Figma serving as the AI-readable blueprint.
The next frontier is refinement. Soon, AI will require far less detailed prompting to produce high-quality results. Where prompts once needed exact instructions about file structures and behaviors, future models will infer those details from context.
Early models required explicit instructions. But newer ones, like Opus 4.1 and beyond, can interpret architecture and design intent automatically—enabling faster, smoother builds.
Ultimately, contextual engineering will evolve into a more autonomous, seamless process where AI deeply understands context, allowing humans to focus on intent, quality, and creative direction.
At Southleft, nothing ships without human review. Even though AI accelerates the process, each generated component is manually verified.
We monitor generation live, observe how MCP builds files, and review the result in Storybook. From there, details like spacing, padding, and icons are refined through prompts or manual edits. The AI may handle 80–85% of the work—but craftsmanship still requires human oversight.
Start with context, not code. Define requirements, user stories, and architecture before prompting. Once you know what and why—AI can handle the how.
Curiousity got me to use Cursor, Unicorn Studio, Perplexity and get 118k impressions online

List of icons companies are using for Deep Research, Thinking, Create, etc.
8 Experiments to run today if you believe “Sign in with OpenAI” is coming i.e., user arrives with deep context, not just an email address.

for designers and product teams in the new AI paradigm.
