Samsen Internal Framework
This is Samsen's internal framework content for Transformation. It is read-only reference material.
Adoption Framework — Organizational Transformation to AI-Assisted Design¶
The Challenge¶
Moving an organization from traditional design-to-development workflows to AI-assisted design is a transformation project, not a tool rollout. It touches process, skills, roles, culture, and tooling simultaneously.
The most common failure mode: buying tools without changing workflows. Teams get access to Claude Code or similar AI tools, try them for a week, find them frustrating because they don't fit existing processes, and abandon them.
Readiness Assessment¶
Before adoption, assess the organization across four dimensions:
1. Design System Maturity¶
Prerequisite for AI-assisted design. Without a structured, token-based design system, AI has nothing to work with.
| Level | Description | AI Readiness |
|---|---|---|
| No system | Ad hoc styles, no shared components | Not ready — build the system first |
| Component library | Shared components, but no token architecture | Partially ready — need token alignment |
| Token-based system | Semantic tokens, Figma variables, but no code alignment | Almost ready — bridge Figma to code tokens |
| Aligned system | Figma variables ↔ CSS custom properties, documented | Ready — AI can produce reliable output |
2. Team Skill Composition¶
| Role | What They Need | Training Required |
|---|---|---|
| Product Designers | Understanding of token architecture, ability to describe implementation in natural language, basic git literacy | Samsen Academy (6 sessions) |
| Design System Team | MCP configuration, AI context preparation, quality guardrail setup | Toolchain workshop |
| Developers | Understanding of AI-generated PRs, review practices, CI integration | Process orientation |
| Design Leadership | Metrics framework, workflow redesign, team structure implications | Strategic briefing |
3. Process Compatibility¶
Current processes that help adoption: - Design reviews (can extend to include AI-generated output review) - PR-based code review (AI output fits naturally) - Component-driven development (aligns with AI's component-level strength) - Design system governance (provides the rules AI needs)
Current processes that hinder adoption: - Heavy handoff documentation (AI replaces most of this) - Pixel-perfect QA against static designs (shift to browser-based review) - Siloed design and development teams (the workflow blurs the boundary) - Waterfall delivery (AI-assisted design is inherently iterative)
4. Tooling Infrastructure¶
Required: - [ ] Figma with Variables enabled - [ ] Git-based codebase - [ ] CI/CD pipeline - [ ] Claude Code access (or equivalent)
Recommended: - [ ] Storybook for component isolation - [ ] Chromatic or Percy for visual regression testing - [ ] MCP server configuration (Figma Console MCP + Design Systems MCP)
Adoption Phases¶
Phase 1: Foundation (2-4 weeks)¶
Goal: Prepare the environment and prove the concept.
Activities: - Audit design system for token alignment - Set up CLAUDE.md and MCP configurations - Identify 2-3 low-risk pilot components - Train one designer through the full workflow end-to-end - Ship the pilot components to production
Success criteria: - One designer has shipped a component via Claude Code - The component passed code review and visual QA - The process is documented
Phase 2: Expansion (4-8 weeks)¶
Goal: Extend to the full design team.
Activities: - Run Samsen Academy training (6 sessions) for all product designers - Establish the PR review workflow for AI-generated components - Set up visual regression testing - Create context documentation (CLAUDE.md, component specs, token reference) - Identify and address process conflicts
Success criteria: - All designers have shipped at least one component - A standard workflow is documented and followed - Visual regression testing catches unintended changes
Phase 3: Integration (8-16 weeks)¶
Goal: AI-assisted design becomes the default workflow.
Activities: - Integrate AI-assisted maintenance into design system governance - Establish quality guardrails (FigmaLint, automated token validation) - Shift design reviews to include browser-based component review - Begin measuring efficiency gains (cycle time, throughput) - Evolve the CLAUDE.md and context documentation based on team experience
Success criteria: - Design-to-implementation cycle time has measurably decreased - AI-generated components are indistinguishable from human-written ones - The team defaults to AI-assisted workflow for new components
Measuring Impact¶
Efficiency Metrics¶
- Cycle time — Time from design approval to merged PR. Target: 70-80% reduction.
- Throughput — Components shipped per sprint. Target: 2-3x increase.
- Design system update propagation — Time to propagate a token change across all components. Target: hours instead of days.
Quality Metrics¶
- Visual regression rate — Percentage of PRs with unintended visual changes. Should stay flat or decrease.
- Defect rate — Bugs in AI-generated components vs. human-written. Should be comparable.
- Token compliance — Percentage of components using design system tokens (no hardcoded values). Should increase.
Team Satisfaction¶
- Designer time allocation — Percentage of time on creative vs. operational work. Target: shift from 40/60 to 70/30.
- Confidence — Designers' self-reported confidence in shipping quality. Should increase over time.
Common Adoption Challenges¶
"AI-generated code isn't good enough"¶
Usually means insufficient context, not insufficient AI. Improve CLAUDE.md, add component specs, ensure token alignment.
"Designers don't want to learn git"¶
They don't need to learn git deeply. They need to understand: branch, commit, PR, merge. Claude Code handles the mechanics. Frame it as "publishing your work" not "programming."
"Developers feel threatened"¶
Reframe: AI handles the UI component layer. Developers focus on application logic, data architecture, performance — the work that was always the harder and more valuable part. Their role shifts from "implementing designs" to "architecting systems."
"We can't measure the ROI"¶
Start measuring before adoption: current cycle times, current throughput, current time allocation. Then measure the same things after. The data makes the case.