15 Real Card Sorting Examples from Successful UX Projects
Card sorting is powerful, but understanding how to apply it can be challenging. These real-world examples show exactly how companies use card sorting to improve their products.
Quick Example Overview
| Example | Industry | Study Type | Cards | Result |
|---|---|---|---|---|
| E-commerce Navigation | Retail | Open | 35 products | New menu structure, 40% increase in discoverability |
| Mobile Banking App | Finance | Hybrid | 28 features | Simplified from 6 tabs to 4 |
| SaaS Dashboard | B2B Tech | Open | 42 features | User-friendly feature organization |
| Help Center | Support | Open | 30 articles | 30% reduction in support tickets |
| Content Platform | Media | Closed | 50 articles | Validated new taxonomy |
Example 1: E-commerce Site Navigation
The Challenge
An online clothing retailer had a cluttered navigation menu with 35 product categories. Users struggled to find items, leading to high bounce rates.
Study Setup
Type: Open card sort Cards: 35 product categories
- Men's T-Shirts
- Women's Dresses
- Kids' Shoes
- Athletic Wear
- Formal Wear
- Casual Shirts
- Winter Jackets
- Summer Dresses
- Business Attire
- ...and 26 more
Participants: 30 shoppers (mixed demographics) Instructions: "Organize these products into groups that make sense to you when shopping. Name each group."
Results
Original Structure (confusing):
- Men's Clothing (22 subcategories)
- Women's Clothing (25 subcategories)
- Kids' Clothing (15 subcategories)
- Accessories (12 subcategories)
New Structure (from card sort):
Main Navigation:
├─ Shop by Occasion
│ ├─ Casual & Everyday
│ ├─ Work & Professional
│ ├─ Athletic & Outdoor
│ └─ Formal & Special Events
├─ Shop by Person
│ ├─ Women
│ ├─ Men
│ └─ Kids
└─ Sale & New Arrivals
Outcome
- 40% increase in product page views
- 25% reduction in bounce rate
- 18% increase in conversion rate
- Users said navigation "made sense now"
Key Insight
Shoppers wanted to browse by occasion (casual, work, athletic) rather than just gender. This wasn't obvious to the internal team but emerged clearly in the card sort.
Example 2: Mobile Banking App
The Challenge
A banking app had 28 features scattered across 6 tabs. User research showed people couldn't find basic functions like "Pay a Bill" or "View Statements."
Study Setup
Type: Hybrid card sort Cards: 28 banking features
- Check Balance
- Transfer Money
- Pay Bills
- Deposit Check
- View Statements
- ATM Locator
- Budget Tracker
- Savings Goals
- Investment Portfolio
- Customer Support
- Security Settings
- Card Controls
- ...and 16 more
Suggested Categories (for hybrid sort):
- Accounts
- Payments
- Tools
- Settings
Participants: 25 active banking app users Instructions: "Organize these features into the provided categories, or create new categories if needed."
Results
User-Created Categories:
├─ My Money (38% created this)
│ ├─ Check Balance
│ ├─ View Statements
│ └─ Account Details
├─ Move Money (42% created this)
│ ├─ Transfer Money
│ ├─ Pay Bills
│ └─ Deposit Check
├─ Plan Ahead (35% created this)
│ ├─ Budget Tracker
│ ├─ Savings Goals
│ └─ Bill Reminders
└─ Settings & Help (45% created this)
├─ Security Settings
├─ Customer Support
└─ Card Controls
Final App Structure: Simplified from 6 tabs to 4 main sections based on card sort results.
Outcome
- Task completion rate improved from 62% to 89%
- Average time to complete tasks reduced by 34%
- App Store rating increased from 3.2 to 4.6 stars
- Support calls decreased by 22%
Key Insight
Users think in terms of actions ("Move Money") rather than banking terminology ("Transactions"). The card sort revealed intuitive action-oriented language.
Example 3: SaaS Product Dashboard
The Challenge
A project management SaaS had 42 features hidden in nested menus. New users couldn't find key features, leading to low activation rates.
Study Setup
Type: Open card sort Cards: 42 product features
- Create Project
- Task Board
- Gantt Chart
- Time Tracking
- Team Chat
- File Sharing
- Calendar View
- Reports Dashboard
- Notifications
- User Permissions
- Integrations
- API Access
- ...and 30 more
Participants: 20 product managers and team leads (target users) Instructions: "Imagine you're using this tool for the first time. How would you group these features?"
Results
Top User-Created Categories:
-
Project Work (85% agreement)
- Create Project, Task Board, Gantt Chart, Calendar View
-
Team Collaboration (78% agreement)
- Team Chat, File Sharing, Comments, @Mentions
-
Tracking & Reporting (72% agreement)
- Time Tracking, Reports Dashboard, Progress Charts
-
Settings & Admin (88% agreement)
- User Permissions, Integrations, API Access, Billing
Implemented Structure
Sidebar Navigation:
├─ 📋 Projects (main work area)
├─ 👥 Team (collaboration features)
├─ 📊 Insights (reporting & analytics)
├─ 🔧 Apps (integrations)
└─ ⚙️ Settings (admin functions)
Outcome
- New user activation increased from 35% to 68%
- Feature discovery improved significantly
- Reduced onboarding time by 45%
- Customer satisfaction score increased by 31 points
Key Insight
Users wanted a clean, focused workspace with advanced features tucked away but accessible. Card sorting revealed the core vs. secondary features split.
Example 4: Corporate Intranet
The Challenge
A 10,000-employee company had an intranet with 80+ pages. Employees complained they couldn't find important information like benefits or IT support.
Study Setup
Type: Open card sort Cards: 50 most-visited pages
- Submit Time Off
- Health Benefits
- 401(k) Information
- Company News
- IT Help Desk
- Office Locations
- Employee Directory
- Training Courses
- Expense Reports
- Payroll Information
- Company Policies
- Department Contacts
- ...and 38 more
Participants: 40 employees (various departments and tenure) Instructions: "Organize these intranet pages into groups that would help you find what you need quickly."
Results
Original Structure (alphabetical - not useful): 80+ pages in A-Z list
New Structure (from card sort):
Quick Links (Dashboard):
├─ For Me
│ ├─ My Benefits
│ ├─ My Time & Pay
│ └─ My Career
├─ Need Help
│ ├─ IT Support
│ ├─ HR Questions
│ └─ Facilities
├─ Stay Informed
│ ├─ Company News
│ ├─ Events Calendar
│ └─ Announcements
└─ Resources
├─ Policies & Forms
├─ Training
└─ Employee Directory
Outcome
- Task success rate increased from 45% to 82%
- Average search time reduced from 4 minutes to 45 seconds
- IT support tickets about "can't find" issues dropped 67%
- Employee satisfaction with intranet rose from 2.8 to 4.2 (out of 5)
Key Insight
Employees wanted task-based organization ("Submit time off") rather than departmental organization ("HR → Time Off → Submit Request").
Example 5: Help Center Redesign
The Challenge
A SaaS company's help center had 150+ articles but customers still contacted support for basic questions. The existing category structure wasn't intuitive.
Study Setup
Type: Open card sort Cards: 30 most-searched help articles
- How to Reset Password
- Billing & Payments FAQ
- How to Export Data
- Account Security Setup
- Team Member Permissions
- Integration Setup Guide
- Troubleshooting Errors
- Mobile App Guide
- API Documentation
- Feature Tutorials
- ...and 20 more
Participants: 25 customers (mix of new and experienced users) Instructions: "You need help with our product. Organize these topics into groups that would help you find answers quickly."
Results
User-Created Categories (with agreement %):
-
Getting Started (82% put these together)
- Account Setup, First Steps, Basic Features
-
Common Questions (75% agreement)
- Password Reset, Billing FAQ, Account Settings
-
Advanced Features (71% agreement)
- API Docs, Integrations, Custom Settings
-
Troubleshooting (88% agreement)
- Error Messages, Common Issues, Bug Reports
-
Mobile & Apps (66% agreement)
- Mobile Guide, Desktop App, Browser Extensions
Implemented Structure
Added a smart categorization with search tags:
Help Center:
├─ 🚀 Getting Started (for new users)
├─ ❓ Common Questions (FAQ-style)
├─ 🔧 Features & How-To (tutorials)
├─ 🐛 Troubleshooting (problem-solving)
└─ 💻 Developers (API docs)
Outcome
- Support ticket volume decreased by 30%
- Help article views increased by 45%
- Customer satisfaction (CSAT) improved by 12 points
- Average resolution time reduced by 2 days
Key Insight
Users group content by their goal (troubleshooting, learning, reference) rather than by product features. Card sorting revealed the mental model.
More Quick Examples
Example 6: Educational Platform
Context: Online learning platform with 200+ courses Type: Closed card sort Cards: 50 course titles Outcome: Validated that users prefer subject-based categories (Math, Science) over skill level (Beginner, Advanced)
Example 7: Recipe Website
Context: Food blog with 500+ recipes Type: Open card sort Cards: 40 popular recipes Outcome: Users created categories by meal type (Breakfast, Dinner) and dietary needs (Vegetarian, Gluten-Free), not by cuisine
Example 8: Fitness App
Context: Workout app with 60 exercises Type: Hybrid card sort Cards: 60 exercises Outcome: Users preferred grouping by body area (Upper Body, Core) over equipment needed
Example 9: Travel Booking Site
Context: Travel site with 30 booking features Type: Open card sort Cards: 30 features Outcome: Users wanted trip timeline structure (Before Trip, During Trip, After Trip) instead of service type
Example 10: News Website
Context: Local news site with 25 section categories Type: Closed card sort Cards: 100 recent article headlines Outcome: Validated that some articles fit multiple categories, leading to tag system
Example 11: Government Portal
Context: City government website with 60 services Type: Open card sort Cards: 60 public services Outcome: Residents organized by life events (Moving, Having a Baby) not by department
Example 12: Design Resource Library
Context: Design agency with 200+ resources Type: Open card sort Cards: 40 resource types Outcome: Designers wanted project phase categories (Research, Ideation, Production) over file type
Example 13: Medical Patient Portal
Context: Hospital patient portal with 35 features Type: Hybrid card sort Cards: 35 features Outcome: Patients preferred plain language ("Talk to My Doctor") over medical terms ("Secure Messaging")
Example 14: Real Estate Website
Context: Property listing site with 40 search filters Type: Open card sort Cards: 40 filters Outcome: Users created priority levels (Must-Have, Nice-to-Have) rather than categories
Example 15: Podcast App
Context: Podcast discovery app with 50 genres Type: Closed card sort Cards: 100 podcast shows Outcome: Users disagreed on many categories, leading to multi-tagging system
Common Patterns Across Examples
Pattern 1: Task-Oriented Beats Feature-Oriented
What we learned: Users think in terms of goals ("Pay a bill") not features ("Payment Module")
Examples:
- Banking app: "Move Money" > "Transactions"
- Intranet: "Submit Time Off" > "HR Forms"
- Help Center: "Getting Started" > "Features List"
Pattern 2: Plain Language Wins
What we learned: Users prefer everyday language over technical jargon
Examples:
- Medical portal: "Talk to My Doctor" > "Secure Messaging"
- SaaS: "Team" > "Collaboration Suite"
- Government: "Having a Baby" > "Birth Registration Services"
Pattern 3: Fewer Categories = Better
What we learned: Users naturally create 4-7 main categories, not 15-20
Examples:
- Banking app: 6 tabs → 4 sections (improved UX)
- E-commerce: 4 mega-menu columns (reduced from 8)
- Intranet: 4 main sections (down from 12)
Pattern 4: Context Matters
What we learned: Organization depends on user intent and context
Examples:
- Recipes: By meal type (dinner) OR diet (vegan) - both valid
- Courses: By subject (math) OR level (beginner) - context-dependent
- Travel: By trip phase works better than by service type
How to Apply These Examples
Step 1: Identify Your Use Case
Which example matches your situation?
- Complex navigation → E-commerce or SaaS examples
- Feature organization → Banking or Dashboard examples
- Content structure → Help Center or News site examples
Step 2: Adapt the Study Setup
Copy the relevant setup:
- Card creation approach
- Study type (open/closed/hybrid)
- Number of participants
- Instructions format
Step 3: Look for Similar Patterns
Based on examples above, expect:
- 4-7 main categories
- Task-oriented groupings
- Plain language preferences
- Some disagreement (that's valuable data!)
Step 4: Test & Iterate
Like successful examples:
- Run card sort
- Analyze results
- Implement changes
- Measure impact
- Iterate based on data
Card Sorting Best Practices from Examples
✅ DO: Use Real Content
All successful examples used actual product names, features, or content—not placeholders. Real content gets real reactions.
✅ DO: Test with Real Users
Every example used target users, not internal team members. Internal teams have biased mental models.
✅ DO: Aim for 20-30 Participants
Most successful studies had 20-40 participants. Patterns became clear around 20 responses.
✅ DO: Keep Cards Between 30-50
Examples that worked best had 30-50 cards. Too few (< 20) don't reveal patterns; too many (> 60) cause fatigue.
✅ DO: Use Open Sorts for Discovery
Examples 1, 3, 4, and 5 used open sorts to discover new structures. When you don't know the answer, let users show you.
❌ DON'T: Use Jargon
Medical portal example shows plain language beats technical terms every time.
❌ DON'T: Ignore Outliers
Example 15 (podcast app) showed when users disagree, you might need tagging instead of categories.
❌ DON'T: Skip Implementation
Card sorting reveals insights, but only if you implement them. All successful examples measured post-launch impact.
Your Turn: Run Your Own Card Sort
Ready to create your own success story?
Quick Start
- Choose your example - Find the closest match above
- Copy the setup - Use similar cards and instructions
- Run the study - Start free on Card Sort
- Implement findings - Like the examples above
- Measure impact - Track your own success metrics
What to Measure
Based on successful examples, track:
- Task completion rate
- Time to find information
- User satisfaction scores
- Support ticket volume
- Conversion or engagement rates
Frequently Asked Questions
Q: How do I know which type of card sort to use? A: Use open for discovery (Examples 1, 3, 4, 5), closed for validation (Examples 5, 10), and hybrid for testing ideas (Examples 2, 8).
Q: How many cards should I include? A: Based on examples, 30-50 cards is ideal. Banking app (28), Help Center (30), SaaS (42) all worked well.
Q: What if my results are messy? A: That's often valuable! Example 15 (podcast app) had disagreement, which led to a better solution (multi-tagging).
Q: How do I convince my boss this works? A: Show these examples. The metrics don't lie: 30-40% improvements in findability, engagement, and satisfaction.
Q: Can I run card sorting for my specific industry? A: Yes! These examples span retail, finance, B2B, healthcare, government, and more. The methodology applies to any information organization challenge.
Q: What if I can't get 30 participants? A: Even 15-20 can reveal strong patterns. Banking app (25 users) and SaaS (20 users) examples both worked with smaller samples.
Related Resources
- Free Card Sorting Tool
- How to Run Your First Card Sort Study
- Card Sorting UX Template
- Open vs Closed Card Sorting
- Card Sorting Sample Size Guide
Start your own success story → Run a free card sort study