Competitive analysis in IA research means systematically studying how competitors organize their navigation, content, and labeling — then using those patterns to inform your own information architecture. Done before a card sort, it gives you a baseline of industry conventions and helps you design a sharper study.
Pick 3-5 direct competitors. Include the market leader (users are trained on their patterns), one or two mid-tier players, and ideally one disruptor or adjacent-market product. Don't include companies with wildly different audiences — their IA choices won't be relevant to your users.
For each competitor, document:
Screenshot everything. Navigation structures change, and you'll want the reference during analysis.
The real value of competitive analysis is distinguishing conventions from quirks. If all 4 competitors separate "Getting Started" guides from "API Documentation," that's a convention. Users who've interacted with any competitor product will expect that separation. Breaking it without evidence from your own research is a risk.
If only one competitor buries pricing under "Resources" while the other three put it in the main nav, that's a quirk. It might be intentional (hiding price comparison) or accidental (nobody reorganized after a site migration). Either way, it's not a user expectation you need to match.
Record these patterns in a simple spreadsheet: rows are content types or categories, columns are competitors, cells show where that content appears in each competitor's IA. Patterns jump out quickly.
Before redesigning a SaaS help center, you screenshot the navigation structures of four competitors: Intercom, Zendesk, HubSpot, and Freshdesk. You notice:
From this, you establish that "Getting Started," "API Docs," and "Integrations" are conventions your users likely expect. "Best Practices" is debatable — worth including as cards in your sort to see where users place them. "Community" is a Zendesk-specific choice, not an industry norm.
This analysis shapes your card sort directly. You know to include cards for API documentation, getting started guides, and integration content. You can test whether "Best Practices" articles belong in their own section or under related feature categories. And you won't waste cards on a "Community" category unless your own product actually has one.
The audit output serves three purposes in your card sort:
Card set design. The audit reveals content types you might overlook. If every competitor has a "Status Page" link and you were going to leave that out of your card set, the audit catches it.
Category hypothesis. For closed or hybrid sorts, competitor navigation gives you candidate categories to test. You're not copying competitors — you're using their structures as a starting hypothesis that your card sort will validate or reject.
Result interpretation. After the sort, when you see 80% of participants grouping API docs separately from getting started guides, you can note that this matches the industry convention. When participants break from a convention, that's a stronger signal than usual — they're overriding trained behavior because their mental model genuinely differs.
Competitive analysis tells you what exists, not whether it works. A competitor might have terrible IA that frustrates their users — copying it would import their problems. Always pair competitive findings with your own user research.
There's also a conformity trap. If you anchor too heavily on competitor patterns, your card sort becomes a confirmation exercise rather than genuine discovery. Present competitor categories as one input among several, not as the answer. The content audit and stakeholder interviews should carry equal weight in shaping your study.
How many competitors should you analyze before a card sort? Audit 3-5 direct competitors for a useful baseline. Fewer than 3 and you can't distinguish company-specific quirks from genuine industry conventions. More than 5 produces diminishing returns because navigation patterns start repeating. Include at least one market leader and one smaller or newer competitor to capture both established patterns and innovative approaches.
What should you look for in a competitor navigation audit? Document top-level navigation labels, the number of primary categories, how content is grouped under each category, and where competitors diverge from each other. Pay special attention to conventions that appear across all competitors — these reflect user expectations that are risky to violate. Also note any category that only one competitor uses, which may represent a differentiation opportunity or a mistake.
How does competitive analysis improve card sorting results? Competitive analysis gives you a baseline before you run your card sort. It surfaces industry conventions that users already expect, identifies category labels worth testing, and reveals gaps where competitors organize content poorly. This context helps you design a better card set and interpret results more effectively, because you can distinguish user-driven patterns from industry-trained habits.
Explore related concepts, comparisons, and guides