← Back to Scorecard
About This Report

An Independent Scorecard for Digital Experience Platforms

The DXP Scorecard is a free, independent evaluation of the leading content management systems and digital experience platforms on the market today. No sponsored rankings. No pay-to-play tiers. No analyst briefings from vendor marketing teams.

What This Report Is

The DXP Scorecard evaluates platforms across 104 scored criteria spanning content management, platform capabilities, technical architecture, ecosystem health, total cost of ownership, build complexity, maintenance burden, and use-case fit. Each platform receives a structured scorecard built from publicly available technical documentation, real-world implementation experience, and multi-dimensional analysis from both development and marketing perspectives.

This is not a feature checklist. It is an implementation-grounded evaluation designed to reflect what engineering teams, digital marketers, and technology decision-makers actually encounter when they build on these platforms — not what the vendor's sales deck says.

Who it's for

Organizations evaluating a CMS or DXP for the first time. Teams replacing a legacy platform. Architects comparing headless CMS options. Marketing leaders building a business case. CTOs validating vendor shortlists. Anyone who needs an honest, structured framework for platform selection — not a round of guided demos.

Methodology

Each platform is scored using a structured framework with eight weighted categories and over 100 individual evaluation criteria. Scores range from 0–100 and reflect real-world capability, not theoretical maximum. Every score includes a reasoning statement and a confidence level based on the quality of available evidence.

Primary evidence sources

Official technical documentation, API references, developer guides, release notes, and architecture documentation. Where available, this includes hands-on implementation experience across production deployments.

Secondary evidence sources

Community resources, developer forums, open-source repositories, independent benchmarks, and practitioner experience. Scores derived from secondary sources are marked with lower confidence levels.

Confidence levels

HIGHVerified against official documentation or direct implementation experience
MEDIUMSupported by official docs with limited detail, or strong practitioner knowledge
LOWBased on community sources, indirect evidence, or general knowledge
INFERREDNo direct evidence found — scored conservatively based on absence of signal

Scoring categories

Core Content ManagementPlatform CapabilitiesTechnical ArchitecturePlatform Velocity & HealthTotal Cost of OwnershipBuild ComplexityMaintenance BurdenUse-Case Fit

Independence & Bias Policy

This report is produced by HT Blue, a digital experience agency with hands-on implementation experience across enterprise CMS and DXP platforms. The scoring is based entirely on technical merit and real-world performance data.

No platform vendor has paid to be included, excluded, or ranked favorably. No platform vendor has reviewed or approved scores prior to publication. Scores are not influenced by commercial relationships, referral agreements, or partnership status. Where HT Blue has a commercial relationship with a platform vendor, that relationship does not influence scoring — all platforms are evaluated against the same criteria by the same methodology.

This is the opposite of how most analyst firms operate. Traditional analyst reports are funded by vendor inquiry fees, placement in quadrants, and briefing cycles where vendors present curated demos to analysts who may have limited implementation experience. Platforms with larger marketing budgets and more aggressive analyst relations programs tend to score better in those reports — not because they are better platforms, but because they are better at marketing to analysts. This report is different.

What We Actually Evaluate

Platform evaluation from a development and marketing perspective means assessing the full lifecycle — not just the demo. Our evaluation covers:

Developer experience
API design quality, SDK maturity, local development workflow, TypeScript support, documentation depth.
Content author experience
Editor usability, workflow capabilities, preview quality, structured content modeling, localization support.
Architecture fitness
Headless capability, API-first design, scalability, deployment flexibility, security posture.
Ecosystem health
Community size and trajectory, integration marketplace, talent availability, release cadence.
Total cost of ownership
Licensing costs, implementation costs, hosting, operational burden, migration risk.
Use-case alignment
How well the platform fits specific scenarios: marketing sites, commerce, intranets, multi-brand.

Freshness & Updates

Each platform scorecard is timestamped with the date it was last evaluated. The digital experience platform landscape evolves rapidly — platforms that were weak three years ago may have significantly improved, and platforms that were leaders may have stagnated. Scores are updated periodically as platforms release major versions, change pricing, or shift market position meaningfully.

All platform scores include their evaluation date. If a score is more than 12 months old, treat it as directionally useful but verify against the platform's current documentation before making procurement decisions.

Start Evaluating

View the full platform comparison and scorecard.

View the Scorecard →