Why this matters
A brand that just won a Costco rotation calls the analyst on a Tuesday: "How are we doing at Costco this week?" The analyst pulls SPINS. Nothing useful comes back. They pull Circana. Sam's Club and BJ's are there, but Costco isn't. They check the SPINS MULO+ extract. Costco isn't there either.
The unsettling truth: Costco doesn't report to syndicators. Not to SPINS, not to Circana, not to NielsenIQ. It's the single largest distribution gap in syndicated CPG data — a roughly $250B retailer that operates almost entirely outside the analytical infrastructure brand-side teams rely on for every other retailer they sell into.
The rest of the club channel is partially covered: Sam's Club and BJ's report to Circana and are accessible through Circana's MULO universe (and via SPINS' MULO+ extension, which licenses MULO from Circana — see What is MULO). But "club" in brand-side conversations almost always means Costco, and Costco specifically is the gap.
This page is about how brands track Costco performance anyway — what sources exist, what each one can tell you, and the triangulation pattern most Costco brands settle into.
What's in syndicated data — and what isn't
| Club retailer | In Circana MULO? | In SPINS MULO+? | In SPINS Natural? |
|---|---|---|---|
| Costco | No | No | No |
| Sam's Club | Yes | Yes (via Circana licensing) | No (club isn't natural channel) |
| BJ's | Yes | Yes (via Circana licensing) | No |
The Costco gap is structural. Costco's data sharing model is built around its supplier relationships — vendors get access to Costco's own analytics through the vendor portal, but the data doesn't flow to the third-party syndicators that brands use for everywhere else. The result is a permanent dual-track for any Costco-active brand: "how am I doing everywhere else" (syndicated) and "how am I doing at Costco" (Costco-direct only).
Where Costco data actually comes from
Four sources, in roughly decreasing order of accuracy:
1. Costco's vendor portal and buyer relationship
The primary source. Costco shares vendor-specific data through its internal systems — sales by item, region, and warehouse cluster. Cadence is roughly weekly. The brand sees its own performance at Costco at warehouse-cluster granularity, not full warehouse-by- warehouse but close.
What it doesn't show:
- Category competitive context — Costco doesn't share competitor detail with vendors at a category-share level the way buyer category-review materials typically do at other retailers.
- Cross-retailer comparison — Costco data sits in its own surface with its own taxonomy.
2. Numerator panel projections
Numerator's purchase panel includes Costco transactions captured through receipt scanning. Projected to total US, Numerator publishes brand-level estimates of Costco performance. This is the best external category-level read for Costco.
Strengths:
- Includes buyer demographics — who at Costco is buying.
- Cross-retailer behavior — what else the Costco buyer purchases elsewhere.
- Category competitive — share-of-category cuts at Costco against named competitors.
Limitations:
- Panel-projected, so brand-level reads are reliable but SKU-week-level cells have wider confidence intervals.
- Doesn't see Costco's "Item Number" UPC scheme cleanly — Costco's package configurations sometimes don't map cleanly to retail UPCs.
- Subscription cost is a meaningful budget line.
3. NielsenIQ Homescan panel
NielsenIQ's household panel covers Costco purchases through self-reported scanning. Methodology is similar to Numerator; strengths and limitations broadly parallel. For brands that already use NIQ for source-of-volume or repeat-rate work, the Costco panel projection is included rather than a separate subscription.
4. Internal proxies and triangulation
The cheapest tier — what brands do when they don't have any external Costco surface:
- Shipment volume to Costco. The brand's own shipping numbers tell you what went into Costco's distribution centers. Not sales, but a leading indicator over a 4–6 week window.
- Regional sell-through patterns. A brand selling at Costco in the Pacific Northwest can compare its WA/OR/BC volume to its velocity at Fred Meyer (Kroger's PNW banner) for proxy demand signals.
- Cross-channel category trend extrapolation. If the brand's category is growing 12% in MULO+ and natural channel, and the brand's Costco shipment volume is growing 8%, the brand can infer Costco performance is broadly tracking with the category. Imprecise, but better than nothing.
What SPINS does tell you about the club channel (excluding Costco)
For Sam's Club and BJ's, SPINS' MULO+ extract includes them via the Circana-licensed MULO data. The reads are real and usable:
- Sam's Club is the volume leader of the SPINS-covered clubs. A brand reading "club channel +8%" in SPINS MULO+ is reading mostly Sam's Club performance.
- BJ's has narrower geographic footprint (Northeast and Mid-Atlantic concentrated) and contributes proportionally less to the club number.
- Both are useful retailer-level reads in their own right, but they don't substitute for Costco data.
The single biggest analyst error: reading "club channel" in SPINS or Circana MULO+ as if it included Costco. It doesn't. A "club channel grew +6%" read tells you about Sam's Club and BJ's; it tells you nothing about Costco.
The dual-track reporting pattern
Most Costco-serious brands settle into a permanent dual-track:
| Track | Source | Cadence | What it answers |
|---|---|---|---|
| Costco-direct | Costco vendor portal | Weekly | "How are we doing at Costco?" |
| Everywhere else | SPINS Natural / MULO+ + Circana for conventional | Weekly with multi-week lag | "How are we doing in the syndicated retail universe?" |
| Cross-context | Numerator or NIQ panel | Monthly or quarterly | "Who is buying us at Costco, and what else are they buying?" |
The dual-track is a permanent state, not a transitional one. Costco data is structurally separate, and the analytical surfaces don't merge. The best a brand can do is keep two reads next to each other and report them with appropriate boundaries.
Worked example — a snack brand's Costco rotation report
A snack brand wins a 12-week Costco rotation on a 24-count multipack of its hero SKU. Q1 report:
| Source | Cut | $ | Comment |
|---|---|---|---|
| Costco vendor portal | Brand's multipack sales across Costco regions | $4.2M for the 12 weeks | The primary read |
| SPINS MULO+ | Same brand's hero SKU at Sam's Club + BJ's | $180K for the 12 weeks | Reference point only |
| SPINS Natural | Same brand at Sprouts, Natural Grocers, naturals | $1.8M | Out-of-club but useful baseline |
| Numerator panel | Costco buyer demographics + cross-retailer behavior | 38% of Costco multipack buyers also buy the brand's smaller pack at Whole Foods, Sprouts, or conventional | Strategic context |
Reading these together:
- The Costco rotation drove $4.2M in 12 weeks — Costco-direct data, the primary number.
- Sam's Club + BJ's added $180K — small relative to Costco because the brand isn't pushing the same multipack format there; the comparison is useful for "is the multipack form-factor resonating in club" as a directional question.
- Cross-retailer behavior suggests Costco isn't pure cannibalization of the brand's other-retailer sales — 38% of Costco buyers also buy elsewhere, implying additive volume rather than just shifted volume.
- The natural-channel baseline of $1.8M is useful because it contextualizes the magnitude — Costco's $4.2M is roughly 2.3× the natural-channel volume for the same period, demonstrating the Costco rotation's incremental impact.
The report uses all three sources, names Costco as the primary, labels the SPINS reads explicitly as "everywhere else." No one confuses the SPINS club read for Costco performance.
Tracking rotation entry and exit windows
Costco's rotation model is unlike any other major retail relationship — items cycle in for a defined window (typically 8–16 weeks), then either earn a rebuy or rotate out. The data shape during entry and exit windows is structurally different from steady-state Costco distribution, and the standard syndicated-data instincts misread it.
Three things to watch differently during rotation entry:
- Week 1–2 dollars include initial-stock ramp. Costco's warehouse-stocking model produces a Week-1 dollar number that's part real consumer pull, part warehouse loading. Per- unit velocity stabilizes around Week 3–4 once the initial stock has cleared.
- Regional onboarding is rarely simultaneous. A national Costco rotation often phases regionally — West Coast first, then Mountain / Midwest, then East. Week-1 dollars reflect partial geographic coverage; comparing Week-1 dollars to a steady-state national run-rate understates rotation performance.
- The rebuy decision happens mid-window. Costco's buyers typically commit to a rebuy 4–6 weeks before the rotation ends. The data the brand needs in front of the buyer at rebuy decision time is the trend-adjusted velocity, not the cumulative dollars — buyers are forecasting steady-state sustained performance, not reading the launch curve.
Exit windows have the inverse problem — final-weeks Costco dollars typically dip as warehouses sell down inventory rather than reorder. Reading the final 2–3 weeks of a rotation as "performance softening" is usually misreading inventory wind- down for demand softness.
For brands tracking multiple rotations across a year, the right unit of analysis is rotation-window-comparable metrics: peak weekly velocity, sustained Week-4-through-Week-(end-2) velocity, and rebuy attainment rate. Cumulative-dollar comparisons across rotations of different durations are noise.
Anti-patterns
- Reading "club channel" in SPINS or Circana as if it included Costco. It doesn't. Always label the syndicated club read as "Sam's Club + BJ's" or "ex-Costco" to avoid confusion downstream.
- Treating Numerator's Costco read as equivalent to Costco vendor portal data. They measure different things — Numerator is a panel projection, the portal is Costco's own sales record. Use the portal for the level, Numerator for the buyer and category context.
- Reporting Costco results in the syndicated retail report. Costco numbers belong in their own section with explicit boundary. Mixing them with SPINS or Circana numbers in the same row makes the rollup unauditable.
- Assuming Costco shipment ≈ Costco sales week-to-week. Costco manages inventory aggressively; shipment lumpiness around season starts, rotation entries, and out-of-stock cycles can shift reported shipment dramatically from actual sell-through. Shipment-to-sales reconciliation needs a 6–8 week rolling read.
- Forgetting that Costco's Item Numbers don't map to retail UPCs cleanly. A brand SKU might have a different package configuration for Costco than for retail (the 24-count multipack vs. the 6-count retail pack). Triangulating against retail UPC- level SPINS data requires keeping the package-config mapping explicit.
Doing this in Scout
For Costco-active brands, Scout supports uploading the Costco vendor portal exports alongside SPINS extracts and (where licensed) the Numerator or NIQ Costco-projection data. The dashboard makes the Costco surface explicitly separate from the SPINS retail surface — so "Costco rotation performance" reports against the Costco-direct data, and "everywhere else" reports against SPINS, with the appropriate ex-Costco labels on the syndicated reads. Direct API feeds to the Costco vendor portal aren't wired today; the integration model is upload-driven.
Summary + further reading
- Costco doesn't report to any major CPG syndicator. SPINS' "club channel" coverage via MULO+ is Sam's Club and BJ's only.
- The dual-track reporting pattern (Costco vendor portal + SPINS for everywhere else + Numerator/NIQ panel for cross-retailer buyer context) is a permanent state, not transitional.
- Always label syndicated club reads as ex-Costco, and report Costco results in their own section with explicit data-source boundaries.
Related: What is SPINS data? · What is MULO — and what SPINS' MULO+ adds · Syndicated vs. panel data