Guide

Reading App Store Connect analytics: what each metric actually means.

App Store Connect surfaces a lot of numbers. Almost every App Store Connect metric has a specific, non-obvious definition — and confusing impressions with page views, or app units with installations, leads teams to make the wrong calls. Here's every major metric with the precise definition, how to interpret moves, and how to diagnose drops.

Two analytics surfaces: App Analytics and Sales and Trends

App Store Connect has two related but distinct analytics areas, and knowing which answers your question saves hours.

  • App Analytics covers the growth and engagement funnel: impressions, page views, app units, installations, sessions, active devices, retention, sources, crashes. This is where you look for "how is the app performing".
  • Sales and Trends covers the financial side: units sold, proceeds, in-app purchase revenue, subscription metrics, refunds. This is where you look for "how much money did we make".

The two can disagree slightly — a unit counted in App Analytics may appear in Sales and Trends a few hours later, and the aggregation windows differ. Don't waste time reconciling small gaps between them.

Impressions

An impression is counted when your app appears on a user's screen — in a search result list, a category browse list, a Today tab card, a Search tab suggestion — but before the user taps through to your product page. Apple counts an impression when your app is scrolled past into the visible area.

What moves impressions: search ranking changes, new category placements, featured slots, Today-tab editorial coverage, trending lists, season-specific browse surfaces. A sudden drop in impressions with conversion holding steady is almost always an ASO or algorithm move, not a product problem.

Product Page Views

A product page view is counted when a user actually taps into your product page — either from an impression, a direct URL, or a referrer. This is the denominator for conversion rate.

The ratio of impressions to page views (roughly, click-through rate) is a read on your icon and the first line of your subtitle. If 10,000 users saw your app in search but only 500 tapped through, your icon and subtitle aren't doing the job of selling the tap. Compare CTR across competitors in your category to benchmark.

Conversion Rate

Conversion rate is First-Time Downloads / Product Page Views. It is not downloads divided by impressions — a common misconception that leads to misleading comparisons across apps.

Typical benchmarks by category vary widely. Utilities and productivity apps often see 25-40% conversion from page views. Games skew lower — often 15-30%. Subscription apps with a hard paywall shown on page often sit below 20%. Compare yourself to your own historical baseline, not industry averages you find online.

What moves conversion: screenshots (especially the first 3), the app preview video if any, subtitle, category ranking signal, rating and review count, promotional text, seasonality. Metadata changes that land on your default page can move conversion within hours.

App Units vs Installations

The most confused pair of metrics in App Store Connect.

  • App Units — first-time downloads per Apple ID per app. A user installing your app for the first time is one App Unit forever. They can delete and reinstall 12 times; still one App Unit.
  • Installations — every install event. Reinstalls after deletion, installs on additional devices under the same Apple ID, installs via Family Sharing — all count.

Use App Units to measure acquisition: is the top of your funnel growing? Use Installations to measure distribution footprint: how many devices have your app right now. Installations should always be higher than App Units; the ratio (roughly 1.1 to 2.0 for most apps) reflects how many devices per user your app lives on and how often users reinstall after deletion.

Sessions and Active Devices

A session is a period of app usage, terminated when the app is backgrounded for more than 10 seconds. If a user opens your app, looks at one screen, and backgrounds it, that's one session. If they come back 30 seconds later, that's a new session.

Active Devices counts unique devices that have used your app in the reporting window (day, week, or month, depending on the view). This is your DAU/WAU/MAU number. Note: devices, not users — a user with an iPhone and an iPad who uses both counts as two active devices.

Sessions per active device is a useful engagement ratio. Utility apps sometimes see 1-2 sessions per active device per day; social apps 3-8; games vary wildly by genre.

Crashes

App Store Connect reports total crashes and — more importantly — percentage of sessions with crashes. The percentage is the actionable number.

  • Under 0.5%: healthy for most apps.
  • 0.5% to 1%: investigate which build shipped the regression.
  • Over 1%: actively painful; users are bouncing.
  • Over 2%: consider rolling back the release or shipping an expedited fix.

Apple's crash data comes from users who opt into sharing crash data with app developers (a system-level setting). Your MetricKit-sourced dashboard (Xcode Organizer → Crashes) and any third-party tool (Sentry, Firebase Crashlytics, Bugsnag) will see different subsets. App Store Connect's crash count is directional; your third-party tool is usually the ground truth for specific stack traces.

Retention: 1-day, 7-day, 28-day

Retention is the percentage of users who open your app again N days after first install. Apple reports 1-day, 7-day, and 28-day cohorts.

Rough benchmarks vary by category, but:

  • 1-day retention under 20% means most users tap install, open once, and never return. Look at onboarding friction.
  • 7-day retention in the 15-30% range is typical for utility apps; higher for habit/social apps.
  • 28-day retention is the best predictor of long-term value. Apps above 15% at 28 days tend to have sustainable organic growth.

Retention data by cohort lets you attribute moves: if the cohort from your May 1 release has worse 7-day retention than the April 1 cohort, something in the May release broke onboarding. Isolate and fix.

Sources: where installs come from

App Analytics breaks down impressions, page views, and app units by Source Type:

  • App Store Search — user searched on the App Store. A high share here means your keyword game is working.
  • App Store Browse — category lists, Today tab, Editor's Choice, What's Hot, rankings. A high share means Apple is surfacing you editorially or algorithmically.
  • Web Referrer — click from outside Apple's ecosystem (your website, press coverage, social media links). Marketing-driven traffic.
  • App Referrer — another app deep-linked into your product page (partner apps, in-app cross-promo, Universal Links).
  • App Clips — if your app has App Clips enabled.
  • Apple Search Ads — if you run ASA campaigns, these appear separately.
  • Event — App Store in-app events you've scheduled.

The breakdown tells you what's moving. A 20% App Units increase driven by App Store Search is a different story than a 20% increase driven by Web Referrer (maybe you got press coverage) versus App Store Browse (Apple featured you somewhere).

Data delays

App Analytics data is not real-time. Typical delays:

  • Today's data: partial, often visible within a few hours but not finalized.
  • Yesterday's data: usually finalized by mid-day the following day.
  • Week-long trends: stabilized after 2-3 days.

Apple marks partial days in the UI (sometimes shaded or with a "partial data" indicator). Don't make decisions off partial data — wait a day for it to stabilize. For same-day directional signals, use your in-app analytics (Mixpanel, Amplitude, your own backend) rather than App Analytics.

Sales and Trends has its own schedule. Daily sales reports typically appear within a few hours of end-of-day in the relevant region. Monthly financial reports arrive around the middle of the following month.

Diagnosing a sudden drop

You open App Store Connect and App Units is down 30% week-over-week. Diagnostic workflow:

Step 1: Did you ship anything?

Check your version history. If a new version went live, look at the app itself, the screenshots, the metadata. Did you accidentally change something that tanked conversion?

Step 2: Impressions or conversion?

Compare impressions and conversion rate separately:

  • Impressions down, conversion up or flat → ASO or App Store algorithm shift. You lost ranking on keywords or a category placement. Investigate search share and category rankings.
  • Impressions up or flat, conversion down → something on the product page isn't converting. New screenshots? New subtitle? Recent review complaints lowering the rating?
  • Both down → harder to diagnose; could be multi-factor. Check sources next.

Step 3: Which sources dropped?

If App Store Search dropped but Browse held steady, you lost keyword ranking. If Web Referrer dropped, your external traffic source changed (partner removed a link, press coverage aged out). If App Store Browse dropped, Apple changed where you appear.

Step 4: Seasonality check

Compare year-over-year. If the same week last year also saw a 30% drop, the move is seasonal, not a regression. Tax apps drop in May, fitness apps drop in February, shopping apps drop in December after the holiday rush.

Step 5: Competitive and category check

If a competitor just launched a major feature or a well-funded new app entered your category, your share may have redistributed to them. Search your key queries manually on an incognito device to see the current ranking and who's above you.

Metrics to watch weekly

A minimal weekly dashboard for most apps:

  • App Units (acquisition)
  • Conversion Rate (product page effectiveness)
  • Share by Source (where growth comes from)
  • 7-day retention (onboarding health)
  • Sessions per Active Device (engagement)
  • Percentage of Sessions with Crashes (stability)
  • Top 10 search queries (keyword health)

Revenue-focused apps should add: proceeds (from Sales and Trends), ARPPU (average revenue per paying user), and subscriber retention from the Subscriptions panel in Sales and Trends.

Frequently asked questions

Impressions vs page views?

Impression = your app icon appeared on-screen in search or browse. Page view = the user tapped to your product page. Conversion rate uses page views, not impressions.

App Units vs Installations?

App Units = first-time downloads per Apple ID per app. Installations = every install event including reinstalls and additional devices.

How do I read Crashes?

Use the "percentage of sessions with crashes" metric, not absolute count. Under 0.5% is healthy; over 1% is a problem.

How long until data appears?

24-48 hours typical. Today's data is partial; previous-day data is usually finalized by mid-day the following day.

What are the Source categories?

App Store Search, App Store Browse, Web Referrer, App Referrer, App Clips, Apple Search Ads (if enabled), Event.

ASC Analytics vs App Analytics?

App Analytics covers growth (impressions, page views, app units, retention). Sales and Trends covers revenue (proceeds, units sold, subscriptions). They occasionally disagree on small numbers — use them for different questions.

How do I diagnose a drop?

Check in order: did you ship something; is it impressions or conversion; which sources dropped; seasonal comparison; competitive shifts.

App Analytics surfaced alongside metadata and versions.

AppConsul renders per-app analytics panels next to your metadata editor, so you can see conversion and source breakdowns while editing screenshots or subtitles. No tab-switching to correlate a metadata change with an install dip.

See AppConsul →