Key Takeaways
- GA4 measures navigation, not cognition. Pageviews and bounce rates are proxies for engagement. Behavioural analytics measures what visitors actually read, where they paused, and where they got frustrated.
- PostHog exposes raw session data via API — unlike Hotjar and Clarity, which are visual-only tools. This enables programmatic analysis of scroll behaviour across hundreds of sessions, not one-at-a-time replay watching.
- Scroll dwell mapping converts GA4's single "average time on page" into a spatial attention distribution — showing exactly which sections hold attention and which are skipped.
- Scroll velocity reveals two-pass reading patterns. Sections that appear "skipped" on the first pass often accumulate significant dwell time on the return trip. Without velocity data, you'd misdiagnose them as ineffective.
- Rage clicks surface UX failures invisible to A/B testing — elements that aren't broken but are misleading, presenting affordances that don't exist.
What Does Behavioural Analytics Actually Mean?
Traditional analytics tools — GA4, Adobe Analytics — measure navigation. Page A to Page B. Time on site. Bounce rate. These are proxies for engagement, not measurements of it.
Behavioural analytics measures cognition. What did the visitor actually read? Where did they pause? What did they try to interact with? Where did they get frustrated?
PostHog captures this through rrweb session recordings — a frame-by-frame replay of every visitor's DOM interaction. From this raw data, we extract four analytical layers that GA4 structurally cannot provide:
| Layer | What It Measures | GA4 Equivalent |
|---|---|---|
| Scroll Dwell Mapping | Time spent at each vertical position on the page | None |
| Scroll Velocity | Reading speed — scanning vs. reading vs. studying | None |
| Interaction Mapping | Rage clicks, dead clicks, click density by page zone | None |
| Journey Reconstruction | Scroll depth + dwell pauses + navigation = intent narrative | Page path (no depth) |
Why Not Hotjar or Microsoft Clarity?
Session recordings aren't new. Hotjar and Microsoft Clarity have offered them for years. PostHog differs in three structural ways:
| Capability | PostHog | Hotjar | Clarity |
|---|---|---|---|
| Session recordings | ✓ | ✓ | ✓ |
| Product analytics in same platform | ✓ | ✕ | ✕ |
| API access to raw session data | ✓ | ✕ | ✕ |
| HogQL (SQL-like querying) | ✓ | ✕ | ✕ |
| Mobile app recording | ✓ | ✕ | ✕ |
| Feature flags + A/B testing | ✓ | Partial | ✕ |
| Self-hostable (data ownership) | ✓ | ✕ | ✕ |
| Free tier | 5,000 recordings/mo | 35 sessions/day | Unlimited |
| Script size | ~52KB (lazy-loads replay) | ~110KB | ~22KB |
The critical difference is the API. Hotjar and Clarity are visual tools — you watch recordings one at a time in a dashboard. PostHog exposes the raw rrweb snapshot data through its API, which means you can programmatically analyse scroll behaviour, velocity, and interaction patterns across hundreds of sessions at once. That's the difference between watching replays and doing behavioural science.
How Does Scroll Dwell Mapping Work?
We ran PostHog on our own website for a week and analysed 15 non-internal sessions. For each session, we extracted scroll position data from the rrweb recordings and calculated cumulative dwell time — how long visitors spent at each 400-pixel vertical zone of the page.
Two findings stood out. First, 38% of all attention concentrated in the first two screens — the hero and value proposition. This isn't surprising, but quantifying it changes how you prioritise copy decisions.
Second, the "How it works" section received three times more dwell time than the services list. Visitors want to understand your process before they evaluate your offerings. Methodology outperforms features.
What Does Scroll Velocity Reveal About Reading Behaviour?
Dwell time tells you where visitors stopped. Velocity tells you how they moved. By calculating pixels per second between scroll events, we classified each page section as being read (under 300px/s), browsed (300-500px/s), or scanned (over 500px/s).
This revealed a two-pass reading pattern. Visitors scrolled the full page at scanning speed on the first pass — testimonials were crossed at 5,012 pixels per second. Then they scrolled back up and re-read specific sections at browsing speed. The testimonials that appeared "skipped" on the first pass actually accumulated 23 seconds of dwell time on the return trip.
Without velocity data, you might conclude that testimonials aren't working. With it, you see they're working — just on the second pass, not the first.
How Do Rage Clicks Expose Hidden UX Failures?
PostHog's rrweb recordings capture every mouse interaction at the DOM element level. We analysed click events for two frustration signals:
Rage clicks — three or more clicks within two seconds inside a 50-pixel radius. This indicates a user repeatedly clicking something that isn't responding as expected.
Dead clicks — clicks on non-interactive elements. These reveal gaps between what a user expects to be functional and what actually is.
All 20 rage click clusters concentrated in the top 800 pixels of the page. The highest-density cluster — 10 rapid clicks — was on a visual element in the hero that resembled an interactive diagnostic tool. Users expected to click it and receive something. It was a static image.
This is the kind of finding that no amount of A/B testing would surface. The element wasn't broken — it was misleading. It presented an affordance that didn't exist. The fix is binary: make it interactive or remove it.
What Does Journey Reconstruction Tell You That Page Paths Can't?
The final layer combines scroll depth, dwell pauses, and navigation events into a session-level intent narrative. Here's one reconstructed journey from a visitor who arrived via a search ad:
GA4 would record this as a single pageview with 76 seconds of engagement time. The behavioural reconstruction reveals a visitor who read the entire page, evaluated every section, reached the conversion point, and chose not to act. The problem isn't awareness or interest — it's the form itself.
How Do You Implement PostHog Session Recording?
PostHog's session recording requires a single script tag — a 52KB core library that lazy-loads the replay module only when recording is active. Independent benchmarking shows 50-100ms added to mobile page load, with interaction delays of 0.02-0.07 seconds — below the threshold of user perception.
The analytical layer — scroll dwell mapping, velocity classification, rage click detection — is built on PostHog's snapshots API, which returns rrweb event data as JSONL. The calculations are deterministic: scroll position changes over time intervals, classified by velocity thresholds. This means the analysis can be fully automated and run across every session, not sampled.
The free tier includes 5,000 session recordings per month. For most B2B websites, that covers total traffic with room to spare.
What Changes When You Have Behavioural Data?
Every layer of this analysis produces a decision. The scroll dwell map tells you which sections to invest copywriting effort in. Velocity reveals whether your testimonials work on the first pass or the second. Rage clicks surface UX failures that look intentional from the design side. Journey reconstruction distinguishes between visitors who bounced from lack of interest and visitors who read everything and still didn't convert — two populations that require completely different responses.
The tools exist. The data is already being generated by every visitor. The question is whether you're reading it.
Frequently Asked Questions
What is behavioural analytics and how is it different from traditional web analytics?
Traditional web analytics tools like GA4 measure navigation — page A to page B, time on site, bounce rate. These are proxies for engagement, not measurements of it. Behavioural analytics measures cognition: what visitors actually read, where they paused, what they tried to interact with, and where they got frustrated. It uses session recording data (rrweb snapshots) to extract scroll dwell mapping, scroll velocity, interaction patterns, and journey reconstruction — four analytical layers that GA4 structurally cannot provide.
How does PostHog compare to Google Analytics GA4 for understanding user behaviour?
GA4 tells you aggregate metrics — 34% bounce rate, 84 seconds average time on page — with no indication of where attention was spent or how users moved through the page. PostHog captures frame-by-frame DOM interaction via rrweb session recordings, enabling scroll dwell mapping (time at each vertical position), scroll velocity analysis (reading vs. scanning speed), rage click detection, and full journey reconstruction. The critical difference is PostHog's API access to raw session data, allowing programmatic analysis across hundreds of sessions rather than watching replays one at a time.
What is scroll dwell mapping and how does it improve conversion rates?
Scroll dwell mapping calculates cumulative dwell time — how long visitors spend at each vertical zone of a page. Instead of GA4's single "average time on page" number, you get a spatial attention distribution showing exactly which sections hold attention and which are skipped. This reveals where to invest copywriting effort, which content sections outperform others, and whether visitors are reaching your conversion points. For example, our analysis showed 38% of attention concentrates in the first two screens, and that "How it works" content receives 3x more dwell time than feature lists.
PostHog vs Hotjar vs Microsoft Clarity — which session recording tool is best?
All three offer session recordings, but PostHog differs structurally. PostHog provides product analytics in the same platform, API access to raw session data, HogQL (SQL-like querying), mobile app recording, feature flags with A/B testing, and self-hosting for data ownership. Hotjar and Clarity are visual tools — you watch recordings one at a time. PostHog exposes raw rrweb snapshot data through its API, enabling programmatic analysis across hundreds of sessions at once. PostHog's free tier includes 5,000 recordings/month vs Hotjar's 35 sessions/day. Clarity is unlimited and free but lacks API access and advanced querying.
How do you detect rage clicks and what do they reveal about UX problems?
Rage clicks are detected as three or more clicks within two seconds inside a 50-pixel radius, indicating a user repeatedly clicking something that isn't responding as expected. Dead clicks — clicks on non-interactive elements — reveal gaps between what users expect to be functional and what actually is. In our analysis, all 20 rage click clusters concentrated in the top 800 pixels, with the highest-density cluster on a visual element that resembled an interactive tool but was a static image. These findings surface UX failures that no amount of A/B testing would detect — elements that aren't broken but are misleading.
TL;DR
GA4 tells you what happened (bounce rate, time on page). PostHog tells you why it happened — through scroll dwell mapping, velocity analysis, rage click detection, and journey reconstruction. The critical difference is API access to raw session data: PostHog lets you programmatically analyse behaviour across hundreds of sessions, not watch replays one at a time. The free tier (5,000 recordings/month) covers most B2B sites. Start with scroll dwell mapping — it converts a single "average time on page" number into a spatial attention distribution that directly informs where to invest copywriting and design effort.