Jump analysis (jump_analysis)

Measure vertical jump height using a pixel-to-cm scale (requires user height).

Measure vertical jump height from the user’s hip movement in pixels.

This mode needs a user-provided height (userHeightCm).

See also:

Overview

What you get

  • Live jump detection and jump height estimates.

  • A one-time calibration step to lock a baseline and scale.

  • Optional device pitch correction.

Key features

  • Uses only hip keypoints (left_hip, right_hip).

  • Works in real time (/pose_tracker/tracking) and on video (/pose_tracker/upload_tracking).

  • Emits simple PostMessage events: posture, jump_calibration, jump_started, jump_height, jump_summary, error.

Prerequisites

Required query parameters

  • exercise=jump_analysis

  • token=...

  • userHeightCm=... (number, centimeters)

If userHeightCm is missing or invalid, the iframe sends an error event with:

  • error: "jump_analysis_missing_height"

Optional query parameters

  • devicePitchDeg=... (number, degrees)

    • Used to compensate for a camera pitched up/down.

    • See Measurement algorithm.

Tracking prerequisites (pose)

  • Both hip keypoints must be detected with score > 0.5.

  • The full body should be visible.

  • The camera must stay fixed.

Comparison: jump_analysis vs air_time_jump

Use this when you can provide user height.

Aspect
jump_analysis
air_time_jump

User height

Required (userHeightCm)

Not required

Calibration

Yes (stable standing β†’ baselineY + cmPerPixel from frame height)

No (dynamic baseline from recent frames)

Scale

Pixel-to-cm via cmPerPixel

Time-to-height via physics

Formula

jumpHeightCm = (baselineY - minY) * cmPerPixel

h = (1/8)*g*tΒ², then cm

Outputs

jumpHeightCm, deltaPixels, cmPerPixel, baselineY, minY

jumpHeightCm, airTimeMs, airTimeSeconds, baselineY, minY

Use when

User height is known; you want a scale tied to frame/user

User height unknown; you want a simple, physics-based estimate

Also see: Air time jump (air_time_jump)

Exercise detection

Internally, this exercise is selected when any of these conditions match:

  • exerciseHandler.v2_movement.type === "jump_analysis"

  • exerciseHandler.type === "jump_analysis"

  • URL param exercise=jump_analysis

In integrations, always pass exercise=jump_analysis.

API endpoints

Live (camera)

Base URL:

Example:

Upload Tracking (video)

Base URL:

Example:

Common optional flags on video:

  • export=true (keeps the last frame visible and enables export UI)

  • skeleton=true|false

  • keypoints=true (plan-gated)

PostMessage events

These events are sent from the iframe/WebView to your host app.

If you’re new to placement messages, read: Exercise Placement.

posture

Sent continuously to guide placement.

jump_calibration (jump_analysis only)

Sent while the user is standing still for calibration.

Payload varies by version.

You should treat this as:

  • a progress/ready signal

  • plus optional debug fields (baseline, scale)

jump_started

Sent once when upward movement crosses the detection threshold.

jump_height

Sent during measurement and/or at the end of a jump.

Notes:

  • deltaPixels = baselineY - minY (hip Y decreases when the user goes up).

  • final=true indicates the jump is completed (landing detected).

jump_summary (video)

Sent at the end of a video analysis.

Multiple jumps can be detected in a single clip.

Heights are in centimeters.

error

Sent on invalid config or runtime failures.

Placement overlay behavior

  • Uses the standard placement system (posture).

  • You should keep the user centered and fully in frame.

  • Calibration only becomes ready when hip Y is stable for ~1 second.

User flow

  1. Placement: wait for posture.ready=true.

  2. Calibration: wait for jump_calibration.ready=true.

  3. Jump:

    • Receive jump_started.

    • Receive jump_height updates.

    • Consider the jump complete when jump_height.final=true.

Measurement algorithm

1) Hip signal

  • Read left_hip and right_hip keypoints.

  • Require score > 0.5.

  • Use a single hip Y signal (implementation may average hips).

2) Baseline calibration

  • The user must stand still for a short stable window (~1s).

  • Baseline:

3) Pixel-to-cm scale

Scale is derived from user height and the frame height.

Important:

  • Current implementation does not use ankle-to-hip pixel distance.

  • It uses frame height Γ— 0.9 as an approximate body span.

4) Jump detection + landing

  • Jump starts when hip moves up by > 15 px from baseline.

  • Landing when hip returns to within 10 px of baseline.

5) Height calculation

During the jump, track the minimum hip Y (minY).

6) Optional pitch correction

If devicePitchDeg is provided, height can be corrected by:

The handler clamps the pitch it uses.

Integration examples

Web (iframe)

Upload Tracking (video URL)

When export=true, the last analyzed frame stays visible.

Best practices and limitations

Best practices

  • Keep the camera fixed and avoid zoom changes.

  • Keep the user fully visible (especially hips).

  • Ask the user to stand still for calibration.

Limitations

  • Absolute height depends on the userHeightCm you provide.

  • Any camera pitch reduces vertical pixel motion. Use devicePitchDeg when you can.

  • This measures hip vertical displacement, not true COM displacement.

Troubleshooting

  • Getting jump_analysis_missing_height:

    • Pass userHeightCm as a number in centimeters.

  • Calibration never becomes ready:

    • Improve lighting.

    • Ensure hips are visible and stable.

    • Reduce camera shake.

Last updated