# How does it work ?

<mark style="color:$danger;">**As of January 28, 2026, the Pixel Tracking feature is deprecated and is no longer actively maintained.**</mark>\ <mark style="color:$danger;">**It will remain available for teams who still rely on it, but we won’t ship further improvements or provide ongoing maintenance for it.**</mark>

<mark style="color:$danger;">**If you need a custom version (or an alternative approach), we can build and support a tailored solution, happy to discuss your needs on a quick call:**</mark> [<mark style="color:$danger;">**https://calendly.com/fabrice-sepret/shaping-the-future-of-posetracker**</mark>](https://calendly.com/fabrice-sepret/shaping-the-future-of-posetracker)

We provide a pixel that receives images to detect human body key points and analyze motion within those images (such as counting squats, giving recommendations during pushups, tracking lunges progression, etc.).

<figure><img src="https://2260623413-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FVN2exBTzCYvSfEZLPE2d%2Fuploads%2FHp1oznqFcRvYt9HIuNjt%2FPixel%20tracking%20PoseTracker%20schema%20(2).png?alt=media&#x26;token=276804d1-2588-43bd-8651-429aede45907" alt=""><figcaption></figcaption></figure>

**Everything stay client side :**&#x20;

* ❌ no streaming&#x20;
* ❌ no latency
* &#x20;❌ privacy problems

**Pose estimation**&#x20;

We track 17 keypoints of the human body on a 2D plane:

<figure><img src="https://2260623413-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FVN2exBTzCYvSfEZLPE2d%2Fuploads%2F3IuFnvRXjRx6ulgppcMq%2Fimage.png?alt=media&#x26;token=e59f6dd9-3cd9-4185-87f5-a5359dc7c7f4" alt=""><figcaption><p>PoseTracker keypoints</p></figcaption></figure>

\
**You can try it here**: <https://codepen.io/Fabrice-Sepret/pen/OJeBpMm>
