Using the Jam Data Export
Using the Jam Data Export

Using the Jam Data Export

Create an API Key for your Organization

 
You need to be an Owner of your Organization. Go to the org Page and create an API Key:
notion image

API Documentation - data-export Endpoint

 

Example Use: Get the most Active User by Sessions

 
Create file most_active_user.py
# /// script # dependencies = [ # "requests<3", # "pandas", # ] # /// from typing import Any import pandas as pd import requests API_KEY: str = "YOUR_JAM_API_KEY" BASE_URL: str = "https://api.wejam.ai/api/v1/data-exports" HEADERS: dict[str, str] = {"accept": "application/json", "X-API-KEY": API_KEY} def fetch_paginated(endpoint: str) -> list[dict[str, Any]]: page: int = 1 results: list[dict[str, Any]] = [] while True: url: str = f"{BASE_URL}/{endpoint}?page={page}&limit=100" response: requests.Response = requests.get(url, headers=HEADERS, timeout=10) response.raise_for_status() data: dict[str, Any] = response.json() results.extend(data["data"]) if not data["meta"].get("hasNext"): break page += 1 return results def main() -> None: users_data: list[dict[str, Any]] = fetch_paginated("users") sessions_data: list[dict[str, Any]] = fetch_paginated("sessions") # Convert to DataFrames df_sessions: pd.DataFrame = pd.DataFrame(sessions_data) df_users: pd.DataFrame = pd.DataFrame(users_data) # Count sessions per user session_counts: pd.DataFrame = ( df_sessions["learnerUserId"].value_counts().rename_axis("userId").reset_index(name="sessionCount") ) # Join with users df_users["userId"] = df_users["id"] df_merged: pd.DataFrame = session_counts.merge(df_users, on="userId", how="left") # Get most active user most_active_user: pd.Series = df_merged.sort_values(by="sessionCount", ascending=False).iloc[0] print("Most Active User:") print(f"Name: {most_active_user['firstName']} {most_active_user['lastName']}") print(f"Email: {most_active_user['email']}") print(f"Sessions: {most_active_user['sessionCount']}") if __name__ == "__main__": main()
 
Execute using uv (https://docs.astral.sh/uv/)
 
uv run most_active_user.py
notion image
 
 

Clemens Mail

1. Data integration via Jam’s API (ready to use)

With Jam’s API, you can easily extract all training data and display it in your LMS, CRM, or BI tools. Setup is straightforward, and our tech team is available to support you.
All documentation and links you’ll need to get started:
The most relevant entities for reporting are:
  • Users: userteams
  • Training content: missionstracks
  • Training activity: sprintstrack-assignmentsmission-assignments, and sessions (which include the overall score 0–100 — your main performance metric and all of the feedback).
These endpoints give you everything needed to build dashboards and analytics.
Myself and our CTO, Tobias Hölzer, would be happy to support your team in setting up the integration.
If you can share what you’d like to report on, we’ll guide you directly to the relevant data.

2. Linking Jam role-plays from other systems (ready to use)

Each role-play (tracks and mission) has a unique URL.
You can paste these links into your LMS, LXP, or other learning tools, allowing learners to launch specific Jam role-plays directly from within your systems. The prerequisite is that they have a Jam account and log in.
That said, we recommend using the Jam app as your primary learning environment — it offers full functionality for content management, user management, dashboards, reporting, and coaching. Your other systems can then serve as the system of record, receiving the resulting data via our API (option 1).

3. Custom integrations (on request)

We’re happy to collaborate on custom integrations with your LMS/LXP or CRM, using standards such as SCORM or, preferably, CMI5/xAPI for a seamless native experience.
The required effort and timeline depend on the specific systems you use — we can discuss this in more detail once we know your preferred integration setup.
Please let us know which options best fits your needs and for which specific tools/platforms you would need integrations at Xiaomi—and we’ll take it from there.