diff --git a/images/observe-functions/local-inference-history.png b/images/observe-functions/local-inference-history.png
new file mode 100644
index 00000000..83a85a9d
Binary files /dev/null and b/images/observe-functions/local-inference-history.png differ
diff --git a/modus/quickstart.mdx b/modus/quickstart.mdx
index 82b61a9b..b41f7152 100644
--- a/modus/quickstart.mdx
+++ b/modus/quickstart.mdx
@@ -200,6 +200,20 @@ learn how to use the basic components of a Modus app and how to run it locally.
This links your project to the Hypermode platform, allowing you to leverage the model in your modus app.
+
+
+
+ When testing an AI app locally, Modus records the inference and related metadata
+ in the `View Inferences` tab of the APIs explorer.
+
+
+ Local model tracing is only supported on Linux and macOS. Windows support is
+ coming soon.
+
+
+ 
+
+
diff --git a/observe-functions.mdx b/observe-functions.mdx
index 2cd8b180..20792e9a 100644
--- a/observe-functions.mdx
+++ b/observe-functions.mdx
@@ -17,7 +17,7 @@ your functions.
For more information on recording info, warnings, and errors in your Modus app,
see [Error Handling](/modus/error-handling).
-## Inference history
+## Model tracing
For each model invocation, Hypermode records the model inference and related
metadata. This includes the input and output of the model, as well as the
@@ -26,4 +26,4 @@ timestamp and duration of the inference.
From your project, select the Inferences tab to view the inferences from your
app's models.
-
+