-
How can I view the output of LlamaIndex in a console or webpage format? LangGraph has the LangSmith concept. Can we see the output of LlamaIndex in a similar way? Currently, I'm seeing all the output in VS Code, and I have to cut and paste to make it look presentable. Do they have a dedicated console for viewing the output? Just curious. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
LlamaIndex provides several ways to view and visualize output, both in the console and on a webpage. Console LoggingYou can enable basic logging in your application with the following code: import logging
import sys
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout)) Callback HandlerLlamaIndex offers a callback manager to help debug, track, and trace the inner workings of the library. You can set a simple callback handler like this: import llama_index.core
llama_index.core.set_global_handler("simple") Observability ToolsLlamaIndex supports one-click integration with various observability tools, allowing you to view LLM/prompt inputs/outputs, ensure component outputs are performing as expected, and view call traces for both indexing and querying. Some of the supported tools include Langfuse, DeepEval, Weights and Biases Prompts, and more. You can configure these tools by setting environment variables and using the Web-Based VisualizationFor web-based visualization, you can use community-contributed plugins or extensions. For example, you can integrate LlamaIndex with Streamlit to create interactive web applications. More information can be found on the Streamlit blog. Additionally, LlamaIndex provides an example of visualizing knowledge graphs using Vis.js. This example demonstrates how to create interactive visualizations of knowledge graphs, which can be particularly useful for web-based applications. You can find more details and the implementation in the LlamaIndex documentation [1][2][3][4]. For more detailed information, you can refer to the observability documentation. |
Beta Was this translation helpful? Give feedback.
LlamaIndex provides several ways to view and visualize output, both in the console and on a webpage.
Console Logging
You can enable basic logging in your application with the following code:
Callback Handler
LlamaIndex offers a callback manager to help debug, track, and trace the inner workings of the library. You can set a simple callback handler like this:
Observability Tools
LlamaIndex supports one-click integration with various observability tools, allowing…