Skip to content

Commit

Permalink
adding readthedocs
Browse files Browse the repository at this point in the history
  • Loading branch information
Snell1224 committed Jul 16, 2024
1 parent 02c899f commit 44a834e
Show file tree
Hide file tree
Showing 269 changed files with 37,748 additions and 0 deletions.
22 changes: 22 additions & 0 deletions rtd/.readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: rtd/docs/source/conf.py

# We recommend specifying your dependencies to enable reproducible builds:
# https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: rtd/docs/requirements.txt
46 changes: 46 additions & 0 deletions rtd/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
OVIS-HPC LDMS Documentation
########################

This repository hosts all LDMS related documentation such as how-to tutorials, getting started with LDMS, docker-hub links, API's and much more. Documentation webpage can be found in the `LDMS readthedocs webpage <https://ovis-hpc.readthedocs.io/projects/ldms/en/latest/>`_

Contributing to ReadTheDocs
############################
Instructions and documentation on how to use ReadTheDocs can be found here:
`readthedocs Help Guide <https://sublime-and-sphinx-guide.readthedocs.io/en/latest/images.html>`_


* Clone the repository:

.. code-block:: RST
> git clone [email protected]:<current-repo>/ovis-docs.git
* Add any existing file name(s) you will be editing to paper.lock

.. code-block:: RST
> vi paper.lock
<add Name | Date | File(s)>
<username> | mm/dd | <filename>
* Make necessary changes, update paper.lock file and push to repo.

.. code-block:: RST
> vi paper.lock
<add Name | Date | File(s)>
## remove line
> git add <files>
> git commit -m "add message"
> git push
Adding A New File
******************
For any new RST files created, please include them in rtd/docs/src/index.rst under their corresponding sections. All RST files not included in index.rst will not populate on the offical webpage (e.g. readthedocs).

Paper Lock
************
This is for claiming any sections you are working on so there is no overlap.
Please USE paper.lock to indicate if you are editing an existing RST file.


35 changes: 35 additions & 0 deletions rtd/docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build

if "%1" == "" goto help

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
2 changes: 2 additions & 0 deletions rtd/docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# compatable with the newest version of Sphinx (v7.2.1)
sphinx_rtd_theme==1.3.0rc1
130 changes: 130 additions & 0 deletions rtd/docs/source/asf/asf-quickstart.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
AppSysFusion Quick Start
==================================================================

Create A Simple Analysis
------------------------
To start, please create a folder called ``graf_analysis`` in your home directory and copy the following contents to a python file called ``dsosTemplate.py``:

* This is a python analysis that queries the DSOS database and returns a DataFrame of the ``meminfo`` schema metrics along with the ``timestamp``, ``component_id`` and ``job_id``.

dsosTemplate.py:

.. code-block :: python
import os, sys, traceback
import datetime as dt
from graf_analysis.grafanaAnalysis import Analysis
from sosdb import Sos
import pandas as pd
import numpy as np
class dsosTemplate(Analysis):
def __init__(self, cont, start, end, schema='meminfo', maxDataPoints=4096):
super().__init__(cont, start, end, schema, 1000000)
def get_data(self, metrics, filters=[],params=None):
try:
self.sel = f'select {",".join(metrics)} from {self.schema}'
where_clause = self.get_where(filters,res=FALSE)
order = 'time_job_comp'
orderby='order_by ' + order
self.query.select(f'{sel} {where_clause} {orderby}')
res = self.get_all_data(self.query)
# Fun stuff here!
print(res.head)
return res
except Exception as e:
a, b, c = sys.exc_info()
print(str(e)+' '+str(c.tb_lineno))
.. note::

If you want to use this analysis module in a Grafana dashboard, you will need to ask your administrator to copy your new analysis module(s) into the directory that Grafana points to. This is because Grafana is setup to look at a specific path directory to query from.

Test Analysis via Terminal Window
----------------------------------
You can easily test your module without the Grafana interface by creating a python script that mimics the Grafana query and formats the returned JSON into a timeseries dataframe or table.

First, create the following file in the same directory as your python analysis (i.e. ``/user/home/graf_analysis/``) and label it ``testDSOSanalysis.py``.

* This python script imitates the Grafana query that calls your analysis module and will return a timeseries DataFrame of the ``Active`` and ``Inactive`` meminfo metrics.

.. code-block :: python
#!/usr/bin/python3
import time,sys
from sosdb import Sos
from grafanaFormatter import DataFormatter
from table_formatter import table_formatter
from time_series_formatter import time_series_formatter
from dsosTemplate import dsosTemplate
sess = Sos.Session("/<DSOS_CONFIG_PATH>/config/dsos.conf")
cont = '<PATH_TO_DATABASE>'
cont = sess.open(cont)
model = dsosTemplate(cont, time.time()-300, time.time(), schema='meminfo', maxDataPoints=4096)
x = model.get_data(['Active','Inactive'], filters=['job_id'], params='')
#fmt = table_formatter(x)
fmt = time_series_formatter(x)
x = fmt.ret_json()
print(x)
.. note::

You will need to provide the path to the DSOS container and ``Sos.Session()`` configuration file in order to run this python script. Please see the `Python Analysis Creation <pyanalysis.rst>`_ for more details.

* Next, run the python module:

.. code-block :: bash
python3 dsosTemplate.py
.. note::

All imports are python scripts that need to reside in the same directory as the test analysis module in order for it to run successfully.

Then, run the python script with the current python verion installed. In this case it would be ``python3 <analysisTemplate.py>``

Expected Results & Output
+++++++++++++++++++++++++
The following is an example test of an analysis module that queries the ``meminfo`` schema an returns a timeseries dataframe of the ``Active`` and ``Inactive`` metrics:

.. image:: ../images/grafana/grafana_output.png

Test Analysis via Grafana Dashboard
-----------------------------------
You can optionally test the analysis in a grafana dashboard. This is not preferred because it is a bit more time consuming and, if there is a lot of data to query, there can be some additional wait time in that as well.

Create A New Dashboard
++++++++++++++++++++++++++
To create a new dashboard, click on the + sign on the left side of the home page and hit dashboard. This will create a blank dashboard with an empty panel in it. Hit the add query button on the panel to begin configuring the query to be sent to an analysis module. 

.. note::

For more information on how to navigate around the Grafana dashboard and what the variables and advanced settings do, please see `Grafana Panel <grafanapanel>`_ and `Grafana Usage <grafanause>`_.

* Next, add your analysis by filling out the required fields shown below:

.. image:: ../images/grafana/grafana_query.png

* These fields are identical to the python script you can generate to test in your terminal window so please refer to :ref:`Test Analysis via Terminal Window` or `Grafana Panel <grafanapanel>`_ for more details.

* Now change the analysis to query from the last 5 minutes by selecting the down arrow in the top right of the panel and selecting "Last 5 minutes"

.. image:: ../images/grafana/grafana_time.png
:height: 250
:width: 50

* Then change the refresh rate to 5 seconds so that Grafana will automatically query the data every 5 seconds

.. image:: ../images/grafana/grafana_timerange.png

* Now you should be able to see a the "Active" and "Inactive" values for each job_id.

.. image::



2 changes: 2 additions & 0 deletions rtd/docs/source/asf/asf-tutorial.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Additional ASF Tutorial Material
===============================
9 changes: 9 additions & 0 deletions rtd/docs/source/asf/deployment/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
ASF Deployment
===============
This section covers how to deploy and test AppSysFusion

.. toctree::
:maxdepth: 2

test

4 changes: 4 additions & 0 deletions rtd/docs/source/asf/deployment/test.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
Github
======

Documentation for this is currently under development.
47 changes: 47 additions & 0 deletions rtd/docs/source/asf/grafanapanel.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
Grafana Panel Creation with DSOS Plugin
======================================

To create a new dashboard, click on the + sign on the left side of the Grafana home page and select dashboard.
This will create a blank dashboard with an empty panel in it. Panels can be thought of as a visualization of a single query. select the add query button on the panel to begin configuring the query to be sent to an analysis module.

Configuring the Query and Visualization
---------------------------------------
.. image:: ../images/grafana/grafanapanel.png

Once you right click on the panel title and select edit, the panel settings will appear. The first tab is for configuring the query. There are 8 fields in the query field defined below:

* Query Type - type of query to perform. The most commonly used in "analysis" which calls an analysis module. "metrics" is used to return raw data without any analysis module.
* Query Format - the type of visualization to be used on the dataset. It is used by Grafana Formatter to properly JSON-ify the data returned from the analysis module. Can be either time_series, table, or heatmap.
* Analysis - required if you choose analysis query type. Specifies the python module to call to transofrm the data.
* Container - the name of the container to be used. This can be either the full path to the container or the Django backend get_container function can be changed to customize for site settings.
* Schema - What LDMS schema will be passed into the analysis module
* Metric - Pass a metric, or a comma separated list (without spaces) of metrics, into the analysis module
* Extra Params - (Optional) pass in an arbitrary string into the analysis module
* Filters - (Optional) include a no-sql like syntax for filtering your query, can be a comma separated list of filters i.e. component_id == 5,job_id > 0

The second tab in the panel settings is for visualization. Graph, Table, and Heatmap are the available visualizations for a query output.

Text, which uses Markdown language, could also be used for Dashboard descriptions or details. If you use a graph visualization, the query Format should be time_series. If you use a table visualization, the query Format should be table.

Graphs have multiple draw modes: bars, lines, and points. You can any or all of these draw modes on. You can also stack multiple time_series using the stack toggle button.

For more information about how to view the data and configure the panels, please see Grafana's `Panels and Visualization Documentation <https://grafana.com/docs/grafana/latest/panels-visualizations/>`_

Dashboard Variables and Advanced Settings
-------------------------------------------
.. image:: ../images/grafana/grafanapanel_variables.png

Often we want users to be able to change inputs into the queries, however users cannot change edit queries. What they can edit in Grafana are variables, which are listed at the top of the dashboard. These variables can be referenced with a ``$`` in front of the variable name. For example, we can let the user switch SOS containers they are interested by creating a variable called container and then putting ``$container`` in the container field of the query. To create variables, go to the dashboard settings (gear button at the top right) and go to variables. Here you can create new variables. Common variable types are text boxes, for users to fill in, or queries. We can actually create a pre-populated list of options for certain fields by querying the container. Below are the queryable metrics what information to put in the query field.

* Container - select the custom option in the **Type** field and add the name of the container being used to query from in the **custom options** field.
* Schema - ``query=schema&container=<cont_name>``
* Index - ``query=index&container=<cont_name>&schema=<schema_name>``
* Metrics - ``query=metrics&container=<cont_name>&schema=<schema_name>``
* Component IDs - ``query=components&container=<cont_name>&schema=<schema_name>``
* Jobs - ``query=jobs&container=<cont_name>&schema=<schema_name>``

You can put variables in queries as well. For example, if you already have a $container variable, you can set the schema variable query to be ``query=schema&container=$container``. Then the ``$schema`` variable can be used in other queries.

In the dashboard settings you can also change the dashboard name and folder location and load previously saved versions.

Other than the container variable, all other variables bulleted above are set to query in the **Type** field
2 changes: 2 additions & 0 deletions rtd/docs/source/asf/grafanause.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
Basic Grafana Usage
===================
22 changes: 22 additions & 0 deletions rtd/docs/source/asf/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
.. image:: ../images/appsysfusion.png
:width: 300
:height: 125
:align: center

ASF
====
AppSysFusion provides analysis and visualization capabilities aimed at serving insights from HPC monitoring data gathered with LDMS, though could be generalized outside of that scope.
It combines a Grafana front-end with a Django back-end to perform in-query analyses on raw data and return transformed information back to the end user.
By performing in-query analyses, only data of interest to the end-user is operated on rather than the entirety of the dataset for all analyses for all time.
This saves significant computation and storage resources with the penalty of slightly higher query times.
These analyses are modular python scripts that can be easily added or changed to suit evolving needs.
The current implementation is aimed at querying DSOS databases containing LDMS data, though efforts are in progress to abstract this functionality out to other databases and datatypes.

.. toctree::
:maxdepth: 2

asf-quickstart
asf-tutorial
grafanapanel
grafanause
pyanalysis
Loading

0 comments on commit 44a834e

Please sign in to comment.