Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AMG] Migrate #7802

Merged
merged 80 commits into from
Aug 15, 2024
Merged
Show file tree
Hide file tree
Changes from 55 commits
Commits
Show all changes
80 commits
Select commit Hold shift + click to select a range
b0aa897
feat: initial migration support using backup & restore commands.
leozhang-msft Jun 25, 2024
d776567
merge
leozhang-msft Jul 2, 2024
ea5d07f
create backup_core to separate backup & make migrate
leozhang-msft Jul 2, 2024
42dabc0
port over dashboard for migrate and backup.
leozhang-msft Jul 3, 2024
2106c1d
rename get_dashboards to get_all_dashboards
leozhang-msft Jul 3, 2024
33a5c18
move over library panels
leozhang-msft Jul 3, 2024
994bd9a
in backup_core, support skipping external snapshots, only applicable …
leozhang-msft Jul 3, 2024
fc0d67c
refactor snapshots between backup_core & use in migrate/backup
leozhang-msft Jul 3, 2024
dfb4b1a
feat: change backup_core to have get_folders and have migrate and bac…
leozhang-msft Jul 5, 2024
c306bf1
Refactor backup_core to include get_all_annotations and update migrat…
leozhang-msft Jul 5, 2024
bb0cd42
refactor backup_core to have get, which returns and migrate and backu…
leozhang-msft Jul 5, 2024
c6ca1cb
refactor: restore of dashboards
leozhang-msft Jul 8, 2024
db4b0a1
refactor: make create_folder function to be used by migrate and restore.
leozhang-msft Jul 9, 2024
5c38026
refactor: library panels
leozhang-msft Jul 10, 2024
140ad9b
refactor: create_snapshot function for migrate & restore.
leozhang-msft Jul 10, 2024
6d5eb77
refactor: create_annotation for restore & migrate
leozhang-msft Jul 10, 2024
565187b
feat: datasources & mapping, fully port everything over to migrate.
leozhang-msft Jul 10, 2024
dd18a25
delete backup & restore from migrate.
leozhang-msft Jul 11, 2024
2ab8de1
Add dry_run flag to migrate & only create new folders.
leozhang-msft Jul 12, 2024
8bc5006
add dry_run logic and not sending request to remake datasource when w…
leozhang-msft Jul 12, 2024
d9d0731
add in summary
leozhang-msft Jul 15, 2024
e0d9e48
refactor: sync library panels during migration
leozhang-msft Jul 15, 2024
2eb57c7
Refactor migrate function to have dry run & summary
leozhang-msft Jul 15, 2024
e7d52f3
refactor migrate into having it's own functions.
leozhang-msft Jul 15, 2024
333e4a9
feat: Add override flags to migrate_grafana
leozhang-msft Jul 15, 2024
a001b63
add overwrite support for folders and library panels
leozhang-msft Jul 17, 2024
6a7882d
change override to overwrite to be more consistent.
leozhang-msft Jul 17, 2024
fddce63
update the summary text so that it better reflects what happens
leozhang-msft Jul 17, 2024
91db034
update library panel summary text to be correct.
leozhang-msft Jul 17, 2024
d3cd072
add better summary texts for dashboards.
leozhang-msft Jul 18, 2024
e6d7d3e
create new migrate test by copy and pasting the backup and restore te…
leozhang-msft Jul 22, 2024
4ac7edd
e2e except folders seem to work
leozhang-msft Jul 25, 2024
cafcf32
fix folder issue
leozhang-msft Jul 25, 2024
7ff1cd7
move the migrate tests to a new file.
leozhang-msft Jul 25, 2024
a346422
start writing the unit tests
leozhang-msft Jul 25, 2024
ee16c9b
migrate dry run flag test works
leozhang-msft Jul 25, 2024
5129a02
fix dashboard version edge case & finish writing all the tests
leozhang-msft Jul 29, 2024
4a4da3c
All recordings
leozhang-msft Jul 29, 2024
f4c5b6b
Add datasources warning
leozhang-msft Jul 30, 2024
5e9a810
fix edge case for library panels & folders & dashboards, depending on…
leozhang-msft Jul 30, 2024
0429ef8
cleanup dry run summary texts
leozhang-msft Jul 30, 2024
c1618f9
implement PR comment changes
leozhang-msft Jul 31, 2024
06169e0
add in support for --overwrite snapshots
leozhang-msft Aug 1, 2024
0c39c91
add in support for --overwrite for annotations
leozhang-msft Aug 1, 2024
8c80e69
Add in Jeremy's suggestions: refactor summary functions into more sum…
leozhang-msft Aug 1, 2024
bd28ba0
Update test recordings
leozhang-msft Aug 1, 2024
f908daa
Add header back to backup.py & change imports back to normal. Delete …
leozhang-msft Aug 1, 2024
12419cc
Clean up migrate logic
leozhang-msft Aug 1, 2024
f6c03c3
Make backup_core more consistent
leozhang-msft Aug 1, 2024
7ab8331
Use update_summary / update_summary_dict in migrate instead of duplic…
leozhang-msft Aug 1, 2024
5ad56ab
run autopep8 to fix style issues & manually fix some
leozhang-msft Aug 1, 2024
981d64d
fix most azdev style issues, dealing with too-many-locals
leozhang-msft Aug 1, 2024
b2f51e5
disable too-many-locals for migrate function
leozhang-msft Aug 1, 2024
b9498ad
Update recordings
leozhang-msft Aug 1, 2024
8438a17
add in accidentally removed prints
leozhang-msft Aug 1, 2024
97e1bb5
Merge branch 'main' of https://github.com/Azure/azure-cli-extensions
leozhang-msft Aug 6, 2024
90e7d39
change variable names, update HISTORY, make return more explict
leozhang-msft Aug 6, 2024
6efcf36
fix remapping test
leozhang-msft Aug 6, 2024
e9d230c
Update test recordings
leozhang-msft Aug 6, 2024
0b83a31
Add print statements
leozhang-msft Aug 6, 2024
00186f4
use different method to check for library panels for dashboards
leozhang-msft Aug 6, 2024
e23ecf8
Update tests to mock search_annotations.
leozhang-msft Aug 7, 2024
9a802d3
Update recordings
leozhang-msft Aug 7, 2024
ea57685
Delete resources after test complete.
leozhang-msft Aug 7, 2024
cf205a7
Update recordings
leozhang-msft Aug 7, 2024
7d4d077
add in copyright & get rid of useles imports and get rid of useless f…
leozhang-msft Aug 8, 2024
b32cce1
Change help message
leozhang-msft Aug 8, 2024
f2e2f43
do health check of source_url before calling the migrate function
leozhang-msft Aug 8, 2024
4d06109
Add check for same folders in include & exclude. Update valid folders…
leozhang-msft Aug 8, 2024
a7d8a69
fix a few minor bugs
leozhang-msft Aug 13, 2024
81205e6
re-record the tests
leozhang-msft Aug 13, 2024
e34cfe4
Merge branch 'main' of https://github.com/Azure/azure-cli-extensions
leozhang-msft Aug 14, 2024
bbfe9cd
redo recordings
leozhang-msft Aug 14, 2024
e88c0a3
fix style issues on custom.py (removing converting fstrings to string…
leozhang-msft Aug 14, 2024
3da0875
add 'meta' to fix an Grafana 8 migrate bug.
leozhang-msft Aug 14, 2024
295fc5b
azdev style line to long fix
leozhang-msft Aug 15, 2024
4800ab5
update history.rst
leozhang-msft Aug 15, 2024
78f92d9
rename migrate variables
leozhang-msft Aug 15, 2024
81a225a
cleanup the valid folder uids general edge case to be cleaner
leozhang-msft Aug 15, 2024
784c11b
fix indenting from azdev style amg
leozhang-msft Aug 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions src/amg/azext_amg/_help.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,15 @@
az grafana restore -g MyResourceGroup -n MyGrafana --archive-file backup\\dashboards\\ServiceHealth-202307051036.tar.gz --components dashboards folders --remap-data-sources
"""

helps['grafana migrate'] = """
type: command
short-summary: Migrate an existing Grafana instance to an Azure Managed Grafana instance.
examples:
- name: Migrate dashboards and folders from a local Grafana instance to an Azure Managed Grafana instance.
text: |
az grafana migrate -g MyResourceGroup -n MyGrafana -s http://localhost:3000 -t <token that starts with glsa_...>
"""

helps['grafana update'] = """
type: command
short-summary: Update an Azure Managed Grafana instance.
Expand Down
6 changes: 6 additions & 0 deletions src/amg/azext_amg/_params.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,12 @@ def load_arguments(self, _):
c.argument("remap_data_sources", options_list=["-r", "--remap-data-sources"], arg_type=get_three_state_flag(),
help="during restoration, update dashboards to reference data sources defined at the destination workspace through name matching")

with self.argument_context("grafana migrate") as c:
c.argument("source_grafana_endpoint", options_list=["-s", "--src-endpoint"], help="Grafana instance endpoint to migrate from")
c.argument("source_grafana_token_or_api_key", options_list=["-t", "--src-token-or-key"], help="Grafana instance service token (or api key) to get access to migrate from")
c.argument("dry_run", options_list=["-d", "--dry-run"], arg_type=get_three_state_flag(), help="Preview changes without committing. Takes priority over --overwrite.")
c.argument("overwrite", options_list=["--overwrite"], arg_type=get_three_state_flag(), help="Overwrite previous dashboards, library panels, and folders with the same uid or title")

with self.argument_context("grafana dashboard") as c:
c.argument("uid", options_list=["--dashboard"], help="dashboard uid")
c.argument("title", help="title of a dashboard")
Expand Down
285 changes: 57 additions & 228 deletions src/amg/azext_amg/backup.py

Large diffs are not rendered by default.

238 changes: 238 additions & 0 deletions src/amg/azext_amg/backup_core.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,238 @@
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------

import json
import re
import time

from knack.log import get_logger

from .utils import search_dashboard, get_dashboard
from .utils import search_library_panels
from .utils import search_snapshot, get_snapshot
from .utils import search_folders, get_folder, get_folder_permissions
from .utils import search_datasource
from .utils import search_annotations

logger = get_logger(__name__)


def get_all_dashboards(grafana_url, http_headers, **kwargs):
limit = 5000 # limit is 5000 above V6.2+
current_page = 1

all_dashboards = []

# Go through all the pages, we are unsure how many pages there are
while True:
dashboards = _get_all_dashboards_in_grafana(current_page, limit, grafana_url, http_headers)

# only include what users want
folders_to_include = kwargs.get('folders_to_include')
folders_to_exclude = kwargs.get('folders_to_exclude')
if folders_to_include:
leozhang-msft marked this conversation as resolved.
Show resolved Hide resolved
folders_to_include = [f.lower() for f in folders_to_include]
dashboards = [d for d in dashboards if (d.get('folderTitle', '').lower() in folders_to_include or
not d.get('folderTitle', '') and 'general' in folders_to_include)]
if folders_to_exclude:
folders_to_exclude = [f.lower() for f in folders_to_exclude]
dashboards = [d for d in dashboards if ((d.get('folderTitle', '')
and d.get('folderTitle', '').lower() not in folders_to_exclude)
or
(not d.get('folderTitle', '')
and 'general' not in folders_to_exclude))]

print_an_empty_line()
if len(dashboards) == 0:
break
current_page += 1
current_run_dashboards = _get_individual_dashboard_setting(dashboards, grafana_url, http_headers)
# add the previous list to the list where we added everything.
all_dashboards += current_run_dashboards
print_an_empty_line()

return all_dashboards


def _get_all_dashboards_in_grafana(page, limit, grafana_url, http_headers):
(status, content) = search_dashboard(page,
limit,
grafana_url,
http_headers)
if status == 200:
dashboards = content
logger.info("There are %s dashboards:", len(dashboards))
for board in dashboards:
logger.info('name: %s', board['title'])
return dashboards
logger.warning("Get dashboards FAILED, status: %s, msg: %s", status, content)
return []


def _get_individual_dashboard_setting(dashboards, grafana_url, http_headers):
if not dashboards:
return []

all_individual_dashboards = []
for board in dashboards:
board_uri = "uid/" + board['uid']

(status, content) = get_dashboard(board_uri, grafana_url, http_headers)
if status == 200:
# do not back up provisioned dashboards
if content['meta']['provisioned']:
logger.warning("Dashboard: \"%s\" is provisioned, skipping...", board['title'])
continue

all_individual_dashboards.append(content)

return all_individual_dashboards


def get_all_library_panels(grafana_url, http_headers):
all_panels = []
current_page = 1
while True:
panels = _get_all_library_panels_in_grafana(current_page, grafana_url, http_headers)

print_an_empty_line()
if len(panels) == 0:
break
current_page += 1

# Since we are not excluding anything. We can just add the panels to the
# list since this is all the data we need.
all_panels += panels
print_an_empty_line()

return all_panels


def _get_all_library_panels_in_grafana(page, grafana_url, http_headers):
(status, content) = search_library_panels(page, grafana_url, http_headers)
if status == 200:
library_panels = content
logger.info("There are %s library panels:", len(library_panels))
for panel in library_panels:
logger.info('name: %s', panel['name'])
return library_panels
logger.warning("Get library panel FAILED, status: %s, msg: %s", status, content)
return []


def get_all_snapshots(grafana_url, http_headers):
(status, content) = search_snapshot(grafana_url, http_headers)

leozhang-msft marked this conversation as resolved.
Show resolved Hide resolved
if status != 200:
logger.warning("Query snapshot failed, status: %s, msg: %s", status, content)
return []

all_snapshots_metadata = []
for snapshot in content:
if not snapshot['external']:
all_snapshots_metadata.append(snapshot)
else:
logger.warning("External snapshot: %s is skipped", snapshot['name'])

logger.info("There are %s snapshots:", len(all_snapshots_metadata))

all_snapshots = []
for snapshot in all_snapshots_metadata:
logger.info(snapshot)

(individual_status, individual_content) = get_snapshot(snapshot['key'], grafana_url, http_headers)
if individual_status == 200:
all_snapshots.append((snapshot['key'], individual_content))
else:
logger.warning("Getting snapshot %s FAILED, status: %s, msg: %s",
snapshot['name'], individual_status, individual_content)

return all_snapshots


def get_all_folders(grafana_url, http_headers, **kwargs):
folders = _get_all_folders_in_grafana(grafana_url, http_get_headers=http_headers)

# only include what users want
folders_to_include = kwargs.get('folders_to_include')
folders_to_exclude = kwargs.get('folders_to_exclude')
if folders_to_include:
folders_to_include = [f.lower() for f in folders_to_include]
folders = [f for f in folders if f.get('title', '').lower() in folders_to_include]
if folders_to_exclude:
folders_to_exclude = [f.lower() for f in folders_to_exclude]
folders = [f for f in folders if f.get('title', '').lower() not in folders_to_exclude]

individual_folders = []
for folder in folders:
(status_folder_settings, content_folder_settings) = get_folder(folder['uid'], grafana_url, http_headers)
# TODO: get_folder_permissions usually requires admin permission but we
# don't save the permissions in backup or migrate. Figure out what to do.
(status_folder_permissions, content_folder_permissions) = get_folder_permissions(folder['uid'],
grafana_url,
http_headers)
if status_folder_settings == 200 and status_folder_permissions == 200:
individual_folders.append((content_folder_settings, content_folder_permissions))
else:
logger.warning("Getting folder %s FAILED", folder['title'])
logger.info("settings status: %s, settings content: %s, permissions status: %s, permissions content: %s",
status_folder_settings,
content_folder_settings,
status_folder_permissions,
content_folder_permissions)

return individual_folders


def _get_all_folders_in_grafana(grafana_url, http_get_headers):
status_and_content_of_all_folders = search_folders(grafana_url, http_get_headers)
status = status_and_content_of_all_folders[0]
content = status_and_content_of_all_folders[1]
if status == 200:
folders = content
logger.info("There are %s folders:", len(content))
for folder in folders:
logger.info("name: %s", folder['title'])
return folders
logger.warning("Get folders FAILED, status: %s, msg: %s", status, content)
return []


def get_all_annotations(grafana_url, http_headers):
all_annotations = []
now = int(round(time.time() * 1000))
one_month_in_ms = 31 * 24 * 60 * 60 * 1000

ts_to = now
ts_from = now - one_month_in_ms
thirteen_months_retention = now - (13 * one_month_in_ms)

while ts_from > thirteen_months_retention:
(status, content) = search_annotations(grafana_url, ts_from, ts_to, http_headers)
if status == 200:
annotations_batch = content
logger.info("There are %s annotations:", len(annotations_batch))
all_annotations += annotations_batch
else:
logger.warning("Query annotation FAILED, status: %s, msg: %s", status, content)

ts_to = ts_from
ts_from = ts_from - one_month_in_ms

return all_annotations


def get_all_datasources(grafana_url, http_headers):
(status, content) = search_datasource(grafana_url, http_headers)
if status == 200:
datasources = content
logger.info("There are %s datasources:", len(datasources))
return datasources

logger.info("Query datasource FAILED, status: %s, msg: %s", status, content)


def print_an_empty_line():
logger.info('')
1 change: 1 addition & 0 deletions src/amg/azext_amg/commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ def load_command_table(self, _):
g.custom_command('update', 'update_grafana')
g.custom_command('backup', 'backup_grafana', is_preview=True)
g.custom_command('restore', 'restore_grafana', is_preview=True)
g.custom_command('migrate', 'migrate_grafana', is_preview=True)

with self.command_group('grafana dashboard') as g:
g.custom_command('create', 'create_dashboard')
Expand Down
28 changes: 28 additions & 0 deletions src/amg/azext_amg/custom.py
Original file line number Diff line number Diff line change
Expand Up @@ -297,6 +297,34 @@ def restore_grafana(cmd, grafana_name, archive_file, components=None, remap_data
destination_datasources=data_sources)


def migrate_grafana(cmd, grafana_name, source_grafana_endpoint, source_grafana_token_or_api_key, dry_run=False,
overwrite=False, folders_to_include=None, folders_to_exclude=None, resource_group_name=None):
from .migrate import migrate

# for source instance (backing up from)
headers_src = {
"content-type": "application/json",
"authorization": "Bearer " + source_grafana_token_or_api_key
}

# for destination instance (restoring to)
_health_endpoint_reachable(cmd, grafana_name, resource_group_name=resource_group_name)
creds_dest = _get_data_plane_creds(cmd, api_key_or_token=None, subscription=None)
headers_dest = {
"content-type": "application/json",
"authorization": "Bearer " + creds_dest[1]
}

migrate(backup_url=source_grafana_endpoint,
backup_headers=headers_src,
restore_url=_get_grafana_endpoint(cmd, resource_group_name, grafana_name, subscription=None),
restore_headers=headers_dest,
dry_run=dry_run,
overwrite=overwrite,
folders_to_include=folders_to_include,
folders_to_exclude=folders_to_exclude)


def sync_dashboard(cmd, source, destination, folders_to_include=None, folders_to_exclude=None,
dashboards_to_include=None, dashboards_to_exclude=None, dry_run=None):
from .sync import sync
Expand Down
Loading
Loading