From efb868881404995691309a7763c80c183088a362 Mon Sep 17 00:00:00 2001 From: miguelgfierro Date: Fri, 29 Dec 2023 09:03:13 +0100 Subject: [PATCH 1/5] change path hybrid Signed-off-by: miguelgfierro --- .../README.md | 2 + examples/02_model_hybrid/fm_deep_dive.ipynb | 912 -------- .../02_model_hybrid/lightfm_deep_dive.ipynb | 1956 ----------------- 3 files changed, 2 insertions(+), 2868 deletions(-) delete mode 100644 examples/02_model_hybrid/fm_deep_dive.ipynb delete mode 100755 examples/02_model_hybrid/lightfm_deep_dive.ipynb diff --git a/examples/02_model_collaborative_filtering/README.md b/examples/02_model_collaborative_filtering/README.md index 658af1f6f2..c4cb97cf8f 100644 --- a/examples/02_model_collaborative_filtering/README.md +++ b/examples/02_model_collaborative_filtering/README.md @@ -8,6 +8,8 @@ In this directory, notebooks are provided to give a deep dive of collaborative f | [baseline_deep_dive](baseline_deep_dive.ipynb) | --- | Deep dive on baseline performance estimation. | [cornac_bivae_deep_dive](cornac_bivae_deep_dive.ipynb) | Python CPU, GPU | Deep dive on the BiVAE algorithm and implementation. | [cornac_bpr_deep_dive](cornac_bpr_deep_dive.ipynb) | Python CPU | Deep dive on the BPR algorithm and implementation. +| [fm_deep_dive](fm_deep_dive.ipynb) | Python CPU | Deep dive into factorization machine (FM) and field-aware FM (FFM) algorithm. +| [lightfm_deep_dive](lightfm_deep_dive.ipynb) | Python CPU | Deep dive into matrix factorization model with LightFM. | [lightgcn_deep_dive](lightgcn_deep_dive.ipynb) | Python CPU, GPU | Deep dive on a LightGCN algorithm and implementation. | [multi_vae_deep_dive](multi_vae_deep_dive.ipynb) | Python CPU, GPU | Deep dive on the Multinomial VAE algorithm and implementation. | [ncf_deep_dive](ncf_deep_dive.ipynb) | Python CPU, GPU | Deep dive on a NCF algorithm and implementation. diff --git a/examples/02_model_hybrid/fm_deep_dive.ipynb b/examples/02_model_hybrid/fm_deep_dive.ipynb deleted file mode 100644 index 04d782e5a6..0000000000 --- a/examples/02_model_hybrid/fm_deep_dive.ipynb +++ /dev/null @@ -1,912 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Copyright (c) Recommenders contributors.\n", - "\n", - "Licensed under the MIT License." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Factorization Machine Deep Dive\n", - "\n", - "Factorization machine (FM) is one of the representative algorithms that are used for building hybrid recommenders model. The algorithm is powerful in terms of capturing the effects of not just the input features but also their interactions. The algorithm provides better generalization capability and expressiveness compared to other classic algorithms such as SVMs. The most recent research extends the basic FM algorithms by using deep learning techniques, which achieve remarkable improvement in a few practical use cases.\n", - "\n", - "This notebook presents a deep dive into the Factorization Machine algorithm, and demonstrates some best practices of using the contemporary FM implementations like [`xlearn`](https://github.com/aksnzhy/xlearn) for dealing with tasks like click-through rate prediction." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 1 Factorization Machine" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 1.1 Factorization Machine" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "FM is an algorithm that uses factorization in prediction tasks with data set of high sparsity. The algorithm was original proposed in [\\[1\\]](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf). Traditionally, the algorithms such as SVM do not perform well in dealing with highly sparse data that is usually seen in many contemporary problems, e.g., click-through rate prediction, recommendation, etc. FM handles the problem by modeling not just first-order linear components for predicting the label, but also the cross-product of the feature variables in order to capture more generalized correlation between variables and label. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In certain occasions, the data that appears in recommendation problems, such as user, item, and feature vectors, can be encoded into a one-hot representation. Under this arrangement, classical algorithms like linear regression and SVM may suffer from the following problems:\n", - "1. The feature vectors are highly sparse, and thus it makes it hard to optimize the parameters to fit the model efficienly\n", - "2. Cross-product of features will be sparse as well, and this in turn, reduces the expressiveness of a model if it is designed to capture the high-order interactions between features" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The FM algorithm is designed to tackle the above two problems by factorizing latent vectors that model the low- and high-order components. The general idea of a FM model is expressed in the following equation:" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "$$\\hat{y}(\\textbf{x})=w_{0}+\\sum^{n}_{i=1}w_{i}x_{i}+\\sum^{n}_{i=1}\\sum^{n}_{j=i+1}<\\textbf{v}_{i}, \\textbf{v}_{j}>x_{i}x_{j}$$" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "where $\\hat{y}$ and $\\textbf{x}$ are the target to predict and input feature vectors, respectively. $w_{i}$ is the model parameters for the first-order component. $<\\textbf{v}_{i}, \\textbf{v}_{j}>$ is the dot product of two latent factors for the second-order interaction of feature variables, and it is defined as " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "$$<\\textbf{v}_{i}, \\textbf{v}_{j}>=\\sum^{k}_{f=1}v_{i,f}\\cdot v_{j,f}$$" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Compared to using fixed parameter for the high-order interaction components, using the factorized vectors increase generalization as well as expressiveness of the model. In addition to this, the computation complexity of the equation (above) is $O(kn)$ where $k$ and $n$ are the dimensionalities of the factorization vector and input feature vector, respectively (see [the paper](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf) for detailed discussion). In practice, usually a two-way FM model is used, i.e., only the second-order feature interactions are considered to favor computational efficiency." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 1.2 Field-Aware Factorization Machine" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Field-aware factorization machine (FFM) is an extension to FM. It was originally introduced in [\\[2\\]](https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf). The advantage of FFM over FM is that, it uses different factorized latent factors for different groups of features. The \"group\" is called \"field\" in the context of FFM. Putting features into fields resolves the issue that the latent factors shared by features that intuitively represent different categories of information may not well generalize the correlation. \n", - "\n", - "Different from the formula for the 2-order cross product as can be seen above in the FM equation, in the FFM settings, the equation changes to " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "$$\\theta_{\\text{FFM}}(\\textbf{w}\\textbf{x})=\\sum^{n}_{j1=1}\\sum^{n}_{j2=j1+1}<\\textbf{v}_{j1,f2}, \\textbf{v}_{j2,f1}>x_{j1}x_{j2}$$" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "where $f_1$ and $f_2$ are the fields of $j_1$ and $j_2$, respectively." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Compared to FM, the computational complexity increases to $O(n^2k)$. However, since the latent factors in FFM only need to learn the effect within the field, so the $k$ values in FFM is usually much smaller than that in FM." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 1.3 FM/FFM extensions" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In the recent years, FM/FFM extensions were proposed to enhance the model performance further. The new algorithms leverage the powerful deep learning neural network to improve the generalization capability of the original FM/FFM algorithms. Representatives of the such algorithms are summarized as below. Some of them are implemented and demonstrated in the microsoft/recommenders repository. \n", - "\n", - "|Algorithm|Notes|References|Example in Microsoft/Recommenders|\n", - "|---------|-----|----------|---------------------------------|\n", - "|DeepFM|Combination of FM and DNN where DNN handles high-order interactions|[\\[3\\]](https://arxiv.org/abs/1703.04247)|-|\n", - "|xDeepFM|Combination of FM, DNN, and Compressed Interaction Network, for vectorized feature interactions|[\\[4\\]](https://dl.acm.org/citation.cfm?id=3220023)|[notebook](../00_quick_start/xdeepfm_criteo.ipynb) / [utilities](../../recommenders/models/deeprec/models/xDeepFM.py)|\n", - "|Factorization Machine Supported Neural Network|Use FM user/item weight vectors as input layers for DNN model|[\\[5\\]](https://link.springer.com/chapter/10.1007/978-3-319-30671-1_4)|-|\n", - "|Product-based Neural Network|An additional product-wise layer between embedding layer and fully connected layer to improve expressiveness of interactions of features across fields|[\\[6\\]](https://ieeexplore.ieee.org/abstract/document/7837964)|-|\n", - "|Neural Factorization Machines|Improve the factorization part of FM by using stacks of NN layers to improve non-linear expressiveness|[\\[7\\]](https://dl.acm.org/citation.cfm?id=3080777)|-|\n", - "|Wide and deep|Combination of linear model (wide part) and deep neural network model (deep part) for memorisation and generalization|[\\[8\\]](https://dl.acm.org/citation.cfm?id=2988454)|[notebook](../00_quick_start/wide_deep_movielens.ipynb) / [utilities](../../recommenders/models/wide_deep)|" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 2 Factorization Machine Implementation" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.1 Implementations" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following table summarizes the implementations of FM/FFM. Some of them (e.g., xDeepFM and VW) are implemented and/or demonstrated in the microsoft/recommenders repository" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "|Implementation|Language|Notes|Examples in Microsoft/Recommenders|\n", - "|-----------------|------------------|------------------|---------------------|\n", - "|[libfm](https://github.com/srendle/libfm)|C++|Implementation of FM algorithm|-|\n", - "|[libffm](https://github.com/ycjuan/libffm)|C++|Original implemenation of FFM algorithm. It is handy in model building, but does not support Python interface|-|\n", - "|[xlearn](https://github.com/aksnzhy/xlearn)|C++ with Python interface|More computationally efficient compared to libffm without loss of modeling effectiveness|[notebook](fm_deep_dive.ipynb)|\n", - "|[Vowpal Wabbit FM](https://github.com/VowpalWabbit/vowpal_wabbit/wiki/Matrix-factorization-example)|Online library with estimator API|Easy to use by calling API|[notebook](../02_model_content_based_filtering/vowpal_wabbit_deep_dive.ipynb) / [utilities](../../recommenders/models/vowpal_wabbit)\n", - "|[microsoft/recommenders xDeepFM](../../recommenders/models/deeprec/models/xDeepFM.py)|Python|Support flexible interface with different configurations of FM and FM extensions, i.e., LR, FM, and/or CIN|[notebook](../00_quick_start/xdeepfm_criteo.ipynb) / [utilities](../../recommenders/models/deeprec/models/xDeepFM.py)|" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Other than `libfm` and `libffm`, all the other three can be used in a Python environment. \n", - "\n", - "* A deep dive of using Vowbal Wabbit for FM model can be found [here](../02_model_content_based_filtering/vowpal_wabbit_deep_dive.ipynb)\n", - "* A quick start of Microsoft xDeepFM algorithm can be found [here](../00_quick_start/xdeepfm_criteo.ipynb). \n", - "\n", - "Therefore, in the example below, only code examples and best practices of using `xlearn` are presented." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.2 xlearn" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Setups for using `xlearn`.\n", - "\n", - "1. `xlearn` is implemented in C++ and has Python bindings, so it can be directly installed as a Python package from PyPI. The installation of `xlearn` is enabled in the [Recommenders repo environment setup script](../../tools/generate_conda_file.py). One can follow the general setup steps to install the environment as required, in which `xlearn` is installed as well.\n", - "2. NOTE `xlearn` may require some base libraries installed as prerequisites in the system, e.g., `cmake`." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "After a succesful creation of the environment, one can load the packages to run `xlearn` in a Jupyter notebook or Python script." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "System version: 3.6.13 |Anaconda, Inc.| (default, Feb 23 2021, 21:15:04) \n", - "[GCC 7.3.0]\n", - "Xlearn version: 0.4.0\n" - ] - } - ], - "source": [ - "import os\n", - "import sys\n", - "from tempfile import TemporaryDirectory\n", - "import xlearn as xl\n", - "from sklearn.metrics import roc_auc_score\n", - "import numpy as np\n", - "import pandas as pd\n", - "import seaborn as sns\n", - "%matplotlib notebook\n", - "from matplotlib import pyplot as plt\n", - "\n", - "from recommenders.utils.constants import SEED\n", - "from recommenders.utils.timer import Timer\n", - "from recommenders.datasets.download_utils import maybe_download, unzip_file\n", - "from recommenders.tuning.parameter_sweep import generate_param_grid\n", - "from recommenders.datasets.pandas_df_utils import LibffmConverter\n", - "from recommenders.utils.notebook_utils import store_metadata\n", - "\n", - "print(\"System version: {}\".format(sys.version))\n", - "print(\"Xlearn version: {}\".format(xl.__version__))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In the FM model building, data is usually represented in the libsvm data format. That is, `label feat1:val1 feat2:val2 ...`, where `label` is the target to predict, and `val` is the value to each feature `feat`.\n", - "\n", - "FFM algorithm requires data to be represented in the libffm format, where each vector is split into several fields with categorical/numerical features inside. That is, `label field1:feat1:val1 field2:feat2:val2 ...`." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In the Microsoft/Recommenders utility functions, [a libffm converter](../../recommenders/dataset/pandas_df_utils.py) is provided to achieve the transformation from a tabular feature vectors to the corresponding libffm representation. For example, the following shows how to transform the format of a synthesized data by using the module of `LibffmConverter`." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
ratingfield1field2field3field4
011:1:12:4:33:5:1.04:6:1
101:2:12:4:43:5:2.04:7:1
201:3:12:4:53:5:3.04:8:1
311:3:12:4:63:5:4.04:9:1
411:3:12:4:73:5:5.04:10:1
\n", - "
" - ], - "text/plain": [ - " rating field1 field2 field3 field4\n", - "0 1 1:1:1 2:4:3 3:5:1.0 4:6:1\n", - "1 0 1:2:1 2:4:4 3:5:2.0 4:7:1\n", - "2 0 1:3:1 2:4:5 3:5:3.0 4:8:1\n", - "3 1 1:3:1 2:4:6 3:5:4.0 4:9:1\n", - "4 1 1:3:1 2:4:7 3:5:5.0 4:10:1" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "df_feature_original = pd.DataFrame(\n", - " {\n", - " \"rating\": [1, 0, 0, 1, 1],\n", - " \"field1\": [\"xxx1\", \"xxx2\", \"xxx4\", \"xxx4\", \"xxx4\"],\n", - " \"field2\": [3, 4, 5, 6, 7],\n", - " \"field3\": [1.0, 2.0, 3.0, 4.0, 5.0],\n", - " \"field4\": [\"1\", \"2\", \"3\", \"4\", \"5\"],\n", - " }\n", - ")\n", - "\n", - "converter = LibffmConverter().fit(df_feature_original, col_rating=\"rating\")\n", - "df_out = converter.transform(df_feature_original)\n", - "df_out\n" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "There are in total 4 fields and 10 features.\n" - ] - } - ], - "source": [ - "print(\n", - " \"There are in total {0} fields and {1} features.\".format(\n", - " converter.field_count, converter.feature_count\n", - " )\n", - ")\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "To illustrate the use of `xlearn`, the following example uses the [Criteo data set](https://labs.criteo.com/category/dataset/), which has already been processed in the libffm format, for building and evaluating a FFM model built by using `xlearn`. Sometimes, it is important to know the total numbers of fields and features. When building a FFM model, `xlearn` can count these numbers automatically." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "tags": [ - "parameters" - ] - }, - "outputs": [], - "source": [ - "# Model parameters\n", - "LEARNING_RATE = 0.2\n", - "LAMBDA = 0.002\n", - "EPOCH = 10\n", - "OPT_METHOD = \"sgd\" # options are \"sgd\", \"adagrad\" and \"ftrl\"\n", - "\n", - "# The metrics for binary classification options are \"acc\", \"prec\", \"f1\" and \"auc\"\n", - "# for regression, options are \"rmse\", \"mae\", \"mape\"\n", - "METRIC = \"auc\"\n" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "100%|██████████| 10.3k/10.3k [00:00<00:00, 55.9kKB/s]\n" - ] - } - ], - "source": [ - "# Paths\n", - "YAML_FILE_NAME = \"xDeepFM.yaml\"\n", - "TRAIN_FILE_NAME = \"cretio_tiny_train\"\n", - "VALID_FILE_NAME = \"cretio_tiny_valid\"\n", - "TEST_FILE_NAME = \"cretio_tiny_test\"\n", - "MODEL_FILE_NAME = \"model.out\"\n", - "OUTPUT_FILE_NAME = \"output.txt\"\n", - "\n", - "tmpdir = TemporaryDirectory()\n", - "\n", - "data_path = tmpdir.name\n", - "yaml_file = os.path.join(data_path, YAML_FILE_NAME)\n", - "train_file = os.path.join(data_path, TRAIN_FILE_NAME)\n", - "valid_file = os.path.join(data_path, VALID_FILE_NAME)\n", - "test_file = os.path.join(data_path, TEST_FILE_NAME)\n", - "model_file = os.path.join(data_path, MODEL_FILE_NAME)\n", - "output_file = os.path.join(data_path, OUTPUT_FILE_NAME)\n", - "\n", - "assets_url = (\n", - " \"https://recodatasets.z20.web.core.windows.net/deeprec/xdeepfmresources.zip\"\n", - ")\n", - "assets_file = maybe_download(assets_url, work_directory=data_path)\n", - "unzip_file(assets_file, data_path)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following steps are from the [official documentation of `xlearn`](https://xlearn-doc.readthedocs.io/en/latest/index.html) for building a model. To begin with, we do not modify any training parameter values. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "NOTE, if `xlearn` is run through command line, the training process can be displayed in the console." - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Training time: 14.4424\n" - ] - } - ], - "source": [ - "# Training task\n", - "ffm_model = xl.create_ffm() # Use field-aware factorization machine (ffm)\n", - "ffm_model.setTrain(train_file) # Set the path of training dataset\n", - "ffm_model.setValidate(valid_file) # Set the path of validation dataset\n", - "\n", - "# Parameters:\n", - "# 0. task: binary classification\n", - "# 1. learning rate: 0.2\n", - "# 2. regular lambda: 0.002\n", - "# 3. evaluation metric: auc\n", - "# 4. number of epochs: 10\n", - "# 5. optimization method: sgd\n", - "param = {\n", - " \"task\": \"binary\",\n", - " \"lr\": LEARNING_RATE,\n", - " \"lambda\": LAMBDA,\n", - " \"metric\": METRIC,\n", - " \"epoch\": EPOCH,\n", - " \"opt\": OPT_METHOD,\n", - "}\n", - "\n", - "# Start to train\n", - "# The trained model will be stored in model.out\n", - "with Timer() as time_train:\n", - " ffm_model.fit(param, model_file)\n", - "print(f\"Training time: {time_train}\")\n" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Prediction time: 0.6435\n" - ] - } - ], - "source": [ - "# Prediction task\n", - "ffm_model.setTest(test_file) # Set the path of test dataset\n", - "ffm_model.setSigmoid() # Convert output to 0-1\n", - "\n", - "# Start to predict\n", - "# The output result will be stored in output.txt\n", - "with Timer() as time_predict:\n", - " ffm_model.predict(model_file, output_file)\n", - "print(f\"Prediction time: {time_predict}\")\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The output are the predicted labels (i.e., 1 or 0) for the testing data set. AUC score is calculated to evaluate the model performance." - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "0.7485411618010794" - ] - }, - "execution_count": 14, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "with open(output_file) as f:\n", - " predictions = f.readlines()\n", - "\n", - "with open(test_file) as f:\n", - " truths = f.readlines()\n", - "\n", - "truths = np.array([float(truth.split(\" \")[0]) for truth in truths])\n", - "predictions = np.array([float(prediction.strip(\"\")) for prediction in predictions])\n", - "\n", - "auc_score = roc_auc_score(truths, predictions)\n", - "\n", - "print(auc_score)\n" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "data": { - "application/papermill.record+json": { - "auc_score": 0.7498803439718372 - } - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "store_metadata(\"auc_score\", auc_score)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "It can be seen that the model building/scoring process is fast and the model performance is good. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.3 Hyperparameter tuning of `xlearn`" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following presents a naive approach to tune the parameters of `xlearn`, which is using grid-search of parameter values to find the optimal combinations. It is worth noting that the original [FFM paper](https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf) gave some hints in terms of the impact of parameters on the sampled Criteo dataset. \n", - "\n", - "The following are the parameters that can be tuned in the `xlearn` implementation of FM/FFM algorithm." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "|Parameter|Description|Default value|Notes|\n", - "|-------------|-----------------|------------------|-----------------|\n", - "|`lr`|Learning rate|0.2|Higher learning rate helps fit a model more efficiently but may also result in overfitting.|\n", - "|`lambda`|Regularization parameter|0.00002|The value needs to be selected empirically to avoid overfitting.|\n", - "|`k`|Dimensionality of the latent factors|4|In FFM the effect of k is not that significant as the algorithm itself considers field where `k` can be small to capture the effect of features within each of the fields.|\n", - "|`init`|Model initialization|0.66|-|\n", - "|`epoch`|Number of epochs|10|Using a larger epoch size will help converge the model to its optimal point|" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": {}, - "outputs": [], - "source": [ - "param_dict = {\"lr\": [0.0001, 0.001, 0.01], \"lambda\": [0.001, 0.01, 0.1]}\n", - "\n", - "param_grid = generate_param_grid(param_dict)\n" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": {}, - "outputs": [], - "source": [ - "auc_scores = []\n", - "\n", - "with Timer() as time_tune:\n", - " for param in param_grid:\n", - " ffm_model = xl.create_ffm()\n", - " ffm_model.setTrain(train_file)\n", - " ffm_model.setValidate(valid_file)\n", - " ffm_model.fit(param, model_file)\n", - "\n", - " ffm_model.setTest(test_file)\n", - " ffm_model.setSigmoid()\n", - " ffm_model.predict(model_file, output_file)\n", - "\n", - " with open(output_file) as f:\n", - " predictions = f.readlines()\n", - "\n", - " with open(test_file) as f:\n", - " truths = f.readlines()\n", - "\n", - " truths = np.array([float(truth.split(\" \")[0]) for truth in truths])\n", - " predictions = np.array(\n", - " [float(prediction.strip(\"\")) for prediction in predictions]\n", - " )\n", - "\n", - " auc_scores.append(roc_auc_score(truths, predictions))\n" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Tuning by grid search takes 4.2 min\n" - ] - } - ], - "source": [ - "print(\"Tuning by grid search takes {0:.2} min\".format(time_tune.interval / 60))\n" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
Lambda0.0010.0100.100
LR
0.00010.54810.61220.7210
0.00100.54540.61030.7245
0.01000.54050.61500.7238
\n", - "
" - ], - "text/plain": [ - "Lambda 0.001 0.010 0.100\n", - "LR \n", - "0.0001 0.5481 0.6122 0.7210\n", - "0.0010 0.5454 0.6103 0.7245\n", - "0.0100 0.5405 0.6150 0.7238" - ] - }, - "execution_count": 18, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "auc_scores = [float(\"%.4f\" % x) for x in auc_scores]\n", - "auc_scores_array = np.reshape(\n", - " auc_scores, (len(param_dict[\"lr\"]), len(param_dict[\"lambda\"]))\n", - ")\n", - "\n", - "auc_df = pd.DataFrame(\n", - " data=auc_scores_array,\n", - " index=pd.Index(param_dict[\"lr\"], name=\"LR\"),\n", - " columns=pd.Index(param_dict[\"lambda\"], name=\"Lambda\"),\n", - ")\n", - "auc_df\n" - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "metadata": {}, - "outputs": [ - { - "data": { - "application/javascript": "/* Put everything inside the global mpl namespace */\n/* global mpl */\nwindow.mpl = {};\n\nmpl.get_websocket_type = function () {\n if (typeof WebSocket !== 'undefined') {\n return WebSocket;\n } else if (typeof MozWebSocket !== 'undefined') {\n return MozWebSocket;\n } else {\n alert(\n 'Your browser does not have WebSocket support. ' +\n 'Please try Chrome, Safari or Firefox ≥ 6. ' +\n 'Firefox 4 and 5 are also supported but you ' +\n 'have to enable WebSockets in about:config.'\n );\n }\n};\n\nmpl.figure = function (figure_id, websocket, ondownload, parent_element) {\n this.id = figure_id;\n\n this.ws = websocket;\n\n this.supports_binary = this.ws.binaryType !== undefined;\n\n if (!this.supports_binary) {\n var warnings = document.getElementById('mpl-warnings');\n if (warnings) {\n warnings.style.display = 'block';\n warnings.textContent =\n 'This browser does not support binary websocket messages. ' +\n 'Performance may be slow.';\n }\n }\n\n this.imageObj = new Image();\n\n this.context = undefined;\n this.message = undefined;\n this.canvas = undefined;\n this.rubberband_canvas = undefined;\n this.rubberband_context = undefined;\n this.format_dropdown = undefined;\n\n this.image_mode = 'full';\n\n this.root = document.createElement('div');\n this.root.setAttribute('style', 'display: inline-block');\n this._root_extra_style(this.root);\n\n parent_element.appendChild(this.root);\n\n this._init_header(this);\n this._init_canvas(this);\n this._init_toolbar(this);\n\n var fig = this;\n\n this.waiting = false;\n\n this.ws.onopen = function () {\n fig.send_message('supports_binary', { value: fig.supports_binary });\n fig.send_message('send_image_mode', {});\n if (fig.ratio !== 1) {\n fig.send_message('set_dpi_ratio', { dpi_ratio: fig.ratio });\n }\n fig.send_message('refresh', {});\n };\n\n this.imageObj.onload = function () {\n if (fig.image_mode === 'full') {\n // Full images could contain transparency (where diff images\n // almost always do), so we need to clear the canvas so that\n // there is no ghosting.\n fig.context.clearRect(0, 0, fig.canvas.width, fig.canvas.height);\n }\n fig.context.drawImage(fig.imageObj, 0, 0);\n };\n\n this.imageObj.onunload = function () {\n fig.ws.close();\n };\n\n this.ws.onmessage = this._make_on_message_function(this);\n\n this.ondownload = ondownload;\n};\n\nmpl.figure.prototype._init_header = function () {\n var titlebar = document.createElement('div');\n titlebar.classList =\n 'ui-dialog-titlebar ui-widget-header ui-corner-all ui-helper-clearfix';\n var titletext = document.createElement('div');\n titletext.classList = 'ui-dialog-title';\n titletext.setAttribute(\n 'style',\n 'width: 100%; text-align: center; padding: 3px;'\n );\n titlebar.appendChild(titletext);\n this.root.appendChild(titlebar);\n this.header = titletext;\n};\n\nmpl.figure.prototype._canvas_extra_style = function (_canvas_div) {};\n\nmpl.figure.prototype._root_extra_style = function (_canvas_div) {};\n\nmpl.figure.prototype._init_canvas = function () {\n var fig = this;\n\n var canvas_div = (this.canvas_div = document.createElement('div'));\n canvas_div.setAttribute(\n 'style',\n 'border: 1px solid #ddd;' +\n 'box-sizing: content-box;' +\n 'clear: both;' +\n 'min-height: 1px;' +\n 'min-width: 1px;' +\n 'outline: 0;' +\n 'overflow: hidden;' +\n 'position: relative;' +\n 'resize: both;'\n );\n\n function on_keyboard_event_closure(name) {\n return function (event) {\n return fig.key_event(event, name);\n };\n }\n\n canvas_div.addEventListener(\n 'keydown',\n on_keyboard_event_closure('key_press')\n );\n canvas_div.addEventListener(\n 'keyup',\n on_keyboard_event_closure('key_release')\n );\n\n this._canvas_extra_style(canvas_div);\n this.root.appendChild(canvas_div);\n\n var canvas = (this.canvas = document.createElement('canvas'));\n canvas.classList.add('mpl-canvas');\n canvas.setAttribute('style', 'box-sizing: content-box;');\n\n this.context = canvas.getContext('2d');\n\n var backingStore =\n this.context.backingStorePixelRatio ||\n this.context.webkitBackingStorePixelRatio ||\n this.context.mozBackingStorePixelRatio ||\n this.context.msBackingStorePixelRatio ||\n this.context.oBackingStorePixelRatio ||\n this.context.backingStorePixelRatio ||\n 1;\n\n this.ratio = (window.devicePixelRatio || 1) / backingStore;\n\n var rubberband_canvas = (this.rubberband_canvas = document.createElement(\n 'canvas'\n ));\n rubberband_canvas.setAttribute(\n 'style',\n 'box-sizing: content-box; position: absolute; left: 0; top: 0; z-index: 1;'\n );\n\n // Apply a ponyfill if ResizeObserver is not implemented by browser.\n if (this.ResizeObserver === undefined) {\n if (window.ResizeObserver !== undefined) {\n this.ResizeObserver = window.ResizeObserver;\n } else {\n var obs = _JSXTOOLS_RESIZE_OBSERVER({});\n this.ResizeObserver = obs.ResizeObserver;\n }\n }\n\n this.resizeObserverInstance = new this.ResizeObserver(function (entries) {\n var nentries = entries.length;\n for (var i = 0; i < nentries; i++) {\n var entry = entries[i];\n var width, height;\n if (entry.contentBoxSize) {\n if (entry.contentBoxSize instanceof Array) {\n // Chrome 84 implements new version of spec.\n width = entry.contentBoxSize[0].inlineSize;\n height = entry.contentBoxSize[0].blockSize;\n } else {\n // Firefox implements old version of spec.\n width = entry.contentBoxSize.inlineSize;\n height = entry.contentBoxSize.blockSize;\n }\n } else {\n // Chrome <84 implements even older version of spec.\n width = entry.contentRect.width;\n height = entry.contentRect.height;\n }\n\n // Keep the size of the canvas and rubber band canvas in sync with\n // the canvas container.\n if (entry.devicePixelContentBoxSize) {\n // Chrome 84 implements new version of spec.\n canvas.setAttribute(\n 'width',\n entry.devicePixelContentBoxSize[0].inlineSize\n );\n canvas.setAttribute(\n 'height',\n entry.devicePixelContentBoxSize[0].blockSize\n );\n } else {\n canvas.setAttribute('width', width * fig.ratio);\n canvas.setAttribute('height', height * fig.ratio);\n }\n canvas.setAttribute(\n 'style',\n 'width: ' + width + 'px; height: ' + height + 'px;'\n );\n\n rubberband_canvas.setAttribute('width', width);\n rubberband_canvas.setAttribute('height', height);\n\n // And update the size in Python. We ignore the initial 0/0 size\n // that occurs as the element is placed into the DOM, which should\n // otherwise not happen due to the minimum size styling.\n if (fig.ws.readyState == 1 && width != 0 && height != 0) {\n fig.request_resize(width, height);\n }\n }\n });\n this.resizeObserverInstance.observe(canvas_div);\n\n function on_mouse_event_closure(name) {\n return function (event) {\n return fig.mouse_event(event, name);\n };\n }\n\n rubberband_canvas.addEventListener(\n 'mousedown',\n on_mouse_event_closure('button_press')\n );\n rubberband_canvas.addEventListener(\n 'mouseup',\n on_mouse_event_closure('button_release')\n );\n // Throttle sequential mouse events to 1 every 20ms.\n rubberband_canvas.addEventListener(\n 'mousemove',\n on_mouse_event_closure('motion_notify')\n );\n\n rubberband_canvas.addEventListener(\n 'mouseenter',\n on_mouse_event_closure('figure_enter')\n );\n rubberband_canvas.addEventListener(\n 'mouseleave',\n on_mouse_event_closure('figure_leave')\n );\n\n canvas_div.addEventListener('wheel', function (event) {\n if (event.deltaY < 0) {\n event.step = 1;\n } else {\n event.step = -1;\n }\n on_mouse_event_closure('scroll')(event);\n });\n\n canvas_div.appendChild(canvas);\n canvas_div.appendChild(rubberband_canvas);\n\n this.rubberband_context = rubberband_canvas.getContext('2d');\n this.rubberband_context.strokeStyle = '#000000';\n\n this._resize_canvas = function (width, height, forward) {\n if (forward) {\n canvas_div.style.width = width + 'px';\n canvas_div.style.height = height + 'px';\n }\n };\n\n // Disable right mouse context menu.\n this.rubberband_canvas.addEventListener('contextmenu', function (_e) {\n event.preventDefault();\n return false;\n });\n\n function set_focus() {\n canvas.focus();\n canvas_div.focus();\n }\n\n window.setTimeout(set_focus, 100);\n};\n\nmpl.figure.prototype._init_toolbar = function () {\n var fig = this;\n\n var toolbar = document.createElement('div');\n toolbar.classList = 'mpl-toolbar';\n this.root.appendChild(toolbar);\n\n function on_click_closure(name) {\n return function (_event) {\n return fig.toolbar_button_onclick(name);\n };\n }\n\n function on_mouseover_closure(tooltip) {\n return function (event) {\n if (!event.currentTarget.disabled) {\n return fig.toolbar_button_onmouseover(tooltip);\n }\n };\n }\n\n fig.buttons = {};\n var buttonGroup = document.createElement('div');\n buttonGroup.classList = 'mpl-button-group';\n for (var toolbar_ind in mpl.toolbar_items) {\n var name = mpl.toolbar_items[toolbar_ind][0];\n var tooltip = mpl.toolbar_items[toolbar_ind][1];\n var image = mpl.toolbar_items[toolbar_ind][2];\n var method_name = mpl.toolbar_items[toolbar_ind][3];\n\n if (!name) {\n /* Instead of a spacer, we start a new button group. */\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n buttonGroup = document.createElement('div');\n buttonGroup.classList = 'mpl-button-group';\n continue;\n }\n\n var button = (fig.buttons[name] = document.createElement('button'));\n button.classList = 'mpl-widget';\n button.setAttribute('role', 'button');\n button.setAttribute('aria-disabled', 'false');\n button.addEventListener('click', on_click_closure(method_name));\n button.addEventListener('mouseover', on_mouseover_closure(tooltip));\n\n var icon_img = document.createElement('img');\n icon_img.src = '_images/' + image + '.png';\n icon_img.srcset = '_images/' + image + '_large.png 2x';\n icon_img.alt = tooltip;\n button.appendChild(icon_img);\n\n buttonGroup.appendChild(button);\n }\n\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n\n var fmt_picker = document.createElement('select');\n fmt_picker.classList = 'mpl-widget';\n toolbar.appendChild(fmt_picker);\n this.format_dropdown = fmt_picker;\n\n for (var ind in mpl.extensions) {\n var fmt = mpl.extensions[ind];\n var option = document.createElement('option');\n option.selected = fmt === mpl.default_extension;\n option.innerHTML = fmt;\n fmt_picker.appendChild(option);\n }\n\n var status_bar = document.createElement('span');\n status_bar.classList = 'mpl-message';\n toolbar.appendChild(status_bar);\n this.message = status_bar;\n};\n\nmpl.figure.prototype.request_resize = function (x_pixels, y_pixels) {\n // Request matplotlib to resize the figure. Matplotlib will then trigger a resize in the client,\n // which will in turn request a refresh of the image.\n this.send_message('resize', { width: x_pixels, height: y_pixels });\n};\n\nmpl.figure.prototype.send_message = function (type, properties) {\n properties['type'] = type;\n properties['figure_id'] = this.id;\n this.ws.send(JSON.stringify(properties));\n};\n\nmpl.figure.prototype.send_draw_message = function () {\n if (!this.waiting) {\n this.waiting = true;\n this.ws.send(JSON.stringify({ type: 'draw', figure_id: this.id }));\n }\n};\n\nmpl.figure.prototype.handle_save = function (fig, _msg) {\n var format_dropdown = fig.format_dropdown;\n var format = format_dropdown.options[format_dropdown.selectedIndex].value;\n fig.ondownload(fig, format);\n};\n\nmpl.figure.prototype.handle_resize = function (fig, msg) {\n var size = msg['size'];\n if (size[0] !== fig.canvas.width || size[1] !== fig.canvas.height) {\n fig._resize_canvas(size[0], size[1], msg['forward']);\n fig.send_message('refresh', {});\n }\n};\n\nmpl.figure.prototype.handle_rubberband = function (fig, msg) {\n var x0 = msg['x0'] / fig.ratio;\n var y0 = (fig.canvas.height - msg['y0']) / fig.ratio;\n var x1 = msg['x1'] / fig.ratio;\n var y1 = (fig.canvas.height - msg['y1']) / fig.ratio;\n x0 = Math.floor(x0) + 0.5;\n y0 = Math.floor(y0) + 0.5;\n x1 = Math.floor(x1) + 0.5;\n y1 = Math.floor(y1) + 0.5;\n var min_x = Math.min(x0, x1);\n var min_y = Math.min(y0, y1);\n var width = Math.abs(x1 - x0);\n var height = Math.abs(y1 - y0);\n\n fig.rubberband_context.clearRect(\n 0,\n 0,\n fig.canvas.width / fig.ratio,\n fig.canvas.height / fig.ratio\n );\n\n fig.rubberband_context.strokeRect(min_x, min_y, width, height);\n};\n\nmpl.figure.prototype.handle_figure_label = function (fig, msg) {\n // Updates the figure title.\n fig.header.textContent = msg['label'];\n};\n\nmpl.figure.prototype.handle_cursor = function (fig, msg) {\n var cursor = msg['cursor'];\n switch (cursor) {\n case 0:\n cursor = 'pointer';\n break;\n case 1:\n cursor = 'default';\n break;\n case 2:\n cursor = 'crosshair';\n break;\n case 3:\n cursor = 'move';\n break;\n }\n fig.rubberband_canvas.style.cursor = cursor;\n};\n\nmpl.figure.prototype.handle_message = function (fig, msg) {\n fig.message.textContent = msg['message'];\n};\n\nmpl.figure.prototype.handle_draw = function (fig, _msg) {\n // Request the server to send over a new figure.\n fig.send_draw_message();\n};\n\nmpl.figure.prototype.handle_image_mode = function (fig, msg) {\n fig.image_mode = msg['mode'];\n};\n\nmpl.figure.prototype.handle_history_buttons = function (fig, msg) {\n for (var key in msg) {\n if (!(key in fig.buttons)) {\n continue;\n }\n fig.buttons[key].disabled = !msg[key];\n fig.buttons[key].setAttribute('aria-disabled', !msg[key]);\n }\n};\n\nmpl.figure.prototype.handle_navigate_mode = function (fig, msg) {\n if (msg['mode'] === 'PAN') {\n fig.buttons['Pan'].classList.add('active');\n fig.buttons['Zoom'].classList.remove('active');\n } else if (msg['mode'] === 'ZOOM') {\n fig.buttons['Pan'].classList.remove('active');\n fig.buttons['Zoom'].classList.add('active');\n } else {\n fig.buttons['Pan'].classList.remove('active');\n fig.buttons['Zoom'].classList.remove('active');\n }\n};\n\nmpl.figure.prototype.updated_canvas_event = function () {\n // Called whenever the canvas gets updated.\n this.send_message('ack', {});\n};\n\n// A function to construct a web socket function for onmessage handling.\n// Called in the figure constructor.\nmpl.figure.prototype._make_on_message_function = function (fig) {\n return function socket_on_message(evt) {\n if (evt.data instanceof Blob) {\n /* FIXME: We get \"Resource interpreted as Image but\n * transferred with MIME type text/plain:\" errors on\n * Chrome. But how to set the MIME type? It doesn't seem\n * to be part of the websocket stream */\n evt.data.type = 'image/png';\n\n /* Free the memory for the previous frames */\n if (fig.imageObj.src) {\n (window.URL || window.webkitURL).revokeObjectURL(\n fig.imageObj.src\n );\n }\n\n fig.imageObj.src = (window.URL || window.webkitURL).createObjectURL(\n evt.data\n );\n fig.updated_canvas_event();\n fig.waiting = false;\n return;\n } else if (\n typeof evt.data === 'string' &&\n evt.data.slice(0, 21) === 'data:image/png;base64'\n ) {\n fig.imageObj.src = evt.data;\n fig.updated_canvas_event();\n fig.waiting = false;\n return;\n }\n\n var msg = JSON.parse(evt.data);\n var msg_type = msg['type'];\n\n // Call the \"handle_{type}\" callback, which takes\n // the figure and JSON message as its only arguments.\n try {\n var callback = fig['handle_' + msg_type];\n } catch (e) {\n console.log(\n \"No handler for the '\" + msg_type + \"' message type: \",\n msg\n );\n return;\n }\n\n if (callback) {\n try {\n // console.log(\"Handling '\" + msg_type + \"' message: \", msg);\n callback(fig, msg);\n } catch (e) {\n console.log(\n \"Exception inside the 'handler_\" + msg_type + \"' callback:\",\n e,\n e.stack,\n msg\n );\n }\n }\n };\n};\n\n// from http://stackoverflow.com/questions/1114465/getting-mouse-location-in-canvas\nmpl.findpos = function (e) {\n //this section is from http://www.quirksmode.org/js/events_properties.html\n var targ;\n if (!e) {\n e = window.event;\n }\n if (e.target) {\n targ = e.target;\n } else if (e.srcElement) {\n targ = e.srcElement;\n }\n if (targ.nodeType === 3) {\n // defeat Safari bug\n targ = targ.parentNode;\n }\n\n // pageX,Y are the mouse positions relative to the document\n var boundingRect = targ.getBoundingClientRect();\n var x = e.pageX - (boundingRect.left + document.body.scrollLeft);\n var y = e.pageY - (boundingRect.top + document.body.scrollTop);\n\n return { x: x, y: y };\n};\n\n/*\n * return a copy of an object with only non-object keys\n * we need this to avoid circular references\n * http://stackoverflow.com/a/24161582/3208463\n */\nfunction simpleKeys(original) {\n return Object.keys(original).reduce(function (obj, key) {\n if (typeof original[key] !== 'object') {\n obj[key] = original[key];\n }\n return obj;\n }, {});\n}\n\nmpl.figure.prototype.mouse_event = function (event, name) {\n var canvas_pos = mpl.findpos(event);\n\n if (name === 'button_press') {\n this.canvas.focus();\n this.canvas_div.focus();\n }\n\n var x = canvas_pos.x * this.ratio;\n var y = canvas_pos.y * this.ratio;\n\n this.send_message(name, {\n x: x,\n y: y,\n button: event.button,\n step: event.step,\n guiEvent: simpleKeys(event),\n });\n\n /* This prevents the web browser from automatically changing to\n * the text insertion cursor when the button is pressed. We want\n * to control all of the cursor setting manually through the\n * 'cursor' event from matplotlib */\n event.preventDefault();\n return false;\n};\n\nmpl.figure.prototype._key_event_extra = function (_event, _name) {\n // Handle any extra behaviour associated with a key event\n};\n\nmpl.figure.prototype.key_event = function (event, name) {\n // Prevent repeat events\n if (name === 'key_press') {\n if (event.which === this._key) {\n return;\n } else {\n this._key = event.which;\n }\n }\n if (name === 'key_release') {\n this._key = null;\n }\n\n var value = '';\n if (event.ctrlKey && event.which !== 17) {\n value += 'ctrl+';\n }\n if (event.altKey && event.which !== 18) {\n value += 'alt+';\n }\n if (event.shiftKey && event.which !== 16) {\n value += 'shift+';\n }\n\n value += 'k';\n value += event.which.toString();\n\n this._key_event_extra(event, name);\n\n this.send_message(name, { key: value, guiEvent: simpleKeys(event) });\n return false;\n};\n\nmpl.figure.prototype.toolbar_button_onclick = function (name) {\n if (name === 'download') {\n this.handle_save(this, null);\n } else {\n this.send_message('toolbar_button', { name: name });\n }\n};\n\nmpl.figure.prototype.toolbar_button_onmouseover = function (tooltip) {\n this.message.textContent = tooltip;\n};\n\n///////////////// REMAINING CONTENT GENERATED BY embed_js.py /////////////////\n// prettier-ignore\nvar _JSXTOOLS_RESIZE_OBSERVER=function(A){var t,i=new WeakMap,n=new WeakMap,a=new WeakMap,r=new WeakMap,o=new Set;function s(e){if(!(this instanceof s))throw new TypeError(\"Constructor requires 'new' operator\");i.set(this,e)}function h(){throw new TypeError(\"Function is not a constructor\")}function c(e,t,i,n){e=0 in arguments?Number(arguments[0]):0,t=1 in arguments?Number(arguments[1]):0,i=2 in arguments?Number(arguments[2]):0,n=3 in arguments?Number(arguments[3]):0,this.right=(this.x=this.left=e)+(this.width=i),this.bottom=(this.y=this.top=t)+(this.height=n),Object.freeze(this)}function d(){t=requestAnimationFrame(d);var s=new WeakMap,p=new Set;o.forEach((function(t){r.get(t).forEach((function(i){var r=t instanceof window.SVGElement,o=a.get(t),d=r?0:parseFloat(o.paddingTop),f=r?0:parseFloat(o.paddingRight),l=r?0:parseFloat(o.paddingBottom),u=r?0:parseFloat(o.paddingLeft),g=r?0:parseFloat(o.borderTopWidth),m=r?0:parseFloat(o.borderRightWidth),w=r?0:parseFloat(o.borderBottomWidth),b=u+f,F=d+l,v=(r?0:parseFloat(o.borderLeftWidth))+m,W=g+w,y=r?0:t.offsetHeight-W-t.clientHeight,E=r?0:t.offsetWidth-v-t.clientWidth,R=b+v,z=F+W,M=r?t.width:parseFloat(o.width)-R-E,O=r?t.height:parseFloat(o.height)-z-y;if(n.has(t)){var k=n.get(t);if(k[0]===M&&k[1]===O)return}n.set(t,[M,O]);var S=Object.create(h.prototype);S.target=t,S.contentRect=new c(u,d,M,O),s.has(i)||(s.set(i,[]),p.add(i)),s.get(i).push(S)}))})),p.forEach((function(e){i.get(e).call(e,s.get(e),e)}))}return s.prototype.observe=function(i){if(i instanceof window.Element){r.has(i)||(r.set(i,new Set),o.add(i),a.set(i,window.getComputedStyle(i)));var n=r.get(i);n.has(this)||n.add(this),cancelAnimationFrame(t),t=requestAnimationFrame(d)}},s.prototype.unobserve=function(i){if(i instanceof window.Element&&r.has(i)){var n=r.get(i);n.has(this)&&(n.delete(this),n.size||(r.delete(i),o.delete(i))),n.size||r.delete(i),o.size||cancelAnimationFrame(t)}},A.DOMRectReadOnly=c,A.ResizeObserver=s,A.ResizeObserverEntry=h,A}; // eslint-disable-line\nmpl.toolbar_items = [[\"Home\", \"Reset original view\", \"fa fa-home icon-home\", \"home\"], [\"Back\", \"Back to previous view\", \"fa fa-arrow-left icon-arrow-left\", \"back\"], [\"Forward\", \"Forward to next view\", \"fa fa-arrow-right icon-arrow-right\", \"forward\"], [\"\", \"\", \"\", \"\"], [\"Pan\", \"Left button pans, Right button zooms\\nx/y fixes axis, CTRL fixes aspect\", \"fa fa-arrows icon-move\", \"pan\"], [\"Zoom\", \"Zoom to rectangle\\nx/y fixes axis, CTRL fixes aspect\", \"fa fa-square-o icon-check-empty\", \"zoom\"], [\"\", \"\", \"\", \"\"], [\"Download\", \"Download plot\", \"fa fa-floppy-o icon-save\", \"download\"]];\n\nmpl.extensions = [\"eps\", \"jpeg\", \"pdf\", \"png\", \"ps\", \"raw\", \"svg\", \"tif\"];\n\nmpl.default_extension = \"png\";/* global mpl */\n\nvar comm_websocket_adapter = function (comm) {\n // Create a \"websocket\"-like object which calls the given IPython comm\n // object with the appropriate methods. Currently this is a non binary\n // socket, so there is still some room for performance tuning.\n var ws = {};\n\n ws.close = function () {\n comm.close();\n };\n ws.send = function (m) {\n //console.log('sending', m);\n comm.send(m);\n };\n // Register the callback with on_msg.\n comm.on_msg(function (msg) {\n //console.log('receiving', msg['content']['data'], msg);\n // Pass the mpl event to the overridden (by mpl) onmessage function.\n ws.onmessage(msg['content']['data']);\n });\n return ws;\n};\n\nmpl.mpl_figure_comm = function (comm, msg) {\n // This is the function which gets called when the mpl process\n // starts-up an IPython Comm through the \"matplotlib\" channel.\n\n var id = msg.content.data.id;\n // Get hold of the div created by the display call when the Comm\n // socket was opened in Python.\n var element = document.getElementById(id);\n var ws_proxy = comm_websocket_adapter(comm);\n\n function ondownload(figure, _format) {\n window.open(figure.canvas.toDataURL());\n }\n\n var fig = new mpl.figure(id, ws_proxy, ondownload, element);\n\n // Call onopen now - mpl needs it, as it is assuming we've passed it a real\n // web socket which is closed, not our websocket->open comm proxy.\n ws_proxy.onopen();\n\n fig.parent_element = element;\n fig.cell_info = mpl.find_output_cell(\"
\");\n if (!fig.cell_info) {\n console.error('Failed to find cell for figure', id, fig);\n return;\n }\n fig.cell_info[0].output_area.element.on(\n 'cleared',\n { fig: fig },\n fig._remove_fig_handler\n );\n};\n\nmpl.figure.prototype.handle_close = function (fig, msg) {\n var width = fig.canvas.width / fig.ratio;\n fig.cell_info[0].output_area.element.off(\n 'cleared',\n fig._remove_fig_handler\n );\n fig.resizeObserverInstance.unobserve(fig.canvas_div);\n\n // Update the output cell to use the data from the current canvas.\n fig.push_to_output();\n var dataURL = fig.canvas.toDataURL();\n // Re-enable the keyboard manager in IPython - without this line, in FF,\n // the notebook keyboard shortcuts fail.\n IPython.keyboard_manager.enable();\n fig.parent_element.innerHTML =\n '';\n fig.close_ws(fig, msg);\n};\n\nmpl.figure.prototype.close_ws = function (fig, msg) {\n fig.send_message('closing', msg);\n // fig.ws.close()\n};\n\nmpl.figure.prototype.push_to_output = function (_remove_interactive) {\n // Turn the data on the canvas into data in the output cell.\n var width = this.canvas.width / this.ratio;\n var dataURL = this.canvas.toDataURL();\n this.cell_info[1]['text/html'] =\n '';\n};\n\nmpl.figure.prototype.updated_canvas_event = function () {\n // Tell IPython that the notebook contents must change.\n IPython.notebook.set_dirty(true);\n this.send_message('ack', {});\n var fig = this;\n // Wait a second, then push the new image to the DOM so\n // that it is saved nicely (might be nice to debounce this).\n setTimeout(function () {\n fig.push_to_output();\n }, 1000);\n};\n\nmpl.figure.prototype._init_toolbar = function () {\n var fig = this;\n\n var toolbar = document.createElement('div');\n toolbar.classList = 'btn-toolbar';\n this.root.appendChild(toolbar);\n\n function on_click_closure(name) {\n return function (_event) {\n return fig.toolbar_button_onclick(name);\n };\n }\n\n function on_mouseover_closure(tooltip) {\n return function (event) {\n if (!event.currentTarget.disabled) {\n return fig.toolbar_button_onmouseover(tooltip);\n }\n };\n }\n\n fig.buttons = {};\n var buttonGroup = document.createElement('div');\n buttonGroup.classList = 'btn-group';\n var button;\n for (var toolbar_ind in mpl.toolbar_items) {\n var name = mpl.toolbar_items[toolbar_ind][0];\n var tooltip = mpl.toolbar_items[toolbar_ind][1];\n var image = mpl.toolbar_items[toolbar_ind][2];\n var method_name = mpl.toolbar_items[toolbar_ind][3];\n\n if (!name) {\n /* Instead of a spacer, we start a new button group. */\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n buttonGroup = document.createElement('div');\n buttonGroup.classList = 'btn-group';\n continue;\n }\n\n button = fig.buttons[name] = document.createElement('button');\n button.classList = 'btn btn-default';\n button.href = '#';\n button.title = name;\n button.innerHTML = '';\n button.addEventListener('click', on_click_closure(method_name));\n button.addEventListener('mouseover', on_mouseover_closure(tooltip));\n buttonGroup.appendChild(button);\n }\n\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n\n // Add the status bar.\n var status_bar = document.createElement('span');\n status_bar.classList = 'mpl-message pull-right';\n toolbar.appendChild(status_bar);\n this.message = status_bar;\n\n // Add the close button to the window.\n var buttongrp = document.createElement('div');\n buttongrp.classList = 'btn-group inline pull-right';\n button = document.createElement('button');\n button.classList = 'btn btn-mini btn-primary';\n button.href = '#';\n button.title = 'Stop Interaction';\n button.innerHTML = '';\n button.addEventListener('click', function (_evt) {\n fig.handle_close(fig, {});\n });\n button.addEventListener(\n 'mouseover',\n on_mouseover_closure('Stop Interaction')\n );\n buttongrp.appendChild(button);\n var titlebar = this.root.querySelector('.ui-dialog-titlebar');\n titlebar.insertBefore(buttongrp, titlebar.firstChild);\n};\n\nmpl.figure.prototype._remove_fig_handler = function (event) {\n var fig = event.data.fig;\n if (event.target !== this) {\n // Ignore bubbled events from children.\n return;\n }\n fig.close_ws(fig, {});\n};\n\nmpl.figure.prototype._root_extra_style = function (el) {\n el.style.boxSizing = 'content-box'; // override notebook setting of border-box.\n};\n\nmpl.figure.prototype._canvas_extra_style = function (el) {\n // this is important to make the div 'focusable\n el.setAttribute('tabindex', 0);\n // reach out to IPython and tell the keyboard manager to turn it's self\n // off when our div gets focus\n\n // location in version 3\n if (IPython.notebook.keyboard_manager) {\n IPython.notebook.keyboard_manager.register_events(el);\n } else {\n // location in version 2\n IPython.keyboard_manager.register_events(el);\n }\n};\n\nmpl.figure.prototype._key_event_extra = function (event, _name) {\n var manager = IPython.notebook.keyboard_manager;\n if (!manager) {\n manager = IPython.keyboard_manager;\n }\n\n // Check for shift+enter\n if (event.shiftKey && event.which === 13) {\n this.canvas_div.blur();\n // select the cell after this one\n var index = IPython.notebook.find_cell_index(this.cell_info[0]);\n IPython.notebook.select(index + 1);\n }\n};\n\nmpl.figure.prototype.handle_save = function (fig, _msg) {\n fig.ondownload(fig, null);\n};\n\nmpl.find_output_cell = function (html_output) {\n // Return the cell and output element which can be found *uniquely* in the notebook.\n // Note - this is a bit hacky, but it is done because the \"notebook_saving.Notebook\"\n // IPython event is triggered only after the cells have been serialised, which for\n // our purposes (turning an active figure into a static one), is too late.\n var cells = IPython.notebook.get_cells();\n var ncells = cells.length;\n for (var i = 0; i < ncells; i++) {\n var cell = cells[i];\n if (cell.cell_type === 'code') {\n for (var j = 0; j < cell.output_area.outputs.length; j++) {\n var data = cell.output_area.outputs[j];\n if (data.data) {\n // IPython >= 3 moved mimebundle to data attribute of output\n data = data.data;\n }\n if (data['text/html'] === html_output) {\n return [cell, data, j];\n }\n }\n }\n }\n};\n\n// Register the function which deals with the matplotlib target/channel.\n// The kernel may be null if the page has been refreshed.\nif (IPython.notebook.kernel !== null) {\n IPython.notebook.kernel.comm_manager.register_target(\n 'matplotlib',\n mpl.mpl_figure_comm\n );\n}\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/html": [ - "" - ], - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "text/plain": [ - "" - ] - }, - "execution_count": 21, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "fig, ax = plt.subplots()\n", - "sns.heatmap(auc_df, cbar=False, annot=True, fmt=\".4g\")\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "More advanced tuning methods like Bayesian Optimization can be used for searching for the optimal model efficiently. The benefit of using, for example, `HyperDrive` from Azure Machine Learning Services, for tuning the parameters, is that, the tuning tasks can be distributed across nodes of a cluster and the optimization can be run concurrently to save the total cost.\n", - "\n", - "* Details about how to tune hyper parameters by using Azure Machine Learning Services can be found [here](https://github.com/microsoft/recommenders/tree/master/notebooks/04_model_select_and_optimize).\n", - "* Note, to enable the tuning task on Azure Machine Learning Services by using HyperDrive, one needs a Docker image to containerize the environment where `xlearn` can be run. The Docker file provided [here](https://github.com/microsoft/recommenders/tree/master/docker) can be used for such purpose." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.4 Clean up" - ] - }, - { - "cell_type": "code", - "execution_count": 20, - "metadata": {}, - "outputs": [], - "source": [ - "tmpdir.cleanup()\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## References" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "\n", - "1. Rendle, Steffen. \"Factorization machines.\" 2010 IEEE International Conference on Data Mining. IEEE, 2010.\n", - "2. Juan, Yuchin, et al. \"Field-aware factorization machines for CTR prediction.\" Proceedings of the 10th ACM Conference on Recommender Systems. ACM, 2016.\n", - "3. Guo, Huifeng, et al. \"DeepFM: a factorization-machine based neural network for CTR prediction.\" arXiv preprint arXiv:1703.04247, 2017.\n", - "4. Lian, Jianxun, et al. \"xdeepfm: Combining explicit and implicit feature interactions for recommender systems.\" Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 2018.\n", - "5. Qu, Yanru, et al. \"Product-based neural networks for user response prediction.\" 2016 IEEE 16th International Conference on Data Mining (ICDM). IEEE, 2016.\n", - "6. Zhang, Weinan, Tianming Du, and Jun Wang. \"Deep learning over multi-field categorical data.\" European conference on information retrieval. Springer, Cham, 2016.\n", - "7. He, Xiangnan, and Tat-Seng Chua. \"Neural factorization machines for sparse predictive analytics.\" Proceedings of the 40th International ACM SIGIR conference on Research and Development in Information Retrieval. ACM, 2017.\n", - "8. Cheng, Heng-Tze, et al. \"Wide & deep learning for recommender systems.\" Proceedings of the 1st workshop on deep learning for recommender systems. ACM, 2016.\n", - "9. Langford, John, Lihong Li, and Alex Strehl. \"Vowpal wabbit online learning project.\", 2007." - ] - } - ], - "metadata": { - "celltoolbar": "Tags", - "kernelspec": { - "display_name": "recommenders", - "language": "python", - "name": "conda-env-recommenders-py" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.13" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/examples/02_model_hybrid/lightfm_deep_dive.ipynb b/examples/02_model_hybrid/lightfm_deep_dive.ipynb deleted file mode 100755 index 5ce4b79151..0000000000 --- a/examples/02_model_hybrid/lightfm_deep_dive.ipynb +++ /dev/null @@ -1,1956 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Copyright (c) Recommenders contributors.\n", - "\n", - "Licensed under the MIT License." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# LightFM - hybrid matrix factorisation on MovieLens (Python, CPU)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "This notebook explains the concept of a hybrid matrix factorisation based model for recommendation, it also outlines the steps to construct a pure matrix factorisation and a hybrid models using the [LightFM](https://github.com/lyst/lightfm) package. It also demonstrates how to extract both user and item affinity from a fitted hybrid model.\n", - "\n", - "## 1. Hybrid matrix factorisation model\n", - "\n", - "### 1.1 Background\n", - "\n", - "In general, most recommendation models can be divided into two categories:\n", - "- Content based model,\n", - "- Collaborative filtering model.\n", - "\n", - "The content-based model recommends based on similarity of the items and/or users using their description/metadata/profile. On the other hand, collaborative filtering model (discussion is limited to matrix factorisation approach in this notebook) computes the latent factors of the users and items. It works based on the assumption that if a group of people expressed similar opinions on an item, these peole would tend to have similar opinions on other items. For further background and detailed explanation between these two approaches, the reader can refer to machine learning literatures [3, 4].\n", - "\n", - "The choice between the two models is largely based on the data availability. For example, the collaborative filtering model is usually adopted and effective when sufficient ratings/feedbacks have been recorded for a group of users and items.\n", - "\n", - "However, if there is a lack of ratings, content based model can be used provided that the metadata of the users and items are available. This is also the common approach to address the cold-start issues, where there are insufficient historical collaborative interactions available to model new users and/or items.\n", - "\n", - "\n", - "\n", - "### 1.2 Hybrid matrix factorisation algorithm\n", - "\n", - "In view of the above problems, there have been a number of proposals to address the cold-start issues by combining both content-based and collaborative filtering approaches. The hybrid matrix factorisation model is among one of the solutions proposed [1]. \n", - "\n", - "In general, most hybrid approaches proposed different ways of assessing and/or combining the feature data in conjunction with the collaborative information.\n", - "\n", - "### 1.3 LightFM package \n", - "\n", - "LightFM is a Python implementation of a hybrid recommendation algorithms for both implicit and explicit feedbacks [1].\n", - "\n", - "It is a hybrid content-collaborative model which represents users and items as linear combinations of their content features’ latent factors. The model learns **embeddings or latent representations of the users and items in such a way that it encodes user preferences over items**. These representations produce scores for every item for a given user; items scored highly are more likely to be interesting to the user.\n", - "\n", - "The user and item embeddings are estimated for every feature, and these features are then added together to be the final representations for users and items. \n", - "\n", - "For example, for user i, the model retrieves the i-th row of the feature matrix to find the features with non-zero weights. The embeddings for these features will then be added together to become the user representation e.g. if user 10 has weight 1 in the 5th column of the user feature matrix, and weight 3 in the 20th column, the user 10’s representation is the sum of embedding for the 5th and the 20th features multiplying their corresponding weights. The representation for each items is computed in the same approach. \n", - "\n", - "#### 1.3.1 Modelling approach\n", - "\n", - "Let $U$ be the set of users and $I$ be the set of items, and each user can be described by a set of user features $f_{u} \\subset F^{U}$ whilst each items can be described by item features $f_{i} \\subset F^{I}$. Both $F^{U}$ and $F^{I}$ are all the features which fully describe all users and items. \n", - "\n", - "The LightFM model operates based binary feedbacks, the ratings will be normalised into two groups. The user-item interaction pairs $(u,i) \\in U\\times I$ are the union of positive (favourable reviews) $S^+$ and negative interactions (negative reviews) $S^-$ for explicit ratings. For implicit feedbacks, these can be the observed and not observed interactions respectively.\n", - "\n", - "For each user and item feature, their embeddings are $e_{f}^{U}$ and $e_{f}^{I}$ respectively. Furthermore, each feature is also has a scalar bias term ($b_U^f$ for user and $b_I^f$ for item features). The embedding (latent representation) of user $u$ and item $i$ are the sum of its respective features’ latent vectors:\n", - "\n", - "$$ \n", - "q_{u} = \\sum_{j \\in f_{u}} e_{j}^{U}\n", - "$$\n", - "\n", - "$$\n", - "p_{i} = \\sum_{j \\in f_{i}} e_{j}^{I}\n", - "$$\n", - "\n", - "Similarly the biases for user $u$ and item $i$ are the sum of its respective bias vectors. These variables capture the variation in behaviour across users and items:\n", - "\n", - "$$\n", - "b_{u} = \\sum_{j \\in f_{u}} b_{j}^{U}\n", - "$$\n", - "\n", - "$$\n", - "b_{i} = \\sum_{j \\in f_{i}} b_{j}^{I}\n", - "$$\n", - "\n", - "In LightFM, the representation for each user/item is a linear weighted sum of its feature vectors.\n", - "\n", - "The prediction for user $u$ and item $i$ can be modelled as sigmoid of the dot product of user and item vectors, adjusted by its feature biases as follows:\n", - "\n", - "$$\n", - "\\hat{r}_{ui} = \\sigma (q_{u} \\cdot p_{i} + b_{u} + b_{i})\n", - "$$\n", - "\n", - "As the LightFM is constructed to predict binary outcomes e.g. $S^+$ and $S^-$, the function $\\sigma()$ is based on the [sigmoid function](https://mathworld.wolfram.com/SigmoidFunction.html). \n", - "\n", - "The LightFM algorithm estimates interaction latent vectors and bias for features. For model fitting, the cost function of the model consists of maximising the likelihood of data conditional on the parameters described above using stochastic gradient descent. The likelihood can be expressed as follows:\n", - "\n", - "$$\n", - "L = \\prod_{(u,i) \\in S+}\\hat{r}_{ui} \\times \\prod_{(u,i) \\in S-}1 - \\hat{r}_{ui}\n", - "$$\n", - "\n", - "Note that if the feature latent vectors are not available, the algorithm will behaves like a [logistic matrix factorisation model](http://stanford.edu/~rezab/nips2014workshop/submits/logmat.pdf)." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 2. Movie recommender with LightFM using only explicit feedbacks" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.1 Import libraries" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "System version: 3.8.13 (default, Mar 28 2022, 11:38:47) \n", - "[GCC 7.5.0]\n", - "LightFM version: 1.16\n" - ] - } - ], - "source": [ - "import os\n", - "import sys\n", - "import itertools\n", - "import pandas as pd\n", - "import numpy as np\n", - "import matplotlib.pyplot as plt\n", - "import seaborn as sns\n", - "\n", - "import lightfm\n", - "from lightfm import LightFM\n", - "from lightfm.data import Dataset\n", - "from lightfm import cross_validation\n", - "from lightfm.evaluation import precision_at_k as lightfm_prec_at_k\n", - "from lightfm.evaluation import recall_at_k as lightfm_recall_at_k\n", - "\n", - "from recommenders.evaluation.python_evaluation import precision_at_k, recall_at_k\n", - "from recommenders.utils.timer import Timer\n", - "from recommenders.datasets import movielens\n", - "from recommenders.models.lightfm.lightfm_utils import (\n", - " track_model_metrics,\n", - " prepare_test_df,\n", - " prepare_all_predictions,\n", - " compare_metric,\n", - " similar_users,\n", - " similar_items,\n", - ")\n", - "from recommenders.utils.notebook_utils import store_metadata\n", - "\n", - "print(\"System version: {}\".format(sys.version))\n", - "print(\"LightFM version: {}\".format(lightfm.__version__))\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.2 Defining variables" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "tags": [ - "parameters" - ] - }, - "outputs": [], - "source": [ - "# Select MovieLens data size\n", - "MOVIELENS_DATA_SIZE = '100k'\n", - "\n", - "# default number of recommendations\n", - "K = 10\n", - "# percentage of data used for testing\n", - "TEST_PERCENTAGE = 0.25\n", - "# model learning rate\n", - "LEARNING_RATE = 0.25\n", - "# no of latent factors\n", - "NO_COMPONENTS = 20\n", - "# no of epochs to fit model\n", - "NO_EPOCHS = 20\n", - "# no of threads to fit model\n", - "NO_THREADS = 32\n", - "# regularisation for both user and item features\n", - "ITEM_ALPHA = 1e-6\n", - "USER_ALPHA = 1e-6\n", - "\n", - "# seed for pseudonumber generations\n", - "SEED = 42" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.2 Retrieve data" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "100%|██████████| 4.81k/4.81k [00:00<00:00, 6.13kKB/s]\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
userIDitemIDratinggenre
153894772944.0Comedy
485285762592.0Children's|Comedy
200067161353.0Drama|Mystery|Sci-Fi|Thriller
2284222293.0Action|Adventure|Comedy|Crime
959756558964.0Drama
\n", - "
" - ], - "text/plain": [ - " userID itemID rating genre\n", - "15389 477 294 4.0 Comedy\n", - "48528 576 259 2.0 Children's|Comedy\n", - "20006 716 135 3.0 Drama|Mystery|Sci-Fi|Thriller\n", - "2284 222 29 3.0 Action|Adventure|Comedy|Crime\n", - "95975 655 896 4.0 Drama" - ] - }, - "execution_count": 3, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "data = movielens.load_pandas_df(\n", - " size=MOVIELENS_DATA_SIZE,\n", - " genres_col='genre',\n", - " header=[\"userID\", \"itemID\", \"rating\"]\n", - ")\n", - "# quick look at the data\n", - "data.sample(5, random_state=SEED)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.3 Prepare data" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Before fitting the LightFM model, we need to create an instance of `Dataset` which holds the interaction matrix." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [], - "source": [ - "dataset = Dataset()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The `fit` method creates the user/item id mappings." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Num users: 943, num_topics: 1682.\n" - ] - } - ], - "source": [ - "dataset.fit(users=data['userID'], \n", - " items=data['itemID'])\n", - "\n", - "# quick check to determine the number of unique users and items in the data\n", - "num_users, num_topics = dataset.interactions_shape()\n", - "print(f'Num users: {num_users}, num_topics: {num_topics}.')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Next is to build the interaction matrix. The `build_interactions` method returns 2 COO sparse matrices, namely the `interactions` and `weights` matrices." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "(interactions, weights) = dataset.build_interactions(data.iloc[:, 0:3].values)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "LightLM works slightly differently compared to other packages as it expects the train and test sets to have same dimension. Therefore the conventional train test split will not work.\n", - "\n", - "The package has included the `cross_validation.random_train_test_split` method to split the interaction data and splits it into two disjoint training and test sets. \n", - "\n", - "However, note that **it does not validate the interactions in the test set to guarantee all items and users have historical interactions in the training set**. Therefore this may result into a partial cold-start problem in the test set." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [], - "source": [ - "train_interactions, test_interactions = cross_validation.random_train_test_split(\n", - " interactions, test_percentage=TEST_PERCENTAGE,\n", - " random_state=np.random.RandomState(SEED))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Double check the size of both the train and test sets." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Shape of train interactions: (943, 1682)\n", - "Shape of test interactions: (943, 1682)\n" - ] - } - ], - "source": [ - "print(f\"Shape of train interactions: {train_interactions.shape}\")\n", - "print(f\"Shape of test interactions: {test_interactions.shape}\")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.4 Fit the LightFM model" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In this notebook, the LightFM model will be using the weighted Approximate-Rank Pairwise (WARP) as the loss. Further explanation on the topic can be found [here](https://making.lyst.com/lightfm/docs/examples/warp_loss.html#learning-to-rank-using-the-warp-loss).\n", - "\n", - "\n", - "In general, it maximises the rank of positive examples by repeatedly sampling negative examples until a rank violation has been located. This approach is recommended when only positive interactions are present." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": { - "scrolled": true - }, - "outputs": [], - "source": [ - "model1 = LightFM(loss='warp', no_components=NO_COMPONENTS, \n", - " learning_rate=LEARNING_RATE, \n", - " random_state=np.random.RandomState(SEED))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The LightFM model can be fitted with the following code:" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [], - "source": [ - "model1.fit(interactions=train_interactions,\n", - " epochs=NO_EPOCHS);" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.5 Prepare model evaluation data" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Before we can evaluate the fitted model and to get the data into a format which is compatible with the existing evaluation methods within this repo, the data needs to be massaged slightly.\n", - "\n", - "First the train/test indices need to be extracted from the `lightfm.cross_validation` method as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [], - "source": [ - "uids, iids, interaction_data = cross_validation._shuffle(\n", - " interactions.row, interactions.col, interactions.data, \n", - " random_state=np.random.RandomState(SEED))\n", - "\n", - "cutoff = int((1.0 - TEST_PERCENTAGE) * len(uids))\n", - "test_idx = slice(cutoff, None)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Then the the mapping between internal and external representation of the user and item are extracted as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [], - "source": [ - "uid_map, ufeature_map, iid_map, ifeature_map = dataset.mapping()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Once the train/test indices and mapping are ready, the test dataframe can be constructed as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Took 1.3 seconds for prepare and predict test data.\n" - ] - } - ], - "source": [ - "with Timer() as test_time:\n", - " test_df = prepare_test_df(test_idx, uids, iids, uid_map, iid_map, weights)\n", - "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict test data.\") \n", - "time_reco1 = test_time.interval" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "And samples of the test dataframe:" - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
userIDitemIDrating
145426163283.0
22507382403.0
160248082945.0
1545834625.0
15002110383.0
\n", - "
" - ], - "text/plain": [ - " userID itemID rating\n", - "14542 616 328 3.0\n", - "2250 738 240 3.0\n", - "16024 808 294 5.0\n", - "15458 346 2 5.0\n", - "15002 110 38 3.0" - ] - }, - "execution_count": 14, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "test_df.sample(5, random_state=SEED)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In addition, the predictions of all unseen user-item pairs (e.g. removing those seen in the training data) can be prepared as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Took 316.4 seconds for prepare and predict all data.\n" - ] - } - ], - "source": [ - "with Timer() as test_time:\n", - " all_predictions = prepare_all_predictions(data, uid_map, iid_map, \n", - " interactions=train_interactions,\n", - " model=model1, \n", - " num_threads=NO_THREADS)\n", - "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict all data.\")\n", - "time_reco2 = test_time.interval" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Samples of the `all_predictions` dataframe:" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
userIDitemIDprediction
11000841031291-84.180405
9285266381274-28.313053
93421401576-49.781242
568732021177-55.072628
1270199751462-53.741127
\n", - "
" - ], - "text/plain": [ - " userID itemID prediction\n", - "1100084 103 1291 -84.180405\n", - "928526 638 1274 -28.313053\n", - "93421 40 1576 -49.781242\n", - "56873 202 1177 -55.072628\n", - "1270199 75 1462 -53.741127" - ] - }, - "execution_count": 16, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "all_predictions.sample(5, random_state=SEED)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Note that the **raw prediction values from the LightFM model are for ranking purposes only**, they should not be used directly. The magnitude and sign of these values do not have any specific interpretation." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 2.6 Model evaluation" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Once the evaluation data are ready, they can be passed into to the repo's evaluation methods as follows. The performance of the model will be tracked using both Precision@K and Recall@K.\n", - "\n", - "In addition, the results have also being compared with those computed from LightFM's own evaluation methods to ensure accuracy." - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "------ Using Repo's evaluation methods ------\n", - "Precision@K:\t0.131601\n", - "Recall@K:\t0.038056\n", - "\n", - "------ Using LightFM evaluation methods ------\n", - "Precision@K:\t0.131601\n", - "Recall@K:\t0.038056\n" - ] - } - ], - "source": [ - "with Timer() as test_time:\n", - " eval_precision = precision_at_k(rating_true=test_df, \n", - " rating_pred=all_predictions, k=K)\n", - " eval_recall = recall_at_k(test_df, all_predictions, k=K)\n", - "time_reco3 = test_time.interval\n", - "\n", - "with Timer() as test_time:\n", - " eval_precision_lfm = lightfm_prec_at_k(model1, test_interactions, \n", - " train_interactions, k=K).mean()\n", - " eval_recall_lfm = lightfm_recall_at_k(model1, test_interactions, \n", - " train_interactions, k=K).mean()\n", - "time_lfm = test_time.interval\n", - " \n", - "print(\n", - " \"------ Using Repo's evaluation methods ------\",\n", - " f\"Precision@K:\\t{eval_precision:.6f}\",\n", - " f\"Recall@K:\\t{eval_recall:.6f}\",\n", - " \"\\n------ Using LightFM evaluation methods ------\",\n", - " f\"Precision@K:\\t{eval_precision_lfm:.6f}\",\n", - " f\"Recall@K:\\t{eval_recall_lfm:.6f}\", \n", - " sep='\\n')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 3. Movie recommender with LightFM using explicit feedbacks and additional item and user features" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "As the LightFM was designed to incorporates both user and item metadata, the model can be extended to include additional features such as movie genres and user occupations." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.1 Extract and prepare movie genres" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In this notebook, the movie's genres will be used as the item metadata. As the genres have already been loaded during the initial data import, it can be processed directly as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [], - "source": [ - "# split the genre based on the separator\n", - "movie_genre = [x.split('|') for x in data['genre']]" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "['Action',\n", - " 'Adventure',\n", - " 'Animation',\n", - " \"Children's\",\n", - " 'Comedy',\n", - " 'Crime',\n", - " 'Documentary',\n", - " 'Drama',\n", - " 'Fantasy',\n", - " 'Film-Noir',\n", - " 'Horror',\n", - " 'Musical',\n", - " 'Mystery',\n", - " 'Romance',\n", - " 'Sci-Fi',\n", - " 'Thriller',\n", - " 'War',\n", - " 'Western',\n", - " 'unknown']" - ] - }, - "execution_count": 19, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "# retrieve the all the unique genres in the data\n", - "all_movie_genre = sorted(list(set(itertools.chain.from_iterable(movie_genre))))\n", - "# quick look at the all the genres within the data\n", - "all_movie_genre" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.2 Retrieve and prepare movie genres" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Further user features can be included as part of the model fitting process. In this notebook, **only the occupation of each user will be included** but the feature list can be extended easily.\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### 3.2.1 Retrieve and merge data" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The user features can be retrieved directly from the grouplens website and merged with the existing data as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 21, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
userIDitemIDratinggenreoccupation
826946981743.0Action|Adventureprogrammer
987698247481.0Action|Romance|Thrillerother
633513877894.0Comedy|Dramaentertainment
680015415964.0Animation|Children's|Musicalstudent
750001611352.0Drama|Mystery|Sci-Fi|Thrillerlawyer
\n", - "
" - ], - "text/plain": [ - " userID itemID rating genre occupation\n", - "82694 698 174 3.0 Action|Adventure programmer\n", - "98769 824 748 1.0 Action|Romance|Thriller other\n", - "63351 387 789 4.0 Comedy|Drama entertainment\n", - "68001 541 596 4.0 Animation|Children's|Musical student\n", - "75000 161 135 2.0 Drama|Mystery|Sci-Fi|Thriller lawyer" - ] - }, - "execution_count": 21, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "user_feature_URL = 'http://files.grouplens.org/datasets/movielens/ml-100k/u.user'\n", - "columns = ['userID','age','gender','occupation','zipcode']\n", - "user_data = pd.read_table(user_feature_URL, sep='|', header=None, names=columns)\n", - "\n", - "# merging user feature with existing data\n", - "new_data = data.merge(user_data[['userID','occupation']], left_on='userID', right_on='userID')\n", - "# quick look at the merged data\n", - "new_data.sample(5, random_state=SEED)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "#### 3.2.2 Extract and prepare user occupations" - ] - }, - { - "cell_type": "code", - "execution_count": 22, - "metadata": {}, - "outputs": [], - "source": [ - "# retrieve all the unique occupations in the data\n", - "all_occupations = sorted(list(set(new_data['occupation'])))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.3 Prepare data and features" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Similar to the previous model, the data is required to be converted into a `Dataset` instance and then create a user/item id mapping with the `fit` method." - ] - }, - { - "cell_type": "code", - "execution_count": 23, - "metadata": {}, - "outputs": [], - "source": [ - "dataset2 = Dataset()\n", - "dataset2.fit(data['userID'], \n", - " data['itemID'], \n", - " item_features=all_movie_genre,\n", - " user_features=all_occupations)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The movie genres are then converted into a item feature matrix using the `build_item_features` method as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 24, - "metadata": {}, - "outputs": [], - "source": [ - "item_features = dataset2.build_item_features((x, y) for x,y in zip(data.itemID, movie_genre))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The user occupations are then converted into an user feature matrix using the `build_user_features` method as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 25, - "metadata": {}, - "outputs": [], - "source": [ - "user_features = dataset2.build_user_features((x, [y]) for x,y in zip(new_data.userID, new_data['occupation']))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Once the item and user features matrices have been completed, the next steps are similar as before, which is to build the interaction matrix and split the interactions into train and test sets as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 26, - "metadata": {}, - "outputs": [], - "source": [ - "interactions2, weights2 = dataset2.build_interactions(data.iloc[:, 0:3].values)\n", - "\n", - "train_interactions2, test_interactions2 = cross_validation.random_train_test_split(\n", - " interactions2, \n", - " test_percentage=TEST_PERCENTAGE,\n", - " random_state=np.random.RandomState(SEED)\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.3 Fit the LightFM model with additional user and item features" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The parameters of the second model will be similar to the first model to facilitates comparison.\n", - "\n", - "The model performance at each epoch is also tracked by the same metrics as before." - ] - }, - { - "cell_type": "code", - "execution_count": 27, - "metadata": {}, - "outputs": [], - "source": [ - "model2 = LightFM(loss='warp', no_components=NO_COMPONENTS, \n", - " learning_rate=LEARNING_RATE, \n", - " item_alpha=ITEM_ALPHA,\n", - " user_alpha=USER_ALPHA,\n", - " random_state=np.random.RandomState(SEED)\n", - " )" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The LightFM model can then be fitted:" - ] - }, - { - "cell_type": "code", - "execution_count": 28, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "" - ] - }, - "execution_count": 28, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "model2.fit(interactions=train_interactions2,\n", - " user_features=user_features,\n", - " item_features=item_features,\n", - " epochs=NO_EPOCHS\n", - " )" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.4 Prepare model evaluation data" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Similar to the previous model, the evaluation data needs to be prepared in order to get them into a format consumable with this repo's evaluation methods.\n", - "\n", - "Firstly the train/test indices and id mappings are extracted using the new interations matrix as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 29, - "metadata": {}, - "outputs": [], - "source": [ - "uids, iids, interaction_data = cross_validation._shuffle(\n", - " interactions2.row, \n", - " interactions2.col, \n", - " interactions2.data, \n", - " random_state=np.random.RandomState(SEED)\n", - ")\n", - "\n", - "uid_map, ufeature_map, iid_map, ifeature_map = dataset2.mapping()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The test dataframe is then constructed as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 30, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Took 1.3 seconds for prepare and predict test data.\n" - ] - } - ], - "source": [ - "with Timer() as test_time:\n", - " test_df2 = prepare_test_df(test_idx, uids, iids, uid_map, iid_map, weights2)\n", - "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict test data.\") " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The predictions of all unseen user-item pairs can be prepared as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 31, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Took 161.9 seconds for prepare and predict all data.\n" - ] - } - ], - "source": [ - "with Timer() as test_time:\n", - " all_predictions2 = prepare_all_predictions(data, uid_map, iid_map, \n", - " interactions=train_interactions2,\n", - " user_features=user_features,\n", - " item_features=item_features,\n", - " model=model2,\n", - " num_threads=NO_THREADS)\n", - "\n", - "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict all data.\")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.5 Model evaluation and comparison" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The predictive performance of the new model can be computed and compared with the previous model (which used only the explicit rating) as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 32, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "------ Using only explicit ratings ------\n", - "Precision@K:\t0.131601\n", - "Recall@K:\t0.038056\n", - "\n", - "------ Using both implicit and explicit ratings ------\n", - "Precision@K:\t0.145599\n", - "Recall@K:\t0.051338\n" - ] - } - ], - "source": [ - "eval_precision2 = precision_at_k(rating_true=test_df2, \n", - " rating_pred=all_predictions2, k=K)\n", - "eval_recall2 = recall_at_k(test_df2, all_predictions2, k=K)\n", - "\n", - "print(\n", - " \"------ Using only explicit ratings ------\",\n", - " f\"Precision@K:\\t{eval_precision:.6f}\",\n", - " f\"Recall@K:\\t{eval_recall:.6f}\",\n", - " \"\\n------ Using both implicit and explicit ratings ------\",\n", - " f\"Precision@K:\\t{eval_precision2:.6f}\",\n", - " f\"Recall@K:\\t{eval_recall2:.6f}\",\n", - " sep='\\n')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The new model which used both implicit and explicit data performed consistently better than the previous model which used only the explicit data, thus highlighting the benefits of including such additional features to the model." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 3.6 Evaluation metrics comparison" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Note that the evaluation approaches here are solely for demonstration purposes only.\n", - "\n", - "If the reader were using the LightFM package and/or its models, the LightFM's built-in evaluation methods are much more efficient and are the recommended approach for production usage as they are designed and optimised to work with the package. If the reader wants to compare LigthFM with other algorithms in Recommenders repository, it is better to use the evaluation tools in Recommenders.\n", - "\n", - "As a comparison, the times recorded to evaluate model1 are shown as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 33, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "------ Using Repo's evaluation methods ------\n", - "Time [sec]:\t320.6\n", - "\n", - "------ Using LightFM evaluation methods ------\n", - "Time [sec]:\t0.2\n" - ] - } - ], - "source": [ - "print(\n", - " \"------ Using Repo's evaluation methods ------\",\n", - " f\"Time [sec]:\\t{(time_reco1+time_reco2+time_reco3):.1f}\",\n", - " \"\\n------ Using LightFM evaluation methods ------\",\n", - " f\"Time [sec]:\\t{time_lfm:.1f}\",\n", - " sep='\\n')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 4. Evaluate model fitting process" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In addition to the inclusion of both implicit and explicit data, the model fitting process can also be monitored in order to determine whether the model is being trained properly. \n", - "\n", - "This notebook also includes a `track_model_metrics` method which plots the model's metrics e.g. Precision@K and Recall@K as model fitting progresses.\n", - "\n", - "For the first model (using only explicit data), the model fitting progress is shown as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 34, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAADQCAYAAADbLGKxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAqg0lEQVR4nO3deZRcZZ3/8fcnBAhmYQ2rhoDEYdUgzTIiCjPAhIwQ54ACjoq/QZFBZRyOHuMRIQI6IDMyII4SgRFQBhEHjRpBBNEMsjUQSJAtCRHDlkAQEiSQpL+/P+6tTqXSfe/trrpdS39e59RJVd3tqU499dxn+z6KCMzMzKw5RjQ7AWZmZsOZC2IzM7MmckFsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8QtSNJkSVMztndJuqSkax8q6WVJcyU9IunsBp13tqQtMrZfLmnPRlzLrFYL5alHJf17g88/UdL8qmv9vJHnt/KNbHYCrE+TgS5gdu0GSSMjohvoLvH6cyLifZJGA3Ml/Swi7q9Jw5qBnDAi+v0RTLd/fJBpNStiMq2RpzYDHpB0Y0TcUeL1rI24RlyC9A71UUnfk/S4pB9IOlzSHZKekHRAut9oSVdKukfSA5KmSdoEOAc4Pr2DPl7SDEnXSLoDuKb6rlfSGEn/LWmepIckHduozxERrwL3Abv1kYbxkn4s6d70cXBWeiQtlrRN+pl/IelBSfMlHZ9uv11SV/r8xPT4+ZIuqPq7rpT01fTYuyRt16jPaq2tg/LUa8BcYKf0WkdKulPS/ZJ+JGlM+v7+kn6fftfvkTQ2/RvMSfe9X9K7GpUua7KI8KPBD2AisAbYh+Rm5z7gSkDANOAn6X5fAz6cPt8CeBwYDXwMuLTqfDPSc2yWvj4U+Hn6/ALgP6v23bKP9FxEkvlrH9P72Lf63FsDi4G9+kjDtcC70+cTgEey0pOeZxvgWOC7Vds3T/+9naTGsiPwFDCepMXmNuD96T4BHJ0+/zpwZrP/r/0YmkcH5akt0+tun+aH3wGj021fAM4CNgEWAfun749L88KbgFHpe5OA7qq/zfzaa/nRPg83TZfnyYiYByDpYeDWiAhJ80gyDsCRwDGSPpe+HkVSqPVlViR307UOB06ovIiIl2p3iIh/HWDaD5H0ANADnB8RD0v6QE0aDgf2lFQ5Zlx6N5+XnnnAf6Q13Z9HxJya7fsDt0fEMgBJPwDeA/wEeAOo9H/dBxwxwM9l7a3d89SDJAXof0bEc5LeB+wJ3JHmo02AO4G/Ap6NiHvTa70CSW0fuFTSZGAt8LYBpsFalAvi8rxe9byn6nUP6/7uAo6NiMeqD5R0YB/ne3WwCZF0EXBYH5uui4jz+3h/TkS8LycNI4CDImJVzbUy0xIRj0t6JzAVOE/SrRFxTuZB66yOiEpw9LX4+zvctH2ekrQLcJek69O03hIRJ9ace59+LvuvwPPAO0jy36p+9rM24z7i5roZ+IzS0kvSvun7K4CxBc9xC/CpygtJW9buEBH/GhGT+3j09YNR1K+Az1Rdd3KR9EjaEfhLRHwfuBB4Z8157wHem/YnbwScCPy2jnTa8NLSeSoingTOJ2mGvgs4WNJu6XVGS3ob8Biwg6T90/fHShoJbE5SU+4BPgJsVPDzWItzQdxc5wIbAw+lTW3npu//hqTZd25lMFOG84At04FND9L3XXoZTge60sEsfwBOLZiefYB7JM0Fzk737xURzwLTSf4GDwL3RcRPy/sY1mHaIU99h6S7pdJ3/T+SHiJplt49It4Ajge+mV7/FpIm9v8CTkrf2506avTWWrSupc/MzMyGmmvEZmZmTeSC2MzMrIlcEJuZmTWRC2IzM7Mm6piCeMqUKUESeckPP5rxaEvON340+WF0UEH8wgsvNDsJZm3H+cas+TqmIDYzM2tHLojNzMyayLF6zXL09ASLX3yV519ZxXbjRjFx69GMGJEdU9tsuHO+Kc4FsVmGnp7gpoef44zr57JqdQ+jNh7BNz44mSl7be8fFbN+ON8MjJumbdjr6QkWLVvJnQtfYNGylfT0rBvMufjFV3t/TABWre7hjOvnsvhFh/k164/zzcC4RmzDWt6d+/OvrOr9MalYtbqHpStWsev4MU1KtVlrc74ZGNeIbVjLu3PfbtwoRm28fjYZtfEIth07asjTatYunG8GxgWxDQv9NT9n3bkDTNx6NN/44OTeH5VKjXni1qOH9gOYtZisLh3nm4Fx07R1hKwRmlnNz5U79+rCuPrOfcQIMWWv7dn99ENYumIV24716E+zvC4d55uBcY3Y2kLW3XflR2HqJXM48bt3M/WSOdz08HO9+2Q1Pxe5cx8xQuw6fgwH7boNu44f4x8TG/aKDMZyvinONWJreXl33/39KOx++iHsOn5M7sAR37mbDYwHYzWWa8TWMvqr9ebdfef18+YNHPGdu1nf+suTHozVWC6IbcgMtnm53oK2VQeOSJoi6TFJCyRN72P7ppJ+mG6/W9LE9P2Jkl6TNDd9fKfqmP0kzUuPuUSS7ypsULLyZKvmqXblpmkbEvU0L+cNqKr8KNSeu/Kj0IoDRyRtBHwLOAJYAtwraVZE/KFqt5OBlyJiN0knABcAx6fbFkbE5D5O/W3gE8DdwGxgCvDLcj6FdbK8Lp9Wy1PtzAWxNUzWyOV6+nEPmLh13QVtpfm5hfqvDgAWRMQiAEnXAdOA6oJ4GjAjfX4DcGlWDVfSDsC4iLgrfX018H5cEFs/svJsXj9wC+aptuWC2Aakv4xbb4SqrFpvmxa0eXYC/lT1eglwYH/7RMQaSS8DW6fbdpH0APAKcGZEzEn3X1Jzzp1qLyzpFOAUgAkTJtT/Sawt5eXZvJYoaxz3EVthWX1G9Uaoyutz8oCq9TwLTIiIfYEzgGsljSt6cETMjIiuiOgaP358aYm01paXZ90PPHRcI7b1DLZ5Oa/G2479uCV7GnhL1es3p+/1tc8SSSOBzYEXIyKA1wEi4j5JC4G3pfu/OeecZkCxpudhliebxgWx9aqnebkREarasHm5HvcCkyTtQlJYngB8qGafWcBJwJ3AccBtERGSxgPLI2KtpF2BScCiiFgu6RVJB5EM1voo8M0h+jzWgrJurIs0PQ+zPNk0bpq2XvU0LztC1cBExBrg08DNwCPA9RHxsKRzJB2T7nYFsLWkBSRN0JUpTu8BHpI0l2QQ16kRsTzddhpwObAAWIgHag1beRHn3PTcOpS0cpV0cmkKcDGwEXB5RJxfs/1U4FPAWmAlcEpl+oakL5JM31gLnB4RN2ddq6urK7q7uxv/IdpQ1l1w1vY7F77Aid+9e4PzXXfKgRy06za5NebKeYdpM1ZbflDnm861aNlKpl4yZ4Ma7+x0pgLQCnm2LfNNo5XWNF1wnuS1EfGddP9jgG8AUyTtSdJUtxewI/BrSW+LiLVlpbdTFCksy1oAwc1YZkOvvxvrImEonWdbQ5lN073zJCPiDaAyT7JXRLxS9XI0UKmeTwOui4jXI+JJkma2A0pMa8fIa172AghmnSOr+dlhKNtHmYO1isyTRNKnSPq/NgH+purYu2qO9XzIVD2T8L0AglnnyJrJkDdTwVpH00dNR8S3gG9J+hBwJsko0aLHzgRmQtLXVU4Km2OwgTPympeLND+7qcqsddRz4+0b6/ZQZtN0kXmS1a4jCcc3mGM7Sj2BM/Kalz1S0qx95I189spinaHMGnHuPElJkyLiifTl3wOV57NIogV9g2Sw1iTgnhLT2lLqCZxRZECV75LN2kNejHY3P3eG0griNDZuZZ7kRsCVlXmSQHdEzAI+LelwYDXwEmmzdLrf9SQB8NcAn+q0EdODbW5qxCR8Nz+btY56F17wjXX7K7WPOCJmkyzFVv3eWVXP/yXj2K8CXy0vdc1TTz+v74DNOkcjFl7wjXX7c2StJqinn7dyBzz79EO47pQDmX36Ib2Z1szaixdeMGiBUdOdqszmJt8Bm7WXwQbdcNPz8OCCuARubjKzinqi2YF/C4YDN03XoacnWLRsJXcufIFFy1b2Tilwc5OZVdQbzc46n2vEg5R1l+vmJjOrcNANy+MacYb+aryQfZdbJMarJ9qbpCmSHpO0QNL0PrZvKumH6fa7JU2s2T5B0kpJn6t6b7GkeZLmSvKySi3AQTcsjwvifuRFtMm6y3Vzk+WpWp3sKGBP4MR01bFqJwMvRcRuwEXABTXbv0Hf6w0fFhGTI6Krwcm2fmTdtPv3wPK4abofeRFtsgZZuOnZCuhdnQxAUmV1suplQqcBM9LnNwCXSlJEhKT3A08Crw5Ziq1PeYMz/XtgeVwj7kdWjRfy73Ld3GQ5+lqdrHaFsd59ImIN8DKwtaQxwBeAr/Rx3gB+Jem+dHWyDUg6RVK3pO5ly5bV+TEsb3Am+PfAsg3rGnHWXN8iqxT5LteaZAZwUUSslDb4vr07Ip6WtC1wi6RHI+J31Tt08qplzZA3GMssz7AtiPOak4qEkvT8PqtDkRXGKvsskTQS2Bx4kWRd7+MkfR3YAuiRtCoiLo2IpwEiYqmkG0mawH+HlabIXGCzLB1fEPdX683rA3aN10qWuzoZySpkJwF3AscBt0VEAIdUdpA0A1gZEZdKGg2MiIgV6fMjgXNK/yTDQFbrmeO/W706uiCuZ64vuMZr5Sm4OtkVwDWSFgDLSQrrLNsBN6bN1SOBayPiptI+xDDhwVhWNiU32O2vq6srurvXnza5aNlKpl4yZ4Mmo9mnJxWK/ra54LVBaMtf3b7yzXDVX60363fEvxV1a8t802gdPWrac33NrIisuAF5MyjM6lVq07SkKcDFJE1vl0fE+TXbzwA+DqwBlgH/FBF/TLetBealuz4VEccM9Pqe62tmRWSNGfFgLCtbaTXigpGDHgC6IuLtJAELvl617bU0OtDkwRTC4Lm+ZlaMW8+smcqsEedGDoqI31Ttfxfw4UYmwLVeMyvCrWfWTGX2EReJHFTtZNaPmzsqjf5zVxrOb1Bc6zUzqC8etH9HrEwtMX1J0oeBLuC9VW/vnEYI2hW4TdK8iFhYc9wpwCkAEyZMGLL0mll78RQka2Vl1oiLRA5C0uHAl4BjIuL1yvtVEYIWAbcD+9YeGxEzI6IrIrrGjx/f2NSbWcdwPGhrZWUWxL2RgyRtQhKMYFb1DpL2BS4jKYSXVr2/paRN0+fbAAez/qo0ZmaFeQqStbLSmqYLRg66EBgD/CiNBlSZprQHcJmkHpKbhfMjwgWxmfWrnkVczJqp1D7iiJgNzK5576yq54f3c9zvgX3KTJuZdY5GLOJi1iwtMVjLzKweXsTF2pkLYjNre17ExdpZR8eaNrPhodIHXM19wNYuXBCbNYmkKZIek7RA0vQ+tm8q6Yfp9rslTazZPkHSSkmfK3rOTuUwlNbO3DRt1gRVsdiPIIk6d6+kWTWzA04GXoqI3SSdAFwAHF+1/RtURaMreM621t/IaPcBWztzQWzWHLmx2NPXM9LnNwCXSlJERBr29Ung1ar9i5yzbRWJjuU+YGtHbpo2a44isdh794mINcDLwNaSxgBfAL4yiHMi6ZQ0jnv3smXL6voQQ6lIdCyzduSC2Kz9zAAuioiVgzm4XUPDOjqWdSo3TZs1R5FY7JV9lkgaCWwOvAgcCBwn6evAFkCPpFXAfQXO2bYcHcs6lWvEZs2RG4s9fX1S+vw44LZIHBIREyNiIvCfwNci4tKC52xp9SxVaFZL0mclvanZ6ciTWyOWtB3wNWDHiDhK0p7AX0fEFaWnzqxDFYzFfgVwjaQFwHKSgnXA5yz1gzSQlyq0EnwW+D7wlyanI5MiInsH6ZfAfwNfioh3pE1kD0RES8WC7urqiu7u7mYnw4avtiwNWinfLFq2kqmXzNmg6Xl2GqbSOlLD8o2k0cD1JF0yGwE/Illi9zHghYg4TNK3gf2BzYAbIuLs9NipJNMBXwXuAHaNiPel5/wmsDewMTAjIn7aqDRXFGma3iYirgd6oHf05tpGJ8TMhjcPxrI6TQGeiYh3RMTeJN02zwCHRcRh6T5fiogu4O3AeyW9XdIokuV4j4qI/YDqEYxfIukSOgA4DLgwLZwbqkhB/KqkrYEAkHQQyTQKM7OGcZhKq9M84AhJF0g6JCL6Kqc+KOl+4AFgL2BPYHdgUUQ8me7zP1X7HwlMlzQXuB0YBUxodMKLjJo+g2TAx1sl3UFyt3BcoxNiZsOblyq0ekTE45LeCUwFzpN0a/V2SbsAnwP2j4iXJH2PpGDNIuDYiHisjDRX5BbEEXG/pPcCf5Um6rGIWF1mosysM/UXohLwYCyri6QdgeUR8X1JfwY+DqwAxgIvAONI+oBfTgchH0VSy30M2FXSxIhYzPphZG8GPiPpM2lEu30j4oFGp73IqOmP1rz1TklExNUFjp0CXEzScX55RJxfs/0Mkj/WGmAZ8E8R8cd020nAmemu50XEVXnXM7PWlTcqGrxUodVlH5I+3B5gNfDPwF8DN0l6Jh2s9QDwKEkEujsAIuI1Sael+71KMg2w4lySvuaHJI0gCSv7vkYnvMio6W9WvRwF/C1wf0RkNk+nAegfpyoAPXBidQB6SYcBd0fEXyT9M3BoRBwvaSugG+gi6Zu+D9gvIl7q73qtNPrThqW2rLYNZb7xqGjrQ0vkG0ljImKlJJEsnPJERFw0VNcv0jT9merXkrYAritw7twA9BHxm6r97wI+nD7/O+CWiFieHnsLyYi46k50M2sjWaOiXRBbk30ibYXdhGQg12VDefHBhLh8FdilwH59BaA/MGP/k1m3pFvh4PXAKQATJjR8IJuZDUJ//cAOUWmtKq39DlkNuFaRPuKfkU5dIpnutCfJpOmGkfRhkmbo9w7kuIiYCcyEpImtkWkys4HL6gf2qGizvhWpEf971fM1wB8jYkmB44oEtUfS4SSTpt8bEa9XHXtozbG3F7immTVRf0sV7p72A3tUtNmGivQR/3aQ5+4NQE9SsJ4AfKh6B0n7krTFT4mIpVWbbga+JmnL9PWRwBcHmQ4zGyJ5/cAeFW22oX4LYkkrWNckvd4mICJiXNaJCwa1vxAYA/woGazGUxFxTEQsl3Qu64aRn1MZuGVmrcv9wGYDlzt9qV14+pI1WVu2rzY63xSZK2xWpSW+FOlsoA9FxH8N8LjZ6XF/ruf6hUdNS9qWqnBgEfFUPRc2G+4KBLzZFLga2A94ETg+IhZLOoB0kCLJD9mMiLgxPWYxSTShtcCaNMD9kHF0LGtTWwCnAesVxJJGpgsd9Skipjbi4kVGTR8D/AewI7AU2Bl4hCRgtpkNQhrw5ltUBbyRNKs64A3JlL6XImI3SScAF5CE35sPdKXdPzsAD0r6WdUPxmER8UJZac8KUwmOjmXlW7Fq9YRHnl1x7vOvrNpx+3Gjnt19h7Fnjh21cT2Vw/NJ1lOYSxKVaxXwEsmCEG+T9BOSwcejgIvTGTuVG98uki7WXwL/B7yLZFzUtIh4rcjFi9SIzwUOAn4dEfum0bA+nHOMmWXLDXiTvp6RPr8BuFSSIqJ6kfNR9D2WoxRuerZmW7Fq9YRfznvu12fNmj+p8h0855i9Dzpqn+0Pr6Mwng7sHRGTJR0K/CJ9XVmR6Z/SsUubkdw0/zgiXqw5xySS6JGfkHQ9cCzw/SIXL7IM4ur0giMkjUijYQ1pc5dZByoStKZ3n7S2+zKwNYCkAyU9TLL026lVteEAfiXpvjTgzQYknSKpW1L3smXLBpTo/qYnLX7x1QGdx2ywHnl2xbmVQhiS7+BZs+ZPeuTZFec28DL3VBXCAKdLepAkAuRbSArdWk9GxNz0+X3AxKIXK1IQ/1nSGGAO8ANJF5NE1zKzJomIuyNiL2B/4Ivp4uYA746Id5KsLPMpSe/p49iZEdEVEV3jx4+v3Zwpa3qS2VB4/pVVO/b5HXxl1Y4NvExvGZfWkA8H/joi3kESArOvaQCvVz1fywDGYBUpiH8DbA78C3ATsBA4uugFzKxPRQLe9O4jaSRJPlyvOSwiHgFWAnunr59O/10K3EjSBD5gPT3BomUruXPhCyxatpKenqT1uzI9qZqnJ9lQ2n7cqGf7/A6OG/VMHaetLJfYl81Jxmr8RdLuJF21DVWkIB4J/IokstVY4Id9tI2b2cD0BryRtAlJwJtZNfvMAk5Knx8H3JauibpLWjAjaWeSASWLJY2WNDZ9fzRJIJz5A01YpR946iVzOPG7dzP1kjnc9PBz9PREb5jKyg+hw1TaUNt9h7FnnnPM3k9UfwfPOWbvJ/bYYeyXB3vOtEy7Q9J8kvgW1W4CRkp6hGRQ112DvU5/Cs8jlvR2khGbxwJLIuLwRiemHp5HbE024JFKkqaSrHVaCXjz1eqAN2lz8zXAvsBy4ISIWCTpIySDS1YDPSQBb34iaVeSWjAkN9DXRsRXs9LQV77JW66wMmra05OsAQb1xamMml76yqodtx036pk9dhj75TpHTTfVQFZfWgo8R9I0tm05yTEbPiJiNjC75r2zqp6vAj7Qx3HXkBTQte8vAt5Rb7ocptJa3dhRGz91wC5bnZS/Z3vIbZqWdJqk24FbSUZsfiIi3l52wsysOdwPbDa0ivQRvwX4bETsFREzagIOmFmHcT+w2dAqsvqSVz0yG0YcptJsaA2kj9jMhgn3A5sNnSJN02ZmZlYSF8RmZtZ4PT3wwhPw5Jzk356e/GOaRNIWkk4b5LGflfSmeq7vgtjMzBqrpwce/Rlcdghc9b7k30d/1sqF8RYkyyAOxmeB1i2IJU2R9JikBZKm97H9PZLul7RG0nE129ZKmps+aiMOmZlZq1q+EG78JKxOVwFc/VryevnCxpx/1SsT+OPvr2L+/97CH++8mlWvTKjzjL3LIEq6UNLnJd0r6SFJX4EkWp2kX0h6UNJ8ScdLOp1kieDfSPrNYC9e2mCtguutPgV8DPhcH6d4LSIml5U+MzMryYrn1hXCFatfg5XPwTZ9LVw0AKtemcAjP/01sz8/idWvwcabwdQLD2KPaYczalwjlkE8kiSk7AEkkb9mpYunjAeeiYi/B5C0eUS8LOkM6lwDvMwace96qxHxBlBZb7VXRCyOiIdIwvSZmVm7yOoDHrt9UkBW23gzGLN9/dd9fv65vYUwJAX87M9P4vn5jVoG8cj08QBwP0ks90kkS44eIekCSYdExMsNul6pBXGR9VazjErXTL1L0vv72qGedVXNzGyQ8vqAt3or/MNl6wrjjTdLXm/11vqvveK5Hfusba94rlHLIAr4t4iYnD52i4grIuJx4J0kBfJ5ks7KPk1xrTyPeOeIeDoNZH+bpHkRsV4HQ0TMBGZCEry+GYk0M+tYPT1Jv+6K55Ja7lZvhREj+u8D/uSeSdPziBGw+9HJ65XPJTXhyrH1GrvDs2y82fpN3xtvBmO3b9QyiDcD50r6QUSslLQTyQIrI4HlEfF9SX8GPl5zbEs2TRdZb7VfVeuqLiJZgnHfRibOzMwyZNV6s/qAK0aMSArliYesK5wbYbu9zmTqhU+sV9ueeuETbLd3o5ZBPAK4FrhT0jzgBpKCdh/gHklzgbOB89LDZwI3teRgLarWWyUpgE8APlTkQElbAn+JiNclbQMcDHy9tJSaNYGkKcDFJMsgXh4R59ds3xS4GtiPZNWz4yNisaQDSFuCSJrRZkTEjUXOacNMfzXaItuzar2VPuDaWmkj+oDzjBr3FHtMO5yt3nouK57bkbHbP8N2e3+5joFaAEREbfl0cc3rhSS15drjvgl8s55rl1YQR8QaSZ8mSXhlvdWHa9Zb3Z9k/dQtgaMlfSUi9gL2AC6T1ENSaz/fi01YJyk4q+Bk4KWI2E3SCcAFJGuCzwe60jy2A/CgpJ8BUeCcNlxUarSVwrTST7v70Ulhm7c9q9Y74eBk39pjG9EHXMSocU+x87s6ZhnEUvuIC6y3ei9Jk3Xtcb8naQYw61S9swoAJFVmFVQXmtOAGenzG4BLJSki/lK1zyiSArjoOa3TDLYfN297Vq23zD7gYch/NbPmKDKroHefiFgDvEyyJjiSDpT0MMkIzlPT7YVmKni2QQeppx83b3veyOey+oCHoVYeNW1m/YiIu4G9JO0BXCXplwM41rMN2klZ/bh5213rHTL+i5o1R5FZBb37SBoJbE4yaKtXRDwCrAT2LnhOayd583WzarV5Ndoic31d6x0SrhGbNUeRWQWzgJOAO0lC7t0WEZEe86d0sNbOJJF/FgN/LnBOazWDrfHW24/rGm/LcEFs1gRFZhUAVwDXSFoALCcpWAHeDUyXtJokPOxplTi3fZ1zSD+YDUw9I5e3mbSuVtvf6OVKjba/+M55221IKKIzuoi6urqiu7u72cmw4UvNTsBgON802QtPJM3NtTXaT85JCse87bCuRt2etdq2zDeN1jb/W2ZmLStrAYSs7fWOXAb343YAN02bmdWjnsAZHrlsuEZsZlaf/gZULV+Yv901XsM1YjOzfFkjm/MGVOVtd4132HNBbGaWJa/puRGBMzxyeVjzbZeZWZa8pudGBM6wYc01YjOzLHlNyw6cYXVyQWxmBv33AxdZe9eBM6wOviUzM8uK6eymZSuZa8RmNjzUE9PZTctWolK/SZKmSHpM0gJJ0/vY/h5J90taI+m4mm0nSXoifZxUZjrNrMPVs4oReC6vlaq0b5OkjYBvAUcBewInStqzZrengI8B19YcuxVwNnAgcABwtqQty0qrmXWArDCTeSOfK/3A1Wr7gc1KUuZt3QHAgohYFBFvANcB06p3iIjFEfEQyQoy1f4OuCUilkfES8AtwJQS02pm7azeGq/7ga2JyiyIdwL+VPV6Sfpew46VdIqkbkndy5YtG3RCzYZagW6bTSX9MN1+t6SJ6ftHSLpP0rz037+pOub29Jxz08e2Q/iRmqveGm/vFKM58LGfJ/9WAnaYlaytv2URMTMiuiKia/z48c1OjlkhBbttTgZeiojdgIuAC9L3XwCOjoh9gJOAa2qO+8eImJw+lpb2IZrFqxhZBypz1PTTwFuqXr85fa/osYfWHHt7Q1Jl1ny93TYAkirdNn+o2mcaMCN9fgNwqSRFxANV+zwMbCZp04h4vfxkN5lXMbIOVea38F5gkqRdJG0CnADMKnjszcCRkrZMB2kdmb5n1gmKdL307hMRa4CXga1r9jkWuL+mEP7vtFn6y5L6XHS9bbt0vIqRdajSasQRsUbSp0kK0I2AKyPiYUnnAN0RMUvS/sCNwJbA0ZK+EhF7RcRySeeSFOYA50TE8rLSatZuJO1F0lx9ZNXb/xgRT0saC/wY+Ahwde2xETETmAnQ1dUVQ5Dc4upZ5cg1XmtTpQb0iIjZwOya986qen4vSbNzX8deCVxZZvrMmqRIt01lnyWSRgKbAy8CSHozyQ3sRyNiYeWAiHg6/XeFpGtJmsA3KIhbVr2rHDmMpLUp3y6aDb0i3TazSAZjARwH3BYRIWkL4BfA9Ii4o7KzpJGStkmfbwy8D5hf7scYhHrm+nqKkXUoh7g0G2JFum2AK4BrJC0AlpMU1gCfBnYDzpJUaV06EngVuDkthDcCfg18d8g+VBF5Nd56Vzkya1MuiM2aoEC3zSrgA30cdx5wXj+n3a+RaWy4vHjOjVjlyKwN+VbSzIaGo1uZ9ck1YrOskbrWOJ7ra9YnF8TWOrIKxLzCcrDb8/oti5zb1tff36tS4639W/c119dNzzaMuCC29dVb4A323FkFImQXlnmFadb2vH7LIgW1rZP393KN12wDzgGdKGuKSNb2vBVsimzv77p5x2ZNXcmb1lLP9rx+y7xz2/ry/l6ObmW2AeeCeuQVePUcm1eoDbbAy9peT4FWT0EL2QViXmFZz/a8VXnyzj0cZX3//PcyGzAXxINVZu0wa3u9BV49tcOs7fUUtJBdIOYVlvVszxup6wXj15f3/fPfy2zAOr8gHmzNMm97mbXDepppy6wdZm2vp6CF7AIxr7CsZ3veOrTDdUpNf999R78ya7jOHqxV5gCgvIInawBQXgShrO1B9rF5U0SytueNas3bnnXdvGPzBvJkbcs7tsj2/kbqDscBRlnffUe/Mmu4zi6Is0bEQnZhWU8UoLwfq3oKy8rzwRZ4WdvrKdDqLWgr+2QViFnTWurdnmW4TanJ+u47+pVZw3V2QVxPzTKvMC2zdpi3vZ4Cr57aYdb2egtaax1Z3/0JB+fPBTazAensgriemmU9UYDqrR3W00xbOb6s2mEWF7SdIeu776Zns4ZTRHnrgkuaAlxMshrM5RFxfs32TUnWS92PZK3V4yNisaSJwCPAY+mud0XEqVnX6urqiu7u7vXfLLOPOE9lOpB/rIYLNTsBgzHgfOPvsDVWW+abRiutIJa0EfA4cASwhGQN1hMj4g9V+5wGvD0iTpV0AvAPEXF8WhD/PCL2Lnq9Pn9QILtAzCssXZhacQP+QanjRvUI4HxgE+AN4PMRcVt6zH7A94DNSFZ3+pfIyOSDyjdmjeOCmHKbpg8AFkTEIgBJ1wHTgD9U7TMNmJE+vwG4VFJj/2PKHABkNkjpjeq3qLpRlTSr+kYVOBl4KSJ2S29ULwCOB14Ajo6IZyTtTbKu8U7pMd8GPgHcTVIQTwF+OeAE+rtvNmTKvMXdCfhT1eslrPux2GCfiFgDvAxsnW7bRdIDkn4r6ZAS02nWDL03qhHxBlC5Ua02DbgqfX4D8LeSFBEPRMQz6fsPA5tJ2lTSDsC4iLgrrQVfDby/9E9iZnVp1bamZ4EJEbEvcAZwraRxtTtJOkVSt6TuZcuWDXkizepQ741qxbHA/RHxerr/kpxzOt+YtZgym6afBt5S9frN6Xt97bNE0khgc+DF9G7+dYCIuE/SQuBtwHqdWRExE5gJIGmZpD9mpGcbkia9VuN0DUyrpuumiJgylBeUtBdJc/WRAznO+aZUTtfADHm+aUVlFsT3ApMk7UJS4J4AfKhmn1nAScCdwHHAbRERksYDyyNiraRdgUnAoqyLRcT4rO2SuiOia3AfpTxO18C0aroGYdA3qgCS3gzcCHw0IhZW7f/mnHOux/mmsZwuG4zSmqbTprRPkwwkeQS4PiIelnSOpGPS3a4Atpa0gKQJenr6/nuAhyTNJekbOzUilpeVVrMm6L1RlbQJyY3qrJp9KjeqsP6N6hbAL4DpEXFHZeeIeBZ4RdJB6aDHjwI/LflzmFmdSg3oERGzSUZuVr93VtXzVcAH+jjux8CPy0ybWTNFxBpJlRvVjYArKzeqQHdEzCK5Ub0mvVFdTlJYQ3KDuxtwlqRKfjoyIpYCp7Fu+tIvGcyIaTMbUp0dWWt9M5udgH44XQPTqukasDpuVM8DzuvnnN1A4fn3BbTq39vpGphWTZdRcmQtMzMzy9aq05fMzMyGBRfEZmZmTdTxBbGkKZIek7RA0vT8I4aOpMWS5kmaK6mPgL9Dlo4rJS2VNL/qva0k3SLpifTfLVskXTMkPZ3+zeZKmjrU6RoOWjXftEqeSdPifGMN0dEFcVU836OAPYETJe3Z3FRt4LCImNzkOX7fI4lJXG06cGtETAJuZd3UsqH0PTZMF8BF6d9scjrgyRqoDfJNK+QZcL6xBunogphi8XyHvYj4Hcn0mGrVcY6vogkxi/tJl5XP+aYA5xtrlE4viIvE822mAH4l6T5JpzQ7MTW2SwNEADwHbNfMxNT4tKSH0ia4IW/6GwZaOd+0cp4B5xsbhE4viFvduyPinSRNgJ+S9J5mJ6gvaezvVpnn9m3grcBkksVB/qOpqbGh1hZ5BpxvrLhOL4iLxPNtmoh4Ov13KUnc4AOam6L1PJ8uq0f679ImpweAiHg+ItZGRA/wXVrrb9YpWjbftHieAecbG4ROL4iLxPNtCkmjJY2tPCdZQWd+9lFDqjrO8Um0SMziyo9c6h9orb9Zp2jJfNMGeQacb2wQOjrEZX/xfJucrIrtgBuT2PyMBK6NiJuakRBJ/wMcCmwjaQlwNnA+cL2kk4E/Ah9skXQdKmkySZPfYuCTQ52uTtfC+aZl8gw431jjOMSlmZlZE3V607SZmVlLc0FsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8RWmKRDJf282ekwayfON5bHBbGZmVkTuSDuQJI+LOmedN3RyyRtJGmlpIskPSzpVknj030nS7orDQZ/YyUYvKTdJP1a0oOS7pf01vT0YyTdIOlRST9QGl3BrN0531izuCDuMJL2AI4HDo6IycBa4B+B0UB3ROwF/JYk2g7A1cAXIuLtwLyq938AfCsi3gG8iyRQPMC+wGdJ1qndFTi45I9kVjrnG2umjg5xOUz9LbAfcG96070ZSeD5HuCH6T7fB/5X0ubAFhHx2/T9q4AfpfF8d4qIGwEiYhVAer57ImJJ+nouMBH4v9I/lVm5nG+saVwQdx4BV0XEF9d7U/pyzX6DjW36etXztfg7ZJ3B+caaxk3TnedW4DhJ2wJI2krSziT/18el+3wI+L+IeBl4SdIh6fsfAX4bESuAJZLen55jU0lvGsoPYTbEnG+saXxX1mEi4g+SzgR+JWkEsBr4FPAqcEC6bSlJfxgkS7V9J/3BWAT8v/T9jwCXSTonPccHhvBjmA0p5xtrJq++NExIWhkRY5qdDrN24nxjQ8FN02ZmZk3kGrGZmVkTuUZsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8RmZmZN9P8B/2ALHFLwKzIAAAAASUVORK5CYII=", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "output1, _ = track_model_metrics(model=model1, \n", - " train_interactions=train_interactions, \n", - " test_interactions=test_interactions, \n", - " k=K,\n", - " no_epochs=NO_EPOCHS, \n", - " no_threads=NO_THREADS)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The second model (with both implicit and explicit data) fitting progress:" - ] - }, - { - "cell_type": "code", - "execution_count": 35, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAADQCAYAAADbLGKxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAuQUlEQVR4nO3de5hcVZnv8e+vIdiYC5AQEkBjuERjUCdicxmZoB7BiYiAR+R2dFAZg6OImmGOOKMSQT1cnIAKMwaFEVFBxMGJTASRi0aHWwMRCBdJQoQguSO5QGNCveePvSupVLqrqrtq966q/n2ep5+u2te1O9n11lp7rXcpIjAzM7N8dORdADMzs6HMgdjMzCxHDsRmZmY5ciA2MzPLkQOxmZlZjhyIzczMcuRA3IQkTZV0VIX1XZK+mdG53y7peUkLJD0q6ZwGHXeepF0rrP+upCmNOJdZuSa6px6T9PUGH3+ipIdLznVjI49v2dsx7wJYr6YCXcC88hWSdoyIbqA7w/PPj4ijJQ0HFkj6eUTcX1aGzf05YET0+SGYrv/7AZbVrBZTaY57amfgAUk3RMTvMjyftRDXiDOQfkN9TNL3JP1B0g8lHSHpd5KekHRwut1wSVdKukfSA5KOlbQTcC5wYvoN+kRJsyRdLel3wNWl33oljZD0H5IekvSgpPc36joiYiNwH7B/L2UYK+mnku5Nfw6rVB5JSyXtnl7zf0v6vaSHJZ2Yrr9DUlf6+uR0/4clXVDyd90g6avpvndJGteoa7Xm1kb31IvAAmDv9FzvknSnpPsl/UTSiHT5QZL+J/2/fo+kkenfYH667f2S3tqoclnOIsI/Df4BJgKbgTeSfNm5D7gSEHAs8LN0u68BH0xf7wr8ARgOfBi4tOR4s9Jj7Jy+fztwY/r6AuCSkm1366U8F5Pc/OU/Z/eybemxxwBLgQN6KcOPgL9JX08AHq1UnvQ4uwPvB75Tsn6X9PcdJDWWvYCngLEkLTa3Acel2wTw3vT1hcAX8v639s/g/LTRPbVbet7x6f3wG2B4uu5zwJeAnYAlwEHp8lHpvfBKoDNdNgnoLvnbPFx+Lv+0zo+bprPzZEQ8BCBpIXBrRISkh0huHIB3AcdIOit930kS1HozN5Jv0+WOAE4qvomI58o3iIjP9rPs0yQ9ABSA8yNioaQPlJXhCGCKpOI+o9Jv89XK8xDwr2lN98aImF+2/iDgjohYBSDph8DhwM+AvwDF51/3AUf287qstbX6PfV7kgB6SUQsl3Q0MAX4XXof7QTcCbwOeDYi7k3PtQ6S2j5wqaSpwMvAa/tZBmtSDsTZeankdaHkfYGtf3cB74+Ix0t3lHRIL8fbONCCSLoYeEcvq66NiPN7WT4/Io6uUoYO4NCI6Ck7V8WyRMQfJB0IHAV8RdKtEXFuxZ222hQRxeToL+P/v0NNy99TkvYB7pJ0XVrWWyLi5LJjv7GP034WWAH8Fcn919PHdtZi/Iw4XzcDn1IavSS9OV2+HhhZ4zFuAT5ZfCNpt/INIuKzETG1l5/ePjBq9UvgUyXnnVpLeSTtBbwQET8ALgIOLDvuPcDb0ufJOwAnA7+uo5w2tDT1PRURTwLnkzRD3wUcJmn/9DzDJb0WeBzYU9JB6fKRknYEdiGpKReADwE71Hg91uQciPN1HjAMeDBtajsvXX47SbPvgmJnpgq+AuyWdmz6Pb1/S8/CmUBX2pnlEeDjNZbnjcA9khYA56TbbxERzwJnk/wNfg/cFxH/ld1lWJtphXvq2ySPW4rPrq+R9CBJs/TkiPgLcCLwrfT8t5A0sf8bcGq6bDJ11OituWhrS5+ZmZkNNteIzczMcuRAbGZmliMHYjMzsxw5EJuZmeWobQLx9OnTgyTzkn/8k8dPS/J945+cf4w2CsSrV6/OuwhmNZM0XdLjkhZJOruX9a+Q9ON0/d2SJqbLJ0p6MR2Gs0DSt0v2eUuaH3mRpG8Wx9JW4vvGLH9tE4jNWkWaqOQy4N0kKQ5P1vZTQJ4GPBcR+5PkNb6gZN3ikgQSHy9Z/u/Ax0jSKE4Cpmd1DWbWOA7EZoPvYGBRRCxJkzdcSzJxQaljgavS19cD76xUw5W0JzAqIu5K04B+Hziu4SU3s4ZzILaWUCgES1Zt4M7Fq1myagOFQks/XtobeLrk/bJ0Wa/bRDL38/Mks2EB7JNO8fdrSdNKtl9W5ZgASJohqVtS96pVq+q7EmtqbXbftC0nzbemVygENy1czszrFtCzqUDnsA5mnzCV6QeMp6Oj6mPQdvMsMCEi1kh6C/AzSQf05wARcTlwOUBXV5c/mdtU7vdNoQBrF8P65TByPIzeDzpc9+uN/yrW9Jau2bjlwwSgZ1OBmdctYOmalk21+wzw6pL3r0qX9bpNScL/NRHxUkSsAYiI+4DFJNPhPZMep9IxbQjJ9b4pFOCxn8OcaXDV0cnvx36eLLftOBBb01uxrmfLh0lRz6YCK9e37Cxw9wKTJO0jaSeSuW/nlm0zFzg1fX08cFs69+7YtLMXkvYl6ZS1JJ0sY52kQ9NnyX8HeLKMISzX+2btYrjhdNiUTve86cXk/drF2Z+7Bblp2preuFGddA7r2OZDpXNYB3uM7Kz5GIVCsHTNRlas62HcqE4mjhmeW7N2RGyWdAbJlH07AFdGxEJJ5wLdETEXuAK4WtIiYC1bJ6o/HDhX0iaSeXg/HhFr03WfAL4H7Az8Iv2xIaoR982ArV++NQgXbXoRNiyH3Sdlf/4Wk2kgljQd+AbJh813y+fqlPRxknk/XwY2ADMi4pF03edJhnC8DJwZETdnWVZrXhPHDGf2CVO3e9Y1cczwmvbP/VlZLyJiHjCvbNmXSl73AB/oZb+fAj/t45jdwBsaW1JrVfXeN3UZOR6G7bxtMB62M4wYn/25W1Bm0yCmzWd/AI4k6cF5L3ByMdCm24yKiHXp62OAT0TE9HRM5TUkwzz2An4FvDYiXu7rfF1dXdHd3Z3JtVj+ijXalet72GPk9jXaSjXeJas2cNQ3529XM5h35jT2HTuiUUVsyV5jvm/aW7X7JsMTJ8+Ei83Tw3aG982Bye8t77DVkvdNo2VZI94yVhJAUnGs5JZAXAzCqeFsTXl2LHBtRLwEPJk2zx1MMnG2DUEdHWLfsSN6DZzVaryVnpXtO3ZE1WbrZmrWNuuPSvdNxidOgu7pU5Lm6BHuNV1JloG4t7GSh5RvJOmTwExgJ+B/lex7V9m+242JlDQDmAEwYcKEhhTaWk9fvUMnpzXeSs/KqgXxZmzWNhs09QxB6uhIngf7mXBVuX89iYjLImI/4HPAF/q57+UR0RURXWPHjs2mgNb0qvUOLT4r6xyW/HcvfVZWbYhHGw6dMquNhyANmixrxLWMlSx1LUmu3IHsaw2SZzPtQI9drXdoR4eYfsB4Jp85bbtnZdWarautN2tbfQ1BOn2Ka7kNlmUg3jJWkiSIngScUrqBpEkR8UT69j1A8fVc4EeSZpN01poE3JNhWY3qz1qzbKat59i19A7t61lZtSCe6xAQszx5CNKgyaxpOs2PWxwr+ShwXXGsZNpDGuAMSQslLSB5Tnxquu9C4DqSjl03AZ+s1GPaGiPrZtpKeW/rOXaxxjvvzGlcO+MQ5p05reYvB5WarWtZb9a2ikOQSnkIUiYyHUdcw1jJT1fY96vAV7MrXb6asSduls209fZsrmagvUMrNVvXst6sqdXT2Wr0fsmQo/IhSKP3q//Ytg1n1spB3j1x+/oSkGUzbT09m7NWLYjnNgTErB61j+XtXaUhSPUe27bhv1gOsmzirWXfmxYu56hvzufk79zNUd+cz00Ll1MoRN3NtJXKVU/PZjMbgEbkey4OQZo4LfldDLLOJd1QrhFXkFXzcZZNvNVUq5kOtJm2Wrnq6dlsZgOQZWcrd+RqKNeI+1Cp5li6zUBqpsWgVKreJt5aa9PVaqbFZthD992dfceO2C4Q9rW+WrlqqfFWO7eZ9UOWna3ckauhHIj7UC2w1BKo+1JLUOoryNc7tVk9XwIqqSXAD7Rns5kNQLGzVTFglne2atZjD0Fumu5Dtebjak28lVRrhq3UzFtvp6asZmSppVzu9GQ2iLLM9+xc0g3lQNyHaoEly+E2lYJ8vYE0q2exuU65Zma9qzPfc8V+Ms4l3TAOxH2oFliyHG5TLcjXG0izqJm6s5VZTjIaz5v3MMuhZEgH4krf9qoFlmqBup4e17X0MG7GJt5mLVezkjQd+AawA/DdiDi/bP0rgO8DbwHWACdGxNKS9RNIss/Nioivp8s+C/w9yZSiDwEfiYjaOhBYc6oUaDMcz1vt8VszJiVqVW0fiPv6z1LLt71KgaWeoTzVuJm3/UnaAbgMOJJkms97Jc2NiEdKNjsNeC4i9pd0EnABcGLJ+tnAL0qOuTdwJjAlIl6UdB1JjvfvZXoxlp1qgTbDiRkqtcxNHDPcteUGausn65V6NjdieruBDuWp5bj19DCuJ+GHDZqDgUURsSQi/kIy+9ixZdscC1yVvr4eeKckAUg6DngSWFi2z47AzpJ2BF4J/Cmb4tugqJY4o9J43jpVGmHh6UEbq60DcaX/LPUOA6qkEcce6JjaeoZV2aDaG3i65P2ydFmv26STqDwPjJE0gmT+7i+XbhwRzwBfB54CngWej4hflp9Y0gxJ3ZK6V61a1aDLaQGFAqx+Ap6cn/xuhXl1qwXaDMfzVhpmmeXn51DU1oG40n+WrMbTQnZjdWvhb6pDwizg4ojYULpQ0m4kteh9SKYPHS7pg+U7R8TlEdEVEV1jx44djPLmr1Unua8WaDMcz1upZS7Pz7h21NaBuNJ/lixzG+eZN9nfVFvGM8CrS96/Kl3W6zZpU/MuJJ22DgEulLQU+Azwz5LOAI4AnoyIVRGxCfhP4K0ZXkPraNXcyNUCbUcHhdcdzfpTb+fPJ9zA+lNvp/C6oxs2nrevljnnhm+stu6sVanTU5bDbfIcyuOJ7FvGvcAkSfuQBNyTgFPKtplLMkf3ncDxwG0REcC04gaSZgEbIuJSSYcAh0p6JfAi8E6gO+sLaQmtmhu5SuKMQiG46ZGVzLxuafoZt5TZJ+yaeacpD1dsrLYOxLXMNZvVcJu8hvK4x3VriIjNaS32ZpLhS1dGxEJJ5wLdETEXuAK4WtIiYC1JsK50zLslXQ/cD2wGHgAuz/I6Wkaxibc0GLdKbuQKiTPqyfBXf7E8XLFRlHzBbn1dXV3R3e0v/7B1yJa/qQ6qlvwDD5n7pk3nz71z8WpO/s7d2y2/dsYhHLrv7jmUqN9a8r5ptLauEQ9V/qZqVqZNcyP7UVR7aO3/hWZmteprkvsW5k5T7cE1YjOzFuVOU+3BgdjMrIU166Mo56KuXaaBuIak9jNJEtRvBlYBH42IP6brXiZJWg/wVEQck2VZzcysMTxzU/9k9pCkJKn9u4EpwMmSppRt9gDQFRFvIsmle2HJuhcjYmr64yBsZtYinOGvf7LsrVA1qX1E3B4RL6Rv7yLJLmRmZi3MGf76J8tAXEtS+1KnUTKlG9CZJqa/K51pxszMWoBzUfdPU/TfTxPTdwEXlSx+TUR0kaT9u0TSdlnMh+wsMmZmTczDqvony85atSS1R9IRwL8Ab4uIl4rL0yndiIglku4A3gxsk6E9Ii4nTeHX1dXVHinCzMxanIdV9U+WgbhqUntJbwbmANMjYmXJ8t2AFyLiJUm7A4exbUcuMzNrYs06rKoZZRaIa0xqfxEwAviJJNg6TOn1wBxJBZLm8/Mj4pGsympmNhR5rG9zyHQccUTMA+aVLftSyesj+tjvf4A3Zlk2M2tBhUIyh/D65cmMSs2SL7pZy1WBx/o2D2fWMrPWkOcMSpUCbYvO7LR0zUYuuukRvjqtk3H6MyvYjYtueoTJ40e6OXmQORCbWWtYu3hrsIPk9w2nJzMq9TJXb8NUC7T1liun2vRzG3u48pAV7DN/5pbrOnDabJ7b2AMOxIOqeb+umbUxSdMlPS5pkaSze1n/Ckk/TtffLWli2foJkjZIOqtk2a6Srpf0mKRHJf31IFzK4Fm/fGuwK9r0YjKtYZb6CrRrF9dfrmKQnzMNrjo6+f3Yz5PlGXvtjqu2BmGATS+yz/yZTNrRQ0EHmwOx2SCrMf3racBzEbE/cDFwQdn62WybAAeSvO43RcRk4K+ARxtd9lyNHJ/URksN2zmZWzhLVQJtYUTv5SoMr6Fc1YJ8hoZvWt3rdQ3ftDrzc9u2HIjNBl/V9K/p+6vS19cD71Q6tCDNNPcksLC4saRdgMOBKwAi4i8R8ecMr2Hwjd4vaRIuBr1iE/Ho7XL9NFaVLwBPazxLD5+9TbmWHj6bp1VDIM6rlg909HFdHSMz/mJj23EgNht8taR/3bJNRGwGngfGSBoBfA74ctn2+5DMYPYfkh6Q9F1J7ZXGqKMjeS57+nz48I3J78HoEFXlC8Cfnn+JU347lp8edA3zD7uKnx50Daf8dizPrnupwkFTedXyIb8vNrYdd9Yyay2zgIsjYkNaQS7aETgQ+FRE3C3pG8DZwBfLDyBpBjADYMKECZkXuKE6OpIOUFl2zurtnJPfm3S+2rA8CZIlHarGjepk7Qub+cfb/gIMA16oPa9yMRiWdwQbjGBY5bps8DgQmw2+WtK/FrdZJmlHYBdgDXAIcLykC4FdgYKkHpLm62URcXe6//UkgXg7Tg07ABW+ABTzKpePx60pr3LewTCPLza2HQdis8FXNf0rMBc4FbgTOB64LSICmFbcQNIsYENEXJq+f1rS6yLiceCdgLPRDYK68yo7GA55DsRmg6zG9K9XAFdLWgSsJQnW1XwK+KGknYAlwEeyuQIr57zKVg8HYrMc1JD+tQf4QJVjzCp7v4BkOlEzayF+Km9mZpYjB2IzM7McORCbmZnlyIHYzMwsR+6sZWbWxAqFYOmajaxY18O4Uf0cGmUtwYHYzKxJFQrBTQuXb5csZPoB4x2MayDpM8DlEfFC3mWppGrTtKRxkq6Q9Iv0/RRJp2VfNDOzNlEowOon4Mn5ye8apzlcumbjliAM0LOpwMzrFrB0zcYsS9tOPgO8Mu9CVFPLM+LvkSQe2Ct9/weSizMzG1SFQrBk1QbuXLyaJas2UCi0QIbOOuYcXrGuZ0sQLurZVGDl+p6sStuyJA2X9N+Sfi/pYUnnkMSt2yXdnm7z75K6JS2U9OWSfY9K5/G+T9I3Jd1YcswrJd2TTqZSPktaQ9TSNL17RFwn6fOwJSvQy1kUxsysLy3bTNvXnMOnT6ma1nLcqE46h3VsE4xrnlBi6JkO/Cki3gNbpgb9CPCOiChOsvwvEbE2nRP8VklvIqlczgEOj4gnJV1Tcsx/IUkv+1FJuwL3SPpVRDS0SaKWGvFGSWOAAJB0KMmUbGZmg6Zlm2nrmHO4OKFE57Dko7pfE0oMPQ8BR0q6QNK0iOgtTp0g6X7gAeAAYAowGVgSEU+m25QG4ncBZ0taANwBdAINn7KslhrxTJIE9PtJ+h0wliQJvZnZoKnUTNvUOZ6Lcw6XBuMa5xyue0KJISQi/iDpQOAo4CuSbi1dn06ychZwUEQ8J+l7JIG1EgHvTydSyUzVGnFE3A+8DXgrcDpwQEQ8WMvBJU2X9LikRZK2m5JN0kxJj0h6UNKtkl5Tsu5USU+kP6fWfklm1o6KzbSlWqKZdvR+xHHfToIvwLCdk/c1zjlcnFDi0H13Z9+xIxyE+yBpL+CFiPgBcBHJ/NzrgZHpJqOAjcDzksYB706XPw7sK2li+v7EksPeDHxK6eTfkt6cRdmr1ogl/V3ZogMlERHfr7LfDsBlwJHAMuBeSXMjonRqtgeAroh4QdI/ABcCJ0oaDZxDksA+gPvSfZ+r+crMrK3UNe9vjgqIOzoOZd0h1zIm/swa7cqojsm8HTmjUmO9EbhIUgHYBPwD8NfATZL+FBHvkPQA8BjwNPA7gIh4UdIn0u02kkxTWnQecAnwoKQO4Eng6EYXvJam6YNKXneSzHN6P1AxEAMHA4siYgmApGuBYymZIzUibi/Z/i7gg+nrvwVuiYi16b63kDyIL227N7MhpFWbaZeu2cgnfvRA2qw+DNhI57AHmHfmtOZuUm8xEXEzSQ22VDfwrZJtPtzH7rdHxOS05ntZuh8R8SJJS3CmqgbiiPhU6fu059i1NRx7b5JvHUXLgEMqbH8a8IsK++5dvoOkGcAMgAkTGv783MyaTD3z/lbNUFUoJD2c1y9PnuuO3g866q+zVnu27cxZTeFj6SPQnUhaaucM5skHkllrI7BPIwsh6YMkzdBv689+EXE5cDlAV1dXCwwoNLM8VB36VBzrWxxmNGxneN8cmPzeuoNxpSFILTskq81ExMXAxXmdv5bMWj+XNDf9uZHkwfYNNRz7GeDVJe9flS4rP/4RJGO1jomIl/qzr1krq6Ez4ysk/Thdf3dJZ5Li+gmSNkg6q2z5DmnygRszvoSWUXXoU19jfdcurvvclYYgteyQLGuoWmrEXy95vRn4Y0Qsq2G/e4FJaZfxZ4CTgFNKN0h7oM0BpkfEypJVNwNfk7Rb+v5dwOdrOKdZS6ixM+NpwHMRsb+kk4AL2LZH52y2Ps4p9WngUZJeokYNQ58qjfWtknSjmkrPtlt2SJY1VC3PiH89kAOnGbjOIAmqOwBXRsRCSecC3RExl6SL+QjgJ2nv8Kci4pg088l5bO29dm6x45ZZm6jamTF9Pyt9fT1wqSRFREg6jqQH5zZVJ0mvAt4DfJUkB4BRQ4aqOsb61qKvZ9vOnGVQoWla0npJ63r5WS9pXS0Hj4h5EfHaiNgvIr6aLvtSGoSJiCMiYlxETE1/jinZ98qI2D/9+Y96L9SsydTSIXHLNhGxmSSj3RhJI4DPAV9me5cA/xfoM5GxpBlpvt3uVatWDfgCWknVDFWj90ueCZeM9eV9c2oe65tZuWxI6LNGHBEj+1pnZrmaBVwcERvSliQAJB0NrIyI+yS9va+dh2Inx6pDnzo6ko5Zp09JmqNHNK7XdF3lskGRjgY6JSL+rZ/7zUv3+3M956+517SkPShJBxYRT9VzYrMhrpYOicVtlknaEdgFWEMyDPB4SRcCuwIFST0kNehjJB1Fcq+OkvSDiPggVn3oU0dH8jy4l2fCWQ4xqmdIljXMrsAngG0CsaQd09aoXkXEUY04eS2ZtY4B/pVkOqmVwGtIOoIc0IgCmA1RVTszkuR4PxW4kyS/+20REcC04gaSZgEbIuLSdNHn0+VvB85quiCc0VjdLHmIUfNZ37NpwqPPrj9vxbqevcaP6nx28p4jvzCyc1g9lcPzSeZTWECSlasHeI5kQojXSvoZyZfiTuAbaasSkpaSDL0dQdJx8rck6aCfAY5NE4JUVUuN+DzgUOBXEfFmSe9gawYsMxuAGjszXgFcLWkRsJYkWLeuDMfqZqmvIUaTnRkrF+t7Nk34xUPLf/WluQ9PKn4xOveYNxz67jeOP6KOYHw28IaImJp+if3v9H1xRqaPpp2IdyYZ4fDTiFhTdoxJwMkR8TFJ1wHvB35Qy8lrCcSbImKNpA5JHRFxu6RLajm4mfUtIuYB88qWfankdQ/wgSrHmNXH8jtIpm1rHnXMy5snDzFqLo8+u/68YhCG5N/iS3MfnjRx9+HnHbzP6EZNEHRPSRAGOFPS+9LXryYJuuWB+MmIWJC+vg+YWOvJagnEf057ac4HfihpJWVDJszMqspwrG6WPMSouaxY17NXr1+M1vXs1cDTbIlxaQ35COCv0wmK7qD36RNfKnn9MrBzrSerpT3odpJOIp8GbgIWA++t9QRmZsDWsbqlGjhWNyseYtRcxo/qfLbX6TBHdf6pjsOWTpdYbheSxDovSJpM8qi2oWqpEe8I/JLkGdWPgR/30jZuZlZZcaxu+TPijMfq1stDjJrL5D1HfuHcY95waNkz4idev+fILw70mOnj199Jehh4EVhRsvom4OOSHiVJ8XxXfVewPSWdMGvYUHoTSXq99wPLIuKIRhemHl1dXdHd3Z13MWzoaslP5UG/b4q9pgdxrK41tQHdN8Ve0yvX9ey1x6jOP71+z5FfrLPXdK76M/vSSmA5yQPqPbIpjpm1tQpjdc1qNbJz2FMN7JiVu1pmX/pE+nD6VmAM8LGIeFPWBTMzMxsKaqkRvxr4TEm3bDOzbLRgwg+zetUy+5KnHzSz7LVowg+zevl/t5k1h74SfqxdnG+5zDLmQGxmzaFSwg+zNuZAbGbNoUUTfljrk7SrpE8McN/PSHplPed3IDaz5lBM+FEMxi2S8MPawq4k0yAOxGeAugJxf8YRm5llp6Mj6Zh1+hQn/LDKetZNYMXD57F++V6M3PNZxh3wBTpHNWoaxFtI8macALwCuCEizpE0HLiOZO7wHUhmJhxHMkXw7ZJWR8Q7BnJyB2Izax51JPwoFIKlazayYl0P40Y5DWXb6lk3gUf/61fM+6dJW3rXH3XRobz+2CPqCMal0yC+i2T+74NJMn/NlXQ4MBb4U0S8B0DSLhHxvKSZwDsiYvVAL8lfNc2s5RUKwR2Pr+DB33fz8pL5PPj7+7jj8RUUCrWl8LUWsuLh87YEYUg69M37p0msePi8Bp3hXenPA8D9wGSSaQ8fAo6UdIGkaRHxfIPO5xqxmbW+p9duYN/VtzHx7plbxiAvPXw2T499L6/Zva9JdawlrV++V6+969cvb9Q0iAL+X0TM2W6FdCBwFPAVSbdGxLmNOKFrxGY5kDRd0uOSFkk6u5f1r5D043T93ZImlq2fIGmDpLPS96+WdLukRyQtlPTpQbqUprDz+qVM/M3MbcYgT/zNTF65fmmu5bIMjNzz2V57148c36hpEG8GPippBICkvSXtIWkv4IWI+AFwEXBgL/sOSKaBuIYPm8Ml3S9ps6Tjy9a9LGlB+jM3y3KaDSZJOwCXAe8GpgAnS5pSttlpJHOg7g9cDFxQtn428IuS95uBf4yIKSTzpX6yl2PmrlAIlqzawJ2LV7Nk1YaGNR2P2LSm1zHIwzd5xta2M+6AL3DURU9s07v+qIueYNwb6poGEShOg3gk8CPgTkkPAdeTBNo3AvekHbrOAb6S7n45cJOk2wd6/syapks+bI4ElgH3SpobEY+UbPYU8GHgrF4O8WJETM2qfGY5OhhYFBFLACRdCxwLlN4bxwKz0tfXA5dKUkSEpOOAJ4GNxY0j4lng2fT1+nTu1L3LjpmrQiG4aeFyZl63gOI8srNPmMr0A8bX3amqc7e9kw/k0mA8bOdkubWXzlFP8fpjj2D0fmmv6fF/Ytwbvlhnr2ki4pSyRd8oe7+YpLZcvt+3gG/Vc+4sa8RbPmwi4i9A8cNmi4hYGhEPAoUMy2HWbPYGni55vyxd1us2EbEZeB4YkzaXfQ74cl8HT5ux3wzc3cf6GZK6JXWvWrVqoNfQp75qvUvXbNwShAF6NhWYed0Clq7ZWOlwNekYsx9x3Le3GYMcx32bjjEeg9yWOkc9xWveeipv+N9H8pq3nlpvEM5blp21evuwOaQf+3dK6iZpcjs/In5WvoGkGcAMgAkTJgy8pGatYxZwcURskLavRaaB+qckM6at6+0AEXE5SXMaXV1dDe1WXKnWu2Jdz5YgXNSzqcDK9T3sO3ZEfSfu6ECvPwbGHbBlDLI8BtlaRDP3mn5NRDwjaV/gNkkPRcQ22d+z/EAxy9AzJNOLFr0qXdbbNssk7QjsAqwh+TJ7vKQLSbIBFST1RMSlkoaRBOEfRsR/ZnwNveqr1jv5zGmMG9VJ57CObYJx57AO9hjZ2ZiT1zEG2SxPWX5drOXDpk8R8Uz6ewlwB0lTm1k7uBeYJGkfSTsBJwHlHRLnAqemr48HbovEtIiYGBETgUuAr6VBWMAVwKMRMXtQrqIXlWq9E8cMZ/YJU+kclnzsFGvLE8cMz6OoZk0jyxrxlg8bkgB8ElD+MLxXknYj6Sb+kqTdgcOACzMrqdkgiojNks4g6fixA3BlRCyUdC7QHRFzSYLq1ZIWAWtJ7p9KDgM+BDyU9uoE+OeImJfJRfShUq23o0NMP2A8k8+cxsr1Pewx0tmvzAAUkV2LrqSjSL61Fz9svlr6YSPpIOAGYDegB1geEQdIeiswh6QTVwdwSURcUelcXV1d0d3dndm1mFXRktGk0fdNlj2jrS35PwUZB+LB5EBsOWvJD5Q+75tCAdYuTuYIHtm/yReKOZ9d67Ua+D8Gzd1Zy8zyUCjAYz+HG07fki6S981JZkaqIRh3dIh9x46ovye02RDhvv1mtq21i7cGYUh+33B6stzMGs6B2My2tX55r+ki2bA8n/KYtTkHYjPb1sjx9JpUf8T4fMpj1uYciM1sW6P3S54JlybVf9+cZHnGspoUwqyZubOWmW2royPpmHX6lC3pIvvTa3qg8h76VOztvWJdD+NGube3DR4HYjPbXg7pIiulx8y6B3beXwJsaHPTtJk1hUrpMbOW5cxQZtU4EJtZUyimxyzV0EkhKsjzS4CZA7GZNVahAKufgCfnJ78LtU03nuekEHl+CTDzM2Iza5w6snLlOSlE8UtA+TNizwxlg8GB2Mwap6+sXKdPqanjV17pMT0zlOXJgdjMGqdSVq5B7IE9EM6RbXnxM2Izaxxn5TLrNwdiM2ucHLNymbUqB2KznEiaLulxSYsknd3L+ldI+nG6/m5JE8vWT5C0QdJZtR6zVgNONbklK9d8+PCNye8ap080G6r8jNgsB5J2AC4DjgSWAfdKmhsRj5RsdhrwXETsL+kk4ALgxJL1s4Ff9POYVdWdZSqHrFxmrcxfU83ycTCwKCKWRMRfgGuBY8u2ORa4Kn19PfBOSQKQdBzwJLCwn8esylmmzAaXA7FZPvYGni55vyxd1us2EbEZeB4YI2kE8DngywM4JpJmSOqW1L1q1artCuYsU2aDy4HYrPXMAi6OiA0D2TkiLo+IrojoGjt27HbrnWXKbHA5EJvl4xng1SXvX5Uu63UbSTsCuwBrgEOACyUtBT4D/LOkM2o8ZlV5ppo0G4oy7awlaTrwDWAH4LsRcX7Z+sOBS4A3ASdFxPUl604FvpC+/UpEXIXZQBUKSdan9cuTsa79mV+3nn37di8wSdI+JMHyJOCUsm3mAqcCdwLHA7dFRADTihtImgVsiIhL02Bd7ZhVOcuU2eDKLBDX2IPzKeDDwFll+44GzgG6gADuS/d9LqvyWpOrN5AOMP9xXftWEBGb01rszSRfVK+MiIWSzgW6I2IucAVwtaRFwFqSwNrvYw6kfM4yZTZ4sqwRb+nBCSCp2INzSyCOiKXpuvLpWf4WuCUi1qbrbwGmA9dkWF6rV7VgOdBgWm8wrCf/cZ25kyuJiHnAvLJlXyp53QN8oMoxZlU7ppk1tyyfEdfUg7Oefav1/mxqA5wqrmnPXQyWc6bBVUcnvx/7+dZjV1tfSV/BcO3i2spWKf9xsWx9/T2q7WtmVqeW7qxVrfdn06onKBX3H2ggrffcfakWLGsJpn1dV73BsFL+42p/D+dONrOMZRmI6+nB2ZDen02rnhpevYG02rmrBfmBBstaaqV9XVctwbBSuSvlP67293DuZDPLWJbPiGvpFdqXm4GvSdotff8u4PONL2KdBvrMs5ap4vo6di3PLCuVq9K5R+9X+VlspWe1xWBZeuzSYFltfaXrKgbD8vMWg2G1Z8hb8h9PSa5zRMnfpNq/RaV9zcwaILNPkzQTULEH56PAdcVeoZKOAZB0kKRlJB1S5khamO67FjiPJJjfC5xb7LjVb/U24/a1bz0102o1vErHrqdmWe3c9TQvV6s5Vltf6bqqTSRQSwtDMf/xxGlbA2wt/xaV9h2q8uzfYNaGMh1HXEOv0HtJmp172/dK4Mq6ClCtplSp5lht33p601ar4VU6dj01y90nVT73H39XuXZYrfZYqeZYrWZZ7boqTSRQz2T01f4tbFsZDecyG8rae/alas2d9QTaah/+lYJ8taBU6dgTDqscOOppaq23ebnarDuV1tcTEKuVqxI3PfdPhsO5zIaq9g7E1Zpx6wm0lT78a6k1VApKlY5db82y0rmrBcMsa4/1BMR6y+Vp+2pWWL+cjl7ui8L65XT472c2IO0diCsFpXoCLVT+8K+31lAtsGRVs6wWDLOuPQ40ILpWO2g2Dtudkb3cFxuH7c7I/Ipl1tLaOxBXC0oDDbRQX0/cauoJLPUGpXqal/PUrOVqM3/YPJbR02azz/yZW+6LJ6fN5rnNYzkw78KZtaj2DsSVglI9gbb0+L19+NfzzLLasbPe16yC3YZ38tG7x3HGQdewh55nZezCpXcXuOL1nXkXzaxltXcghr6DUj2Bthr3xLU2NXHMcP5p+hRmXreAnk3D6BzW4ykSzerU/oG4kqxqjn5maW3KUySaNd7QDsRZcvOwtSlPkWjWWK6imZmZ5ciB2MzMLEeKiLzL0BCSVgF/rLDJ7sDqQSpOf7hc/dOs5VodEdPzLkR/+b5pOJerf1ryvmm0tgnE1UjqjoiuvMtRzuXqn2YtV7tq1r+3y9U/zVouS7hp2szMLEcOxGZmZjkaSoH48rwL0AeXq3+atVztqln/3i5X/zRruYwh9IzYzMysGQ2lGrGZmVnTcSA2MzPLUdsHYknTJT0uaZGks/MuTylJSyU9JGmBpO4cy3GlpJWSHi5ZNlrSLZKeSH/v1iTlmiXpmfRvtkDSUYNdrqGgWe+bZrln0rL4vrGGaOtALGkH4DLg3cAU4GRJU/It1XbeERFTcx7j9z2gfFD92cCtETEJuDV9P9i+x/blArg4/ZtNjYh5g1ymttcC900z3DPg+8YapK0DMXAwsCgilkTEX4BrgWNzLlPTiYjfAGvLFh8LXJW+vgo4bjDLBH2Wy7Ln+6YGvm+sUdo9EO8NPF3yflm6rFkE8EtJ90makXdhyoyLiGfT18uBcXkWpswZkh5Mm+AGvelvCGjm+6aZ7xnwfWMD0O6BuNn9TUQcSNIE+ElJh+ddoN5EMsatWca5/TuwHzAVeBb411xLY4OtJe4Z8H1jtWv3QPwM8OqS969KlzWFiHgm/b0SuIGkSbBZrJC0J0D6e2XO5QEgIlZExMsRUQC+Q3P9zdpF0943TX7PgO8bG4B2D8T3ApMk7SNpJ+AkYG7OZQJA0nBJI4uvgXcBD1fea1DNBU5NX58K/FeOZdmi+CGXeh/N9TdrF01537TAPQO+b2wAdsy7AFmKiM2SzgBuBnYAroyIhTkXq2gccIMkSP4dfhQRN+VREEnXAG8Hdpe0DDgHOB+4TtJpJNPkndAk5Xq7pKkkTX5LgdMHu1ztronvm6a5Z8D3jTWOU1yamZnlqN2bps3MzJqaA7GZmVmOHIjNzMxy5EBsZmaWIwdiMzOzHDkQW80kvV3SjXmXw6yV+L6xahyIzczMcuRA3IYkfVDSPem8o3Mk7SBpg6SLJS2UdKuksem2UyXdlSaDv6GYDF7S/pJ+Jen3ku6XtF96+BGSrpf0mKQfKs2uYNbqfN9YXhyI24yk1wMnAodFxFTgZeD/AMOB7og4APg1SbYdgO8Dn4uINwEPlSz/IXBZRPwV8FaSRPEAbwY+QzJP7b7AYRlfklnmfN9Ynto6xeUQ9U7gLcC96ZfunUkSzxeAH6fb/AD4T0m7ALtGxK/T5VcBP0nz+e4dETcAREQPQHq8eyJiWfp+ATAR+G3mV2WWLd83lhsH4vYj4KqI+Pw2C6Uvlm030NymL5W8fhn/H7L24PvGcuOm6fZzK3C8pD0AJI2W9BqSf+vj021OAX4bEc8Dz0mali7/EPDriFgPLJN0XHqMV0h65WBehNkg831jufG3sjYTEY9I+gLwS0kdwCbgk8BG4OB03UqS52GQTNX27fQDYwnwkXT5h4A5ks5Nj/GBQbwMs0Hl+8by5NmXhghJGyJiRN7lMGslvm9sMLhp2szMLEeuEZuZmeXINWIzM7McORCbmZnlyIHYzMwsRw7EZmZmOXIgNjMzy9H/B1Ck7YhL5z97AAAAAElFTkSuQmCC", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "output2, _ = track_model_metrics(model=model2, \n", - " train_interactions=train_interactions2, \n", - " test_interactions=test_interactions2, \n", - " k=K, \n", - " no_epochs=NO_EPOCHS, \n", - " no_threads=NO_THREADS, \n", - " item_features=item_features,\n", - " user_features=user_features)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "These show slightly different behaviour with the two approaches, the reader can then tune the hyperparameters to improve the model fitting process.\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 4.1 Performance comparison" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In addition, the model's performance metrics (based on the test dataset) can be plotted together to facilitate easier comparison as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 36, - "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA9dElEQVR4nO3deXyU5b3//9d7khBCCARCwhZWAQVkKUTAtSrVorWira1Y63Jqtf219nQ77bHfb8/R2tYez+ne+j099qh1rbZupa3WulStGwoWQUBkhyBrEiAJBJLM5/fHfYPDZAIzJJOZJJ/n4zGPzFz39rnvmcxn7uu67+uSmeGcc84lK5LpAJxzznUunjicc86lxBOHc865lHjicM45lxJPHM4551LiicM551xKPHG4Yybpckl/TWK+X0n6t46IqTOQtEzSmZmO42iSfX9d9yO/j6NrkrQeGAg0A/XAk8D1ZlaXybicg0Ofz8+a2TNtXM/V4XpOa4+44tZtwFgzW93e6+7s/Iyja/uomfUGpgEVwLfjZ5CU2+FRdVN+rF1X4YmjGzCzzQRnHCdC8EtK0hclrQJWhWUXSFosaZekVyRNPri8pGGSHpW0Q1KVpF+G5VdLeil8Lkk/kbRd0h5JSyUd3N5vJH0vZn3XSlotqVrSfElDYqaZpM9LWhXGcpskJdovSTmS/o+kNZJqJS2SNCycdoqkNyTtDv+eErPc85K+F+5nnaQ/SiqRdH8Y+xuSRsbF9M+S1kraKem/JEXCacdJei48LjvDdRTHLLte0r9KWgLUS8oNyz4UTp8haWG43W2Sfhyz7IVhtdauMObxcev9F0lLwn18SFLPVo7TTZLui3k9Mtyn3Jj3cW14DNdJujz+/T3aexO+Fz8Kj8E6SdfHbiMunnuB4cAfw+P/zbB8Vvie7JL0lmKq8xLFGB6PXwEnh+vZ1cr+J9y/cNpnJK2QVCPpKUkjwvIXw1neCtd9aaJ1d1tm5o8u+ADWAx8Knw8DlgHfDV8b8DTQHygAPgBsB2YCOcBV4fL54eu3gJ8AhUBP4LRwPVcDL4XPPwwsAooBAeOBweG03wDfC5+fDewkOAvKB34BvBgTtwF/CtczHNgBzGllH78BLAWOD7c5BSgJ96sGuALIBS4LX5eEyz0PrAaOA/oCy4F3gQ+F898D3BUX09/C9Q4P5/1sOG0McE64L6XAi8BP496HxeF7UJDgvXkVuCJ83huYFT4fR1DFeA6QB3wzjLlHzDpeB4aEca0APt/KcboJuC/m9chwn3LD93QPcHw4bTAwMf79Pdp7A3w+PI7lQD/gmYPbONrnM3w9FKgCzif4QXtO+Lo0lRgTbOdIy84Nj+n48Fh8G3glbn/HZPp/ORsffsbRtT0e/gp7CXgBuCVm2g/MrNrM9gHXAf9jZgvMrNnM7gb2A7OAGQRfTt8ws3ozazCzl2ipESgCTiBoO1thZlsSzHc5cKeZvWlm+4FvEfxiHBkzz3+Y2S4z20jwhT21lf37LPBtM1tpgbfMrAr4CLDKzO41syYz+y3wDvDRmGXvMrM1Zrab4GxsjZk9Y2ZNwO8JkmmsW8PjtRH4KUEywsxWm9nTZrbfzHYAPwY+GLfsz81sU3isEx23MZIGmFmdmb0Wll8K/DlcdyPwQ4Ikf0rMsj83s/fMrBr44xGO09FEgRMlFZjZFjNbdoR5W3tvPgn8zMwqzawG+I8UY/g08ISZPWFmUTN7GlhIkEhSjTFea8t+nuD/YEX4vt8CTD141uFa54mja7vIzIrNbISZfSHui2tTzPMRwNfDKoJdYbIZRpAwhgEbwn+sVpnZc8AvgduA7ZJul9QnwaxDgA0xy9UR/LIcGjPP1pjnewl+iScyDFhztG2ENsRtY1vM830JXsdvM/Z4bQi3gaSBkh6UtFnSHuA+YMARlo13DcHZxTthFdkFifbBzKLheo7lOLXKzOoJktTngS2S/izphCMs0to2h3D4fh5pnxMZAXwi7jN4GsFZa6oxHnKUZUcAP4vZXjXBmevQhCtzh3ji6L5iL6fbBHw/TDIHH73CX+qbgOGJ6qpbrNDs52Y2HZhA8GX4jQSzvUfwDwuApEKC6qXNx7APmwiqm464jdDwY9zGQcPi1vVe+PwWgmM5ycz6EPxyjm+TafXSRTNbZWaXAWXArcDD4TGJP04KYziWfagHesW8HhQXw1Nmdg5BNc47wK+PYRtbCKqpDhrW2owHNxv3ehNwb9xnsNDM/uMoMR71stAjLLsJ+FzcNgvM7JWjrbO788ThIPhH+rykmQoUSvqIpCKCevQtwH+E5T0lnRq/AkknhcvnEXxRNRBUEcT7LfBPkqZKyif44l1gZuuPIe7/Bb4raWwY92RJJcATwDhJn1LQGH0pQTL70zFs46BvSOqnoPH9y8BDYXkRUAfsljSUxMmyVZI+Lak0PKPYFRZHgd8BH5E0OzymXyeoPjyWL7XFwBmShkvqS1A9eHD7AyXNDZPV/nBfEr1vR/M74MuShiq4OOBfjzL/NmB0zOv7gI9K+nDY0N5T0pmSyo8S4zagXFKPRBs5yrK/Ar4laWI4b19JnzhCjC7kicNhZguBawmqmmoIGgyvDqc1E7QNjAE2ApUEp/7x+hAkoBqCKpYq4L8SbOsZ4N+ARwgS0nHAvGMM/ccEX1h/JWgAvYOgAboKuIDgy7aKoGH5AjPbeYzbAfgDQeP/YuDP4bYAvkPQ0L87LH80xfXOAZZJqgN+Bswzs31mtpLg7OUXBBcTfJTg8uoDqQYethc8BCwJ9yE2gUaArxGc4VQTtM/8f6lug+C9/2u4jX8QJO8mgvuIEvkB8O2wmuhfzGwTQWP1/yFodN9EkIQjR4nxOYILP7ZKSvT+trqsmT1GcJb3YFjN+DZwXsyyNwF3hzF+MpWD0dX5DYDOHYX8RrCUSToP+JWZeUNzF+RnHM65NpNUIOn8sGpwKHAj8Fim43Lp4YnDOdceRFBtV0NQVbUC+PeMRuTSxquqnHPOpcTPOJxzzqWkW3S6NmDAABs5cmSmw3DOuU5l0aJFO82sNL68WySOkSNHsnDhwkyH4ZxznYqk+B4YAK+qcs45lyJPHM4551LiicM551xKukUbRyKNjY1UVlbS0NCQ6VCyUs+ePSkvLycvLy/ToTjnsky3TRyVlZUUFRUxcuRIlHiAuW7LzKiqqqKyspJRo0ZlOhznXJbptomjoaHBk0YrJFFSUsKOHTsyHYpznZLtqYLtG7D9+1D/wVA2HOV07NetNdRD437o1afdt91tEwfgSeMI/Ng4d2xsTxXRP/4/2LY+eC0RmfslGD2lY7ZvUdj4DtEXHoLdO9D4WVAxBxWXtds2vHHcOefa0/YNh5IGAGZE//Ygtq+2Y7a/o5LoYz+FnZXQuB9b8gLR1/6INR1xEM+UpDVxSJojaaWk1ZJuSDD9DElvSmqSdEnctKskrQofV8WUT5e0NFznz9XJfhrfdNNN/PCHP2x1+uOPP87y5cs7MCLnXHuy/QmGlq+rhsaUh1I5tu1XvQfRuGFQVrwKdTXtto20JQ5JOQTjT59HMPraZZImxM22kWDAoAfilu1P0C3zTGAGcKOkfuHk/yYYdGhs+JiTpl3ICE8cznVu6j8Y4n7PauKpUNi3Y7afX9CysFcfyG2/KyTTecYxA1htZmvDUcseJBjh6xAzW29mS2g5VOWHgafNrNrMaoCngTmSBgN9zOw1C7r1vQe4KI370C6+//3vM27cOE477TRWrlwJwK9//WtOOukkpkyZwsc//nH27t3LK6+8wvz58/nGN77B1KlTWbNmTcL5nHNZrGx40KbRtwxyctHkD6KK8zqucbx0OAw+7rAinXkZ6l3cbptI554MJRj+8aBKgjOIY112aPioTFDegqTrgOsAhg8fnuRm29+iRYt48MEHWbx4MU1NTUybNo3p06fzsY99jGuvvRaAb3/729xxxx186Utf4sILL+SCCy7gkkuCmrvi4uKE8znnspNycmH0FCKDRwfVU4V9O/SKKhX1I3LB52DrBqyhDpUMgdL2HYixy15VZWa3A7cDVFRUZGzQkb///e9cfPHF9OrVC4ALL7wQgLfffptvf/vb7Nq1i7q6Oj784Q8nXD7Z+Zxz2UUFRZCg1qhDtl1UAkUlpKsBOJ1VVZuBYTGvy8Oytiy7OXx+LOvMKldffTW//OUvWbp0KTfeeGOrd7AnO59zznWUdCaON4CxkkZJ6gHMA+YnuexTwLmS+oWN4ucCT5nZFmCPpFnh1VRXAn9IR/Dt5YwzzuDxxx9n37591NbW8sc//hGA2tpaBg8eTGNjI/fff/+h+YuKiqitff+yvdbmc84dmR3Yj+2ry3QYXVLaqqrMrEnS9QRJIAe408yWSboZWGhm8yWdRDCgfT/go5K+Y2YTzaxa0ncJkg/AzWZWHT7/AvAbgpPAJ8NH1po2bRqXXnopU6ZMoaysjJNOOgmA7373u8ycOZPS0lJmzpx5KFnMmzePa6+9lp///Oc8/PDDrc7nnEvMolHY/C7RV/8AtTVo6lno+Bmod7+jL+yS0i3GHK+oqLD4gZxWrFjB+PHjMxRR5+DHyHVGtnUd0Qd/cNi9DDr1YiIzL8hgVJ2TpEVmVhFf7neOO+e6FNu+qcUNcPbm01jdrswE1AV54nDOdS15PVqW5RdCB3cy2JV54nDOdSkaNAri2jN0+sdRQe8MRdT1eAp2zrU727UN27oeDjSgsuFBt+KRnA7ZtvoNJHLJ17HNq6B+NyofBwN9XJn25InDOdeurGYr0Ud+DHuqgteKEPnYV2FEfFd16aP+g4M+o1xaeFWVc65d2eY1h5JGUBAl+vKjiXuNdZ2SJ44uYuTIkezcuTPpeT7zmc9QVlbGiSee2BHhuU7G9tZim1YSXbcU253iSJANCW66q9sFzY3tEpvLPE8c3dTVV1/NX/7yl0yH4bKQ1VYTffLXRH//n9hjPyX6wPexbRuSXl6DW7YnaMqZqFef9gzTZZAnjiQt2L6Ob73+OJ/7+wN86/XHWbB9XZvXuX79ek444QSuvvpqxo0bx+WXX84zzzzDqaeeytixY3n99deprq7moosuYvLkycyaNYslS5YAUFVVxbnnnsvEiRP57Gc/S+yNnPfddx8zZsxg6tSpfO5zn6O5ubnFts844wz69+/f5n1wXY+9txo2LHu/YF8t0TeewJqSPGMYOCroVrzfICjojU6eiyackp5gXUZ44kjCgu3ruG/V61TvD8bCqN6/l/tWvd4uyWP16tV8/etf55133uGdd97hgQce4KWXXuKHP/wht9xyCzfeeCMf+MAHWLJkCbfccgtXXnklAN/5znc47bTTWLZsGRdffDEbN24Egru9H3roIV5++WUWL15MTk6O93HlUlO9tWXZe2vgQHJtFMrNQ8dNJTLvW0Su+A6a9VFU5D9SuhK/qioJj69/iwNxd6IeiDbz+Pq3mFnWtsv8Ro0axaRJkwCYOHEis2fPRhKTJk1i/fr1bNiwgUceeQSAs88+m6qqKvbs2cOLL77Io48+CsBHPvIR+vULrlt/9tlnWbRo0aE+sfbt20dZWfsNUu+6Pg0cQXxHRBo7HXqmdh+E3zfRdXniSMLBM41ky1ORn59/6HkkEjn0OhKJ0NTURF5easM9mhlXXXUVP/jBD9ocm+umBh+HZl6AvfFk0HVH+Qloylko4hUULuCfhCT0z++VUnl7Ov300w9VNT3//PMMGDCAPn36cMYZZ/DAA8FQ7U8++SQ1NcFA9LNnz+bhhx9m+/btAFRXV7NhQ/INm86poDea9VEiV9yEPn0jkbnXo/6DMh2WyyKeOJJw0cgp9Ii767VHJIeLRk5J+7ZvuukmFi1axOTJk7nhhhu4++67Abjxxht58cUXmThxIo8++uih4XEnTJjA9773Pc4991wmT57MOeecw5YtW1qs97LLLuPkk09m5cqVlJeXc8cdd7R77NbYgB3Y3+7r7SyscX+nvXdBObmoZAiRsuEoP0PD2Lms5d2qJ2nB9nU8vv4tqvfvpX9+Ly4aOaXN7RvZ7li7VbcDDbBxOdHXnwAzdNL5aMRElN8zDVFmH2tqhMqVRBf8CfbvQxVz0OjJqGdhpkNznYiZQTSKcjqmq5ZEWutW3ds4kjSzbFSXTxTtZvMqovNvO/TS/vT/0Nx/huPSf4aWFbauI/roTw69tL/8L5x/LTphVgaDcp2JbVuPLXkB27kZTTodjZqMCvtmOqxD0lpVJWmOpJWSVku6IcH0fEkPhdMXSBoZlveQdJekpZLeknRmzDLPh+tcHD78kqEsE132SsuyJS9kIJLMsLVvtSxb9DTW2H2r7VzyrOo9or//Ibb0RdiyBvvrb7ClL5JNtUNpSxyScoDbgPOACcBlkuJ7ObsGqDGzMcBPgFvD8msBzGwScA7wI0mxsV5uZlPDx/Z07YM7NipIUCXTnS7NTHTRRM9C6KDeYV3nZjs2tbhnxt54EmqrW1mi46XzjGMGsNrM1prZAeBBYG7cPHOBu8PnDwOzJYkg0TwHECaGXUCLejaXnTTh5MMHzYnkEJl8RuYC6mAadSLkxbTnSEROOg/5QEIuCQkve47kgLLnWqZ0fpKHAptiXlcCM1ubx8yaJO0GSoC3gAsl/RYYBkwP/74eLneXpGbgEeB7luAcTtJ1wHXAoSuOXAcZNJrIpTdgG1cEjXsjJsCgkZmOqsOobASRS/8V27Acmvaj4RNgkLePuSSVDodefWDvnkNFOuUiVNTvCAt1rGz9CXQnMB5YCGwAXgEO3rp9uZltllREkDiuAO6JX4GZ3Q7cDsFVVR0RtAtIgkGjgpHYuimVDQ8GMHIuRYcGolq9GKveQmTcdCg/PtNhHSad5z6bCc4SDioPyxLOIykX6AtUmVmTmX01bMOYCxQD7wKY2ebwby3wAEGVWLeXSrfqmzZt4qyzzmLChAlMnDiRn/3sZx0UZcex2hps/dtE1y7Bdh/5uDiXbTSgnMisC8g5/1o0ZlrWXcqdzjOON4CxkkYRJIh5wKfi5pkPXAW8ClwCPGdmJqkXwT0m9ZLOAZrMbHmYXIrNbKekPOAC4Jk07kOXlJuby49+9COmTZtGbW0t06dP55xzzmHChI4boS2drHor0T/eBlXvBa8Li4l87KuotDzDkblkWUM9VG3Bmvaj4oGo74BMh+RipO2Mw8yagOuBp4AVwO/MbJmkmyVdGM52B1AiaTXwNeDgJbtlwJuSVgD/SlAdBZAPPCVpCbCYICH9Ol37ECu64lWaf/1Nmn98Dc2//ibRFa+2eZ2Z6lZ98ODBTJs2DYCioiLGjx/P5s3xJ4Odl61bcihpAFC/C3v771l1OaNrndXvIvrsfUQf+gH2yI+J/ja18UBc+qW1md7MnjCzcWZ2nJl9Pyz7dzObHz5vMLNPmNkYM5thZmvD8vVmdryZjTezD5nZhrC83symm9lkM5toZl82s5aDTbSz6IpXsafvgdpwOMzaKuzpe9oleWS6W/X169fzj3/8g5kz469b6LwSfcnYljVBh30u+21ZBytff//13j1EX3kcazyQuZjcYbK1cTyr2EuPQVPch7bpQFA+/uQ2rTuT3arX1dXx8Y9/nJ/+9Kf06dN1RmfTqEnYO68dXnb8DL8ctpNIOFTtljWwfy/k9ej4gFwL/p+UjINnGsmWpyBT3ao3Njby8Y9/nMsvv5yPfexjqQeexTR8PFTMwd58GiwKE04JxpNwnYL6D2kxHgijJqc8HohLn+y5oySbFZWkVt6O0tGtuplxzTXXMH78eL72ta+lfR86mgr7olMvJnLld4hceTOR2VegPul/r1w7GTQSzfzo+ze8lY0gMuN8lOu/c7OFvxNJ0GkXB20csdVVuT3QaRenfds33XQTn/nMZ5g8eTK9evU6rFv1yy67jIkTJ3LKKack7FY9Go2Sl5fHbbfdxogRIw6t8+WXX+bee+9l0qRJTJ06FYBbbrmF888/P+3701GUkwv9B2c6jGNme6pgxyasuQkNGIpS3BerrQ6Wb2pEJUNRSec5FiroDbMuQMefhDUdQH1LMzKaoDU2QGMj6lXU4dvOdt6tepKiK14N2jRqq6CoBJ12MZE2tm9ku2PtVt21jdVsI/qHX0B1OI5Kj55ELvmXpG+otF3bif7hl1AVXimX15PIJV9Hg0enKeKuxSwKle8SfeVx2FONJn8QTTi5W46b7t2qt1Fk/Mltbgh3Lhm26Z33kwbAgQaiC58iMuezSVXXWOW77ycNgMYGoq//mchHPo9yU2sz65a2byT6yI8PXYVnLz8a1DacclHQK4LzNg7nss6ubS3LdmyC5iQvR92doMPonZvBu3VPiu2obHHptv3jWajflZmAslC3ThzdoZruWPmxyRwl6JdIE09FSY5xr6FjW5ZNOCUj7QSdUl5+y7KehRDxCpqDum3i6NmzJ1VVVf4FmYCZUVVVRc+e3WOo16wzZAz64Lyga3ZF0OQz0Qkp3KA5+Dh01qegR7j8pDPQhFPSF28Xo7IRENfFic74hDeSx+i2jeONjY1UVlbS0NCQoaiyW8+ePSkvL0/5PpL2ZHvC+2SK+qEOHovAqt7DtqyFaBMaOArKhndo/baZwZ6qoMqkT8kx3bxou3cGyxeV+KWsKbKabdjmVbB3DxpyHAwchbrhzYfeOB4nLy+PUaO6b7ff2cz21WHLXsZenQ8WRSedB5M+iHp3zJjLtmMT0d//FzTUB69zcol84hswZEyHbB/Crunb2LGfdwx47NRvIOo3MNNhZK1uW1XV1dmu7URX/wNbs7jzdSu+6R3sxd9BY0PQtcurf8DWL+mwzdvaJYeSBgDNTUQX/RXzvq6cA7rxGUdXZjsqiT7yI9i7J+i6oah/0K14yZBMh5aU6LsLW5TZ2y9jE05NPKxme6uraVm2pxqiUR833Dn8jKNLsuUvHzbsJLXV2Ko3MxdQijSgZYJT6bCOSRqAjpvasmzqWX4PhHMhTxxdjFkU27q+Zfn2zjOegcZMC8ZcPii/FzrxtI4LYMgYdP51UFwGvfuhsz6FRk/uuO07l+W8qqqLkSJo/Cxs87uHl3ei3mE1oJzIpd+CHRsBgwHlKffV1Kbt9+iJTpiJjZgYNM736jpdzjvXHtJ6xiFpjqSVklZLuiHB9HxJD4XTF0gaGZb3kHSXpKWS3pJ0Zswy08Py1ZJ+Lu8DoAWNnoymnRvUx+fkopkXoOGda1hY9StD4yrQuJM6NGkcFkNBb08aziWQtjMOSTnAbcA5QCXwhqT5ZrY8ZrZrgBozGyNpHnArcClwLYCZTZJUBjwp6SQziwL/HU5fADwBzAGeTNd+dEbq3Q9O/ziachYI6DOgw9oHnHNdXzq/TWYAq81srZkdAB4E5sbNMxe4O3z+MDA7PIOYADwHYGbbgV1AhaTBQB8ze82COxfvAS5K4z50WsrJDX61F5d50nDOtat0fqMMBTbFvK4MyxLOY2ZNwG6gBHgLuFBSrqRRwHRgWDh/5VHWCYCk6yQtlLRwx44EQ1E655w7Jtn6U/ROgqSwEPgp8AqQ0t1XZna7mVWYWUVpaWn7R+icc91UOq+q2kxwlnBQeViWaJ5KSblAX6AqrIb66sGZJL0CvAvUhOs50jq7BNu7B6veGlQz9RvkPZs657JGOhPHG8DYsKppMzAP+FTcPPOBq4BXgUuA58zMJPUi6ICxXtI5QNPBRnVJeyTNImgcvxL4RTqCt4Z6bNd2FMmB4oGoR4KultPEqrcSffJ22LYhuPN7+AQi51yJ+vqZk3Mu89KWOMysSdL1wFNADnCnmS2TdDOw0MzmA3cA90paDVQTJBeAMuApSVGCpHNFzKq/APwGKCC4mqrdr6iymm1E//ob2Pxu8MU98VQip14cXK3UAWzlAtgWc8PexuXY+reDq6Sccy7D0noDoJk9QXDJbGzZv8c8bwA+kWC59UDL0WyCaQuBE9s10MPXjy1/BWJvoFv2MjZ8Aho/K12bfX/7zU3YuqUtyyvfBU8czrkskK2N45nTeABbs7hledyd2OminFw0emrL8uHjO2T7zjl3NJ444uXlJf6SHjS6w0LQCTMgdvjQMdPQiIkdtn3nnDsS76sqjhSBSWcE1UU1W4PCERM79Be/isuIfPSL2K5twVVVxQNRfkGHbd85547EE0cCKhlC5BPfCC6HzcmB/oM7/HJYFRSigo47y3HOuWR54miFehej3sXHvLzt3YPVbAsu5+03EPUsbL/gnHMugzxxpEGL+zBGTSEy+3LUpyTToTnnXJt543ga2PJXDr8PY91b2MblrS/gnHOdiCeOdmaNB7D1Ce7D2LwqA9E451z788TRzpTXA42e0rJ86LgMROOcc+3PE0caaPwsGBxzRdSYaWiE38DnnOsavHE8DdRvEJGLvozVbH3/qqr8XpkOyznn2oUnjjRRQW9UMCbTYbRJ0Ls9+LDuzrlYnjhcC9bYCO+9S3Txc4CITD0bho5FuXmZDs05lwU8cbiW3ltF9JEfH3oZXfMPIpf8C3hHi845vHHcJRBd+kLLsrf/noFInHPZyBOHaymS4EQ0UZlzrltKa+KQNEfSSkmrJd2QYHq+pIfC6QskjQzL8yTdLWmppBWSvhWzzPqwfLGkhemMv7uKTD4DiGkQl4iceFrG4nHOZZe0/YyUlAPcBpwDVAJvSJp/cOzw0DVAjZmNkTQPuBW4lGBUwHwzmxSOP75c0m/DkQEBzjKznemKvdsbfByRT34TW/EqSMF9KR04HolzLruls/5hBrDazNYCSHoQmAvEJo65wE3h84eBXyq49tOAQkm5BGOLHwD2pDFWF0M5uVA+DpX73e7OuZbSWVU1FNgU87oyLEs4j5k1AbuBEoIkUg9sATYCPzSz6nAZA/4qaZGk61rbuKTrJC2UtHDHjh3tsT/OOefI3sbxGUAzMAQYBXxd0sG6ktPMbBpwHvBFSWckWoGZ3W5mFWZWUVpa2iFBO+dcd5DOxLEZGBbzujwsSzhPWC3VF6gCPgX8xcwazWw78DJQAWBmm8O/24HHCJKMc865DpLOxPEGMFbSKEk9gHnA/Lh55gNXhc8vAZ6zoJ+LjcDZAJIKgVnAO5IKJRXFlJ8LvJ3GfXDOORcnbY3jZtYk6XrgKSAHuNPMlkm6GVhoZvOBO4B7Ja0GqgmSCwRXY90laRnBdaF3mdmSsLrqsbDvpFzgATP7S7r2wTnnXEs62JFdV1ZRUWELF/otH845lwpJi8ysIr48WxvHnXPOZSlPHM4551LiicM551xKPHE455xLiScO55xzKTlq4pA0UNIdkp4MX0+QdE36Q3POOZeNkjnj+A3BvRhDwtfvAl9JUzzOOeeyXDKJY4CZ/Q6IwqHOCJvTGpVzzrmslUziqJdUQtArLZJmEfRi65xzrhtKpsuRrxH0KXWcpJeBUoJ+pZxzznVDR00cZvampA8CxxP0G7XSzBrTHplzzrmsdNTEIenKuKJpkjCze9IUk3POuSyWTFXVSTHPewKzgTcBTxzOOdcNJVNV9aXY15KKgQfTFZBzzrnsdix3jtcTDOfqnHOuG0qmjeOPhJfiEiSaCcDv0hmUc8657JVMG8cPY543ARvMrDJN8TjnnMtyR62qMrMXYh4vp5I0JM2RtFLSakk3JJieL+mhcPoCSSPD8jxJd0taKmmFpG8lu07nnHPp1WrikFQraU+CR62kPUdbsaQcgrHDzyOo3rpM0oS42a4BasxsDPAT4Naw/BNAvplNAqYDn5M0Msl1OuecS6NWE4eZFZlZnwSPIjPrk8S6ZwCrzWytmR0guBJrbtw8c4G7w+cPA7MliaBNpVBSLlAAHAD2JLlO55xzaZT0VVWSyiQNP/hIYpGhwKaY15VhWcJ5ws4TdwMlBEmkHtgCbAR+aGbVSa7zYLzXSVooaeGOHTuSCNc551wykhmP40JJq4B1wAvAeuDJNMc1g6AH3iEEl/5+XdLoVFZgZrebWYWZVZSWlqYjRuec65aSOeP4LjALeNfMRhHcOf5aEsttBobFvC4PyxLOE1ZL9QWqgE8BfzGzRjPbDrwMVCS5Tuecc2mUTOJoNLMqICIpYmZ/I/gSP5o3gLGSRknqAcwj6GU31nzgqvD5JcBzZmYE1VNnA0gqJEhc7yS5Tuecc2mUzH0cuyT1Bv4O3C9pO0H7wxGZWZOk6wlGD8wB7jSzZZJuBhaa2XzgDuBeSauBaoJEAMGVU3dJWkbQI+9dZrYEINE6U9hf55xzbaTgB/4RZpD+L8HwsVuBTxNUJ90fnoV0ChUVFbZw4cJMh+Gcc52KpEVm1qKGKZmqqlzgr8DzQBHwUGdKGs4559pXMneOf8fMJgJfBAYDL0h6Ju2ROeecy0qp9I67naC6qgooS084zjnnsl0y93F8QdLzwLMEN+dda2aT0x2Yc8657JTMVVXDgK+Y2eI0x+Kcc64TSGYEwG8dbR7nnHPdx7GMAOicc64b88ThnHMuJZ44nHPOpcQTh3POuZR44nDOOZcSTxzOOedS4onDOedcSjxxOOecS4knDueccynxxOGccy4laU0ckuZIWilptaQbEkzPl/RQOH2BpJFh+eWSFsc8opKmhtOeD9d5cJr31Ouccx0obYlDUg7BELDnAROAyyRNiJvtGqDGzMYAPwFuBTCz+81sqplNBa4A1sV1snj5welmtj1d++Ccc66ldJ5xzABWm9laMzsAPAjMjZtnLnB3+PxhYLYkxc1zWbisc865LJDOxDEU2BTzujIsSziPmTUBuwnG/Ih1KfDbuLK7wmqqf0uQaACQdJ2khZIW7tix41j3wTnnXJysbhyXNBPYa2ZvxxRfbmaTgNPDxxWJljWz282swswqSktLOyBa55zrHtKZODYTDAJ1UHlYlnAeSblAX4KhaQ+aR9zZhpltDv/WAg8QVIk555zrIOlMHG8AYyWNktSDIAnMj5tnPnBV+PwS4DkzMwBJEeCTxLRvSMqVNCB8ngdcALyNc865DpPM0LHHxMyaJF0PPAXkAHea2TJJNwMLzWw+cAdwr6TVQDVBcjnoDGCTma2NKcsHngqTRg7wDPDrdO2Dc865lhT+wO/SKioqbOHChZkOwznnOhVJi8ysIr48qxvHnXPOZR9PHM4551LiicM551xK0tY47pxz7tjs3r+XjfU17G1qZFCvPpQXFpOj7Pmd74nDOeeyyK79e7nr3Vd5Z9c2ACKIL078ICf2H5LhyN6XPSnMOeccG+uqDyUNgCjGg2veoLaxIYNRHc4Th3POZZH6pgMtynY27GV/c1MGoknME4dzzmWRQb36EN9za0XpcPrm9cxIPIl44nDOuXZmZuzYV0tlXQ37GlueQRzJsMJ+fG786RT3KEBAxYBhfHTEJPJyUmuS3rmvjsq6GvYmOINpK28cd865drS/qYkFO9bx8Lp/sL+5iTFFA/j0uJkM7tU3qeVzIzl8YMAwRvcZwIHmJop7FKSUNBqbm3hj50YeWrOQhuYmRvYu4cpxMxlaWHyMe9SSn3E451w72lhfzf2r3zjUJrG6did/WP8WjSm2UfTtUUBpQVHKZxqb6ndx97uv0RBub31dFY+s/Qf7mxtTWs+R+BmHcy7r1OyvZ1PdLg5EmxjUqw9DexXTyphtWWf7vtoWZW9Vb2ZPYwMlOb0zsv1lu7aw+0ADZQV57bINTxzOuaxS1VDH7SteZn1dMDRPriJ8ZdLZjO1bluHIktMnQSP20F7FFOT26Jjt92i5/YE9iyjIaZ+kAV5V5ZzLMutqqw4lDYAmi/LYusU0NCVf1bK38QBr9+xgec0WqhrqUo5hX1Ow/LKaLezcl9ryw4v6M61k+KHXPSI5XHrcdHp1UOIYVtiPWWWjDr3OVYRPjTmJogQJ5Vj5GYdzLqvsPrCvRdm2hloampvomXv0X817Duzj4XWLWbB9HQC98/L554lnMqKoJKnt7znQwGPrFvPK9mAooMLcHnzpxDMZVTQgqeX79ijg8rEnceaQsexrPsDAgj5JN4y3h6IePfnk6GmcOnA0e5sPUNazqN2372cczrmsUl7Yr0XZrLKR9OmRn9Ty62urDyUNgLrG/Ty2/q2kG4c31lUdShoQ3JD3aIpnPL3z8jm+eCBTS4Z1aNI4qDAvn3Hh9ocUtn/7UFoTh6Q5klZKWi3phgTT8yU9FE5fIGlkWH65pMUxj6ikqeG06ZKWhsv8XJ2lxcw5l5SRvUu4atwsCnN7IMSsslGcOXgckSQ7+aveX9+ibF3tTvYm+cVfs7/lGc+62qq03A/RWaWtqkpSDnAbcA5QCbwhab6ZLY+Z7RqgxszGSJoH3Apcamb3A/eH65kEPG5mi8Nl/hu4FlgAPAHMAZ5M13445zpWfm4upwwczfjigTRGo/TP70VuJCfp5csKilqUTew3hN5JtjGUFrS88mlC8SB6Z9Gd25mWzjOOGcBqM1trZgeAB4G5cfPMBe4Onz8MzE5wBnFZuCySBgN9zOw1C8a8vQe4KE3xO+cyqF9+IWUFRSklDYARvftzwfBJRMKvkvLCYi4YfmLS90MML+zP3BFTDi0/pFdf5o6cQo+c1OLoytLZOD4U2BTzuhKY2do8ZtYkaTdQAuyMmedS3k84Q8P1xK5zaKKNS7oOuA5g+PDhiWZxLmtVNdRTWV9DUzTK0MK+DMpAPXlnVZiXz3nDJjB9wHAORJsYkN+b3km2jwD0yuvBh8vHM7WknP3RJkpTXL47yOqrqiTNBPaa2dupLmtmtwO3A1RUVFh7x+ZcumzfW8tty19g6749APTMyeWrk2YzMsmrglzQbceQwmNPtjmRSJuW7+rSWVW1GRgW87o8LEs4j6RcoC9QFTN9HvDbuPnLj7JO5zq1lbu3HkoaAA3NTTxduZymaHMGo0rN3sYDrNmzgxU1W6luaNlY7Tq3dJ5xvAGMlTSK4Mt9HvCpuHnmA1cBrwKXAM+FbRdIigCfBE4/OLOZbZG0R9IsgsbxK4FfpHEfnOtw2xJ0GbGpfjcHmptTru/PhF379/H7tW+ycOcGAIp7FHD9xDMZ1rvlZbauc0rbGYeZNQHXA08BK4DfmdkySTdLujCc7Q6gRNJq4GtA7CW7ZwCbzGwth/sC8L/AamANfkWV62KOLx7YouzkslH0yuuYO4/bal3tzkNJA2DXgX08sentlDv5c9krrW0cZvYEwSWzsWX/HvO8AfhEK8s+D8xKUL4QOLFdA3Uui4wuKuWTo6fxhw1LaGxu5rRBxzGjbGRK66is28XK3VvZ19TICcUDGVFUQl4Hna1si6lmO2j17h3sa2pMuadXl538XXQuTWr21xM1o19+r6RvXgMozOvB2UOOZ0pJOVGzlO9j2Fxfw4+WPnPohrc/bVzKlyaeycT+Q1KKf9f+vTRZlH49epETST7+Ib2KW5Sd2H9IpzljckfnicO5dra38QCv71jPHza8xYHmZs4ecjxnDz2efvm9kl6HJAb0PLYuuFfu2n7YXdIG/Hnj24zpW0Z+Er/49zc1smjnRh5et5iG5kbOGDSGc8rHU9KzMKntj+pTwjlDx/PM5ncwjGGF/TinfHynaJ9xyfHE4Vw7W1O7g9+uWXjo9V83r6A4v4DZQ0/okO03JOiTqb7pAM0WTWr5dXVV3L1qwaHXf9vyLr3z8rlgxKSkli/K68lFIydzctlIDkSbKS0oonee3wfRlXgnh861s2XVW1qUvbx1bUqd5LXFuL4Die9+4Zzy8Ul36712z84WZS9vW0PtgYakY8iN5DC0dz9G9RngSaML8jMO59pZWYK+joYU9umwxumRRf355xPP4omNb1PXdIAPDT2BKSUJO1hIqDhBlVpZQZ+kqrlc9+CfBOfa2YR+g+mfX3iol9b8nFw+NHR8Sg3MbZEbyWFCv8GM6VNKczRKQYqN0mP7lDKooM+hmxDzIjl8dPgkenjicCGF99t1aRUVFbZw4cKjz+hcO9mxr5bK+l1hX1PFna77iqqGOjbV1XAg2syQwr4Jx8hwXZ+kRWZWEV/uPyGcS4PSgiJKE3Tv3VmU9OxNyTFe1eW6Pm8cd845lxI/43AugR37atlUX0Nz1Bha2JchhcWZDsm5rOGJw7k4W/bu5mdLn6PmQDCEqHdr7tzhvKrKuThLq987lDQg6Nb8b++9SzTJG+ic6+r8jMN1SfWN+9m6bw9RMwYWFNGnR0HSy25P0Enf5vAKqR45/lvLOU8crsvZ2VDHfateZ8WurQAML+zHNSecyqBefZJaflL/ofx965rDyk4fdJzfx+BcyH8+uS5nec2WQ0kDYGN9DQu2r0t6+TF9ypg3ejoFOXnkRXI4f9hEppaUH31B57oJ/wnlupxVu7e3KFtWs4WPDD8xqR5aC/N6cNbQ45k6oDzsFr2QiOJ7f3Ku+0rrGYekOZJWSlot6YYE0/MlPRROXyBpZMy0yZJelbRM0lJJPcPy58N1Lg4fZencB5cZW/buZsG2dby6bS2V9TUpLXtC8aAWZZP7D025W+9++YWU9OztScO5OGk745CUA9wGnANUAm9Imm9my2NmuwaoMbMxkuYBtwKXSsoF7gOuMLO3JJUAsV2LXh6OBOiy1PZ9e6is34WZUV7Yj4FJti8AbKqr4ccxAxHlR3L56uTZjEryctjxxYOYNmAYb+7cBMCYPqUpj6DnnGtdOquqZgCrD44ZLulBYC4QmzjmAjeFzx8GfilJwLnAEjN7C8DMqtIYZ5e0s6GO9+p3I4mhvfrSP8lBeNrD5rpd/OTt56htDLrhLszN56uTzmZY7+T6O1q0c+NhAxHtjzbx0pZVSSeO/j0LuXLsLOaUTyBqRllBEYXetbdz7SadiWMosCnmdSUws7V5zKxJ0m6gBBgHmKSngFLgQTP7z5jl7pLUDDwCfM8S9NQo6TrgOoDhw4e3zx51Eu/V7+Lnb//t0L0Ipfm9+eKJH2Rwr47paG/hzg2HkgZAfdN+Fmxfl3Ti2LmvrkXZtn11RC2a9BCsBbl5jPAb9pxLi2y9qioXOA24PPx7saTZ4bTLzWwScHr4uCLRCszsdjOrMLOK0tLSjog5a7y2fd1hN7Dt2F/H4qrKDtt+Zf2uFmWb6pJvpzipbESLstMHH5fSuN3OufRJ53/iZmBYzOvysCzhPGG7Rl+giuDs5EUz22lme4EngGkAZrY5/FsLPEBQJeZCUTNWJxjBbX1tx9X2nVTa8ot/1sBRSS8/tk8ZV4yZQXGPAoryenLp6OlM7DekPUN0zrVBOquq3gDGShpFkCDmAZ+Km2c+cBXwKnAJ8JyZHayi+qakXsAB4IPAT8LkUmxmOyXlARcAz6RxHzqdiMTM0pGs2bPjsPIPDBjWyhItRc1YX1vF2zXvIeDEfkMYUVSS9NVF44sHcdGIKTy5aRlRjDnlE1L64u+V14PTBo9hcv+hmKBvCnd9O+fSL22JI2yzuB54CsgB7jSzZZJuBhaa2XzgDuBeSauBaoLkgpnVSPoxQfIx4Akz+7OkQuCpMGnkECSNX6drHzqrySVD2bx3F3/fshoEs4ecwPi+LS9Rbc3aPTv50dJniIZNR09uWs7XJ8/muD7JVfkV9ejJnGETwiuZjP75hegYLmntk+8Jw7ls5CMAdlFN0WZ2NtQjYEDP3ikNW/qbla/x6va1h5WdPmgMnx7rtYLOdSc+AmA3kxvJSbpvpnj7mva3KKtPUOac6578MhXXwmmDx7YsG3hcBiJxzmUjP+NwLYzrU8YXJpzBU5uWg2BO+QTG9PGeXZxzAU8croX83FymlJQzvngQIHrkpNbHk3Oua/PE4Vrl40845xLxNg7nnHMp8cThnHMuJV4XkaXeq9/FpvoaIohhvfsf86W1zjnX3jxxZKENtVX8aOmz7G9uAqB3Xj5fO3E2Q3sXZzYw55zDE0erqhrqeG/vbnIUYWivvvTN75XyOg40NxMRKY08Z2a8uGXVoaQBUNe4n8XVlZ44nHNZwRNHApV1Nfzs7b+xJxxTYnhhP64bfxqlBUVJLb+v6QDLa7by9OZ3yI/kcG75BMYVl5GXRAKJmrFl754W5dv2tSxzzrlM8MbxOFGL8vyWVYeSBsDG+hpW7Nqa9Dre2bWV2995iXW1O3ln9zZ+sexvrKtt2dV5IjmRCKcNHtOifPqA7jUYlXMue3niiNMYbWZtgvEskh2IqCnazLObVx5WZsCbOzYlXiCBE/sN4eKRU+iZk0dhbj6XHVfhd24757KGV1XFyc/Jo6J0OJs37Dqs/ITi5LolF6IgN69Fec8EZa3p06Mnc4ZNZGbZSIQoPob2FeecSxc/40hgRtlIKsKBjyISHy4fz9i+yf3iz4lE+NDQ8Yj3x5/Ii+QwtaQ85Tj65Rd60nDOZR0fj6MVB5qb2NFQR0SirGdRSuNZNEejrKutYkl1JT0iOZzYfygjevc/psGMnHMuUzIyHoekOcDPCEbr+18z+4+46fnAPcB0grHGLzWz9eG0ycD/AH2AKHCSmTVImg78BiggGIv8y5aG7NcjJ5ehhcXHtGxOJMKYvqWM6ZvciHnOOdeZpK2qSlIOcBtwHjABuEzShLjZrgFqzGwM8BPg1nDZXOA+4PNmNhE4E2gMl/lv4FpgbPiYk659cM4511I62zhmAKvNbK2ZHQAeBObGzTMXuDt8/jAwW0F9zrnAEjN7C8DMqsysWdJgoI+ZvRaeZdwDXJTGfXDOORcnnYljKBB7DWplWJZwHjNrAnYDJcA4wCQ9JelNSd+Mmb/yKOsEQNJ1khZKWrhjx44274xzzrlAtl6OmwucBpwE7AWelbSIILEkxcxuB26HoHE8HUE651x3lM4zjs3AsJjX5WFZwnnCdo2+BI3klcCLZrbTzPYSNIJPC+ePva410Tqdc86lUToTxxvAWEmjJPUA5gHz4+aZD1wVPr8EeC5su3gKmCSpV5hQPggsN7MtwB5Js8K2kCuBP6RxH5xzzsVJ630cks4HfkpwOe6dZvZ9STcDC81svqSewL3AB4BqYJ6ZrQ2X/TTwLYIeO54ws2+G5RW8fznuk8CXjnY5rqQdwIZj3I0BQHIdTWWGx9c2Hl/beHxtk+3xjTCzFvcVdIsbANtC0sJEN8BkC4+vbTy+tvH42ibb42uNdzninHMuJZ44nHPOpcQTx9HdnukAjsLjaxuPr208vrbJ9vgS8jYO55xzKfEzDueccynxxOGccy4lnjhCkuZIWilptaQbEkzPl/RQOH2BpJEdGNswSX+TtFzSMklfTjDPmZJ2S1ocPv69o+ILt79e0tJw2y0GP1Hg5+HxWyJpWgfGdnzMcVksaY+kr8TN06HHT9KdkrZLejumrL+kpyWtCv/2a2XZq8J5Vkm6KtE8aYrvvyS9E75/j0kqbmXZI34W0hjfTZI2x7yH57ey7BH/19MY30Mxsa2XtLiVZdN+/NrMzLr9g+AGxTXAaKAH8BYwIW6eLwC/Cp/PAx7qwPgGA9PC50XAuwniOxP4UwaP4XpgwBGmn09ww6aAWcCCDL7XWwlubMrY8QPOIOhG5+2Ysv8Ebgif3wDcmmC5/sDa8G+/8Hm/DorvXCA3fH5roviS+SykMb6bgH9J4v0/4v96uuKLm/4j4N8zdfza+vAzjkBbuoBPOzPbYmZvhs9rgRW00itwFpsL3GOB14DisJv8jjYbWGNmx9qTQLswsxcJekuIFfsZu5vEQwZ8GHjazKrNrAZ4mjSMSZMoPjP7qwW9WAO8xuH9xnWoVo5fMpL5X2+zI8UXfm98Evhte2+3o3jiCLSlC/gOFVaRfQBYkGDyyZLekvSkpIkdGxkG/FXSIknXJZiezDHuCPNo/R82k8cPYKAF/bFBcFY0MME82XIcP0NwBpnI0T4L6XR9WJV2ZytVfdlw/E4HtpnZqlamZ/L4JcUTRyciqTfwCPAVM9sTN/lNguqXKcAvgMc7OLzTzGwawYiPX5R0Rgdv/6gUdLZ5IfD7BJMzffwOY0GdRVZeKy/p/wJNwP2tzJKpz8J/A8cBU4EtBNVB2egyjny2kfX/S544Am3pAr5DSMojSBr3m9mj8dPNbI+Z1YXPnwDyJA3oqPjMbHP4dzvwGEGVQKxkjnG6nQe8aWbb4idk+viFth2svgv/bk8wT0aPo6SrgQuAy8Pk1kISn4W0MLNtZtZsZlHg161sN9PHLxf4GPBQa/Nk6vilwhNHoC1dwKddWCd6B7DCzH7cyjyDDra5SJpB8N52SGKTVCip6OBzgkbUt+Nmmw9cGV5dNQvYHVMt01Fa/aWXyeMXI/YzdhWJhwx4CjhXUr+wKubcsCztJM0BvglcaME4OYnmSeazkK74YtvMLm5lu8n8r6fTh4B3zKwy0cRMHr+UZLp1PlseBFf9vEtwxcX/DctuJvgnAehJUMWxGngdGN2BsZ1GUG2xBFgcPs4HPg98PpznemAZwVUirwGndGB8o8PtvhXGcPD4xcYn4Lbw+C4FKjr4/S0kSAR9Y8oydvwIEtgWoJGgnv0agjazZ4FVwDNA/3DeCuB/Y5b9TPg5XA38UwfGt5qgfeDgZ/DgVYZDCIY+aPWz0EHx3Rt+tpYQJIPB8fGFr1v8r3dEfGH5bw5+5mLm7fDj19aHdzninHMuJV5V5ZxzLiWeOJxzzqXEE4dzzrmUeOJwzjmXEk8czjnnUuKJw7ksFvba+6dMx+FcLE8czjnnUuKJw7l2IOnTkl4Px1D4H0k5kuok/UTBGCrPSioN550q6bWYcS36heVjJD0TdrT4pqTjwtX3lvRwOBbG/R3VK7NzrfHE4VwbSRoPXAqcamZTgWbgcoK71Rea2UTgBeDGcJF7gH81s8kEdzofLL8fuM2CjhZPIbjzGILekL8CTCC4s/jUNO+Sc0eUm+kAnOsCZgPTgTfCk4ECgg4Ko7zfmd19wKOS+gLFZvZCWH438Puwf6KhZvYYgJk1AITre93Cvo3CUeNGAi+lfa+ca4UnDufaTsDdZvatwwqlf4ub71j799kf87wZ/791GeZVVc613bPAJZLK4NDY4SMI/r8uCef5FPCSme0GaiSdHpZfAbxgwciOlZIuCteRL6lXR+6Ec8nyXy7OtZGZLZf0bYJR2yIEPaJ+EagHZoTTthO0g0DQZfqvwsSwFvinsPwK4H8k3Ryu4xMduBvOJc17x3UuTSTVmVnvTMfhXHvzqirnnHMp8TMO55xzKfEzDueccynxxOGccy4lnjicc86lxBOHc865lHjicM45l5L/Hy3RXgDUC4hgAAAAAElFTkSuQmCC", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA5+ElEQVR4nO3deXyU9bn//9d7khCWhBCSsG+RRfZdQMUVsdbWXSvWVj166ulp7anH2nPs9/RnqbWe0331nFNbrUv1aGvV0latu1ZrFVBAFlFWE2RNAiRAIMlcvz/umzgkE5gJmcyQXE8eeTDzmXu5ZjKZa+7P576vj8wM55xzLlGRdAfgnHPu2OKJwznnXFI8cTjnnEuKJw7nnHNJ8cThnHMuKZ44nHPOJcUTh2tXkl6S9I/h7WskvZrumNqSpP8n6VfpjiMRkmokHZfuONyxxxNHJyZpg6R94QfIFkn3SspLd1zHMjO7w8z+Md1xJMLM8sxsXVtvV9J8Sb9po22ZpBFtsa0m271X0u1tvd3OwhOHO8/M8oDJwBTga+kN59glKTvdMTjXHjxxOADMbAvwF4IEAoCkWZL+JmmnpKWSTo95rLekX0v6UFKVpCfC9kJJf5K0PWz/k6RBrYlJ0uyY/ZdJuiZsL5B0f7iPjZK+LikSPnaNpNck/Shcb52kk8L2MknbJF0ds497Jf2vpGclVUt6WdLQmMd/Eq63W9JiSafEPDZf0qOSfiNpN3BN7LdtSV3DxyrCWBZK6hs+NkDSAkmVktZI+lyT7f42fI7VklZImt7CazQs/FaeHdMW2x04InxOuyTtkPRIzHKN3+bD1+FOSX8O9/mGpOExy54taXW4nf8Ot9nsyErSOcD/Ay4Pj2SXxvzO7pa0WdImSbdLyjpcjJJeCTe7NNzW5XH2d7jnNzr8vVaGsX8qbL8euBL4t3C7f4z32rqWeeJwAIQf7h8H1oT3BwJ/Bm4HegM3A7+XVBKu8gDQHRgH9AF+FLZHgF8DQ4EhwD7g562IZyjwFPAzoIQgoS0JH/4ZUAAcB5wGXAX8Q8zqM4FlQBHwEPAwcAIwAvgM8HMd2iV3JfAtoDjcx4Mxjy0M99073NbvJHWNefwC4FGgV5P1AK4O4xwcxvJ5gteDMKZyYABwKXCHpDNj1j0/XKYXsIBWvIahbwHPAIXAIILXriXzgG+Gy64Bvg0gqZjgOX4tfB6rgZPibcDMngbuAB4Ju8ImhQ/dC9QT/A6mAGcDBxNP3BjN7NTw8UnhthqTwpGen6QewLMEv7M+4XP7b0ljzewugt/Vd8PtnneY18TF4YnDPSGpGigDtgHfCNs/AzxpZk+aWdTMngUWAedK6k+QZD5vZlVmVmdmLwOYWYWZ/d7M9ppZNcGHz2mtiOvTwHNm9n/h9ivMbEn4LXUe8DUzqzazDcAPgM/GrLvezH5tZg3AIwQf3LeZ2X4zewY4QPABdtCfzewVM9sP/AdwoqTB4fP5TbjvejP7AZALHB+z7utm9kT4Gu3jUHUEH7QjzKzBzBab2e5w2ycD/25mtWa2BPgVQQI86NXwtW8gSNKTaJ06giQ+INzX4U5GeNzM3jSzeoIP1slh+7nACjN7LHzsp8CWRAMIj7LOBW40sz1mto3gi8a8VsSY6PP7JLAhfB/Um9nbwO+By5LYtmuBJw53oZnlA6cDowm+dUPwx3hZ2MWyU9JOYDbQn+CDuNLMqppuTFJ3Sb9Q0IW0G3gF6HWwWyIJg4G1cdqLgRxgY0zbRmBgzP2tMbf3AZhZ07bYI46ygzfMrAaoJDgSQNLNklaFXSE7CY4giuOtG8cDBN1/Dyvo0vuupJxw25VhYm3pOcR+MO8Fuqp1Yyj/Bgh4M+zyuvYwyzbd58HXaACHvkZGcLSUqKEEv7PNMe+lXxAcCSQbY1MtrTsUmNnk/Xsl0C+JbbsW+GCeA8DMXpZ0L/B94EKCD4oHzOxzTZcNjzh6S+plZjubPPwVgm/kM81si6TJwNsEf9zJKANmxGnfwUffMleGbUOATUluP9bggzfCLqzewIcKxjP+DZhD8I07KqmKQ59Li+WlzayOoOvnm5KGAU8SdPM8Q/D65cckj9Y+hz3h/92B3eHtxg/HcOzqc+Fzmw08J+kVM1uTxD42E3QDEW5HsffjaPqalAH7geLwiOXQhY8ixpbWDff5spnNTTBGlwQ/4nCxfgzMlTQJ+A1wnqSPScpSMNB7uqRBZraZYPzhvxUMhudIOtgfnU/wjX6npN581PWVrAeBsyR9SlK2pCJJk8Oum98C35aUH46F3BTG21rnKhiI70LQZ/53MysLn0s9sB3IlnQr0DPRjUo6Q9KE8GhrN0HCi4bb/hvwn+HrOhG4rjXPwcy2EyScz4S/p2uB2EHty/TRyQlVBB+Y0SR382dggqQLw6OeL3L4b+5bgWEKT1gI3y/PAD+Q1FNSRNJwSaclEONWgrGsuA6z7p+AUZI+G74/cySdIGlMItt1h+eJwzUKP4TuB24NP9wuIDhDZjvBN7iv8tF75rMEH4TvEoyN3Bi2/xjoRnBk8Hfg6VbG8gFBv/hXCLqOlvBRP/+XCL5prwNeJRgAvac1+wk9RJDgKoFpBOM7EHQzPQ28R9CVVMvhu6aa6kcwqLwbWAW8TNB9BXAFMAz4EHgc+IaZPdfK+D9H8LupIDhZ4W8xj50AvCGphmCQ/cvJXrthZjsIxga+G+5jLMF41/4WVvld+H+FpLfC21cBXQiOEqsIXpf+CcQ4H7gv7G76VJx9xV03PJI7m2Ac5UOCbrjvEIxRAdwNjA23+0Sir4ULyCdycp1Z2D1XbmZfT3csx4rwSKIcuNLMXkx3PK79+RGHc+6Iwi7LXpJyCY5CRXBE6TohTxzOuUScSHCW2w7gPIKz8Zqefuw6Ce+qcs45lxQ/4nDOOZeUTnEdR3FxsQ0bNizdYTjn3DFl8eLFO8yspGl7p0gcw4YNY9GiRekOwznnjimSNsZr964q55xzSfHE4ZxzLimeOJxzziUlpYlD0jkKJlBZI+mWOI/nSnokfPyNsBDcwclp9klaEv78b8w60yS9E67z07DgmnPOuXaSssQRFna7k2DehrHAFZLGNlnsOqDKzEYQ1Of/Tsxja81scvjz+Zj2/yGozTMy/DknVc/BOedcc6k84pgBrAkLjh0gmM3sgibLXADcF95+FJhzuCOIsJx3TzP7ezgnwP0EJcCdc86FrLoKW/s20RWvYh+uxerr2nT7qTwddyCHVhItJ5jSM+4yZlYvaRfBjGkApZLeJqgs+nUz+2u4fOwEMuUcOvlNIwXzCl8PMGTIkKN7Js45d4yw6iqiT/4CNr0f3EfovH9GI6e12T4ydXB8MzDEzKYQzLXwkKSE50EAMLO7zGy6mU0vKWl2/YpzznVM28sak0bAsBcfwvbsarNdpDJxbCJmZjWCGcOaznDWuEw4QUwBUBHODV0BYGaLCYqrjQqXj515LN42nXMu7ay+Htvf/nUgbf/e5o17dkHdgTbbRyoTx0JgpKTScGa1eQQTrcRaAFwd3r4UeMHMTFJJOLiOpOMIBsHXhTOJ7ZY0KxwLuQr4Qwqfg3POJc0+XEv0yf8l+vB/En3rOaymqt32raL+oCYf7cfPhLxebbaPlI1xhGMWNxDMopYF3GNmKyTdBiwyswUEs3A9IGkNwexr88LVTwVuk1RHMA3k582sMnzsC8C9BLPMPRX+OOdcRrDtZUQf/R6EA9L20v9B7R448Xza5eqB4sFELvoy0Rcfgl07YMwsIjM+gbJz2mwXnaKs+vTp081rVTnn2kN05evY0786tDEnl8g1t6P83u0Wh+2rgbr90KMAZbXuGEHSYjOb3rS9UxQ5dM51LrZ1I/bum9ju7UTGnAiDRqGuPdpn5/E+pHNyIZKV8CZsexn23mKs8kMix8+AQaNR97ykwlC3POiW3DqJ8sThnOtQbEc50d99Dw4EA9PR9xejuVejCae2y/7VZwiWVwgx4xo65VLUoyCh9a1qC9FHvw/7aoAw/tPmoWlzUxJva3jicM51KLZ1Y2PSaGx7fQE2fDLqntRZ/a2iwr5ELrkJ+2AV7N6Bho6D/iMSXt+2lTUmjca2v/8BGzWtXbu6DscTh3Oug4k3bmvxm1NERQNQ0YDWrRxv3DnavvEfiScO51ybMzPYtR0a6qFnEcrJTW79PbthRzlWvx8V9kO9+ye8rvoMxXJyg4Hhg20zP4l6pP5ooy2oZBCW2x1irsfQjHMhvzCNUR3KE4dzrk3Z/n3Y8lex1x6D+gMwfAqRUy9DhX0TW7+6kujT90DZquB+Tlcil9yEBgxPaH2VDCZy2VexFa9hO7eh8aegIWNa/Xzam4oGELn05iD+HeVo3MmodEL7nMqbIE8crkOy6iqo2IRZFPUegAqK0x1S57FlHfbywx/dX/s2VlAMp34KRRK45njz+sakAUBdLdHXHidywQ2oS9eEQlC/UtSvNMnAM4f6DkV9h2JmGZUwDvLE4Tocq9pKdMGdUBFUo7G8QiIX34iKBx1hTdcWbGvzaapt9ZvohI9DAmcWWXVF88btZXCgFhJMHB1FJiYNyNwih861mq1f1pg0AKipCrpOOsHFrhmhZ5yju+JBCX/ox0vwGjUtZdckuOR54nAdjm3Z0Lxt0/sQbWj/YDohDRgOA0Z+1JDTlchJFyY+QN6vFJ0+D7K7BPdLJ6CpZ7f66mfX9vw34TocHTcRe/fvh7aNnukfPO1EPYuInPfPsD08K6p3/+TOisrtBlPOQsdNCuo99SxKeGzDtQ//S3IdjgaPhqlnY28/F5wTP+ZENHJqusPqVNSjIKiR1Nr1JejVp01jcm3HE4frcNSjAE65JCgxYVHoVYIOdns4546aJw7XISkrG4oS7x5xziXOB8edc84lxROHc865pHhXlUsZCyt8ys+/T1pQ62kHWAPkF6Ns/1N1mcPfja7N2f692Nol2OsLwAzNOg+NmNJ+E+lkANtXA5VbsGh9UKQvifmerXYP9s5fsdf/AA11MPZkIrPOa/eyKVa7B6IN7VKKPNNYdSVs3Ygd2BdUuS0Zkli5lE7CE4dre2Wrsafvbrxrz/waunRFo5rNQNkh2e4Kos89ABveCe4X9iNy/hcTL7O9aQ321999dH/Fq1ivEjTzkymItjmrr4MPVhJ99TGo3YOmnoXGzEI9erXL/tPNdlcQ/eP/wNb1wX1FiFz0ZRg2Ps2RZQ5Poa7NRVe93qzN3vlrGiJJDyt7tzFpAFC1BVv2MmbRxNYvX928beXrWEyZ7ZTaup7oEz+DHeVBuZZXfoe9+2b77DsTbN3QmDQAsCjRlx/BamtaXKWz8cTh2ly8WcpUUJSGSNIkXsmTD1bCgf3Nl40nXvnxksEfleBIMdu0hqazBtmS5xvHrDo6q42ToHdXQN2B9g8mQ3nicG1OY2ZBTkyJiOwuaPwp6QuovQ1oPk2ohk9OvMjf4DFQNPCjhtxuRKaf034lU+KNRXUvgE5SsiVel6LGnpRQZd/OonO8E1y7Ut9hRObdgm1eBxjqdxzqMyTdYbUbDR6FjZsNK14NGgaMRONmJ1wiW4V9iFx8Y1DrqaE+mIa0d7/UBdx0/wNHYT0KYM+usEFBkcLOUi+qzxB03hewF/8P9u6GsSehaWejSFa6I8sY6gylpqdPn26LFi1KdxiuE7ED+2Hn1uCDv7DvMXdGmVVuxjavhQO1wYRIfYd1ug9O27MrKLKY16vTFsiUtNjMmp3V0jlfDedSTF1yg2+u6Q6klZKtaNsRybumWuRjHM4555KS0sQh6RxJqyWtkXRLnMdzJT0SPv6GpGFNHh8iqUbSzTFtGyS9I2mJJO9/cq4FVrMT212BRRM7Ddi5RKWsq0pSFnAnMBcoBxZKWmBmK2MWuw6oMrMRkuYB3wEuj3n8h8BTcTZ/hpntSFHozh3T7EAt9v5i7JXfQV0tmnwWTDkz7mnSzrVGKo84ZgBrzGydmR0AHgYuaLLMBcB94e1HgTkKTz2RdCGwHliRwhid63i2rMP+cg/sq4b6OmzRU9i7b6Q7KteBpDJxDATKYu6Xh21xlzGzemAXUCQpD/h34JtxtmvAM5IWS7q+pZ1Lul7SIkmLtm/ffhRPw7lji5W917ztnVf8ymfXZjJ1cHw+8CMzi/dOn21mU4GPA1+UdGq8DZjZXWY23cyml5SUpDBU5zJMzzhdUr36ttuV567jS+XpuJuAwTH3B4Vt8ZYpl5QNFAAVwEzgUknfBXoBUUm1ZvZzM9sEYGbbJD1O0CX2Sgqfh+uEbM9uqPwQi0ZR737H1PiABh2PFRQHZdkBsnOIzPyET5/r2kwqE8dCYKSkUoIEMQ/4dJNlFgBXA68DlwIvWHBFYmN9CknzgRoz+7mkHkDEzKrD22cDt6XwObhOyHZtJ/rkr2DzmuB+QTGRC/4FFTftac1MKuxL5NKbYesHWEMdKh6ISgYfeUXnEpSyxGFm9ZJuAP4CZAH3mNkKSbcBi8xsAXA38ICkNUAlQXI5nL7A4+H4eTbwkJk9narn4Don27iiMWkAsGsH9s4rcPq8hMuGpJsKSqCg5Ji9ANFltpReOW5mTwJPNmm7NeZ2LXDZEbYxP+b2OmBS20bp3KEsXnXb8tWo/gDk5LZ/QM5lmEwdHHdtwKqrsJqqdIdxzNHg0c3bRk5DnjScA7xWVYdke3djK/6GvfEniETQieej0bN87u8EafBomHg6tuxlwOC4SWj0zHSH5VzG8MTRAdmGFYdMPWov/h/qUQijpqUxqmOH8nrB6ZfD5DMgGkUFJSi3W7rDci5jeFdVB2MWxZY3n6Y1utqvHE6GsrsQKR5EpM8QTxrONeGJo4ORIvFnMOvkJbKdc23HE0cHpPGnQJeYb8ld89CoZnOxHJFZFDOvrOqcO5SPcXRA6juUyLyvYTvKQUIlg5M64rC6Otj0HtG3n4eIiEyeE0x/mpOTwqidc8cKTxwdlIoHtv5K5w/fI/rYDxvvRtcuIXLpV2DI2DaKzjl3LPOuKtdMdNnLzduWv5qGSJxzmcgTh2suK06XVMQPTp1zAU8crpnIhFMhtiaTRGT8ye0ag1VXYRuWY+vfwXZXtOu+nXOH518jXXMDhhP51L9hq94IBtdHz4T+x7Xb7q1yM9EFd0Ll5qChZxGRC798zFSnda6j88ThmlFWNgwchQaOSsv+be3Sj5IGwO4KbOXr6NRL0xKPc+5Q3lXlMo5tWde8bdN7WNSvKXEuE3jicBlHwyc3bxs9E0X87epcJvC/RJdxNGQsmnIWKBIM0o8/BY2YnO6wnHMhH+Noge2rwaq2oqws6NXXC921I+X1glMuQxNPByyYyS7br1p3LlN44ojDKrcQffpXsGU9BnD8DCKnXobye6c7tE5D2dlQ5IUZnctE3lXVhJlhK16DLes/alz9Jlb+XvqCcs65DOKJo6m6/dj6pc3bN3nicM458MTRXE4uGjqheXv/Ee0fi3POZSBPHE1IQuNnQ/GgjxqHT0KDj09fUM45l0F8cDwOFfUncslNH51VVdgPde3RrjHY7kqsYhNSBIoHBmcaOedcBvDE0QL1KEA9CtKyb9uxiegTPwlKbQAUDSRy/hdRYd+0xOOcc7G8qyoD2Yq/QWxF2IpN2Lo4A/bOOZcGnjgyjDU0YB82P4PLYk8Pds65NEpp4pB0jqTVktZIuiXO47mSHgkff0PSsCaPD5FUI+nmRLd5rFNWFho1o3n7cZPSEI1zzjWXssQhKQu4E/g4MBa4QlLTSauvA6rMbATwI+A7TR7/IfBUkts85mnkFBhzIiBQBE2ZgwaPTndYzjkHpHZwfAawxszWAUh6GLgAWBmzzAXA/PD2o8DPJcnMTNKFwHpgT5LbPOapZzGRuVfBCR8PivwV9AlKcDjnXAZIZVfVQKAs5n552BZ3GTOrB3YBRZLygH8HvtmKbQIg6XpJiyQt2r59e6ufRLoouwsqHoiKBnjScM5llEwdHJ8P/MjMalq7ATO7y8ymm9n0kpKStovMOec6uVR+ld0EDI65Pyhsi7dMuaRsoACoAGYCl0r6LtALiEqqBRYnsE3nnHMplMrEsRAYKamU4MN9HvDpJsssAK4GXgcuBV4wMwNOObiApPlAjZn9PEwuR9qmc865FEpZ4jCzekk3AH8BsoB7zGyFpNuARWa2ALgbeEDSGqCSIBEkvc1UPYfWsvo62LwOe28R5HRBI6dBv1IkpTs055w7agq+4Hds06dPt0WLFrXb/mzjSqK//yEQvrZZ2UQ+9e+o/3HtFoNzzh0tSYvNbHrT9kwdHD9mWUM90cXP0Jg0ABrqsTVvpy0m55xrS544UqH+QPO2hrr2j8M551LAE0cbU1Y2kalzm7aiEVPTEo9zzrU1v7IsFYaMIXL+l4i+/Szk5AaJxMc3nHMdhCeOFFCXrjBiMpHSCWG5qax0h+Scc23GE0cKKcsThnOu4/ExDuecc0nxxOGccy4pnjicc84lxROHc865pHjicM45lxRPHM4555LiicM551xSPHE455xLiicO55xzSTli4pDUV9Ldkp4K74+VdF3qQ3POOZeJEjniuJdgxr0B4f33gBtTFI9zzrkMl0jiKDaz3wJRCKZvBRpSGpVzzrmMlUji2COpiHBKO0mzgF0pjco551zGSqQ67k3AAmC4pNeAEuDSlEblnHMuYx0xcZjZW5JOA44HBKw2M58H1TnnOqkjJg5JVzVpmioJM7s/RTE555zLYIl0VZ0Qc7srMAd4C/DE4ZxznVAiXVVfir0vqRfwcKoCcs45l9lac+X4HqC0rQNxzjl3bEhkjOOPhKfiEiSascBvE9m4pHOAnwBZwK/M7L+aPJ5L0OU1DagALjezDZJmAHcdXAyYb2aPh+tsAKoJriWpN7PpicTinHOubSQyxvH9mNv1wEYzKz/SSpKygDuBuUA5sFDSAjNbGbPYdUCVmY2QNA/4DnA5sByYbmb1kvoDSyX9Mbz4EOAMM9uRQOzOOefaWCJjHC+3ctszgDVmtg5A0sPABUBs4rgAmB/efhT4uSSZ2d6YZbry0RGPc865NGtxjENStaTdcX6qJe1OYNsDgbKY++VhW9xlwqOJXUBRuP+ZklYA7wCfjznaMOAZSYslXZ/Ik3TOOdd2WjziMLP89gwkzv7fAMZJGgPcJ+kpM6sFZpvZJkl9gGclvWtmrzRdP0wq1wMMGTKkXWN3zrmOLOGzqiT1kTTk4E8Cq2wCBsfcHxS2xV1GUjZQQDBI3sjMVgE1wPjw/qbw/23A4wRdYs2Y2V1mNt3MppeUlCQQrnPOuUQkMh/H+ZLeB9YDLwMbgKcS2PZCYKSkUkldgHkENa9iLQCuDm9fCrxgZhaukx3ufygwGtggqYek/LC9B3A2wUC6c865dpLIWVXfAmYBz5nZFElnAJ850krhGVE3EMzlkQXcY2YrJN0GLDKzBcDdwAOS1gCVBMkFYDZwi6Q6gnLuXzCzHZKOAx6XdDD2h8zs6WSesHPOuaMjs8OfsCRpkZlNl7QUmGJmUUlLzWxS+4R49KZPn26LFi1KdxjOOXdMkbQ43rVyiRxx7JSUB/wVeFDSNoKrx51zznVCiQyOv0gwaP1l4GlgLXBeKoNyzjmXuRJJHNnAM8BLQD7wiJlVHHYN55xzHdYRE4eZfdPMxgFfBPoDL0t6LuWROeecy0jJVMfdBmwhuM6iT2rCcc45l+kSuY7jC5JeAp4nKAfyOTObmOrAnHPOZaZEzqoaDNxoZktSHItzzrljQCLVcb/WHoE455w7NrRmBkDnnHOdmCcO55xzSfHE4ZxzLimeOJxzziXFE4dzzrmkeOJwzjmXlESu43DOOXcMqanbz6Y9VeypO0Df7j3p372ASDCPUZvwxOGccx3I7gO1PLJ2EYt2fABAliJ8adxpjCns32b78K4q55zrQMr2VDUmDYAGi/LgmoVU19W22T48cTjnXAdSfaB5gtheW8O++ro224cnDuec60D6dstv1ja+sD8FXbq12T48cTjnXAaqa6hnb/2BpNcb1KMX1x5/It2zcwAY0bOES0qnkJvVdkPaPjjunHMZZu3u7TxVtoKt+6qZ3Xc4M/oMpTC3R0Lr5mRlM7NPKSN6llDbUE/v3O50y+7SpvF54nDOuQxSvqeKH73zAnXRBgAe27CEmrr9XFQ6iYgS7yQq6pqXqhC9q8o55zLJpj27GpPGQS9ufo+q/fvSFFFznjiccy4Fauvr2H2gFjNLar2cSPOP5dysbLLa8AK+o+VdVc4514aiZry/axt/2LCUHfv3MLvfcE7uO5yiromNUQzp0Zvi3Dx27K9pbLt42GR65XZPVchJ88ThnHNtqGxPFT9e/gLR8Ejjzx8s50BDPReXTk5ojKK4Wx7/Mv50Vu/aSmXtHkb16stx+cWpDjspKe2qknSOpNWS1ki6Jc7juZIeCR9/Q9KwsH2GpCXhz1JJFyW6Teecawt1DQ2tOh32wz07G5PGQS9tfp+dSYxR9O3ek1P7j+TC0smMLexP1/DU2kyRsiMOSVnAncBcoBxYKGmBma2MWew6oMrMRkiaB3wHuBxYDkw3s3pJ/YGlkv4IWALbdM51chuqK3hty1q219Ywu99wxvTqR4+c3ITWNTPW7t7Ok2Ur2FG7h9P6j2Ba8ZCEu4q6RJp/rOZl55IdyUrqOWSyVHZVzQDWmNk6AEkPAxcAsR/yFwDzw9uPAj+XJDPbG7NMV4KEkeg2nXOdWHlNFT9c9jz7o/UArNq5hU+POIHT+o9MaP2y8HTYeosC8Nt1b1FbX8e5Q8ajBAaoh+b1pm/XfLbWVje2XXrcFHp26dqKZ5OZUpk4BgJlMffLgZktLRMeXewCioAdkmYC9wBDgc+GjyeyTefcMW5P3X42793F/oYG+nbLp7hb4tckfLCnsjFpHPTkB8uZUjSIngmU3SjfU9WYNA56dtO7nNxveEJHHcXd8vjS+NNZu3sHu+tqGZbXm9L8ooTjPxZk7OC4mb0BjJM0BrhP0lPJrC/peuB6gCFDhqQgQudcKuzcv5dH1i7mrYrgO2JeTi7/Mu4Mhub3TnALzY8KFP5LRE6crqau2TlkJXHxXUm3fEri1IzqKFI5OL4JGBxzf1DYFncZSdlAAVARu4CZrQJqgPEJbvPgeneZ2XQzm15SUnIUT8M51542VFc0Jg0IJiVasHEpBxrqD7PWR4bm9aZr1qGDyZ8cMoH8BLuKhuQVUtjkyOTiYZMTXr8zSOURx0JgpKRSgg/3ecCnmyyzALgaeB24FHjBzCxcpyzsnhoKjAY2ADsT2KZzLs2q9u+hrGYnB6L1DOhewIAevRJed0ftnmZt66sr2Ft/gC4JFOob2KMXX5k4h4XbN7J9Xw2z+gxjZEHfhPfft1tPbpxwJqt3bqXqwD5G9+pLaV5mnQ6bbilLHOGH/g3AX4As4B4zWyHpNmCRmS0A7gYekLQGqCRIBACzgVsk1QFR4AtmtgMg3jZT9Rycc8nbUVvDL1b+lQ/2VAGQE8nixglnMqJnYkf+/Xv0bNY2sfdA8hI8KwpgSF5vhuQl2rXVXL/uBfTrXtDq9Ts6JXs5/LFo+vTptmjRonSH4Vyn8Oa2Ddy9+m+HtB1f0JcvjjstodLee+v28/yH7/Fk2XKiZpTmFXH18bPo7x/k7U7SYjOb3rQ9YwfHnXPHpqr9e5u1bdm3m/0NdQklju45uZw7eBzTi4dwINpASbc8urdxWXB3dDxxOOea2bRnJ+/u3MLeugMcX9iP0vwichK8gG1onC6iWX2GkZeT+OByViRC/x5+hJGpvDquc+4Qm/bs5AfLnuO3697iT2XL+cGy51i9c0vC65fmF/GZETPolpWDELP6lHJKv5FEMqi6qzs6fsThnDvEml3b2dOkRtOCDe8womefhGom5WbncEr/EYwr7E+9Remd271Dldtwnjicc03UNtQ1a9vXUNfsauoj6Z1gGXF37PHE4VyGOtDQQNSiraqMGrUo22traIhGKe6al9D1DweNLOiDEMZHZ1yeNWh0UqfDuo7NE4dzGaY+2sB7u7bxdNkK9tQfYO7A0UzoPTDh6q576vbz8ub3+fMHy6m3KFOKBnFJ6ZSES2AMze/Nl8efwZMfLKe6vpY5A0YzuWjQ0Twl18F44nAuw2yoruSny19s/L7/6/f+zj+MOpFZfUsTWn/t7h38YeOyxvtvV5TTr1tPLhg2KaHqrlmKMKawH8N7FtMQjdItx0+FdYfys6qcyzArqzbT9LLcZzetora++dhDPOuqdzRrW7jjg2YD3kfSJSvbk4aLyxOHcxmmaYE+gO5ZXRI+nbVvt+YlO4blFdI1iXEO5w7H30nOpcAHNVWsrPqQ/Q31jCscQGl+EVmRxL6njSnsR9eybGrDarACPj5kXMID3CMLSijNL2J9dVBount2Fz42eJyfEuvajNeqcq6NfVBTyfeXPtc4mZCAGyecyehe/RLeRllNFauqtrC34QDjCvtTml+U1Af/rv372LR3J3XRBvp3L6BPB54bwqWO16pyrp0sr/zwkBnoDPhL2UqG55eQk5XYh//gvEIG5xW2OoaC3G4U5B55trvOoK6ujvLycmpra9MdSsbq2rUrgwYNIicnsVO/PXE418biXUC3p/7AIddFuPZTXl5Ofn4+w4YNS+isss7GzKioqKC8vJzS0sTO3PPBcefa2ITCgc0mKT1r4OikLsJzbae2tpaioiJPGi2QRFFRUVJHZP5Odq6NDetZxJfGncHTZSuobahj7qDRjCsckO6wOjVPGoeX7OvjicO5OHbsq6F8704aolEG9ChIahKhnEgW43r3Z2RBH4wouXFOr3XuWOaJw7kmtuzdxU+Xv0TF/mDu625ZOfzrhDkMzU9uKtIuWVkEMxy7zmj+/Pnk5eVx8803x338iSeeYNSoUYwdO7adIzt6PsbhXBPLKz9sTBoQVIZ9afNqoklWh3XucJ544glWrlyZ7jBaxROHc01s2be7WVtZzU7qo5443OF9+9vfZtSoUcyePZvVq1cD8Mtf/pITTjiBSZMmcckll7B3717+9re/sWDBAr761a8yefJk1q5dG3e5TOWJw3VYNXW1VNclf+7++N7NB7JP7nucnxXlDmvx4sU8/PDDLFmyhCeffJKFCxcCcPHFF7Nw4UKWLl3KmDFjuPvuuznppJM4//zz+d73vseSJUsYPnx43OUylf8luA6ntr6OZZWbWLBxGfUW5eODxzGteEjC80mM7NmHS0qn8KcP3qEhGuWMAaOYXDw4xVG7Y91f//pXLrroIrp37w7A+eefD8Dy5cv5+te/zs6dO6mpqeFjH/tY3PUTXS4TeOJwHc6a3du5e/XfGu8/tGYhXSPZzEywLHmPnFzmDhzNtOIhRC1K7649yJIfnLvWueaaa3jiiSeYNGkS9957Ly+99NJRLZcJ/K/BdThv7figWdvLm9fQEG1IeBuSKOrag5Ju+Z40XEJOPfVUnnjiCfbt20d1dTV//OMfAaiurqZ///7U1dXx4IMPNi6fn59PdXV14/2WlstEfsThMk59tIG1u3fwyub3qbcop/UfycieJeQkOMbQK7d7s7airt2RJwCXQlOnTuXyyy9n0qRJ9OnThxNOOAGAb33rW8ycOZOSkhJmzpzZmCzmzZvH5z73OX7605/y6KOPtrhcJvLquC7jvLdrGz9c9twhlZ2+PP4Mxhb2T2j9spoqfrDsOfaFNaOyFeGmiXMY3rMkBdG6TLdq1SrGjBmT7jAyXrzXKS3VcSWdA/yE4CqoX5nZfzV5PBe4H5gGVACXm9kGSXOB/wK6AAeAr5rZC+E6LwH9gX3hZs42s22pfB6ufb25bUOzcoAvfvgeo3v1S2gyo8F5hXx10lzWV1cQtSjD8osY3KP1lWadc4dKWeKQlAXcCcwFyoGFkhaYWewVL9cBVWY2QtI84DvA5cAO4Dwz+1DSeOAvwMCY9a40Mz+EyGCb9+6ivKaKKMaQvN5JleyIlxwi0Kxw4OEM7NGLgT16JbGGcy5RqTzimAGsMbN1AJIeBi4AYhPHBcD88PajwM8lyczejllmBdBNUq6Z7U9hvK6NlNVU8cN3nmdvOMd1t6wcbpo4hyF5iZXsOKFkKK9sXnNIGfLTBxzvheqcyxCpTBwDgbKY++XAzJaWMbN6SbuAIoIjjoMuAd5qkjR+LakB+D1wu3WGgZpjyMLtGxuTBgQlO17fuj7hxHFcz2JunjiH17euo96inNR3OMf1LE5VuM65JGX0WVWSxhF0X50d03ylmW2SlE+QOD5LME7SdN3rgesBhgwZ0g7RuoO27t3VrG3znuZtLclShBEFfRhR0Kctw3LOtZFUnp+4CYi93HZQ2BZ3GUnZQAHBIDmSBgGPA1eZ2dqDK5jZpvD/auAhgi6xZszsLjObbmbTS0r8bJr2NKNP8wvtTup3XBoicc6lQioTx0JgpKRSSV2AecCCJsssAK4Ob18KvGBmJqkX8GfgFjN77eDCkrIlFYe3c4BPAstT+Bw6rc17drFw2wYWbt/I5jhHEIdzfK++XH7cNHpk59I9uwuXlU5lTGG/FEXqXOcybNgwduzYkfAy1157LX369GH8+PFtFkPKuqrCMYsbCM6IygLuMbMVkm4DFpnZAuBu4AFJa4BKguQCcAMwArhV0q1h29nAHuAvYdLIAp4Dfpmq59BZfVBdyY+WP8/e+uA6iO7ZXfjXCXMYkpfYKa15ObmcOfB4pob1neJdkOdcpnpj23qe2LCUyv176Z3bnQuHTWJmnKPoY8U111zDDTfcwFVXXdVm20zppbRm9qSZjTKz4Wb27bDt1jBpYGa1ZnaZmY0wsxkHz8Ays9vNrIeZTY752WZme8xsmplNNLNxZvZlM0u8joRLyBvb1zcmDYC99QdYuG1D0tvpldvdk4Y7pryxbT2/ef9NKvcHJc0r9+/lN++/yRvb1h/Vdjds2MDo0aO55pprGDVqFFdeeSXPPfccJ598MiNHjuTNN9+ksrKSCy+8kIkTJzJr1iyWLVsGQEVFBWeffTbjxo3jH//xH4k9F+g3v/kNM2bMYPLkyfzTP/0TDQ3NPw5PPfVUevdObhKyI/EaDK6ZD/c0n48i2e4q545FT2xYyoEmNc0ORBt4YsPSo972mjVr+MpXvsK7777Lu+++y0MPPcSrr77K97//fe644w6+8Y1vMGXKFJYtW8Ydd9zReITwzW9+k9mzZ7NixQouuugiPvggqMW2atUqHnnkEV577TWWLFlCVlZWu9W4yuizqlzrNESjrKvewVs7ysiSmFI8mNL8IiIJ1mo6qW8pK3duPqRtVoKVZZ07lh080ki0PRmlpaVMmDABgHHjxjFnzhwkMWHCBDZs2MDGjRv5/e9/D8CZZ55JRUUFu3fv5pVXXuGxxx4D4BOf+ASFhUGX8fPPP8/ixYsba2Lt27ePPn3a50xETxwd0Nrd2/nhOy80XkD3/IeruXniWQnXahpT2J/LSqfy57J3APHJIeMZ3csHt13H1zu3e9wk0bsNulxzcz+aDyYSiTTej0Qi1NfXk5OTk9T2zIyrr76a//zP/zzq2JLlXVUd0Eub3zvkquuoGW8kMUaRl5PLWYNGc+vUT/CNqecyZ+DohCdBcu5YduGwSXSJZB3S1iWSxYXDJqV836ecckpjV9NLL71EcXExPXv25NRTT+Whhx4C4KmnnqKqqgqAOXPm8Oijj7JtW1Cqr7Kyko0bN6Y8TvDE0eGYWbM+WoADcQbNjqTQB7ddJzOzTymfGTmj8Qijd253PjNyRrucVTV//nwWL17MxIkTueWWW7jvvvsA+MY3vsErr7zCuHHjeOyxxxovaB47diy33347Z599NhMnTmTu3Lls3ry52XavuOIKTjzxRFavXs2gQYPaZEpaL6veAS2v/JCfrXjpkLabJpzJ8d7d5DohL6uemIwpq+7SY2RBH24YdxrPb1pNROKsgaM5Lt9rPTnn2oYnjg4oNyubCb0HMrZXP0BkRbxH0jnXdjxxdGBZTQb5nHOuLXjiyFD76g+wbV81EUXo0y2f3ATn23bOuVTzT6MMtH1fNQ+vXcTyquAMiRklw7i4dBKFuT3SHJlzzvnpuBnprR1ljUkD4M3tG1hVtSWNETnn3Ec8cWSYhmgDb1WUNWtfUdX8/GznXOeTTFn1srIyzjjjDMaOHcu4ceP4yU9+0iYxeFdVhsmKZDG2Vz82VFcc0j7SZ8Nzrl1EV72Ovfo4VFdAfhGafRGRMSemO6xWyc7O5gc/+AFTp06lurqaadOmMXfuXMaOHXtU2/UjjhTY31DP8soPuXPFy/xq1Wu8t3MbDRZNeP2ZfUrp371n4/3hPYsZV9g/FaE652JEV72OPXt/kDQAqiuwZ+8nuur1o9puusqq9+/fn6lTpwKQn5/PmDFj2LSp6USsyfPEkQLv7drKz1a8xLLKTSzcsZEfvvM863dXHHnFUL/uPfnX8Wdy04QzuXniWfzzmFMp6Zafwoidc0BwpFF/4NDG+gNB+1FKd1n1DRs28PbbbzNz5syjfi7eVXUYdQ0NSJCdxPUQDdEGnit/95A2w1hSUcaIgsTnPi/I7U6B14lyrn1Vt/AFr6X2JKSzrHpNTQ2XXHIJP/7xj+nZs2fcZZLhiSOOffUHWFG1mefK3yU3K5uPDR7LqII+CSUQAyQ1a4/EaXPOZZj8ovhJIr/oqDedrrLqdXV1XHLJJVx55ZVcfPHFyQceh3dVxbGiagu/fPc11tdU8O6urfx0+YusT/AbR3Yki7kDDy0UFpGYXDQ4FaE659qQZl8E2V0ObczuErSnWCrKqpsZ1113HWPGjOGmm25qs1j9iKOJuoYGntvUtKsJllZsSvjMplEFJdw4/kxe37aO3Eg2s/qWMiy/bef8dc61vciYE4lCWs6qmj9/Ptdeey0TJ06ke/fuh5RVv+KKKxg3bhwnnXRS3LLq0WiUnJwc7rzzToYOHdq4zddee40HHniACRMmMHnyZADuuOMOzj333KOK1cuqN1EfbeDOla+wssl1E+cPncgnhoxPRXjOuRTysuqJSaasundVNZEdyeLsgWOIHZHIiWQx3k+Hdc45wLuq4hpZUMJXJp7FsspN5EaymdB7AEPbYHDMOec6Ak8ccWRHshhZ0Mev1naugzCzuGc7ukCyQxbeVeWc69C6du1KRUVF0h+OnYWZUVFRQdeuXRNex484nHMd2qBBgygvL2f79u3pDiVjde3alUGDBiW8fEoTh6RzgJ8AWcCvzOy/mjyeC9wPTAMqgMvNbIOkucB/AV2AA8BXzeyFcJ1pwL1AN+BJ4MvmXyWccy3IycmhtLQ03WF0KCnrqpKUBdwJfBwYC1whqWlJxuuAKjMbAfwI+E7YvgM4z8wmAFcDD8Ss8z/A54CR4c85qXoOzjnnmkvlGMcMYI2ZrTOzA8DDwAVNlrkAuC+8/SgwR5LM7G0z+zBsXwF0k5QrqT/Q08z+Hh5l3A9cmMLn4JxzrolUJo6BQOyMROVhW9xlzKwe2AU0Pe/1EuAtM9sfLl9+hG0CIOl6SYskLfK+TeecazsZPTguaRxB99XZya5rZncBd4Xb2S5p4xFWaUkxQddZpvL4jo7Hd3Q8vqOT6fENjdeYysSxCYit7DcobIu3TLmkbKCAYJAcSYOAx4GrzGxtzPKxQ//xttmMmSVez7wJSYviXXKfKTy+o+PxHR2P7+hkenwtSWVX1UJgpKRSSV2AecCCJsssIBj8BrgUeMHMTFIv4M/ALWb22sGFzWwzsFvSLAVX81wF/CGFz8E551wTKUsc4ZjFDcBfgFXAb81shaTbJJ0fLnY3UCRpDXATcEvYfgMwArhV0pLw5+Bl3F8AfgWsAdYCT6XqOTjnnGsupWMcZvYkwbUWsW23xtyuBS6Ls97twO0tbHMR0J5lau9qx321hsd3dDy+o+PxHZ1Mjy+uTlFW3TnnXNvxWlXOOeeS4onDOedcUjxxhCSdI2m1pDWSbonzeK6kR8LH35A0rB1jGyzpRUkrJa2Q9OU4y5wuaVfMyQS3xttWCmPcIOmdcN/NpltU4Kfh67dM0tR2jO34mNdliaTdkm5ssky7vn6S7pG0TdLymLbekp6V9H74f2EL614dLvO+pKvjLZOi+L4n6d3w9/d4ePZjvHUP+15IYXzzJW2K+R3GnR/1SH/rKYzvkZjYNkha0sK6KX/9jpqZdfofgiKMa4HjCAorLgXGNlnmC8D/hrfnAY+0Y3z9ganh7XzgvTjxnQ78KY2v4Qag+DCPn0twBpyAWcAbafxdbwGGpvP1A04FpgLLY9q+S3AKOgRnGH4nznq9gXXh/4Xh7cJ2iu9sIDu8/Z148SXyXkhhfPOBmxP4/R/2bz1V8TV5/AfArel6/Y72x484Aq2uq9UewZnZZjN7K7xdTXB6c9xSKxnsAuB+C/wd6BXWHmtvc4C1ZtbaSgJtwsxeASqbNMe+x+4jfh22jwHPmlmlmVUBz5KCQp/x4jOzZyw4zR7g7xx6MW67auH1S0Qif+tH7XDxhZ8bnwL+r6332148cQTaqq5WyoVdZFOAN+I8fKKkpZKeCsu1tCcDnpG0WNL1cR5P5DVuD/No+Q82na8fQF8LLnKF4Kiob5xlMuV1vJaWr6E60nshlW4Iu9LuaaGrLxNev1OArWb2fguPp/P1S4gnjmOIpDzg98CNZra7ycNvEXS/TAJ+BjzRzuHNNrOpBGX0vyjp1Hbe/xGFFQzOB34X5+F0v36HsKDPIiPPlZf0H0A98GALi6TrvfA/wHBgMrCZoDsoE13B4Y82Mv5vyRNHIJm6WqhJXa32ICmHIGk8aGaPNX3czHabWU14+0kgR1Jxe8VnZpvC/7cR1Bib0WSRRF7jVPs4QaXlrU0fSPfrF9p6sPsu/H9bnGXS+jpKugb4JHBlmNyaSeC9kBJmttXMGswsCvyyhf2m+/XLBi4GHmlpmXS9fsnwxBFodV2t9ggu7BO9G1hlZj9sYZl+B8dcJM0g+N22S2KT1ENS/sHbBIOoy5sstgC4Kjy7ahawK6Zbpr20+E0vna9fjNj32NXEr8P2F+BsSYVhV8zZYVvKKZjR89+A881sbwvLJPJeSFV8sWNmF7Ww30T+1lPpLOBdMyuP92A6X7+kpHt0PlN+CM76eY/gjIv/CNtuI/gjAehK0MWxBngTOK4dY5tN0G2xDFgS/pwLfB74fLjMDQSTXi0lGLg8qR3jOy7c79IwhoOvX2x8IpgRci3wDjC9nX+/PQgSQUFMW9peP4IEthmoI+hnv45gzOx54H3gOaB3uOx0gqmXD657bfg+XAP8QzvGt4ZgfODge/DgWYYDgCcP915op/geCN9bywiSQf+m8YX3m/2tt0d8Yfu9B99zMcu2++t3tD9ecsQ551xSvKvKOedcUjxxOOecS4onDuecc0nxxOGccy4pnjicc84lxROHcxksrNr7p3TH4VwsTxzOOeeS4onDuTYg6TOS3gznUPiFpCxJNZJ+pGAOlecllYTLTpb095h5LQrD9hGSngsLLb4laXi4+TxJj4ZzYTzYXlWZnWuJJw7njpKkMcDlwMlmNhloAK4kuFp9kZmNA14GvhGucj/w72Y2keBK54PtDwJ3WlBo8SSCK48hqIZ8IzCW4Mrik1P8lJw7rOx0B+BcBzAHmAYsDA8GuhEUKIzyUTG73wCPSSoAepnZy2H7fcDvwvpEA83scQAzqwUIt/emhbWNwlnjhgGvpvxZOdcCTxzOHT0B95nZ1w5plP6/Jsu1tr7P/pjbDfjfrUsz76py7ug9D1wqqQ80zh0+lODv69JwmU8Dr5rZLqBK0ilh+2eBly2Y2bFc0oXhNnIldW/PJ+Fcovybi3NHycxWSvo6waxtEYKKqF8E9gAzwse2EYyDQFAy/X/DxLAO+Iew/bPALyTdFm7jsnZ8Gs4lzKvjOpcikmrMLC/dcTjX1ryryjnnXFL8iMM551xS/IjDOedcUjxxOOecS4onDuecc0nxxOGccy4pnjicc84l5f8Ht8j8myVBLFwAAAAASUVORK5CYII=", - "text/plain": [ - "
" - ] - }, - "metadata": { - "needs_background": "light" - }, - "output_type": "display_data" - } - ], - "source": [ - "for i in ['Precision', 'Recall']:\n", - " sns.set_palette(\"Set2\")\n", - " plt.figure()\n", - " sns.scatterplot(x=\"epoch\", \n", - " y=\"value\", \n", - " hue='data',\n", - " data=compare_metric(df_list = [output1, output2], metric=i)\n", - " ).set_title(f'{i} comparison using test set');" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Referring to the figures above, it is rather obvious that the number of epochs is too low as the model's performances have not stabilised. Reader can decide on the number of epochs and other hyperparameters to adjust suit the application.\n", - "\n", - "As stated previously, it is interesting to see model2 (using both implicit and explicit data) performed consistently better than model1 (using only explicit ratings). " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 5. Similar users and items" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "As the LightFM package operates based on latent embeddings, these can be retrieved once the model has been fitted to assess user-user and/or item-item affinity." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 5.1 User affinity" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The user-user affinity can be retrieved with the `get_user_representations` method from the fitted model as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 37, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[ 0.17943075, -0.9845197 , 1.724939 , ..., 3.7842598 ,\n", - " -3.438375 , 3.6794803 ],\n", - " [-0.33647582, 0.7195082 , 2.8680375 , ..., 4.22038 ,\n", - " -4.610963 , 4.010645 ],\n", - " [ 0.14344296, 2.1440773 , 1.8434161 , ..., 1.9370167 ,\n", - " -5.640826 , 4.653452 ],\n", - " ...,\n", - " [ 1.4312286 , -1.0642868 , 2.8821077 , ..., 2.8192847 ,\n", - " -2.7393079 , 3.4289758 ],\n", - " [-0.33159262, 0.7337389 , 2.8301528 , ..., 4.112663 ,\n", - " -4.462565 , 3.8659678 ],\n", - " [-0.7364118 , 1.3901651 , 2.1960316 , ..., 3.8899298 ,\n", - " -4.5879855 , 4.744391 ]], dtype=float32)" - ] - }, - "execution_count": 37, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "_, user_embeddings = model2.get_user_representations(features=user_features)\n", - "user_embeddings" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In order to retrieve the top N similar users, we can use the `similar_users` from `recommenders`. For example, if we want to choose top 10 users most similar to the user 1:" - ] - }, - { - "cell_type": "code", - "execution_count": 38, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
userIDscore
05550.999998
1540.999997
23140.999995
34110.999993
43950.999992
54650.999992
64810.999990
72820.999990
85270.999990
9570.999989
\n", - "
" - ], - "text/plain": [ - " userID score\n", - "0 555 0.999998\n", - "1 54 0.999997\n", - "2 314 0.999995\n", - "3 411 0.999993\n", - "4 395 0.999992\n", - "5 465 0.999992\n", - "6 481 0.999990\n", - "7 282 0.999990\n", - "8 527 0.999990\n", - "9 57 0.999989" - ] - }, - "execution_count": 38, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "similar_users(user_id=1, \n", - " user_features=user_features, \n", - " model=model2)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### 5.2 Item affinity" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Similar to the user affinity, the item-item affinity can be retrieved with the `get_item_representations` method using the fitted model." - ] - }, - { - "cell_type": "code", - "execution_count": 39, - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[-0.07855016, -0.06326439, -0.24408759, ..., 0.91503495,\n", - " -1.1991384 , 0.6392026 ],\n", - " [ 0.02296161, 0.21057224, 0.52859396, ..., 0.6266738 ,\n", - " -0.5909869 , 0.48717606],\n", - " [-0.05290217, 0.21497665, 0.12442638, ..., 0.64513564,\n", - " -0.89034337, 0.47523445],\n", - " ...,\n", - " [ 0.37707207, 0.12548159, 0.74360174, ..., 0.19332102,\n", - " -0.24798231, -0.3791776 ],\n", - " [-0.27374834, -0.23832163, 0.9083196 , ..., 0.9711132 ,\n", - " -0.36962402, 0.20986083],\n", - " [-0.26275527, -0.3118822 , 0.60458297, ..., 0.52483046,\n", - " -0.46068186, 0.53892124]], dtype=float32)" - ] - }, - "execution_count": 39, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "_, item_embeddings = model2.get_item_representations(features=item_features)\n", - "item_embeddings" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The function to retrieve the top N similar items is similar to similar_users() above. For example, if we want to choose top 10 items most similar to the item 10:" - ] - }, - { - "cell_type": "code", - "execution_count": 40, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
itemIDscore
01810.996882
1140.996467
21460.996463
33730.995977
43210.995873
51140.995869
6440.995434
712510.994995
83520.994736
94170.994391
\n", - "
" - ], - "text/plain": [ - " itemID score\n", - "0 181 0.996882\n", - "1 14 0.996467\n", - "2 146 0.996463\n", - "3 373 0.995977\n", - "4 321 0.995873\n", - "5 114 0.995869\n", - "6 44 0.995434\n", - "7 1251 0.994995\n", - "8 352 0.994736\n", - "9 417 0.994391" - ] - }, - "execution_count": 40, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "similar_items(item_id=10, \n", - " item_features=item_features, \n", - " model=model2)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Record results for tests - ignore this cell\n", - "store_metadata(\"eval_precision\", eval_precision)\n", - "store_metadata(\"eval_recall\", eval_recall)\n", - "store_metadata(\"eval_precision2\", eval_precision2)\n", - "store_metadata(\"eval_recall2\", eval_recall2)\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## 6. Conclusion" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "In this notebook, the background of hybrid matrix factorisation model has been explained together with a detailed example of LightFM's implementation. \n", - "\n", - "The process of incorporating additional user and item metadata has also been demonstrated with performance comparison. Furthermore, the calculation of both user and item affinity scores have also been demonstrated and extracted from the fitted model.\n", - "\n", - "This notebook remains a fairly simple treatment on the subject and hopefully could serve as a good foundation for the reader." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## References" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "- [[1](https://arxiv.org/abs/1507.08439)]. Maciej Kula - Metadata Embeddings for User and Item Cold-start Recommendations, 2015. arXiv:1507.08439\n", - "- [[2](https://making.lyst.com/lightfm/docs/home.html)]. LightFM documentation,\n", - "- [3]. Charu C. Aggarwal - Recommender Systems: The Textbook, Springer, April 2016. ISBN 978-3-319-29659-3\n", - "- [4]. Deepak K. Agarwal, Bee-Chung Chen - Statistical Methods for Recommender Systems, 2016. ISBN: 9781107036079 \n" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "reco_cpu", - "language": "python", - "name": "conda-env-reco_cpu-py" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.8.13" - } - }, - "nbformat": 4, - "nbformat_minor": 4 -} From 4a023b5c747e2d2a8c2e739b5712704a27f75415 Mon Sep 17 00:00:00 2001 From: miguelgfierro Date: Fri, 29 Dec 2023 09:08:29 +0100 Subject: [PATCH 2/5] Update hybrid to CF Signed-off-by: miguelgfierro --- README.md | 12 ++++++------ examples/02_model_hybrid/README.md | 10 ---------- examples/README.md | 2 +- 3 files changed, 7 insertions(+), 17 deletions(-) delete mode 100644 examples/02_model_hybrid/README.md diff --git a/README.md b/README.md index 6c6a341bd3..87b9ef986a 100644 --- a/README.md +++ b/README.md @@ -83,12 +83,12 @@ The table below lists the recommender algorithms currently available in the repo | Cornac/Bilateral Variational Autoencoder (BiVAE) | Collaborative Filtering | Generative model for dyadic data (e.g., user-item interactions). It works in the CPU/GPU environment. | [Deep dive](examples/02_model_collaborative_filtering/cornac_bivae_deep_dive.ipynb) | | Convolutional Sequence Embedding Recommendation (Caser) | Collaborative Filtering | Algorithm based on convolutions that aim to capture both user’s general preferences and sequential patterns. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/sequential_recsys_amazondataset.ipynb) | | Deep Knowledge-Aware Network (DKN)* | Content-Based Filtering | Deep learning algorithm incorporating a knowledge graph and article embeddings for providing news or article recommendations. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/dkn_MIND.ipynb) / [Deep dive](examples/02_model_content_based_filtering/dkn_deep_dive.ipynb) | -| Extreme Deep Factorization Machine (xDeepFM)* | Hybrid | Deep learning based algorithm for implicit and explicit feedback with user/item features. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/xdeepfm_criteo.ipynb) | +| Extreme Deep Factorization Machine (xDeepFM)* | Collaborative Filtering | Deep learning based algorithm for implicit and explicit feedback with user/item features. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/xdeepfm_criteo.ipynb) | | FastAI Embedding Dot Bias (FAST) | Collaborative Filtering | General purpose algorithm with embeddings and biases for users and items. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/fastai_movielens.ipynb) | -| LightFM/Hybrid Matrix Factorization | Hybrid | Hybrid matrix factorization algorithm for both implicit and explicit feedbacks. It works in the CPU environment. | [Quick start](examples/02_model_hybrid/lightfm_deep_dive.ipynb) | +| LightFM/Factorization Machine | Collaborative Filtering | Factorization Machine algorithm for both implicit and explicit feedbacks. It works in the CPU environment. | [Quick start](examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb) | | LightGBM/Gradient Boosting Tree* | Content-Based Filtering | Gradient Boosting Tree algorithm for fast training and low memory usage in content-based problems. It works in the CPU/GPU/PySpark environments. | [Quick start in CPU](examples/00_quick_start/lightgbm_tinycriteo.ipynb) / [Deep dive in PySpark](examples/02_model_content_based_filtering/mmlspark_lightgbm_criteo.ipynb) | | LightGCN | Collaborative Filtering | Deep learning algorithm which simplifies the design of GCN for predicting implicit feedback. It works in the CPU/GPU environment. | [Deep dive](examples/02_model_collaborative_filtering/lightgcn_deep_dive.ipynb) | -| GeoIMC* | Hybrid | Matrix completion algorithm that has into account user and item features using Riemannian conjugate gradients optimization and following a geometric approach. It works in the CPU environment. | [Quick start](examples/00_quick_start/geoimc_movielens.ipynb) | +| GeoIMC* | Collaborative Filtering | Matrix completion algorithm that has into account user and item features using Riemannian conjugate gradients optimization and following a geometric approach. It works in the CPU environment. | [Quick start](examples/00_quick_start/geoimc_movielens.ipynb) | | GRU | Collaborative Filtering | Sequential-based algorithm that aims to capture both long and short-term user preferences using recurrent neural networks. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/sequential_recsys_amazondataset.ipynb) | | Multinomial VAE | Collaborative Filtering | Generative model for predicting user/item interactions. It works in the CPU/GPU environment. | [Deep dive](examples/02_model_collaborative_filtering/multi_vae_deep_dive.ipynb) | | Neural Recommendation with Long- and Short-term User Representations (LSTUR)* | Content-Based Filtering | Neural recommendation algorithm for recommending news articles with long- and short-term user interest modeling. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/lstur_MIND.ipynb) | @@ -108,8 +108,8 @@ The table below lists the recommender algorithms currently available in the repo | Surprise/Singular Value Decomposition (SVD) | Collaborative Filtering | Matrix factorization algorithm for predicting explicit rating feedback in small datasets. It works in the CPU/GPU environment. | [Deep dive](examples/02_model_collaborative_filtering/surprise_svd_deep_dive.ipynb) | | Term Frequency - Inverse Document Frequency (TF-IDF) | Content-Based Filtering | Simple similarity-based algorithm for content-based recommendations with text datasets. It works in the CPU environment. | [Quick start](examples/00_quick_start/tfidf_covid.ipynb) | | Vowpal Wabbit (VW)* | Content-Based Filtering | Fast online learning algorithms, great for scenarios where user features / context are constantly changing. It uses the CPU for online learning. | [Deep dive](examples/02_model_content_based_filtering/vowpal_wabbit_deep_dive.ipynb) | -| Wide and Deep | Hybrid | Deep learning algorithm that can memorize feature interactions and generalize user features. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/wide_deep_movielens.ipynb) | -| xLearn/Factorization Machine (FM) & Field-Aware FM (FFM) | Hybrid | Quick and memory efficient algorithm to predict labels with user/item features. It works in the CPU/GPU environment. | [Deep dive](examples/02_model_hybrid/fm_deep_dive.ipynb) | +| Wide and Deep | Collaborative Filtering | Deep learning algorithm that can memorize feature interactions and generalize user features. It works in the CPU/GPU environment. | [Quick start](examples/00_quick_start/wide_deep_movielens.ipynb) | +| xLearn/Factorization Machine (FM) & Field-Aware FM (FFM) | Collaborative Filtering | Quick and memory efficient algorithm to predict labels with user/item features. It works in the CPU/GPU environment. | [Deep dive](examples/02_model_collaborative_filtering/fm_deep_dive.ipynb) | **NOTE**: * indicates algorithms invented/contributed by Microsoft. @@ -130,7 +130,7 @@ We provide a [benchmark notebook](examples/06_benchmarks/movielens.ipynb) to ill | [BPR](examples/02_model_collaborative_filtering/cornac_bpr_deep_dive.ipynb) | 0.132478 | 0.441997 | 0.388229 | 0.212522 | N/A | N/A | N/A | N/A | | [FastAI](examples/00_quick_start/fastai_movielens.ipynb) | 0.025503 | 0.147866 | 0.130329 | 0.053824 | 0.943084 | 0.744337 | 0.285308 | 0.287671 | | [LightGCN](examples/02_model_collaborative_filtering/lightgcn_deep_dive.ipynb) | 0.088526 | 0.419846 | 0.379626 | 0.144336 | N/A | N/A | N/A | N/A | -| [NCF](examples/02_model_hybrid/ncf_deep_dive.ipynb) | 0.107720 | 0.396118 | 0.347296 | 0.180775 | N/A | N/A | N/A | N/A | +| [NCF](examples/02_model_collaborative_filtering/ncf_deep_dive.ipynb) | 0.107720 | 0.396118 | 0.347296 | 0.180775 | N/A | N/A | N/A | N/A | | [SAR](examples/00_quick_start/sar_movielens.ipynb) | 0.110591 | 0.382461 | 0.330753 | 0.176385 | 1.253805 | 1.048484 | -0.569363 | 0.030474 | | [SVD](examples/02_model_collaborative_filtering/surprise_svd_deep_dive.ipynb) | 0.012873 | 0.095930 | 0.091198 | 0.032783 | 0.938681 | 0.742690 | 0.291967 | 0.291971 | diff --git a/examples/02_model_hybrid/README.md b/examples/02_model_hybrid/README.md deleted file mode 100644 index c268cc0bf9..0000000000 --- a/examples/02_model_hybrid/README.md +++ /dev/null @@ -1,10 +0,0 @@ -# Deep dive in hybrid algorithms - -In this directory, notebooks are provided to give a deep dive of hybrid recommendation algorithms. The notebooks make use of the utility functions ([recommenders](../../recommenders)) available in the repo. - -| Notebook | Environment | Description | -| --- | --- | --- | -| [fm_deep_dive](fm_deep_dive.ipynb) | Python CPU | Deep dive into factorization machine (FM) and field-aware FM (FFM) algorithm. -| [lightfm_deep_dive](lightfm_deep_dive.ipynb) | Python CPU | Deep dive into hybrid matrix factorisation model with LightFM. - -Details on model training are best found inside each notebook. diff --git a/examples/README.md b/examples/README.md index 10967bdca3..365fcf08b0 100644 --- a/examples/README.md +++ b/examples/README.md @@ -17,11 +17,11 @@ The following summarizes each directory of the best practice notebooks. | [01_prepare_data](01_prepare_data) | Yes | Data preparation notebooks for each recommender algorithm| | [02_model_collaborative_filtering](02_model_collaborative_filtering) | Yes | Deep dive notebooks about model training and evaluation using collaborative filtering algorithms | | [02_model_content_based_filtering](02_model_content_based_filtering) | Yes |Deep dive notebooks about model training and evaluation using content-based filtering algorithms | -| [02_model_hybrid](02_model_hybrid) | Yes | Deep dive notebooks about model training and evaluation using hybrid algorithms | | [03_evaluate](03_evaluate) | Yes | Notebooks that introduce different evaluation methods for recommenders | | [04_model_select_and_optimize](04_model_select_and_optimize) | Some local, some on Azure | Best practice notebooks for model tuning and selecting by using Azure Machine Learning Service and/or open source technologies | | [05_operationalize](05_operationalize) | No, Run on Azure | Operationalization notebooks that illustrate an end-to-end pipeline by using a recommender algorithm for a certain real-world use case scenario | | [06_benchmarks](06_benchmarks) | Yes | Benchmark comparison of several recommender algorithms | +| [07_tutorials](07_tutorials) | Yes | Tutorials for using the Recommenders library | ## On-premise notebooks From 44e624be96b51f9bbac617a29a727d642007eab1 Mon Sep 17 00:00:00 2001 From: miguelgfierro Date: Fri, 29 Dec 2023 09:09:55 +0100 Subject: [PATCH 3/5] change path hybrid Signed-off-by: miguelgfierro --- tests/conftest.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/tests/conftest.py b/tests/conftest.py index 7063c47fc4..12c636d8f7 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -319,10 +319,10 @@ def notebooks(): "cornac_bivae_deep_dive.ipynb", ), "xlearn_fm_deep_dive": os.path.join( - folder_notebooks, "02_model_hybrid", "fm_deep_dive.ipynb" + folder_notebooks, "02_model_collaborative_filtering", "fm_deep_dive.ipynb" ), "lightfm_deep_dive": os.path.join( - folder_notebooks, "02_model_hybrid", "lightfm_deep_dive.ipynb" + folder_notebooks, "02_model_collaborative_filtering", "lightfm_deep_dive.ipynb" ), "evaluation": os.path.join(folder_notebooks, "03_evaluate", "evaluation.ipynb"), "evaluation_diversity": os.path.join( From d3fce82fae82f287621b3bf08ac25aac04aa0493 Mon Sep 17 00:00:00 2001 From: miguelgfierro Date: Fri, 29 Dec 2023 09:14:38 +0100 Subject: [PATCH 4/5] change path hybrid Signed-off-by: miguelgfierro --- GLOSSARY.md | 4 +- .../fm_deep_dive.ipynb | 912 ++++++++ .../lightfm_deep_dive.ipynb | 1956 +++++++++++++++++ 3 files changed, 2869 insertions(+), 3 deletions(-) create mode 100644 examples/02_model_collaborative_filtering/fm_deep_dive.ipynb create mode 100755 examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb diff --git a/GLOSSARY.md b/GLOSSARY.md index 1829c10575..7f325b4a27 100644 --- a/GLOSSARY.md +++ b/GLOSSARY.md @@ -9,7 +9,7 @@ Licensed under the MIT License. * **Click-through rate (CTR)**: Ratio of the number of users who click on a link over the total number of users that visited the page. CTR is a measure of the user engagement. -* **Cold-start problem**: The cold start problem concerns the recommendations for users with no or few past history (new users). Providing recommendations to users with small past history becomes a difficult problem for collaborative filtering models because their learning and predictive ability is limited. Multiple research have been conducted in this direction using content-based filtering models or hybrid models. These models use auxiliary information like user or item metadata to overcome the cold start problem. +* **Cold-start problem**: The cold start problem concerns the recommendations for users with no or few past history (new users). Providing recommendations to users with small past history becomes a difficult problem for collaborative filtering models because their learning and predictive ability is limited. Multiple research have been conducted in this direction using content-based filtering models. These models use auxiliary information like user or item metadata to overcome the cold start problem. * **Collaborative filtering algorithms (CF)**: CF algorithms make prediction of what is the likelihood of a user selecting an item based on the behavior of other users [1]. It assumes that if user A likes item X and Y, and user B likes item X, user B would probably like item Y. See the [list of CF examples in Recommenders repository](examples/02_model_collaborative_filtering). @@ -21,8 +21,6 @@ Licensed under the MIT License. * **Explicit interaction data**: When a user explicitly rate an item, typically between 1-5, the user is giving a value on the likeliness of the item. -* **Hybrid filtering algorithms**: This type of recommendation system can implement a combination of collaborative and content-based filtering models. See the [list of examples in Recommenders repository](examples/02_model_hybrid). - * **Implicit interaction data**: Implicit interactions are views or clicks that show a certain interest of the user about a specific items. These kind of data is more common but it doesn't define the intention of the user as clearly as the explicit data. * **Item information**: These include information about the item, some examples can be name, description, price, etc. diff --git a/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb b/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb new file mode 100644 index 0000000000..04d782e5a6 --- /dev/null +++ b/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb @@ -0,0 +1,912 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Copyright (c) Recommenders contributors.\n", + "\n", + "Licensed under the MIT License." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Factorization Machine Deep Dive\n", + "\n", + "Factorization machine (FM) is one of the representative algorithms that are used for building hybrid recommenders model. The algorithm is powerful in terms of capturing the effects of not just the input features but also their interactions. The algorithm provides better generalization capability and expressiveness compared to other classic algorithms such as SVMs. The most recent research extends the basic FM algorithms by using deep learning techniques, which achieve remarkable improvement in a few practical use cases.\n", + "\n", + "This notebook presents a deep dive into the Factorization Machine algorithm, and demonstrates some best practices of using the contemporary FM implementations like [`xlearn`](https://github.com/aksnzhy/xlearn) for dealing with tasks like click-through rate prediction." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1 Factorization Machine" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1.1 Factorization Machine" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "FM is an algorithm that uses factorization in prediction tasks with data set of high sparsity. The algorithm was original proposed in [\\[1\\]](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf). Traditionally, the algorithms such as SVM do not perform well in dealing with highly sparse data that is usually seen in many contemporary problems, e.g., click-through rate prediction, recommendation, etc. FM handles the problem by modeling not just first-order linear components for predicting the label, but also the cross-product of the feature variables in order to capture more generalized correlation between variables and label. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In certain occasions, the data that appears in recommendation problems, such as user, item, and feature vectors, can be encoded into a one-hot representation. Under this arrangement, classical algorithms like linear regression and SVM may suffer from the following problems:\n", + "1. The feature vectors are highly sparse, and thus it makes it hard to optimize the parameters to fit the model efficienly\n", + "2. Cross-product of features will be sparse as well, and this in turn, reduces the expressiveness of a model if it is designed to capture the high-order interactions between features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The FM algorithm is designed to tackle the above two problems by factorizing latent vectors that model the low- and high-order components. The general idea of a FM model is expressed in the following equation:" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\hat{y}(\\textbf{x})=w_{0}+\\sum^{n}_{i=1}w_{i}x_{i}+\\sum^{n}_{i=1}\\sum^{n}_{j=i+1}<\\textbf{v}_{i}, \\textbf{v}_{j}>x_{i}x_{j}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "where $\\hat{y}$ and $\\textbf{x}$ are the target to predict and input feature vectors, respectively. $w_{i}$ is the model parameters for the first-order component. $<\\textbf{v}_{i}, \\textbf{v}_{j}>$ is the dot product of two latent factors for the second-order interaction of feature variables, and it is defined as " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$<\\textbf{v}_{i}, \\textbf{v}_{j}>=\\sum^{k}_{f=1}v_{i,f}\\cdot v_{j,f}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compared to using fixed parameter for the high-order interaction components, using the factorized vectors increase generalization as well as expressiveness of the model. In addition to this, the computation complexity of the equation (above) is $O(kn)$ where $k$ and $n$ are the dimensionalities of the factorization vector and input feature vector, respectively (see [the paper](https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf) for detailed discussion). In practice, usually a two-way FM model is used, i.e., only the second-order feature interactions are considered to favor computational efficiency." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1.2 Field-Aware Factorization Machine" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Field-aware factorization machine (FFM) is an extension to FM. It was originally introduced in [\\[2\\]](https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf). The advantage of FFM over FM is that, it uses different factorized latent factors for different groups of features. The \"group\" is called \"field\" in the context of FFM. Putting features into fields resolves the issue that the latent factors shared by features that intuitively represent different categories of information may not well generalize the correlation. \n", + "\n", + "Different from the formula for the 2-order cross product as can be seen above in the FM equation, in the FFM settings, the equation changes to " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "$$\\theta_{\\text{FFM}}(\\textbf{w}\\textbf{x})=\\sum^{n}_{j1=1}\\sum^{n}_{j2=j1+1}<\\textbf{v}_{j1,f2}, \\textbf{v}_{j2,f1}>x_{j1}x_{j2}$$" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "where $f_1$ and $f_2$ are the fields of $j_1$ and $j_2$, respectively." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compared to FM, the computational complexity increases to $O(n^2k)$. However, since the latent factors in FFM only need to learn the effect within the field, so the $k$ values in FFM is usually much smaller than that in FM." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 1.3 FM/FFM extensions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the recent years, FM/FFM extensions were proposed to enhance the model performance further. The new algorithms leverage the powerful deep learning neural network to improve the generalization capability of the original FM/FFM algorithms. Representatives of the such algorithms are summarized as below. Some of them are implemented and demonstrated in the microsoft/recommenders repository. \n", + "\n", + "|Algorithm|Notes|References|Example in Microsoft/Recommenders|\n", + "|---------|-----|----------|---------------------------------|\n", + "|DeepFM|Combination of FM and DNN where DNN handles high-order interactions|[\\[3\\]](https://arxiv.org/abs/1703.04247)|-|\n", + "|xDeepFM|Combination of FM, DNN, and Compressed Interaction Network, for vectorized feature interactions|[\\[4\\]](https://dl.acm.org/citation.cfm?id=3220023)|[notebook](../00_quick_start/xdeepfm_criteo.ipynb) / [utilities](../../recommenders/models/deeprec/models/xDeepFM.py)|\n", + "|Factorization Machine Supported Neural Network|Use FM user/item weight vectors as input layers for DNN model|[\\[5\\]](https://link.springer.com/chapter/10.1007/978-3-319-30671-1_4)|-|\n", + "|Product-based Neural Network|An additional product-wise layer between embedding layer and fully connected layer to improve expressiveness of interactions of features across fields|[\\[6\\]](https://ieeexplore.ieee.org/abstract/document/7837964)|-|\n", + "|Neural Factorization Machines|Improve the factorization part of FM by using stacks of NN layers to improve non-linear expressiveness|[\\[7\\]](https://dl.acm.org/citation.cfm?id=3080777)|-|\n", + "|Wide and deep|Combination of linear model (wide part) and deep neural network model (deep part) for memorisation and generalization|[\\[8\\]](https://dl.acm.org/citation.cfm?id=2988454)|[notebook](../00_quick_start/wide_deep_movielens.ipynb) / [utilities](../../recommenders/models/wide_deep)|" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2 Factorization Machine Implementation" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.1 Implementations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following table summarizes the implementations of FM/FFM. Some of them (e.g., xDeepFM and VW) are implemented and/or demonstrated in the microsoft/recommenders repository" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "|Implementation|Language|Notes|Examples in Microsoft/Recommenders|\n", + "|-----------------|------------------|------------------|---------------------|\n", + "|[libfm](https://github.com/srendle/libfm)|C++|Implementation of FM algorithm|-|\n", + "|[libffm](https://github.com/ycjuan/libffm)|C++|Original implemenation of FFM algorithm. It is handy in model building, but does not support Python interface|-|\n", + "|[xlearn](https://github.com/aksnzhy/xlearn)|C++ with Python interface|More computationally efficient compared to libffm without loss of modeling effectiveness|[notebook](fm_deep_dive.ipynb)|\n", + "|[Vowpal Wabbit FM](https://github.com/VowpalWabbit/vowpal_wabbit/wiki/Matrix-factorization-example)|Online library with estimator API|Easy to use by calling API|[notebook](../02_model_content_based_filtering/vowpal_wabbit_deep_dive.ipynb) / [utilities](../../recommenders/models/vowpal_wabbit)\n", + "|[microsoft/recommenders xDeepFM](../../recommenders/models/deeprec/models/xDeepFM.py)|Python|Support flexible interface with different configurations of FM and FM extensions, i.e., LR, FM, and/or CIN|[notebook](../00_quick_start/xdeepfm_criteo.ipynb) / [utilities](../../recommenders/models/deeprec/models/xDeepFM.py)|" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Other than `libfm` and `libffm`, all the other three can be used in a Python environment. \n", + "\n", + "* A deep dive of using Vowbal Wabbit for FM model can be found [here](../02_model_content_based_filtering/vowpal_wabbit_deep_dive.ipynb)\n", + "* A quick start of Microsoft xDeepFM algorithm can be found [here](../00_quick_start/xdeepfm_criteo.ipynb). \n", + "\n", + "Therefore, in the example below, only code examples and best practices of using `xlearn` are presented." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.2 xlearn" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Setups for using `xlearn`.\n", + "\n", + "1. `xlearn` is implemented in C++ and has Python bindings, so it can be directly installed as a Python package from PyPI. The installation of `xlearn` is enabled in the [Recommenders repo environment setup script](../../tools/generate_conda_file.py). One can follow the general setup steps to install the environment as required, in which `xlearn` is installed as well.\n", + "2. NOTE `xlearn` may require some base libraries installed as prerequisites in the system, e.g., `cmake`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After a succesful creation of the environment, one can load the packages to run `xlearn` in a Jupyter notebook or Python script." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "System version: 3.6.13 |Anaconda, Inc.| (default, Feb 23 2021, 21:15:04) \n", + "[GCC 7.3.0]\n", + "Xlearn version: 0.4.0\n" + ] + } + ], + "source": [ + "import os\n", + "import sys\n", + "from tempfile import TemporaryDirectory\n", + "import xlearn as xl\n", + "from sklearn.metrics import roc_auc_score\n", + "import numpy as np\n", + "import pandas as pd\n", + "import seaborn as sns\n", + "%matplotlib notebook\n", + "from matplotlib import pyplot as plt\n", + "\n", + "from recommenders.utils.constants import SEED\n", + "from recommenders.utils.timer import Timer\n", + "from recommenders.datasets.download_utils import maybe_download, unzip_file\n", + "from recommenders.tuning.parameter_sweep import generate_param_grid\n", + "from recommenders.datasets.pandas_df_utils import LibffmConverter\n", + "from recommenders.utils.notebook_utils import store_metadata\n", + "\n", + "print(\"System version: {}\".format(sys.version))\n", + "print(\"Xlearn version: {}\".format(xl.__version__))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the FM model building, data is usually represented in the libsvm data format. That is, `label feat1:val1 feat2:val2 ...`, where `label` is the target to predict, and `val` is the value to each feature `feat`.\n", + "\n", + "FFM algorithm requires data to be represented in the libffm format, where each vector is split into several fields with categorical/numerical features inside. That is, `label field1:feat1:val1 field2:feat2:val2 ...`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the Microsoft/Recommenders utility functions, [a libffm converter](../../recommenders/dataset/pandas_df_utils.py) is provided to achieve the transformation from a tabular feature vectors to the corresponding libffm representation. For example, the following shows how to transform the format of a synthesized data by using the module of `LibffmConverter`." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
ratingfield1field2field3field4
011:1:12:4:33:5:1.04:6:1
101:2:12:4:43:5:2.04:7:1
201:3:12:4:53:5:3.04:8:1
311:3:12:4:63:5:4.04:9:1
411:3:12:4:73:5:5.04:10:1
\n", + "
" + ], + "text/plain": [ + " rating field1 field2 field3 field4\n", + "0 1 1:1:1 2:4:3 3:5:1.0 4:6:1\n", + "1 0 1:2:1 2:4:4 3:5:2.0 4:7:1\n", + "2 0 1:3:1 2:4:5 3:5:3.0 4:8:1\n", + "3 1 1:3:1 2:4:6 3:5:4.0 4:9:1\n", + "4 1 1:3:1 2:4:7 3:5:5.0 4:10:1" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "df_feature_original = pd.DataFrame(\n", + " {\n", + " \"rating\": [1, 0, 0, 1, 1],\n", + " \"field1\": [\"xxx1\", \"xxx2\", \"xxx4\", \"xxx4\", \"xxx4\"],\n", + " \"field2\": [3, 4, 5, 6, 7],\n", + " \"field3\": [1.0, 2.0, 3.0, 4.0, 5.0],\n", + " \"field4\": [\"1\", \"2\", \"3\", \"4\", \"5\"],\n", + " }\n", + ")\n", + "\n", + "converter = LibffmConverter().fit(df_feature_original, col_rating=\"rating\")\n", + "df_out = converter.transform(df_feature_original)\n", + "df_out\n" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "There are in total 4 fields and 10 features.\n" + ] + } + ], + "source": [ + "print(\n", + " \"There are in total {0} fields and {1} features.\".format(\n", + " converter.field_count, converter.feature_count\n", + " )\n", + ")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To illustrate the use of `xlearn`, the following example uses the [Criteo data set](https://labs.criteo.com/category/dataset/), which has already been processed in the libffm format, for building and evaluating a FFM model built by using `xlearn`. Sometimes, it is important to know the total numbers of fields and features. When building a FFM model, `xlearn` can count these numbers automatically." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "tags": [ + "parameters" + ] + }, + "outputs": [], + "source": [ + "# Model parameters\n", + "LEARNING_RATE = 0.2\n", + "LAMBDA = 0.002\n", + "EPOCH = 10\n", + "OPT_METHOD = \"sgd\" # options are \"sgd\", \"adagrad\" and \"ftrl\"\n", + "\n", + "# The metrics for binary classification options are \"acc\", \"prec\", \"f1\" and \"auc\"\n", + "# for regression, options are \"rmse\", \"mae\", \"mape\"\n", + "METRIC = \"auc\"\n" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "100%|██████████| 10.3k/10.3k [00:00<00:00, 55.9kKB/s]\n" + ] + } + ], + "source": [ + "# Paths\n", + "YAML_FILE_NAME = \"xDeepFM.yaml\"\n", + "TRAIN_FILE_NAME = \"cretio_tiny_train\"\n", + "VALID_FILE_NAME = \"cretio_tiny_valid\"\n", + "TEST_FILE_NAME = \"cretio_tiny_test\"\n", + "MODEL_FILE_NAME = \"model.out\"\n", + "OUTPUT_FILE_NAME = \"output.txt\"\n", + "\n", + "tmpdir = TemporaryDirectory()\n", + "\n", + "data_path = tmpdir.name\n", + "yaml_file = os.path.join(data_path, YAML_FILE_NAME)\n", + "train_file = os.path.join(data_path, TRAIN_FILE_NAME)\n", + "valid_file = os.path.join(data_path, VALID_FILE_NAME)\n", + "test_file = os.path.join(data_path, TEST_FILE_NAME)\n", + "model_file = os.path.join(data_path, MODEL_FILE_NAME)\n", + "output_file = os.path.join(data_path, OUTPUT_FILE_NAME)\n", + "\n", + "assets_url = (\n", + " \"https://recodatasets.z20.web.core.windows.net/deeprec/xdeepfmresources.zip\"\n", + ")\n", + "assets_file = maybe_download(assets_url, work_directory=data_path)\n", + "unzip_file(assets_file, data_path)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following steps are from the [official documentation of `xlearn`](https://xlearn-doc.readthedocs.io/en/latest/index.html) for building a model. To begin with, we do not modify any training parameter values. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "NOTE, if `xlearn` is run through command line, the training process can be displayed in the console." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training time: 14.4424\n" + ] + } + ], + "source": [ + "# Training task\n", + "ffm_model = xl.create_ffm() # Use field-aware factorization machine (ffm)\n", + "ffm_model.setTrain(train_file) # Set the path of training dataset\n", + "ffm_model.setValidate(valid_file) # Set the path of validation dataset\n", + "\n", + "# Parameters:\n", + "# 0. task: binary classification\n", + "# 1. learning rate: 0.2\n", + "# 2. regular lambda: 0.002\n", + "# 3. evaluation metric: auc\n", + "# 4. number of epochs: 10\n", + "# 5. optimization method: sgd\n", + "param = {\n", + " \"task\": \"binary\",\n", + " \"lr\": LEARNING_RATE,\n", + " \"lambda\": LAMBDA,\n", + " \"metric\": METRIC,\n", + " \"epoch\": EPOCH,\n", + " \"opt\": OPT_METHOD,\n", + "}\n", + "\n", + "# Start to train\n", + "# The trained model will be stored in model.out\n", + "with Timer() as time_train:\n", + " ffm_model.fit(param, model_file)\n", + "print(f\"Training time: {time_train}\")\n" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Prediction time: 0.6435\n" + ] + } + ], + "source": [ + "# Prediction task\n", + "ffm_model.setTest(test_file) # Set the path of test dataset\n", + "ffm_model.setSigmoid() # Convert output to 0-1\n", + "\n", + "# Start to predict\n", + "# The output result will be stored in output.txt\n", + "with Timer() as time_predict:\n", + " ffm_model.predict(model_file, output_file)\n", + "print(f\"Prediction time: {time_predict}\")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The output are the predicted labels (i.e., 1 or 0) for the testing data set. AUC score is calculated to evaluate the model performance." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "0.7485411618010794" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "with open(output_file) as f:\n", + " predictions = f.readlines()\n", + "\n", + "with open(test_file) as f:\n", + " truths = f.readlines()\n", + "\n", + "truths = np.array([float(truth.split(\" \")[0]) for truth in truths])\n", + "predictions = np.array([float(prediction.strip(\"\")) for prediction in predictions])\n", + "\n", + "auc_score = roc_auc_score(truths, predictions)\n", + "\n", + "print(auc_score)\n" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "application/papermill.record+json": { + "auc_score": 0.7498803439718372 + } + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "store_metadata(\"auc_score\", auc_score)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It can be seen that the model building/scoring process is fast and the model performance is good. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.3 Hyperparameter tuning of `xlearn`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The following presents a naive approach to tune the parameters of `xlearn`, which is using grid-search of parameter values to find the optimal combinations. It is worth noting that the original [FFM paper](https://www.csie.ntu.edu.tw/~cjlin/papers/ffm.pdf) gave some hints in terms of the impact of parameters on the sampled Criteo dataset. \n", + "\n", + "The following are the parameters that can be tuned in the `xlearn` implementation of FM/FFM algorithm." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "|Parameter|Description|Default value|Notes|\n", + "|-------------|-----------------|------------------|-----------------|\n", + "|`lr`|Learning rate|0.2|Higher learning rate helps fit a model more efficiently but may also result in overfitting.|\n", + "|`lambda`|Regularization parameter|0.00002|The value needs to be selected empirically to avoid overfitting.|\n", + "|`k`|Dimensionality of the latent factors|4|In FFM the effect of k is not that significant as the algorithm itself considers field where `k` can be small to capture the effect of features within each of the fields.|\n", + "|`init`|Model initialization|0.66|-|\n", + "|`epoch`|Number of epochs|10|Using a larger epoch size will help converge the model to its optimal point|" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [], + "source": [ + "param_dict = {\"lr\": [0.0001, 0.001, 0.01], \"lambda\": [0.001, 0.01, 0.1]}\n", + "\n", + "param_grid = generate_param_grid(param_dict)\n" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [], + "source": [ + "auc_scores = []\n", + "\n", + "with Timer() as time_tune:\n", + " for param in param_grid:\n", + " ffm_model = xl.create_ffm()\n", + " ffm_model.setTrain(train_file)\n", + " ffm_model.setValidate(valid_file)\n", + " ffm_model.fit(param, model_file)\n", + "\n", + " ffm_model.setTest(test_file)\n", + " ffm_model.setSigmoid()\n", + " ffm_model.predict(model_file, output_file)\n", + "\n", + " with open(output_file) as f:\n", + " predictions = f.readlines()\n", + "\n", + " with open(test_file) as f:\n", + " truths = f.readlines()\n", + "\n", + " truths = np.array([float(truth.split(\" \")[0]) for truth in truths])\n", + " predictions = np.array(\n", + " [float(prediction.strip(\"\")) for prediction in predictions]\n", + " )\n", + "\n", + " auc_scores.append(roc_auc_score(truths, predictions))\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tuning by grid search takes 4.2 min\n" + ] + } + ], + "source": [ + "print(\"Tuning by grid search takes {0:.2} min\".format(time_tune.interval / 60))\n" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
Lambda0.0010.0100.100
LR
0.00010.54810.61220.7210
0.00100.54540.61030.7245
0.01000.54050.61500.7238
\n", + "
" + ], + "text/plain": [ + "Lambda 0.001 0.010 0.100\n", + "LR \n", + "0.0001 0.5481 0.6122 0.7210\n", + "0.0010 0.5454 0.6103 0.7245\n", + "0.0100 0.5405 0.6150 0.7238" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "auc_scores = [float(\"%.4f\" % x) for x in auc_scores]\n", + "auc_scores_array = np.reshape(\n", + " auc_scores, (len(param_dict[\"lr\"]), len(param_dict[\"lambda\"]))\n", + ")\n", + "\n", + "auc_df = pd.DataFrame(\n", + " data=auc_scores_array,\n", + " index=pd.Index(param_dict[\"lr\"], name=\"LR\"),\n", + " columns=pd.Index(param_dict[\"lambda\"], name=\"Lambda\"),\n", + ")\n", + "auc_df\n" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "application/javascript": "/* Put everything inside the global mpl namespace */\n/* global mpl */\nwindow.mpl = {};\n\nmpl.get_websocket_type = function () {\n if (typeof WebSocket !== 'undefined') {\n return WebSocket;\n } else if (typeof MozWebSocket !== 'undefined') {\n return MozWebSocket;\n } else {\n alert(\n 'Your browser does not have WebSocket support. ' +\n 'Please try Chrome, Safari or Firefox ≥ 6. ' +\n 'Firefox 4 and 5 are also supported but you ' +\n 'have to enable WebSockets in about:config.'\n );\n }\n};\n\nmpl.figure = function (figure_id, websocket, ondownload, parent_element) {\n this.id = figure_id;\n\n this.ws = websocket;\n\n this.supports_binary = this.ws.binaryType !== undefined;\n\n if (!this.supports_binary) {\n var warnings = document.getElementById('mpl-warnings');\n if (warnings) {\n warnings.style.display = 'block';\n warnings.textContent =\n 'This browser does not support binary websocket messages. ' +\n 'Performance may be slow.';\n }\n }\n\n this.imageObj = new Image();\n\n this.context = undefined;\n this.message = undefined;\n this.canvas = undefined;\n this.rubberband_canvas = undefined;\n this.rubberband_context = undefined;\n this.format_dropdown = undefined;\n\n this.image_mode = 'full';\n\n this.root = document.createElement('div');\n this.root.setAttribute('style', 'display: inline-block');\n this._root_extra_style(this.root);\n\n parent_element.appendChild(this.root);\n\n this._init_header(this);\n this._init_canvas(this);\n this._init_toolbar(this);\n\n var fig = this;\n\n this.waiting = false;\n\n this.ws.onopen = function () {\n fig.send_message('supports_binary', { value: fig.supports_binary });\n fig.send_message('send_image_mode', {});\n if (fig.ratio !== 1) {\n fig.send_message('set_dpi_ratio', { dpi_ratio: fig.ratio });\n }\n fig.send_message('refresh', {});\n };\n\n this.imageObj.onload = function () {\n if (fig.image_mode === 'full') {\n // Full images could contain transparency (where diff images\n // almost always do), so we need to clear the canvas so that\n // there is no ghosting.\n fig.context.clearRect(0, 0, fig.canvas.width, fig.canvas.height);\n }\n fig.context.drawImage(fig.imageObj, 0, 0);\n };\n\n this.imageObj.onunload = function () {\n fig.ws.close();\n };\n\n this.ws.onmessage = this._make_on_message_function(this);\n\n this.ondownload = ondownload;\n};\n\nmpl.figure.prototype._init_header = function () {\n var titlebar = document.createElement('div');\n titlebar.classList =\n 'ui-dialog-titlebar ui-widget-header ui-corner-all ui-helper-clearfix';\n var titletext = document.createElement('div');\n titletext.classList = 'ui-dialog-title';\n titletext.setAttribute(\n 'style',\n 'width: 100%; text-align: center; padding: 3px;'\n );\n titlebar.appendChild(titletext);\n this.root.appendChild(titlebar);\n this.header = titletext;\n};\n\nmpl.figure.prototype._canvas_extra_style = function (_canvas_div) {};\n\nmpl.figure.prototype._root_extra_style = function (_canvas_div) {};\n\nmpl.figure.prototype._init_canvas = function () {\n var fig = this;\n\n var canvas_div = (this.canvas_div = document.createElement('div'));\n canvas_div.setAttribute(\n 'style',\n 'border: 1px solid #ddd;' +\n 'box-sizing: content-box;' +\n 'clear: both;' +\n 'min-height: 1px;' +\n 'min-width: 1px;' +\n 'outline: 0;' +\n 'overflow: hidden;' +\n 'position: relative;' +\n 'resize: both;'\n );\n\n function on_keyboard_event_closure(name) {\n return function (event) {\n return fig.key_event(event, name);\n };\n }\n\n canvas_div.addEventListener(\n 'keydown',\n on_keyboard_event_closure('key_press')\n );\n canvas_div.addEventListener(\n 'keyup',\n on_keyboard_event_closure('key_release')\n );\n\n this._canvas_extra_style(canvas_div);\n this.root.appendChild(canvas_div);\n\n var canvas = (this.canvas = document.createElement('canvas'));\n canvas.classList.add('mpl-canvas');\n canvas.setAttribute('style', 'box-sizing: content-box;');\n\n this.context = canvas.getContext('2d');\n\n var backingStore =\n this.context.backingStorePixelRatio ||\n this.context.webkitBackingStorePixelRatio ||\n this.context.mozBackingStorePixelRatio ||\n this.context.msBackingStorePixelRatio ||\n this.context.oBackingStorePixelRatio ||\n this.context.backingStorePixelRatio ||\n 1;\n\n this.ratio = (window.devicePixelRatio || 1) / backingStore;\n\n var rubberband_canvas = (this.rubberband_canvas = document.createElement(\n 'canvas'\n ));\n rubberband_canvas.setAttribute(\n 'style',\n 'box-sizing: content-box; position: absolute; left: 0; top: 0; z-index: 1;'\n );\n\n // Apply a ponyfill if ResizeObserver is not implemented by browser.\n if (this.ResizeObserver === undefined) {\n if (window.ResizeObserver !== undefined) {\n this.ResizeObserver = window.ResizeObserver;\n } else {\n var obs = _JSXTOOLS_RESIZE_OBSERVER({});\n this.ResizeObserver = obs.ResizeObserver;\n }\n }\n\n this.resizeObserverInstance = new this.ResizeObserver(function (entries) {\n var nentries = entries.length;\n for (var i = 0; i < nentries; i++) {\n var entry = entries[i];\n var width, height;\n if (entry.contentBoxSize) {\n if (entry.contentBoxSize instanceof Array) {\n // Chrome 84 implements new version of spec.\n width = entry.contentBoxSize[0].inlineSize;\n height = entry.contentBoxSize[0].blockSize;\n } else {\n // Firefox implements old version of spec.\n width = entry.contentBoxSize.inlineSize;\n height = entry.contentBoxSize.blockSize;\n }\n } else {\n // Chrome <84 implements even older version of spec.\n width = entry.contentRect.width;\n height = entry.contentRect.height;\n }\n\n // Keep the size of the canvas and rubber band canvas in sync with\n // the canvas container.\n if (entry.devicePixelContentBoxSize) {\n // Chrome 84 implements new version of spec.\n canvas.setAttribute(\n 'width',\n entry.devicePixelContentBoxSize[0].inlineSize\n );\n canvas.setAttribute(\n 'height',\n entry.devicePixelContentBoxSize[0].blockSize\n );\n } else {\n canvas.setAttribute('width', width * fig.ratio);\n canvas.setAttribute('height', height * fig.ratio);\n }\n canvas.setAttribute(\n 'style',\n 'width: ' + width + 'px; height: ' + height + 'px;'\n );\n\n rubberband_canvas.setAttribute('width', width);\n rubberband_canvas.setAttribute('height', height);\n\n // And update the size in Python. We ignore the initial 0/0 size\n // that occurs as the element is placed into the DOM, which should\n // otherwise not happen due to the minimum size styling.\n if (fig.ws.readyState == 1 && width != 0 && height != 0) {\n fig.request_resize(width, height);\n }\n }\n });\n this.resizeObserverInstance.observe(canvas_div);\n\n function on_mouse_event_closure(name) {\n return function (event) {\n return fig.mouse_event(event, name);\n };\n }\n\n rubberband_canvas.addEventListener(\n 'mousedown',\n on_mouse_event_closure('button_press')\n );\n rubberband_canvas.addEventListener(\n 'mouseup',\n on_mouse_event_closure('button_release')\n );\n // Throttle sequential mouse events to 1 every 20ms.\n rubberband_canvas.addEventListener(\n 'mousemove',\n on_mouse_event_closure('motion_notify')\n );\n\n rubberband_canvas.addEventListener(\n 'mouseenter',\n on_mouse_event_closure('figure_enter')\n );\n rubberband_canvas.addEventListener(\n 'mouseleave',\n on_mouse_event_closure('figure_leave')\n );\n\n canvas_div.addEventListener('wheel', function (event) {\n if (event.deltaY < 0) {\n event.step = 1;\n } else {\n event.step = -1;\n }\n on_mouse_event_closure('scroll')(event);\n });\n\n canvas_div.appendChild(canvas);\n canvas_div.appendChild(rubberband_canvas);\n\n this.rubberband_context = rubberband_canvas.getContext('2d');\n this.rubberband_context.strokeStyle = '#000000';\n\n this._resize_canvas = function (width, height, forward) {\n if (forward) {\n canvas_div.style.width = width + 'px';\n canvas_div.style.height = height + 'px';\n }\n };\n\n // Disable right mouse context menu.\n this.rubberband_canvas.addEventListener('contextmenu', function (_e) {\n event.preventDefault();\n return false;\n });\n\n function set_focus() {\n canvas.focus();\n canvas_div.focus();\n }\n\n window.setTimeout(set_focus, 100);\n};\n\nmpl.figure.prototype._init_toolbar = function () {\n var fig = this;\n\n var toolbar = document.createElement('div');\n toolbar.classList = 'mpl-toolbar';\n this.root.appendChild(toolbar);\n\n function on_click_closure(name) {\n return function (_event) {\n return fig.toolbar_button_onclick(name);\n };\n }\n\n function on_mouseover_closure(tooltip) {\n return function (event) {\n if (!event.currentTarget.disabled) {\n return fig.toolbar_button_onmouseover(tooltip);\n }\n };\n }\n\n fig.buttons = {};\n var buttonGroup = document.createElement('div');\n buttonGroup.classList = 'mpl-button-group';\n for (var toolbar_ind in mpl.toolbar_items) {\n var name = mpl.toolbar_items[toolbar_ind][0];\n var tooltip = mpl.toolbar_items[toolbar_ind][1];\n var image = mpl.toolbar_items[toolbar_ind][2];\n var method_name = mpl.toolbar_items[toolbar_ind][3];\n\n if (!name) {\n /* Instead of a spacer, we start a new button group. */\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n buttonGroup = document.createElement('div');\n buttonGroup.classList = 'mpl-button-group';\n continue;\n }\n\n var button = (fig.buttons[name] = document.createElement('button'));\n button.classList = 'mpl-widget';\n button.setAttribute('role', 'button');\n button.setAttribute('aria-disabled', 'false');\n button.addEventListener('click', on_click_closure(method_name));\n button.addEventListener('mouseover', on_mouseover_closure(tooltip));\n\n var icon_img = document.createElement('img');\n icon_img.src = '_images/' + image + '.png';\n icon_img.srcset = '_images/' + image + '_large.png 2x';\n icon_img.alt = tooltip;\n button.appendChild(icon_img);\n\n buttonGroup.appendChild(button);\n }\n\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n\n var fmt_picker = document.createElement('select');\n fmt_picker.classList = 'mpl-widget';\n toolbar.appendChild(fmt_picker);\n this.format_dropdown = fmt_picker;\n\n for (var ind in mpl.extensions) {\n var fmt = mpl.extensions[ind];\n var option = document.createElement('option');\n option.selected = fmt === mpl.default_extension;\n option.innerHTML = fmt;\n fmt_picker.appendChild(option);\n }\n\n var status_bar = document.createElement('span');\n status_bar.classList = 'mpl-message';\n toolbar.appendChild(status_bar);\n this.message = status_bar;\n};\n\nmpl.figure.prototype.request_resize = function (x_pixels, y_pixels) {\n // Request matplotlib to resize the figure. Matplotlib will then trigger a resize in the client,\n // which will in turn request a refresh of the image.\n this.send_message('resize', { width: x_pixels, height: y_pixels });\n};\n\nmpl.figure.prototype.send_message = function (type, properties) {\n properties['type'] = type;\n properties['figure_id'] = this.id;\n this.ws.send(JSON.stringify(properties));\n};\n\nmpl.figure.prototype.send_draw_message = function () {\n if (!this.waiting) {\n this.waiting = true;\n this.ws.send(JSON.stringify({ type: 'draw', figure_id: this.id }));\n }\n};\n\nmpl.figure.prototype.handle_save = function (fig, _msg) {\n var format_dropdown = fig.format_dropdown;\n var format = format_dropdown.options[format_dropdown.selectedIndex].value;\n fig.ondownload(fig, format);\n};\n\nmpl.figure.prototype.handle_resize = function (fig, msg) {\n var size = msg['size'];\n if (size[0] !== fig.canvas.width || size[1] !== fig.canvas.height) {\n fig._resize_canvas(size[0], size[1], msg['forward']);\n fig.send_message('refresh', {});\n }\n};\n\nmpl.figure.prototype.handle_rubberband = function (fig, msg) {\n var x0 = msg['x0'] / fig.ratio;\n var y0 = (fig.canvas.height - msg['y0']) / fig.ratio;\n var x1 = msg['x1'] / fig.ratio;\n var y1 = (fig.canvas.height - msg['y1']) / fig.ratio;\n x0 = Math.floor(x0) + 0.5;\n y0 = Math.floor(y0) + 0.5;\n x1 = Math.floor(x1) + 0.5;\n y1 = Math.floor(y1) + 0.5;\n var min_x = Math.min(x0, x1);\n var min_y = Math.min(y0, y1);\n var width = Math.abs(x1 - x0);\n var height = Math.abs(y1 - y0);\n\n fig.rubberband_context.clearRect(\n 0,\n 0,\n fig.canvas.width / fig.ratio,\n fig.canvas.height / fig.ratio\n );\n\n fig.rubberband_context.strokeRect(min_x, min_y, width, height);\n};\n\nmpl.figure.prototype.handle_figure_label = function (fig, msg) {\n // Updates the figure title.\n fig.header.textContent = msg['label'];\n};\n\nmpl.figure.prototype.handle_cursor = function (fig, msg) {\n var cursor = msg['cursor'];\n switch (cursor) {\n case 0:\n cursor = 'pointer';\n break;\n case 1:\n cursor = 'default';\n break;\n case 2:\n cursor = 'crosshair';\n break;\n case 3:\n cursor = 'move';\n break;\n }\n fig.rubberband_canvas.style.cursor = cursor;\n};\n\nmpl.figure.prototype.handle_message = function (fig, msg) {\n fig.message.textContent = msg['message'];\n};\n\nmpl.figure.prototype.handle_draw = function (fig, _msg) {\n // Request the server to send over a new figure.\n fig.send_draw_message();\n};\n\nmpl.figure.prototype.handle_image_mode = function (fig, msg) {\n fig.image_mode = msg['mode'];\n};\n\nmpl.figure.prototype.handle_history_buttons = function (fig, msg) {\n for (var key in msg) {\n if (!(key in fig.buttons)) {\n continue;\n }\n fig.buttons[key].disabled = !msg[key];\n fig.buttons[key].setAttribute('aria-disabled', !msg[key]);\n }\n};\n\nmpl.figure.prototype.handle_navigate_mode = function (fig, msg) {\n if (msg['mode'] === 'PAN') {\n fig.buttons['Pan'].classList.add('active');\n fig.buttons['Zoom'].classList.remove('active');\n } else if (msg['mode'] === 'ZOOM') {\n fig.buttons['Pan'].classList.remove('active');\n fig.buttons['Zoom'].classList.add('active');\n } else {\n fig.buttons['Pan'].classList.remove('active');\n fig.buttons['Zoom'].classList.remove('active');\n }\n};\n\nmpl.figure.prototype.updated_canvas_event = function () {\n // Called whenever the canvas gets updated.\n this.send_message('ack', {});\n};\n\n// A function to construct a web socket function for onmessage handling.\n// Called in the figure constructor.\nmpl.figure.prototype._make_on_message_function = function (fig) {\n return function socket_on_message(evt) {\n if (evt.data instanceof Blob) {\n /* FIXME: We get \"Resource interpreted as Image but\n * transferred with MIME type text/plain:\" errors on\n * Chrome. But how to set the MIME type? It doesn't seem\n * to be part of the websocket stream */\n evt.data.type = 'image/png';\n\n /* Free the memory for the previous frames */\n if (fig.imageObj.src) {\n (window.URL || window.webkitURL).revokeObjectURL(\n fig.imageObj.src\n );\n }\n\n fig.imageObj.src = (window.URL || window.webkitURL).createObjectURL(\n evt.data\n );\n fig.updated_canvas_event();\n fig.waiting = false;\n return;\n } else if (\n typeof evt.data === 'string' &&\n evt.data.slice(0, 21) === 'data:image/png;base64'\n ) {\n fig.imageObj.src = evt.data;\n fig.updated_canvas_event();\n fig.waiting = false;\n return;\n }\n\n var msg = JSON.parse(evt.data);\n var msg_type = msg['type'];\n\n // Call the \"handle_{type}\" callback, which takes\n // the figure and JSON message as its only arguments.\n try {\n var callback = fig['handle_' + msg_type];\n } catch (e) {\n console.log(\n \"No handler for the '\" + msg_type + \"' message type: \",\n msg\n );\n return;\n }\n\n if (callback) {\n try {\n // console.log(\"Handling '\" + msg_type + \"' message: \", msg);\n callback(fig, msg);\n } catch (e) {\n console.log(\n \"Exception inside the 'handler_\" + msg_type + \"' callback:\",\n e,\n e.stack,\n msg\n );\n }\n }\n };\n};\n\n// from http://stackoverflow.com/questions/1114465/getting-mouse-location-in-canvas\nmpl.findpos = function (e) {\n //this section is from http://www.quirksmode.org/js/events_properties.html\n var targ;\n if (!e) {\n e = window.event;\n }\n if (e.target) {\n targ = e.target;\n } else if (e.srcElement) {\n targ = e.srcElement;\n }\n if (targ.nodeType === 3) {\n // defeat Safari bug\n targ = targ.parentNode;\n }\n\n // pageX,Y are the mouse positions relative to the document\n var boundingRect = targ.getBoundingClientRect();\n var x = e.pageX - (boundingRect.left + document.body.scrollLeft);\n var y = e.pageY - (boundingRect.top + document.body.scrollTop);\n\n return { x: x, y: y };\n};\n\n/*\n * return a copy of an object with only non-object keys\n * we need this to avoid circular references\n * http://stackoverflow.com/a/24161582/3208463\n */\nfunction simpleKeys(original) {\n return Object.keys(original).reduce(function (obj, key) {\n if (typeof original[key] !== 'object') {\n obj[key] = original[key];\n }\n return obj;\n }, {});\n}\n\nmpl.figure.prototype.mouse_event = function (event, name) {\n var canvas_pos = mpl.findpos(event);\n\n if (name === 'button_press') {\n this.canvas.focus();\n this.canvas_div.focus();\n }\n\n var x = canvas_pos.x * this.ratio;\n var y = canvas_pos.y * this.ratio;\n\n this.send_message(name, {\n x: x,\n y: y,\n button: event.button,\n step: event.step,\n guiEvent: simpleKeys(event),\n });\n\n /* This prevents the web browser from automatically changing to\n * the text insertion cursor when the button is pressed. We want\n * to control all of the cursor setting manually through the\n * 'cursor' event from matplotlib */\n event.preventDefault();\n return false;\n};\n\nmpl.figure.prototype._key_event_extra = function (_event, _name) {\n // Handle any extra behaviour associated with a key event\n};\n\nmpl.figure.prototype.key_event = function (event, name) {\n // Prevent repeat events\n if (name === 'key_press') {\n if (event.which === this._key) {\n return;\n } else {\n this._key = event.which;\n }\n }\n if (name === 'key_release') {\n this._key = null;\n }\n\n var value = '';\n if (event.ctrlKey && event.which !== 17) {\n value += 'ctrl+';\n }\n if (event.altKey && event.which !== 18) {\n value += 'alt+';\n }\n if (event.shiftKey && event.which !== 16) {\n value += 'shift+';\n }\n\n value += 'k';\n value += event.which.toString();\n\n this._key_event_extra(event, name);\n\n this.send_message(name, { key: value, guiEvent: simpleKeys(event) });\n return false;\n};\n\nmpl.figure.prototype.toolbar_button_onclick = function (name) {\n if (name === 'download') {\n this.handle_save(this, null);\n } else {\n this.send_message('toolbar_button', { name: name });\n }\n};\n\nmpl.figure.prototype.toolbar_button_onmouseover = function (tooltip) {\n this.message.textContent = tooltip;\n};\n\n///////////////// REMAINING CONTENT GENERATED BY embed_js.py /////////////////\n// prettier-ignore\nvar _JSXTOOLS_RESIZE_OBSERVER=function(A){var t,i=new WeakMap,n=new WeakMap,a=new WeakMap,r=new WeakMap,o=new Set;function s(e){if(!(this instanceof s))throw new TypeError(\"Constructor requires 'new' operator\");i.set(this,e)}function h(){throw new TypeError(\"Function is not a constructor\")}function c(e,t,i,n){e=0 in arguments?Number(arguments[0]):0,t=1 in arguments?Number(arguments[1]):0,i=2 in arguments?Number(arguments[2]):0,n=3 in arguments?Number(arguments[3]):0,this.right=(this.x=this.left=e)+(this.width=i),this.bottom=(this.y=this.top=t)+(this.height=n),Object.freeze(this)}function d(){t=requestAnimationFrame(d);var s=new WeakMap,p=new Set;o.forEach((function(t){r.get(t).forEach((function(i){var r=t instanceof window.SVGElement,o=a.get(t),d=r?0:parseFloat(o.paddingTop),f=r?0:parseFloat(o.paddingRight),l=r?0:parseFloat(o.paddingBottom),u=r?0:parseFloat(o.paddingLeft),g=r?0:parseFloat(o.borderTopWidth),m=r?0:parseFloat(o.borderRightWidth),w=r?0:parseFloat(o.borderBottomWidth),b=u+f,F=d+l,v=(r?0:parseFloat(o.borderLeftWidth))+m,W=g+w,y=r?0:t.offsetHeight-W-t.clientHeight,E=r?0:t.offsetWidth-v-t.clientWidth,R=b+v,z=F+W,M=r?t.width:parseFloat(o.width)-R-E,O=r?t.height:parseFloat(o.height)-z-y;if(n.has(t)){var k=n.get(t);if(k[0]===M&&k[1]===O)return}n.set(t,[M,O]);var S=Object.create(h.prototype);S.target=t,S.contentRect=new c(u,d,M,O),s.has(i)||(s.set(i,[]),p.add(i)),s.get(i).push(S)}))})),p.forEach((function(e){i.get(e).call(e,s.get(e),e)}))}return s.prototype.observe=function(i){if(i instanceof window.Element){r.has(i)||(r.set(i,new Set),o.add(i),a.set(i,window.getComputedStyle(i)));var n=r.get(i);n.has(this)||n.add(this),cancelAnimationFrame(t),t=requestAnimationFrame(d)}},s.prototype.unobserve=function(i){if(i instanceof window.Element&&r.has(i)){var n=r.get(i);n.has(this)&&(n.delete(this),n.size||(r.delete(i),o.delete(i))),n.size||r.delete(i),o.size||cancelAnimationFrame(t)}},A.DOMRectReadOnly=c,A.ResizeObserver=s,A.ResizeObserverEntry=h,A}; // eslint-disable-line\nmpl.toolbar_items = [[\"Home\", \"Reset original view\", \"fa fa-home icon-home\", \"home\"], [\"Back\", \"Back to previous view\", \"fa fa-arrow-left icon-arrow-left\", \"back\"], [\"Forward\", \"Forward to next view\", \"fa fa-arrow-right icon-arrow-right\", \"forward\"], [\"\", \"\", \"\", \"\"], [\"Pan\", \"Left button pans, Right button zooms\\nx/y fixes axis, CTRL fixes aspect\", \"fa fa-arrows icon-move\", \"pan\"], [\"Zoom\", \"Zoom to rectangle\\nx/y fixes axis, CTRL fixes aspect\", \"fa fa-square-o icon-check-empty\", \"zoom\"], [\"\", \"\", \"\", \"\"], [\"Download\", \"Download plot\", \"fa fa-floppy-o icon-save\", \"download\"]];\n\nmpl.extensions = [\"eps\", \"jpeg\", \"pdf\", \"png\", \"ps\", \"raw\", \"svg\", \"tif\"];\n\nmpl.default_extension = \"png\";/* global mpl */\n\nvar comm_websocket_adapter = function (comm) {\n // Create a \"websocket\"-like object which calls the given IPython comm\n // object with the appropriate methods. Currently this is a non binary\n // socket, so there is still some room for performance tuning.\n var ws = {};\n\n ws.close = function () {\n comm.close();\n };\n ws.send = function (m) {\n //console.log('sending', m);\n comm.send(m);\n };\n // Register the callback with on_msg.\n comm.on_msg(function (msg) {\n //console.log('receiving', msg['content']['data'], msg);\n // Pass the mpl event to the overridden (by mpl) onmessage function.\n ws.onmessage(msg['content']['data']);\n });\n return ws;\n};\n\nmpl.mpl_figure_comm = function (comm, msg) {\n // This is the function which gets called when the mpl process\n // starts-up an IPython Comm through the \"matplotlib\" channel.\n\n var id = msg.content.data.id;\n // Get hold of the div created by the display call when the Comm\n // socket was opened in Python.\n var element = document.getElementById(id);\n var ws_proxy = comm_websocket_adapter(comm);\n\n function ondownload(figure, _format) {\n window.open(figure.canvas.toDataURL());\n }\n\n var fig = new mpl.figure(id, ws_proxy, ondownload, element);\n\n // Call onopen now - mpl needs it, as it is assuming we've passed it a real\n // web socket which is closed, not our websocket->open comm proxy.\n ws_proxy.onopen();\n\n fig.parent_element = element;\n fig.cell_info = mpl.find_output_cell(\"
\");\n if (!fig.cell_info) {\n console.error('Failed to find cell for figure', id, fig);\n return;\n }\n fig.cell_info[0].output_area.element.on(\n 'cleared',\n { fig: fig },\n fig._remove_fig_handler\n );\n};\n\nmpl.figure.prototype.handle_close = function (fig, msg) {\n var width = fig.canvas.width / fig.ratio;\n fig.cell_info[0].output_area.element.off(\n 'cleared',\n fig._remove_fig_handler\n );\n fig.resizeObserverInstance.unobserve(fig.canvas_div);\n\n // Update the output cell to use the data from the current canvas.\n fig.push_to_output();\n var dataURL = fig.canvas.toDataURL();\n // Re-enable the keyboard manager in IPython - without this line, in FF,\n // the notebook keyboard shortcuts fail.\n IPython.keyboard_manager.enable();\n fig.parent_element.innerHTML =\n '';\n fig.close_ws(fig, msg);\n};\n\nmpl.figure.prototype.close_ws = function (fig, msg) {\n fig.send_message('closing', msg);\n // fig.ws.close()\n};\n\nmpl.figure.prototype.push_to_output = function (_remove_interactive) {\n // Turn the data on the canvas into data in the output cell.\n var width = this.canvas.width / this.ratio;\n var dataURL = this.canvas.toDataURL();\n this.cell_info[1]['text/html'] =\n '';\n};\n\nmpl.figure.prototype.updated_canvas_event = function () {\n // Tell IPython that the notebook contents must change.\n IPython.notebook.set_dirty(true);\n this.send_message('ack', {});\n var fig = this;\n // Wait a second, then push the new image to the DOM so\n // that it is saved nicely (might be nice to debounce this).\n setTimeout(function () {\n fig.push_to_output();\n }, 1000);\n};\n\nmpl.figure.prototype._init_toolbar = function () {\n var fig = this;\n\n var toolbar = document.createElement('div');\n toolbar.classList = 'btn-toolbar';\n this.root.appendChild(toolbar);\n\n function on_click_closure(name) {\n return function (_event) {\n return fig.toolbar_button_onclick(name);\n };\n }\n\n function on_mouseover_closure(tooltip) {\n return function (event) {\n if (!event.currentTarget.disabled) {\n return fig.toolbar_button_onmouseover(tooltip);\n }\n };\n }\n\n fig.buttons = {};\n var buttonGroup = document.createElement('div');\n buttonGroup.classList = 'btn-group';\n var button;\n for (var toolbar_ind in mpl.toolbar_items) {\n var name = mpl.toolbar_items[toolbar_ind][0];\n var tooltip = mpl.toolbar_items[toolbar_ind][1];\n var image = mpl.toolbar_items[toolbar_ind][2];\n var method_name = mpl.toolbar_items[toolbar_ind][3];\n\n if (!name) {\n /* Instead of a spacer, we start a new button group. */\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n buttonGroup = document.createElement('div');\n buttonGroup.classList = 'btn-group';\n continue;\n }\n\n button = fig.buttons[name] = document.createElement('button');\n button.classList = 'btn btn-default';\n button.href = '#';\n button.title = name;\n button.innerHTML = '';\n button.addEventListener('click', on_click_closure(method_name));\n button.addEventListener('mouseover', on_mouseover_closure(tooltip));\n buttonGroup.appendChild(button);\n }\n\n if (buttonGroup.hasChildNodes()) {\n toolbar.appendChild(buttonGroup);\n }\n\n // Add the status bar.\n var status_bar = document.createElement('span');\n status_bar.classList = 'mpl-message pull-right';\n toolbar.appendChild(status_bar);\n this.message = status_bar;\n\n // Add the close button to the window.\n var buttongrp = document.createElement('div');\n buttongrp.classList = 'btn-group inline pull-right';\n button = document.createElement('button');\n button.classList = 'btn btn-mini btn-primary';\n button.href = '#';\n button.title = 'Stop Interaction';\n button.innerHTML = '';\n button.addEventListener('click', function (_evt) {\n fig.handle_close(fig, {});\n });\n button.addEventListener(\n 'mouseover',\n on_mouseover_closure('Stop Interaction')\n );\n buttongrp.appendChild(button);\n var titlebar = this.root.querySelector('.ui-dialog-titlebar');\n titlebar.insertBefore(buttongrp, titlebar.firstChild);\n};\n\nmpl.figure.prototype._remove_fig_handler = function (event) {\n var fig = event.data.fig;\n if (event.target !== this) {\n // Ignore bubbled events from children.\n return;\n }\n fig.close_ws(fig, {});\n};\n\nmpl.figure.prototype._root_extra_style = function (el) {\n el.style.boxSizing = 'content-box'; // override notebook setting of border-box.\n};\n\nmpl.figure.prototype._canvas_extra_style = function (el) {\n // this is important to make the div 'focusable\n el.setAttribute('tabindex', 0);\n // reach out to IPython and tell the keyboard manager to turn it's self\n // off when our div gets focus\n\n // location in version 3\n if (IPython.notebook.keyboard_manager) {\n IPython.notebook.keyboard_manager.register_events(el);\n } else {\n // location in version 2\n IPython.keyboard_manager.register_events(el);\n }\n};\n\nmpl.figure.prototype._key_event_extra = function (event, _name) {\n var manager = IPython.notebook.keyboard_manager;\n if (!manager) {\n manager = IPython.keyboard_manager;\n }\n\n // Check for shift+enter\n if (event.shiftKey && event.which === 13) {\n this.canvas_div.blur();\n // select the cell after this one\n var index = IPython.notebook.find_cell_index(this.cell_info[0]);\n IPython.notebook.select(index + 1);\n }\n};\n\nmpl.figure.prototype.handle_save = function (fig, _msg) {\n fig.ondownload(fig, null);\n};\n\nmpl.find_output_cell = function (html_output) {\n // Return the cell and output element which can be found *uniquely* in the notebook.\n // Note - this is a bit hacky, but it is done because the \"notebook_saving.Notebook\"\n // IPython event is triggered only after the cells have been serialised, which for\n // our purposes (turning an active figure into a static one), is too late.\n var cells = IPython.notebook.get_cells();\n var ncells = cells.length;\n for (var i = 0; i < ncells; i++) {\n var cell = cells[i];\n if (cell.cell_type === 'code') {\n for (var j = 0; j < cell.output_area.outputs.length; j++) {\n var data = cell.output_area.outputs[j];\n if (data.data) {\n // IPython >= 3 moved mimebundle to data attribute of output\n data = data.data;\n }\n if (data['text/html'] === html_output) {\n return [cell, data, j];\n }\n }\n }\n }\n};\n\n// Register the function which deals with the matplotlib target/channel.\n// The kernel may be null if the page has been refreshed.\nif (IPython.notebook.kernel !== null) {\n IPython.notebook.kernel.comm_manager.register_target(\n 'matplotlib',\n mpl.mpl_figure_comm\n );\n}\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/html": [ + "" + ], + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "fig, ax = plt.subplots()\n", + "sns.heatmap(auc_df, cbar=False, annot=True, fmt=\".4g\")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "More advanced tuning methods like Bayesian Optimization can be used for searching for the optimal model efficiently. The benefit of using, for example, `HyperDrive` from Azure Machine Learning Services, for tuning the parameters, is that, the tuning tasks can be distributed across nodes of a cluster and the optimization can be run concurrently to save the total cost.\n", + "\n", + "* Details about how to tune hyper parameters by using Azure Machine Learning Services can be found [here](https://github.com/microsoft/recommenders/tree/master/notebooks/04_model_select_and_optimize).\n", + "* Note, to enable the tuning task on Azure Machine Learning Services by using HyperDrive, one needs a Docker image to containerize the environment where `xlearn` can be run. The Docker file provided [here](https://github.com/microsoft/recommenders/tree/master/docker) can be used for such purpose." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.4 Clean up" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [], + "source": [ + "tmpdir.cleanup()\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## References" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "1. Rendle, Steffen. \"Factorization machines.\" 2010 IEEE International Conference on Data Mining. IEEE, 2010.\n", + "2. Juan, Yuchin, et al. \"Field-aware factorization machines for CTR prediction.\" Proceedings of the 10th ACM Conference on Recommender Systems. ACM, 2016.\n", + "3. Guo, Huifeng, et al. \"DeepFM: a factorization-machine based neural network for CTR prediction.\" arXiv preprint arXiv:1703.04247, 2017.\n", + "4. Lian, Jianxun, et al. \"xdeepfm: Combining explicit and implicit feature interactions for recommender systems.\" Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 2018.\n", + "5. Qu, Yanru, et al. \"Product-based neural networks for user response prediction.\" 2016 IEEE 16th International Conference on Data Mining (ICDM). IEEE, 2016.\n", + "6. Zhang, Weinan, Tianming Du, and Jun Wang. \"Deep learning over multi-field categorical data.\" European conference on information retrieval. Springer, Cham, 2016.\n", + "7. He, Xiangnan, and Tat-Seng Chua. \"Neural factorization machines for sparse predictive analytics.\" Proceedings of the 40th International ACM SIGIR conference on Research and Development in Information Retrieval. ACM, 2017.\n", + "8. Cheng, Heng-Tze, et al. \"Wide & deep learning for recommender systems.\" Proceedings of the 1st workshop on deep learning for recommender systems. ACM, 2016.\n", + "9. Langford, John, Lihong Li, and Alex Strehl. \"Vowpal wabbit online learning project.\", 2007." + ] + } + ], + "metadata": { + "celltoolbar": "Tags", + "kernelspec": { + "display_name": "recommenders", + "language": "python", + "name": "conda-env-recommenders-py" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.13" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb b/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb new file mode 100755 index 0000000000..5ce4b79151 --- /dev/null +++ b/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb @@ -0,0 +1,1956 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Copyright (c) Recommenders contributors.\n", + "\n", + "Licensed under the MIT License." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# LightFM - hybrid matrix factorisation on MovieLens (Python, CPU)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This notebook explains the concept of a hybrid matrix factorisation based model for recommendation, it also outlines the steps to construct a pure matrix factorisation and a hybrid models using the [LightFM](https://github.com/lyst/lightfm) package. It also demonstrates how to extract both user and item affinity from a fitted hybrid model.\n", + "\n", + "## 1. Hybrid matrix factorisation model\n", + "\n", + "### 1.1 Background\n", + "\n", + "In general, most recommendation models can be divided into two categories:\n", + "- Content based model,\n", + "- Collaborative filtering model.\n", + "\n", + "The content-based model recommends based on similarity of the items and/or users using their description/metadata/profile. On the other hand, collaborative filtering model (discussion is limited to matrix factorisation approach in this notebook) computes the latent factors of the users and items. It works based on the assumption that if a group of people expressed similar opinions on an item, these peole would tend to have similar opinions on other items. For further background and detailed explanation between these two approaches, the reader can refer to machine learning literatures [3, 4].\n", + "\n", + "The choice between the two models is largely based on the data availability. For example, the collaborative filtering model is usually adopted and effective when sufficient ratings/feedbacks have been recorded for a group of users and items.\n", + "\n", + "However, if there is a lack of ratings, content based model can be used provided that the metadata of the users and items are available. This is also the common approach to address the cold-start issues, where there are insufficient historical collaborative interactions available to model new users and/or items.\n", + "\n", + "\n", + "\n", + "### 1.2 Hybrid matrix factorisation algorithm\n", + "\n", + "In view of the above problems, there have been a number of proposals to address the cold-start issues by combining both content-based and collaborative filtering approaches. The hybrid matrix factorisation model is among one of the solutions proposed [1]. \n", + "\n", + "In general, most hybrid approaches proposed different ways of assessing and/or combining the feature data in conjunction with the collaborative information.\n", + "\n", + "### 1.3 LightFM package \n", + "\n", + "LightFM is a Python implementation of a hybrid recommendation algorithms for both implicit and explicit feedbacks [1].\n", + "\n", + "It is a hybrid content-collaborative model which represents users and items as linear combinations of their content features’ latent factors. The model learns **embeddings or latent representations of the users and items in such a way that it encodes user preferences over items**. These representations produce scores for every item for a given user; items scored highly are more likely to be interesting to the user.\n", + "\n", + "The user and item embeddings are estimated for every feature, and these features are then added together to be the final representations for users and items. \n", + "\n", + "For example, for user i, the model retrieves the i-th row of the feature matrix to find the features with non-zero weights. The embeddings for these features will then be added together to become the user representation e.g. if user 10 has weight 1 in the 5th column of the user feature matrix, and weight 3 in the 20th column, the user 10’s representation is the sum of embedding for the 5th and the 20th features multiplying their corresponding weights. The representation for each items is computed in the same approach. \n", + "\n", + "#### 1.3.1 Modelling approach\n", + "\n", + "Let $U$ be the set of users and $I$ be the set of items, and each user can be described by a set of user features $f_{u} \\subset F^{U}$ whilst each items can be described by item features $f_{i} \\subset F^{I}$. Both $F^{U}$ and $F^{I}$ are all the features which fully describe all users and items. \n", + "\n", + "The LightFM model operates based binary feedbacks, the ratings will be normalised into two groups. The user-item interaction pairs $(u,i) \\in U\\times I$ are the union of positive (favourable reviews) $S^+$ and negative interactions (negative reviews) $S^-$ for explicit ratings. For implicit feedbacks, these can be the observed and not observed interactions respectively.\n", + "\n", + "For each user and item feature, their embeddings are $e_{f}^{U}$ and $e_{f}^{I}$ respectively. Furthermore, each feature is also has a scalar bias term ($b_U^f$ for user and $b_I^f$ for item features). The embedding (latent representation) of user $u$ and item $i$ are the sum of its respective features’ latent vectors:\n", + "\n", + "$$ \n", + "q_{u} = \\sum_{j \\in f_{u}} e_{j}^{U}\n", + "$$\n", + "\n", + "$$\n", + "p_{i} = \\sum_{j \\in f_{i}} e_{j}^{I}\n", + "$$\n", + "\n", + "Similarly the biases for user $u$ and item $i$ are the sum of its respective bias vectors. These variables capture the variation in behaviour across users and items:\n", + "\n", + "$$\n", + "b_{u} = \\sum_{j \\in f_{u}} b_{j}^{U}\n", + "$$\n", + "\n", + "$$\n", + "b_{i} = \\sum_{j \\in f_{i}} b_{j}^{I}\n", + "$$\n", + "\n", + "In LightFM, the representation for each user/item is a linear weighted sum of its feature vectors.\n", + "\n", + "The prediction for user $u$ and item $i$ can be modelled as sigmoid of the dot product of user and item vectors, adjusted by its feature biases as follows:\n", + "\n", + "$$\n", + "\\hat{r}_{ui} = \\sigma (q_{u} \\cdot p_{i} + b_{u} + b_{i})\n", + "$$\n", + "\n", + "As the LightFM is constructed to predict binary outcomes e.g. $S^+$ and $S^-$, the function $\\sigma()$ is based on the [sigmoid function](https://mathworld.wolfram.com/SigmoidFunction.html). \n", + "\n", + "The LightFM algorithm estimates interaction latent vectors and bias for features. For model fitting, the cost function of the model consists of maximising the likelihood of data conditional on the parameters described above using stochastic gradient descent. The likelihood can be expressed as follows:\n", + "\n", + "$$\n", + "L = \\prod_{(u,i) \\in S+}\\hat{r}_{ui} \\times \\prod_{(u,i) \\in S-}1 - \\hat{r}_{ui}\n", + "$$\n", + "\n", + "Note that if the feature latent vectors are not available, the algorithm will behaves like a [logistic matrix factorisation model](http://stanford.edu/~rezab/nips2014workshop/submits/logmat.pdf)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2. Movie recommender with LightFM using only explicit feedbacks" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.1 Import libraries" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "System version: 3.8.13 (default, Mar 28 2022, 11:38:47) \n", + "[GCC 7.5.0]\n", + "LightFM version: 1.16\n" + ] + } + ], + "source": [ + "import os\n", + "import sys\n", + "import itertools\n", + "import pandas as pd\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "\n", + "import lightfm\n", + "from lightfm import LightFM\n", + "from lightfm.data import Dataset\n", + "from lightfm import cross_validation\n", + "from lightfm.evaluation import precision_at_k as lightfm_prec_at_k\n", + "from lightfm.evaluation import recall_at_k as lightfm_recall_at_k\n", + "\n", + "from recommenders.evaluation.python_evaluation import precision_at_k, recall_at_k\n", + "from recommenders.utils.timer import Timer\n", + "from recommenders.datasets import movielens\n", + "from recommenders.models.lightfm.lightfm_utils import (\n", + " track_model_metrics,\n", + " prepare_test_df,\n", + " prepare_all_predictions,\n", + " compare_metric,\n", + " similar_users,\n", + " similar_items,\n", + ")\n", + "from recommenders.utils.notebook_utils import store_metadata\n", + "\n", + "print(\"System version: {}\".format(sys.version))\n", + "print(\"LightFM version: {}\".format(lightfm.__version__))\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.2 Defining variables" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "tags": [ + "parameters" + ] + }, + "outputs": [], + "source": [ + "# Select MovieLens data size\n", + "MOVIELENS_DATA_SIZE = '100k'\n", + "\n", + "# default number of recommendations\n", + "K = 10\n", + "# percentage of data used for testing\n", + "TEST_PERCENTAGE = 0.25\n", + "# model learning rate\n", + "LEARNING_RATE = 0.25\n", + "# no of latent factors\n", + "NO_COMPONENTS = 20\n", + "# no of epochs to fit model\n", + "NO_EPOCHS = 20\n", + "# no of threads to fit model\n", + "NO_THREADS = 32\n", + "# regularisation for both user and item features\n", + "ITEM_ALPHA = 1e-6\n", + "USER_ALPHA = 1e-6\n", + "\n", + "# seed for pseudonumber generations\n", + "SEED = 42" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.2 Retrieve data" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "100%|██████████| 4.81k/4.81k [00:00<00:00, 6.13kKB/s]\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
userIDitemIDratinggenre
153894772944.0Comedy
485285762592.0Children's|Comedy
200067161353.0Drama|Mystery|Sci-Fi|Thriller
2284222293.0Action|Adventure|Comedy|Crime
959756558964.0Drama
\n", + "
" + ], + "text/plain": [ + " userID itemID rating genre\n", + "15389 477 294 4.0 Comedy\n", + "48528 576 259 2.0 Children's|Comedy\n", + "20006 716 135 3.0 Drama|Mystery|Sci-Fi|Thriller\n", + "2284 222 29 3.0 Action|Adventure|Comedy|Crime\n", + "95975 655 896 4.0 Drama" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "data = movielens.load_pandas_df(\n", + " size=MOVIELENS_DATA_SIZE,\n", + " genres_col='genre',\n", + " header=[\"userID\", \"itemID\", \"rating\"]\n", + ")\n", + "# quick look at the data\n", + "data.sample(5, random_state=SEED)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.3 Prepare data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before fitting the LightFM model, we need to create an instance of `Dataset` which holds the interaction matrix." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "dataset = Dataset()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `fit` method creates the user/item id mappings." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Num users: 943, num_topics: 1682.\n" + ] + } + ], + "source": [ + "dataset.fit(users=data['userID'], \n", + " items=data['itemID'])\n", + "\n", + "# quick check to determine the number of unique users and items in the data\n", + "num_users, num_topics = dataset.interactions_shape()\n", + "print(f'Num users: {num_users}, num_topics: {num_topics}.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next is to build the interaction matrix. The `build_interactions` method returns 2 COO sparse matrices, namely the `interactions` and `weights` matrices." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "(interactions, weights) = dataset.build_interactions(data.iloc[:, 0:3].values)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "LightLM works slightly differently compared to other packages as it expects the train and test sets to have same dimension. Therefore the conventional train test split will not work.\n", + "\n", + "The package has included the `cross_validation.random_train_test_split` method to split the interaction data and splits it into two disjoint training and test sets. \n", + "\n", + "However, note that **it does not validate the interactions in the test set to guarantee all items and users have historical interactions in the training set**. Therefore this may result into a partial cold-start problem in the test set." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "train_interactions, test_interactions = cross_validation.random_train_test_split(\n", + " interactions, test_percentage=TEST_PERCENTAGE,\n", + " random_state=np.random.RandomState(SEED))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Double check the size of both the train and test sets." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Shape of train interactions: (943, 1682)\n", + "Shape of test interactions: (943, 1682)\n" + ] + } + ], + "source": [ + "print(f\"Shape of train interactions: {train_interactions.shape}\")\n", + "print(f\"Shape of test interactions: {test_interactions.shape}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.4 Fit the LightFM model" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this notebook, the LightFM model will be using the weighted Approximate-Rank Pairwise (WARP) as the loss. Further explanation on the topic can be found [here](https://making.lyst.com/lightfm/docs/examples/warp_loss.html#learning-to-rank-using-the-warp-loss).\n", + "\n", + "\n", + "In general, it maximises the rank of positive examples by repeatedly sampling negative examples until a rank violation has been located. This approach is recommended when only positive interactions are present." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "model1 = LightFM(loss='warp', no_components=NO_COMPONENTS, \n", + " learning_rate=LEARNING_RATE, \n", + " random_state=np.random.RandomState(SEED))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The LightFM model can be fitted with the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "model1.fit(interactions=train_interactions,\n", + " epochs=NO_EPOCHS);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.5 Prepare model evaluation data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before we can evaluate the fitted model and to get the data into a format which is compatible with the existing evaluation methods within this repo, the data needs to be massaged slightly.\n", + "\n", + "First the train/test indices need to be extracted from the `lightfm.cross_validation` method as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [], + "source": [ + "uids, iids, interaction_data = cross_validation._shuffle(\n", + " interactions.row, interactions.col, interactions.data, \n", + " random_state=np.random.RandomState(SEED))\n", + "\n", + "cutoff = int((1.0 - TEST_PERCENTAGE) * len(uids))\n", + "test_idx = slice(cutoff, None)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Then the the mapping between internal and external representation of the user and item are extracted as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "uid_map, ufeature_map, iid_map, ifeature_map = dataset.mapping()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the train/test indices and mapping are ready, the test dataframe can be constructed as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Took 1.3 seconds for prepare and predict test data.\n" + ] + } + ], + "source": [ + "with Timer() as test_time:\n", + " test_df = prepare_test_df(test_idx, uids, iids, uid_map, iid_map, weights)\n", + "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict test data.\") \n", + "time_reco1 = test_time.interval" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And samples of the test dataframe:" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
userIDitemIDrating
145426163283.0
22507382403.0
160248082945.0
1545834625.0
15002110383.0
\n", + "
" + ], + "text/plain": [ + " userID itemID rating\n", + "14542 616 328 3.0\n", + "2250 738 240 3.0\n", + "16024 808 294 5.0\n", + "15458 346 2 5.0\n", + "15002 110 38 3.0" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "test_df.sample(5, random_state=SEED)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition, the predictions of all unseen user-item pairs (e.g. removing those seen in the training data) can be prepared as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Took 316.4 seconds for prepare and predict all data.\n" + ] + } + ], + "source": [ + "with Timer() as test_time:\n", + " all_predictions = prepare_all_predictions(data, uid_map, iid_map, \n", + " interactions=train_interactions,\n", + " model=model1, \n", + " num_threads=NO_THREADS)\n", + "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict all data.\")\n", + "time_reco2 = test_time.interval" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Samples of the `all_predictions` dataframe:" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
userIDitemIDprediction
11000841031291-84.180405
9285266381274-28.313053
93421401576-49.781242
568732021177-55.072628
1270199751462-53.741127
\n", + "
" + ], + "text/plain": [ + " userID itemID prediction\n", + "1100084 103 1291 -84.180405\n", + "928526 638 1274 -28.313053\n", + "93421 40 1576 -49.781242\n", + "56873 202 1177 -55.072628\n", + "1270199 75 1462 -53.741127" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "all_predictions.sample(5, random_state=SEED)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the **raw prediction values from the LightFM model are for ranking purposes only**, they should not be used directly. The magnitude and sign of these values do not have any specific interpretation." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 2.6 Model evaluation" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the evaluation data are ready, they can be passed into to the repo's evaluation methods as follows. The performance of the model will be tracked using both Precision@K and Recall@K.\n", + "\n", + "In addition, the results have also being compared with those computed from LightFM's own evaluation methods to ensure accuracy." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "------ Using Repo's evaluation methods ------\n", + "Precision@K:\t0.131601\n", + "Recall@K:\t0.038056\n", + "\n", + "------ Using LightFM evaluation methods ------\n", + "Precision@K:\t0.131601\n", + "Recall@K:\t0.038056\n" + ] + } + ], + "source": [ + "with Timer() as test_time:\n", + " eval_precision = precision_at_k(rating_true=test_df, \n", + " rating_pred=all_predictions, k=K)\n", + " eval_recall = recall_at_k(test_df, all_predictions, k=K)\n", + "time_reco3 = test_time.interval\n", + "\n", + "with Timer() as test_time:\n", + " eval_precision_lfm = lightfm_prec_at_k(model1, test_interactions, \n", + " train_interactions, k=K).mean()\n", + " eval_recall_lfm = lightfm_recall_at_k(model1, test_interactions, \n", + " train_interactions, k=K).mean()\n", + "time_lfm = test_time.interval\n", + " \n", + "print(\n", + " \"------ Using Repo's evaluation methods ------\",\n", + " f\"Precision@K:\\t{eval_precision:.6f}\",\n", + " f\"Recall@K:\\t{eval_recall:.6f}\",\n", + " \"\\n------ Using LightFM evaluation methods ------\",\n", + " f\"Precision@K:\\t{eval_precision_lfm:.6f}\",\n", + " f\"Recall@K:\\t{eval_recall_lfm:.6f}\", \n", + " sep='\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3. Movie recommender with LightFM using explicit feedbacks and additional item and user features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the LightFM was designed to incorporates both user and item metadata, the model can be extended to include additional features such as movie genres and user occupations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.1 Extract and prepare movie genres" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this notebook, the movie's genres will be used as the item metadata. As the genres have already been loaded during the initial data import, it can be processed directly as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "# split the genre based on the separator\n", + "movie_genre = [x.split('|') for x in data['genre']]" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['Action',\n", + " 'Adventure',\n", + " 'Animation',\n", + " \"Children's\",\n", + " 'Comedy',\n", + " 'Crime',\n", + " 'Documentary',\n", + " 'Drama',\n", + " 'Fantasy',\n", + " 'Film-Noir',\n", + " 'Horror',\n", + " 'Musical',\n", + " 'Mystery',\n", + " 'Romance',\n", + " 'Sci-Fi',\n", + " 'Thriller',\n", + " 'War',\n", + " 'Western',\n", + " 'unknown']" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# retrieve the all the unique genres in the data\n", + "all_movie_genre = sorted(list(set(itertools.chain.from_iterable(movie_genre))))\n", + "# quick look at the all the genres within the data\n", + "all_movie_genre" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.2 Retrieve and prepare movie genres" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Further user features can be included as part of the model fitting process. In this notebook, **only the occupation of each user will be included** but the feature list can be extended easily.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### 3.2.1 Retrieve and merge data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The user features can be retrieved directly from the grouplens website and merged with the existing data as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
userIDitemIDratinggenreoccupation
826946981743.0Action|Adventureprogrammer
987698247481.0Action|Romance|Thrillerother
633513877894.0Comedy|Dramaentertainment
680015415964.0Animation|Children's|Musicalstudent
750001611352.0Drama|Mystery|Sci-Fi|Thrillerlawyer
\n", + "
" + ], + "text/plain": [ + " userID itemID rating genre occupation\n", + "82694 698 174 3.0 Action|Adventure programmer\n", + "98769 824 748 1.0 Action|Romance|Thriller other\n", + "63351 387 789 4.0 Comedy|Drama entertainment\n", + "68001 541 596 4.0 Animation|Children's|Musical student\n", + "75000 161 135 2.0 Drama|Mystery|Sci-Fi|Thriller lawyer" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "user_feature_URL = 'http://files.grouplens.org/datasets/movielens/ml-100k/u.user'\n", + "columns = ['userID','age','gender','occupation','zipcode']\n", + "user_data = pd.read_table(user_feature_URL, sep='|', header=None, names=columns)\n", + "\n", + "# merging user feature with existing data\n", + "new_data = data.merge(user_data[['userID','occupation']], left_on='userID', right_on='userID')\n", + "# quick look at the merged data\n", + "new_data.sample(5, random_state=SEED)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### 3.2.2 Extract and prepare user occupations" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [], + "source": [ + "# retrieve all the unique occupations in the data\n", + "all_occupations = sorted(list(set(new_data['occupation'])))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.3 Prepare data and features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similar to the previous model, the data is required to be converted into a `Dataset` instance and then create a user/item id mapping with the `fit` method." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [], + "source": [ + "dataset2 = Dataset()\n", + "dataset2.fit(data['userID'], \n", + " data['itemID'], \n", + " item_features=all_movie_genre,\n", + " user_features=all_occupations)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The movie genres are then converted into a item feature matrix using the `build_item_features` method as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [], + "source": [ + "item_features = dataset2.build_item_features((x, y) for x,y in zip(data.itemID, movie_genre))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The user occupations are then converted into an user feature matrix using the `build_user_features` method as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [], + "source": [ + "user_features = dataset2.build_user_features((x, [y]) for x,y in zip(new_data.userID, new_data['occupation']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the item and user features matrices have been completed, the next steps are similar as before, which is to build the interaction matrix and split the interactions into train and test sets as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [], + "source": [ + "interactions2, weights2 = dataset2.build_interactions(data.iloc[:, 0:3].values)\n", + "\n", + "train_interactions2, test_interactions2 = cross_validation.random_train_test_split(\n", + " interactions2, \n", + " test_percentage=TEST_PERCENTAGE,\n", + " random_state=np.random.RandomState(SEED)\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.3 Fit the LightFM model with additional user and item features" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The parameters of the second model will be similar to the first model to facilitates comparison.\n", + "\n", + "The model performance at each epoch is also tracked by the same metrics as before." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [], + "source": [ + "model2 = LightFM(loss='warp', no_components=NO_COMPONENTS, \n", + " learning_rate=LEARNING_RATE, \n", + " item_alpha=ITEM_ALPHA,\n", + " user_alpha=USER_ALPHA,\n", + " random_state=np.random.RandomState(SEED)\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The LightFM model can then be fitted:" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "model2.fit(interactions=train_interactions2,\n", + " user_features=user_features,\n", + " item_features=item_features,\n", + " epochs=NO_EPOCHS\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.4 Prepare model evaluation data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similar to the previous model, the evaluation data needs to be prepared in order to get them into a format consumable with this repo's evaluation methods.\n", + "\n", + "Firstly the train/test indices and id mappings are extracted using the new interations matrix as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [], + "source": [ + "uids, iids, interaction_data = cross_validation._shuffle(\n", + " interactions2.row, \n", + " interactions2.col, \n", + " interactions2.data, \n", + " random_state=np.random.RandomState(SEED)\n", + ")\n", + "\n", + "uid_map, ufeature_map, iid_map, ifeature_map = dataset2.mapping()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The test dataframe is then constructed as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Took 1.3 seconds for prepare and predict test data.\n" + ] + } + ], + "source": [ + "with Timer() as test_time:\n", + " test_df2 = prepare_test_df(test_idx, uids, iids, uid_map, iid_map, weights2)\n", + "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict test data.\") " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The predictions of all unseen user-item pairs can be prepared as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Took 161.9 seconds for prepare and predict all data.\n" + ] + } + ], + "source": [ + "with Timer() as test_time:\n", + " all_predictions2 = prepare_all_predictions(data, uid_map, iid_map, \n", + " interactions=train_interactions2,\n", + " user_features=user_features,\n", + " item_features=item_features,\n", + " model=model2,\n", + " num_threads=NO_THREADS)\n", + "\n", + "print(f\"Took {test_time.interval:.1f} seconds for prepare and predict all data.\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.5 Model evaluation and comparison" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The predictive performance of the new model can be computed and compared with the previous model (which used only the explicit rating) as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "------ Using only explicit ratings ------\n", + "Precision@K:\t0.131601\n", + "Recall@K:\t0.038056\n", + "\n", + "------ Using both implicit and explicit ratings ------\n", + "Precision@K:\t0.145599\n", + "Recall@K:\t0.051338\n" + ] + } + ], + "source": [ + "eval_precision2 = precision_at_k(rating_true=test_df2, \n", + " rating_pred=all_predictions2, k=K)\n", + "eval_recall2 = recall_at_k(test_df2, all_predictions2, k=K)\n", + "\n", + "print(\n", + " \"------ Using only explicit ratings ------\",\n", + " f\"Precision@K:\\t{eval_precision:.6f}\",\n", + " f\"Recall@K:\\t{eval_recall:.6f}\",\n", + " \"\\n------ Using both implicit and explicit ratings ------\",\n", + " f\"Precision@K:\\t{eval_precision2:.6f}\",\n", + " f\"Recall@K:\\t{eval_recall2:.6f}\",\n", + " sep='\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The new model which used both implicit and explicit data performed consistently better than the previous model which used only the explicit data, thus highlighting the benefits of including such additional features to the model." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 3.6 Evaluation metrics comparison" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that the evaluation approaches here are solely for demonstration purposes only.\n", + "\n", + "If the reader were using the LightFM package and/or its models, the LightFM's built-in evaluation methods are much more efficient and are the recommended approach for production usage as they are designed and optimised to work with the package. If the reader wants to compare LigthFM with other algorithms in Recommenders repository, it is better to use the evaluation tools in Recommenders.\n", + "\n", + "As a comparison, the times recorded to evaluate model1 are shown as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "------ Using Repo's evaluation methods ------\n", + "Time [sec]:\t320.6\n", + "\n", + "------ Using LightFM evaluation methods ------\n", + "Time [sec]:\t0.2\n" + ] + } + ], + "source": [ + "print(\n", + " \"------ Using Repo's evaluation methods ------\",\n", + " f\"Time [sec]:\\t{(time_reco1+time_reco2+time_reco3):.1f}\",\n", + " \"\\n------ Using LightFM evaluation methods ------\",\n", + " f\"Time [sec]:\\t{time_lfm:.1f}\",\n", + " sep='\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 4. Evaluate model fitting process" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to the inclusion of both implicit and explicit data, the model fitting process can also be monitored in order to determine whether the model is being trained properly. \n", + "\n", + "This notebook also includes a `track_model_metrics` method which plots the model's metrics e.g. Precision@K and Recall@K as model fitting progresses.\n", + "\n", + "For the first model (using only explicit data), the model fitting progress is shown as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAADQCAYAAADbLGKxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAqg0lEQVR4nO3deZRcZZ3/8fcnBAhmYQ2rhoDEYdUgzTIiCjPAhIwQ54ACjoq/QZFBZRyOHuMRIQI6IDMyII4SgRFQBhEHjRpBBNEMsjUQSJAtCRHDlkAQEiSQpL+/P+6tTqXSfe/trrpdS39e59RJVd3tqU499dxn+z6KCMzMzKw5RjQ7AWZmZsOZC2IzM7MmckFsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8QtSNJkSVMztndJuqSkax8q6WVJcyU9IunsBp13tqQtMrZfLmnPRlzLrFYL5alHJf17g88/UdL8qmv9vJHnt/KNbHYCrE+TgS5gdu0GSSMjohvoLvH6cyLifZJGA3Ml/Swi7q9Jw5qBnDAi+v0RTLd/fJBpNStiMq2RpzYDHpB0Y0TcUeL1rI24RlyC9A71UUnfk/S4pB9IOlzSHZKekHRAut9oSVdKukfSA5KmSdoEOAc4Pr2DPl7SDEnXSLoDuKb6rlfSGEn/LWmepIckHduozxERrwL3Abv1kYbxkn4s6d70cXBWeiQtlrRN+pl/IelBSfMlHZ9uv11SV/r8xPT4+ZIuqPq7rpT01fTYuyRt16jPaq2tg/LUa8BcYKf0WkdKulPS/ZJ+JGlM+v7+kn6fftfvkTQ2/RvMSfe9X9K7GpUua7KI8KPBD2AisAbYh+Rm5z7gSkDANOAn6X5fAz6cPt8CeBwYDXwMuLTqfDPSc2yWvj4U+Hn6/ALgP6v23bKP9FxEkvlrH9P72Lf63FsDi4G9+kjDtcC70+cTgEey0pOeZxvgWOC7Vds3T/+9naTGsiPwFDCepMXmNuD96T4BHJ0+/zpwZrP/r/0YmkcH5akt0+tun+aH3wGj021fAM4CNgEWAfun749L88KbgFHpe5OA7qq/zfzaa/nRPg83TZfnyYiYByDpYeDWiAhJ80gyDsCRwDGSPpe+HkVSqPVlViR307UOB06ovIiIl2p3iIh/HWDaD5H0ANADnB8RD0v6QE0aDgf2lFQ5Zlx6N5+XnnnAf6Q13Z9HxJya7fsDt0fEMgBJPwDeA/wEeAOo9H/dBxwxwM9l7a3d89SDJAXof0bEc5LeB+wJ3JHmo02AO4G/Ap6NiHvTa70CSW0fuFTSZGAt8LYBpsFalAvi8rxe9byn6nUP6/7uAo6NiMeqD5R0YB/ne3WwCZF0EXBYH5uui4jz+3h/TkS8LycNI4CDImJVzbUy0xIRj0t6JzAVOE/SrRFxTuZB66yOiEpw9LX4+zvctH2ekrQLcJek69O03hIRJ9ace59+LvuvwPPAO0jy36p+9rM24z7i5roZ+IzS0kvSvun7K4CxBc9xC/CpygtJW9buEBH/GhGT+3j09YNR1K+Az1Rdd3KR9EjaEfhLRHwfuBB4Z8157wHem/YnbwScCPy2jnTa8NLSeSoingTOJ2mGvgs4WNJu6XVGS3ob8Biwg6T90/fHShoJbE5SU+4BPgJsVPDzWItzQdxc5wIbAw+lTW3npu//hqTZd25lMFOG84At04FND9L3XXoZTge60sEsfwBOLZiefYB7JM0Fzk737xURzwLTSf4GDwL3RcRPy/sY1mHaIU99h6S7pdJ3/T+SHiJplt49It4Ajge+mV7/FpIm9v8CTkrf2506avTWWrSupc/MzMyGmmvEZmZmTeSC2MzMrIlcEJuZmTWRC2IzM7Mm6piCeMqUKUESeckPP5rxaEvON340+WF0UEH8wgsvNDsJZm3H+cas+TqmIDYzM2tHLojNzMyayLF6zXL09ASLX3yV519ZxXbjRjFx69GMGJEdU9tsuHO+Kc4FsVmGnp7gpoef44zr57JqdQ+jNh7BNz44mSl7be8fFbN+ON8MjJumbdjr6QkWLVvJnQtfYNGylfT0rBvMufjFV3t/TABWre7hjOvnsvhFh/k164/zzcC4RmzDWt6d+/OvrOr9MalYtbqHpStWsev4MU1KtVlrc74ZGNeIbVjLu3PfbtwoRm28fjYZtfEIth07asjTatYunG8GxgWxDQv9NT9n3bkDTNx6NN/44OTeH5VKjXni1qOH9gOYtZisLh3nm4Fx07R1hKwRmlnNz5U79+rCuPrOfcQIMWWv7dn99ENYumIV24716E+zvC4d55uBcY3Y2kLW3XflR2HqJXM48bt3M/WSOdz08HO9+2Q1Pxe5cx8xQuw6fgwH7boNu44f4x8TG/aKDMZyvinONWJreXl33/39KOx++iHsOn5M7sAR37mbDYwHYzWWa8TWMvqr9ebdfef18+YNHPGdu1nf+suTHozVWC6IbcgMtnm53oK2VQeOSJoi6TFJCyRN72P7ppJ+mG6/W9LE9P2Jkl6TNDd9fKfqmP0kzUuPuUSS7ypsULLyZKvmqXblpmkbEvU0L+cNqKr8KNSeu/Kj0IoDRyRtBHwLOAJYAtwraVZE/KFqt5OBlyJiN0knABcAx6fbFkbE5D5O/W3gE8DdwGxgCvDLcj6FdbK8Lp9Wy1PtzAWxNUzWyOV6+nEPmLh13QVtpfm5hfqvDgAWRMQiAEnXAdOA6oJ4GjAjfX4DcGlWDVfSDsC4iLgrfX018H5cEFs/svJsXj9wC+aptuWC2Aakv4xbb4SqrFpvmxa0eXYC/lT1eglwYH/7RMQaSS8DW6fbdpH0APAKcGZEzEn3X1Jzzp1qLyzpFOAUgAkTJtT/Sawt5eXZvJYoaxz3EVthWX1G9Uaoyutz8oCq9TwLTIiIfYEzgGsljSt6cETMjIiuiOgaP358aYm01paXZ90PPHRcI7b1DLZ5Oa/G2479uCV7GnhL1es3p+/1tc8SSSOBzYEXIyKA1wEi4j5JC4G3pfu/OeecZkCxpudhliebxgWx9aqnebkREarasHm5HvcCkyTtQlJYngB8qGafWcBJwJ3AccBtERGSxgPLI2KtpF2BScCiiFgu6RVJB5EM1voo8M0h+jzWgrJurIs0PQ+zPNk0bpq2XvU0LztC1cBExBrg08DNwCPA9RHxsKRzJB2T7nYFsLWkBSRN0JUpTu8BHpI0l2QQ16kRsTzddhpwObAAWIgHag1beRHn3PTcOpS0cpV0cmkKcDGwEXB5RJxfs/1U4FPAWmAlcEpl+oakL5JM31gLnB4RN2ddq6urK7q7uxv/IdpQ1l1w1vY7F77Aid+9e4PzXXfKgRy06za5NebKeYdpM1ZbflDnm861aNlKpl4yZ4Ma7+x0pgLQCnm2LfNNo5XWNF1wnuS1EfGddP9jgG8AUyTtSdJUtxewI/BrSW+LiLVlpbdTFCksy1oAwc1YZkOvvxvrImEonWdbQ5lN073zJCPiDaAyT7JXRLxS9XI0UKmeTwOui4jXI+JJkma2A0pMa8fIa172AghmnSOr+dlhKNtHmYO1isyTRNKnSPq/NgH+purYu2qO9XzIVD2T8L0AglnnyJrJkDdTwVpH00dNR8S3gG9J+hBwJsko0aLHzgRmQtLXVU4Km2OwgTPympeLND+7qcqsddRz4+0b6/ZQZtN0kXmS1a4jCcc3mGM7Sj2BM/Kalz1S0qx95I189spinaHMGnHuPElJkyLiifTl3wOV57NIogV9g2Sw1iTgnhLT2lLqCZxRZECV75LN2kNejHY3P3eG0griNDZuZZ7kRsCVlXmSQHdEzAI+LelwYDXwEmmzdLrf9SQB8NcAn+q0EdODbW5qxCR8Nz+btY56F17wjXX7K7WPOCJmkyzFVv3eWVXP/yXj2K8CXy0vdc1TTz+v74DNOkcjFl7wjXX7c2StJqinn7dyBzz79EO47pQDmX36Ib2Z1szaixdeMGiBUdOdqszmJt8Bm7WXwQbdcNPz8OCCuARubjKzinqi2YF/C4YDN03XoacnWLRsJXcufIFFy1b2Tilwc5OZVdQbzc46n2vEg5R1l+vmJjOrcNANy+MacYb+aryQfZdbJMarJ9qbpCmSHpO0QNL0PrZvKumH6fa7JU2s2T5B0kpJn6t6b7GkeZLmSvKySi3AQTcsjwvifuRFtMm6y3Vzk+WpWp3sKGBP4MR01bFqJwMvRcRuwEXABTXbv0Hf6w0fFhGTI6Krwcm2fmTdtPv3wPK4abofeRFtsgZZuOnZCuhdnQxAUmV1suplQqcBM9LnNwCXSlJEhKT3A08Crw5Ziq1PeYMz/XtgeVwj7kdWjRfy73Ld3GQ5+lqdrHaFsd59ImIN8DKwtaQxwBeAr/Rx3gB+Jem+dHWyDUg6RVK3pO5ly5bV+TEsb3Am+PfAsg3rGnHWXN8iqxT5LteaZAZwUUSslDb4vr07Ip6WtC1wi6RHI+J31Tt08qplzZA3GMssz7AtiPOak4qEkvT8PqtDkRXGKvsskTQS2Bx4kWRd7+MkfR3YAuiRtCoiLo2IpwEiYqmkG0mawH+HlabIXGCzLB1fEPdX683rA3aN10qWuzoZySpkJwF3AscBt0VEAIdUdpA0A1gZEZdKGg2MiIgV6fMjgXNK/yTDQFbrmeO/W706uiCuZ64vuMZr5Sm4OtkVwDWSFgDLSQrrLNsBN6bN1SOBayPiptI+xDDhwVhWNiU32O2vq6srurvXnza5aNlKpl4yZ4Mmo9mnJxWK/ra54LVBaMtf3b7yzXDVX60363fEvxV1a8t802gdPWrac33NrIisuAF5MyjM6lVq07SkKcDFJE1vl0fE+TXbzwA+DqwBlgH/FBF/TLetBealuz4VEccM9Pqe62tmRWSNGfFgLCtbaTXigpGDHgC6IuLtJAELvl617bU0OtDkwRTC4Lm+ZlaMW8+smcqsEedGDoqI31Ttfxfw4UYmwLVeMyvCrWfWTGX2EReJHFTtZNaPmzsqjf5zVxrOb1Bc6zUzqC8etH9HrEwtMX1J0oeBLuC9VW/vnEYI2hW4TdK8iFhYc9wpwCkAEyZMGLL0mll78RQka2Vl1oiLRA5C0uHAl4BjIuL1yvtVEYIWAbcD+9YeGxEzI6IrIrrGjx/f2NSbWcdwPGhrZWUWxL2RgyRtQhKMYFb1DpL2BS4jKYSXVr2/paRN0+fbAAez/qo0ZmaFeQqStbLSmqYLRg66EBgD/CiNBlSZprQHcJmkHpKbhfMjwgWxmfWrnkVczJqp1D7iiJgNzK5576yq54f3c9zvgX3KTJuZdY5GLOJi1iwtMVjLzKweXsTF2pkLYjNre17ExdpZR8eaNrPhodIHXM19wNYuXBCbNYmkKZIek7RA0vQ+tm8q6Yfp9rslTazZPkHSSkmfK3rOTuUwlNbO3DRt1gRVsdiPIIk6d6+kWTWzA04GXoqI3SSdAFwAHF+1/RtURaMreM621t/IaPcBWztzQWzWHLmx2NPXM9LnNwCXSlJERBr29Ung1ar9i5yzbRWJjuU+YGtHbpo2a44isdh794mINcDLwNaSxgBfAL4yiHMi6ZQ0jnv3smXL6voQQ6lIdCyzduSC2Kz9zAAuioiVgzm4XUPDOjqWdSo3TZs1R5FY7JV9lkgaCWwOvAgcCBwn6evAFkCPpFXAfQXO2bYcHcs6lWvEZs2RG4s9fX1S+vw44LZIHBIREyNiIvCfwNci4tKC52xp9SxVaFZL0mclvanZ6ciTWyOWtB3wNWDHiDhK0p7AX0fEFaWnzqxDFYzFfgVwjaQFwHKSgnXA5yz1gzSQlyq0EnwW+D7wlyanI5MiInsH6ZfAfwNfioh3pE1kD0RES8WC7urqiu7u7mYnw4avtiwNWinfLFq2kqmXzNmg6Xl2GqbSOlLD8o2k0cD1JF0yGwE/Illi9zHghYg4TNK3gf2BzYAbIuLs9NipJNMBXwXuAHaNiPel5/wmsDewMTAjIn7aqDRXFGma3iYirgd6oHf05tpGJ8TMhjcPxrI6TQGeiYh3RMTeJN02zwCHRcRh6T5fiogu4O3AeyW9XdIokuV4j4qI/YDqEYxfIukSOgA4DLgwLZwbqkhB/KqkrYEAkHQQyTQKM7OGcZhKq9M84AhJF0g6JCL6Kqc+KOl+4AFgL2BPYHdgUUQ8me7zP1X7HwlMlzQXuB0YBUxodMKLjJo+g2TAx1sl3UFyt3BcoxNiZsOblyq0ekTE45LeCUwFzpN0a/V2SbsAnwP2j4iXJH2PpGDNIuDYiHisjDRX5BbEEXG/pPcCf5Um6rGIWF1mosysM/UXohLwYCyri6QdgeUR8X1JfwY+DqwAxgIvAONI+oBfTgchH0VSy30M2FXSxIhYzPphZG8GPiPpM2lEu30j4oFGp73IqOmP1rz1TklExNUFjp0CXEzScX55RJxfs/0Mkj/WGmAZ8E8R8cd020nAmemu50XEVXnXM7PWlTcqGrxUodVlH5I+3B5gNfDPwF8DN0l6Jh2s9QDwKEkEujsAIuI1Sael+71KMg2w4lySvuaHJI0gCSv7vkYnvMio6W9WvRwF/C1wf0RkNk+nAegfpyoAPXBidQB6SYcBd0fEXyT9M3BoRBwvaSugG+gi6Zu+D9gvIl7q73qtNPrThqW2rLYNZb7xqGjrQ0vkG0ljImKlJJEsnPJERFw0VNcv0jT9merXkrYAritw7twA9BHxm6r97wI+nD7/O+CWiFieHnsLyYi46k50M2sjWaOiXRBbk30ibYXdhGQg12VDefHBhLh8FdilwH59BaA/MGP/k1m3pFvh4PXAKQATJjR8IJuZDUJ//cAOUWmtKq39DlkNuFaRPuKfkU5dIpnutCfJpOmGkfRhkmbo9w7kuIiYCcyEpImtkWkys4HL6gf2qGizvhWpEf971fM1wB8jYkmB44oEtUfS4SSTpt8bEa9XHXtozbG3F7immTVRf0sV7p72A3tUtNmGivQR/3aQ5+4NQE9SsJ4AfKh6B0n7krTFT4mIpVWbbga+JmnL9PWRwBcHmQ4zGyJ5/cAeFW22oX4LYkkrWNckvd4mICJiXNaJCwa1vxAYA/woGazGUxFxTEQsl3Qu64aRn1MZuGVmrcv9wGYDlzt9qV14+pI1WVu2rzY63xSZK2xWpSW+FOlsoA9FxH8N8LjZ6XF/ruf6hUdNS9qWqnBgEfFUPRc2G+4KBLzZFLga2A94ETg+IhZLOoB0kCLJD9mMiLgxPWYxSTShtcCaNMD9kHF0LGtTWwCnAesVxJJGpgsd9Skipjbi4kVGTR8D/AewI7AU2Bl4hCRgtpkNQhrw5ltUBbyRNKs64A3JlL6XImI3SScAF5CE35sPdKXdPzsAD0r6WdUPxmER8UJZac8KUwmOjmXlW7Fq9YRHnl1x7vOvrNpx+3Gjnt19h7Fnjh21cT2Vw/NJ1lOYSxKVaxXwEsmCEG+T9BOSwcejgIvTGTuVG98uki7WXwL/B7yLZFzUtIh4rcjFi9SIzwUOAn4dEfum0bA+nHOMmWXLDXiTvp6RPr8BuFSSIqJ6kfNR9D2WoxRuerZmW7Fq9YRfznvu12fNmj+p8h0855i9Dzpqn+0Pr6Mwng7sHRGTJR0K/CJ9XVmR6Z/SsUubkdw0/zgiXqw5xySS6JGfkHQ9cCzw/SIXL7IM4ur0giMkjUijYQ1pc5dZByoStKZ3n7S2+zKwNYCkAyU9TLL026lVteEAfiXpvjTgzQYknSKpW1L3smXLBpTo/qYnLX7x1QGdx2ywHnl2xbmVQhiS7+BZs+ZPeuTZFec28DL3VBXCAKdLepAkAuRbSArdWk9GxNz0+X3AxKIXK1IQ/1nSGGAO8ANJF5NE1zKzJomIuyNiL2B/4Ivp4uYA746Id5KsLPMpSe/p49iZEdEVEV3jx4+v3Zwpa3qS2VB4/pVVO/b5HXxl1Y4NvExvGZfWkA8H/joi3kESArOvaQCvVz1fywDGYBUpiH8DbA78C3ATsBA4uugFzKxPRQLe9O4jaSRJPlyvOSwiHgFWAnunr59O/10K3EjSBD5gPT3BomUruXPhCyxatpKenqT1uzI9qZqnJ9lQ2n7cqGf7/A6OG/VMHaetLJfYl81Jxmr8RdLuJF21DVWkIB4J/IokstVY4Id9tI2b2cD0BryRtAlJwJtZNfvMAk5Knx8H3JauibpLWjAjaWeSASWLJY2WNDZ9fzRJIJz5A01YpR946iVzOPG7dzP1kjnc9PBz9PREb5jKyg+hw1TaUNt9h7FnnnPM3k9UfwfPOWbvJ/bYYeyXB3vOtEy7Q9J8kvgW1W4CRkp6hGRQ112DvU5/Cs8jlvR2khGbxwJLIuLwRiemHp5HbE024JFKkqaSrHVaCXjz1eqAN2lz8zXAvsBy4ISIWCTpIySDS1YDPSQBb34iaVeSWjAkN9DXRsRXs9LQV77JW66wMmra05OsAQb1xamMml76yqodtx036pk9dhj75TpHTTfVQFZfWgo8R9I0tm05yTEbPiJiNjC75r2zqp6vAj7Qx3HXkBTQte8vAt5Rb7ocptJa3dhRGz91wC5bnZS/Z3vIbZqWdJqk24FbSUZsfiIi3l52wsysOdwPbDa0ivQRvwX4bETsFREzagIOmFmHcT+w2dAqsvqSVz0yG0YcptJsaA2kj9jMhgn3A5sNnSJN02ZmZlYSF8RmZtZ4PT3wwhPw5Jzk356e/GOaRNIWkk4b5LGflfSmeq7vgtjMzBqrpwce/Rlcdghc9b7k30d/1sqF8RYkyyAOxmeB1i2IJU2R9JikBZKm97H9PZLul7RG0nE129ZKmps+aiMOmZlZq1q+EG78JKxOVwFc/VryevnCxpx/1SsT+OPvr2L+/97CH++8mlWvTKjzjL3LIEq6UNLnJd0r6SFJX4EkWp2kX0h6UNJ8ScdLOp1kieDfSPrNYC9e2mCtguutPgV8DPhcH6d4LSIml5U+MzMryYrn1hXCFatfg5XPwTZ9LVw0AKtemcAjP/01sz8/idWvwcabwdQLD2KPaYczalwjlkE8kiSk7AEkkb9mpYunjAeeiYi/B5C0eUS8LOkM6lwDvMwace96qxHxBlBZb7VXRCyOiIdIwvSZmVm7yOoDHrt9UkBW23gzGLN9/dd9fv65vYUwJAX87M9P4vn5jVoG8cj08QBwP0ks90kkS44eIekCSYdExMsNul6pBXGR9VazjErXTL1L0vv72qGedVXNzGyQ8vqAt3or/MNl6wrjjTdLXm/11vqvveK5Hfusba94rlHLIAr4t4iYnD52i4grIuJx4J0kBfJ5ks7KPk1xrTyPeOeIeDoNZH+bpHkRsV4HQ0TMBGZCEry+GYk0M+tYPT1Jv+6K55Ja7lZvhREj+u8D/uSeSdPziBGw+9HJ65XPJTXhyrH1GrvDs2y82fpN3xtvBmO3b9QyiDcD50r6QUSslLQTyQIrI4HlEfF9SX8GPl5zbEs2TRdZb7VfVeuqLiJZgnHfRibOzMwyZNV6s/qAK0aMSArliYesK5wbYbu9zmTqhU+sV9ueeuETbLd3o5ZBPAK4FrhT0jzgBpKCdh/gHklzgbOB89LDZwI3teRgLarWWyUpgE8APlTkQElbAn+JiNclbQMcDHy9tJSaNYGkKcDFJMsgXh4R59ds3xS4GtiPZNWz4yNisaQDSFuCSJrRZkTEjUXOacNMfzXaItuzar2VPuDaWmkj+oDzjBr3FHtMO5yt3nouK57bkbHbP8N2e3+5joFaAEREbfl0cc3rhSS15drjvgl8s55rl1YQR8QaSZ8mSXhlvdWHa9Zb3Z9k/dQtgaMlfSUi9gL2AC6T1ENSaz/fi01YJyk4q+Bk4KWI2E3SCcAFJGuCzwe60jy2A/CgpJ8BUeCcNlxUarSVwrTST7v70Ulhm7c9q9Y74eBk39pjG9EHXMSocU+x87s6ZhnEUvuIC6y3ei9Jk3Xtcb8naQYw61S9swoAJFVmFVQXmtOAGenzG4BLJSki/lK1zyiSArjoOa3TDLYfN297Vq23zD7gYch/NbPmKDKroHefiFgDvEyyJjiSDpT0MMkIzlPT7YVmKni2QQeppx83b3veyOey+oCHoVYeNW1m/YiIu4G9JO0BXCXplwM41rMN2klZ/bh5213rHTL+i5o1R5FZBb37SBoJbE4yaKtXRDwCrAT2LnhOayd583WzarV5Ndoic31d6x0SrhGbNUeRWQWzgJOAO0lC7t0WEZEe86d0sNbOJJF/FgN/LnBOazWDrfHW24/rGm/LcEFs1gRFZhUAVwDXSFoALCcpWAHeDUyXtJokPOxplTi3fZ1zSD+YDUw9I5e3mbSuVtvf6OVKjba/+M55221IKKIzuoi6urqiu7u72cmw4UvNTsBgON802QtPJM3NtTXaT85JCse87bCuRt2etdq2zDeN1jb/W2ZmLStrAYSs7fWOXAb343YAN02bmdWjnsAZHrlsuEZsZlaf/gZULV+Yv901XsM1YjOzfFkjm/MGVOVtd4132HNBbGaWJa/puRGBMzxyeVjzbZeZWZa8pudGBM6wYc01YjOzLHlNyw6cYXVyQWxmBv33AxdZe9eBM6wOviUzM8uK6eymZSuZa8RmNjzUE9PZTctWolK/SZKmSHpM0gJJ0/vY/h5J90taI+m4mm0nSXoifZxUZjrNrMPVs4oReC6vlaq0b5OkjYBvAUcBewInStqzZrengI8B19YcuxVwNnAgcABwtqQty0qrmXWArDCTeSOfK/3A1Wr7gc1KUuZt3QHAgohYFBFvANcB06p3iIjFEfEQyQoy1f4OuCUilkfES8AtwJQS02pm7azeGq/7ga2JyiyIdwL+VPV6Sfpew46VdIqkbkndy5YtG3RCzYZagW6bTSX9MN1+t6SJ6ftHSLpP0rz037+pOub29Jxz08e2Q/iRmqveGm/vFKM58LGfJ/9WAnaYlaytv2URMTMiuiKia/z48c1OjlkhBbttTgZeiojdgIuAC9L3XwCOjoh9gJOAa2qO+8eImJw+lpb2IZrFqxhZBypz1PTTwFuqXr85fa/osYfWHHt7Q1Jl1ny93TYAkirdNn+o2mcaMCN9fgNwqSRFxANV+zwMbCZp04h4vfxkN5lXMbIOVea38F5gkqRdJG0CnADMKnjszcCRkrZMB2kdmb5n1gmKdL307hMRa4CXga1r9jkWuL+mEP7vtFn6y5L6XHS9bbt0vIqRdajSasQRsUbSp0kK0I2AKyPiYUnnAN0RMUvS/sCNwJbA0ZK+EhF7RcRySeeSFOYA50TE8rLSatZuJO1F0lx9ZNXb/xgRT0saC/wY+Ahwde2xETETmAnQ1dUVQ5Dc4upZ5cg1XmtTpQb0iIjZwOya986qen4vSbNzX8deCVxZZvrMmqRIt01lnyWSRgKbAy8CSHozyQ3sRyNiYeWAiHg6/XeFpGtJmsA3KIhbVr2rHDmMpLUp3y6aDb0i3TazSAZjARwH3BYRIWkL4BfA9Ii4o7KzpJGStkmfbwy8D5hf7scYhHrm+nqKkXUoh7g0G2JFum2AK4BrJC0AlpMU1gCfBnYDzpJUaV06EngVuDkthDcCfg18d8g+VBF5Nd56Vzkya1MuiM2aoEC3zSrgA30cdx5wXj+n3a+RaWy4vHjOjVjlyKwN+VbSzIaGo1uZ9ck1YrOskbrWOJ7ra9YnF8TWOrIKxLzCcrDb8/oti5zb1tff36tS4639W/c119dNzzaMuCC29dVb4A323FkFImQXlnmFadb2vH7LIgW1rZP393KN12wDzgGdKGuKSNb2vBVsimzv77p5x2ZNXcmb1lLP9rx+y7xz2/ry/l6ObmW2AeeCeuQVePUcm1eoDbbAy9peT4FWT0EL2QViXmFZz/a8VXnyzj0cZX3//PcyGzAXxINVZu0wa3u9BV49tcOs7fUUtJBdIOYVlvVszxup6wXj15f3/fPfy2zAOr8gHmzNMm97mbXDepppy6wdZm2vp6CF7AIxr7CsZ3veOrTDdUpNf999R78ya7jOHqxV5gCgvIInawBQXgShrO1B9rF5U0SytueNas3bnnXdvGPzBvJkbcs7tsj2/kbqDscBRlnffUe/Mmu4zi6Is0bEQnZhWU8UoLwfq3oKy8rzwRZ4WdvrKdDqLWgr+2QViFnTWurdnmW4TanJ+u47+pVZw3V2QVxPzTKvMC2zdpi3vZ4Cr57aYdb2egtaax1Z3/0JB+fPBTazAensgriemmU9UYDqrR3W00xbOb6s2mEWF7SdIeu776Zns4ZTRHnrgkuaAlxMshrM5RFxfs32TUnWS92PZK3V4yNisaSJwCPAY+mud0XEqVnX6urqiu7u7vXfLLOPOE9lOpB/rIYLNTsBgzHgfOPvsDVWW+abRiutIJa0EfA4cASwhGQN1hMj4g9V+5wGvD0iTpV0AvAPEXF8WhD/PCL2Lnq9Pn9QILtAzCssXZhacQP+QanjRvUI4HxgE+AN4PMRcVt6zH7A94DNSFZ3+pfIyOSDyjdmjeOCmHKbpg8AFkTEIgBJ1wHTgD9U7TMNmJE+vwG4VFJj/2PKHABkNkjpjeq3qLpRlTSr+kYVOBl4KSJ2S29ULwCOB14Ajo6IZyTtTbKu8U7pMd8GPgHcTVIQTwF+OeAE+rtvNmTKvMXdCfhT1eslrPux2GCfiFgDvAxsnW7bRdIDkn4r6ZAS02nWDL03qhHxBlC5Ua02DbgqfX4D8LeSFBEPRMQz6fsPA5tJ2lTSDsC4iLgrrQVfDby/9E9iZnVp1bamZ4EJEbEvcAZwraRxtTtJOkVSt6TuZcuWDXkizepQ741qxbHA/RHxerr/kpxzOt+YtZgym6afBt5S9frN6Xt97bNE0khgc+DF9G7+dYCIuE/SQuBtwHqdWRExE5gJIGmZpD9mpGcbkia9VuN0DUyrpuumiJgylBeUtBdJc/WRAznO+aZUTtfADHm+aUVlFsT3ApMk7UJS4J4AfKhmn1nAScCdwHHAbRERksYDyyNiraRdgUnAoqyLRcT4rO2SuiOia3AfpTxO18C0aroGYdA3qgCS3gzcCHw0IhZW7f/mnHOux/mmsZwuG4zSmqbTprRPkwwkeQS4PiIelnSOpGPS3a4Atpa0gKQJenr6/nuAhyTNJekbOzUilpeVVrMm6L1RlbQJyY3qrJp9KjeqsP6N6hbAL4DpEXFHZeeIeBZ4RdJB6aDHjwI/LflzmFmdSg3oERGzSUZuVr93VtXzVcAH+jjux8CPy0ybWTNFxBpJlRvVjYArKzeqQHdEzCK5Ub0mvVFdTlJYQ3KDuxtwlqRKfjoyIpYCp7Fu+tIvGcyIaTMbUp0dWWt9M5udgH44XQPTqukasDpuVM8DzuvnnN1A4fn3BbTq39vpGphWTZdRcmQtMzMzy9aq05fMzMyGBRfEZmZmTdTxBbGkKZIek7RA0vT8I4aOpMWS5kmaK6mPgL9Dlo4rJS2VNL/qva0k3SLpifTfLVskXTMkPZ3+zeZKmjrU6RoOWjXftEqeSdPifGMN0dEFcVU836OAPYETJe3Z3FRt4LCImNzkOX7fI4lJXG06cGtETAJuZd3UsqH0PTZMF8BF6d9scjrgyRqoDfJNK+QZcL6xBunogphi8XyHvYj4Hcn0mGrVcY6vogkxi/tJl5XP+aYA5xtrlE4viIvE822mAH4l6T5JpzQ7MTW2SwNEADwHbNfMxNT4tKSH0ia4IW/6GwZaOd+0cp4B5xsbhE4viFvduyPinSRNgJ+S9J5mJ6gvaezvVpnn9m3grcBkksVB/qOpqbGh1hZ5BpxvrLhOL4iLxPNtmoh4Ov13KUnc4AOam6L1PJ8uq0f679ImpweAiHg+ItZGRA/wXVrrb9YpWjbftHieAecbG4ROL4iLxPNtCkmjJY2tPCdZQWd+9lFDqjrO8Um0SMziyo9c6h9orb9Zp2jJfNMGeQacb2wQOjrEZX/xfJucrIrtgBuT2PyMBK6NiJuakRBJ/wMcCmwjaQlwNnA+cL2kk4E/Ah9skXQdKmkySZPfYuCTQ52uTtfC+aZl8gw431jjOMSlmZlZE3V607SZmVlLc0FsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8RWmKRDJf282ekwayfON5bHBbGZmVkTuSDuQJI+LOmedN3RyyRtJGmlpIskPSzpVknj030nS7orDQZ/YyUYvKTdJP1a0oOS7pf01vT0YyTdIOlRST9QGl3BrN0531izuCDuMJL2AI4HDo6IycBa4B+B0UB3ROwF/JYk2g7A1cAXIuLtwLyq938AfCsi3gG8iyRQPMC+wGdJ1qndFTi45I9kVjrnG2umjg5xOUz9LbAfcG96070ZSeD5HuCH6T7fB/5X0ubAFhHx2/T9q4AfpfF8d4qIGwEiYhVAer57ImJJ+nouMBH4v9I/lVm5nG+saVwQdx4BV0XEF9d7U/pyzX6DjW36etXztfg7ZJ3B+caaxk3TnedW4DhJ2wJI2krSziT/18el+3wI+L+IeBl4SdIh6fsfAX4bESuAJZLen55jU0lvGsoPYTbEnG+saXxX1mEi4g+SzgR+JWkEsBr4FPAqcEC6bSlJfxgkS7V9J/3BWAT8v/T9jwCXSTonPccHhvBjmA0p5xtrJq++NExIWhkRY5qdDrN24nxjQ8FN02ZmZk3kGrGZmVkTuUZsZmbWRC6IzczMmsgFsZmZWRO5IDYzM2siF8RmZmZN9P8B/2ALHFLwKzIAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "output1, _ = track_model_metrics(model=model1, \n", + " train_interactions=train_interactions, \n", + " test_interactions=test_interactions, \n", + " k=K,\n", + " no_epochs=NO_EPOCHS, \n", + " no_threads=NO_THREADS)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The second model (with both implicit and explicit data) fitting progress:" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAADQCAYAAADbLGKxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAAuQUlEQVR4nO3de5hcVZnv8e+vIdiYC5AQEkBjuERjUCdicxmZoB7BiYiAR+R2dFAZg6OImmGOOKMSQT1cnIAKMwaFEVFBxMGJTASRi0aHWwMRCBdJQoQguSO5QGNCveePvSupVLqrqrtq966q/n2ep5+u2te1O9n11lp7rXcpIjAzM7N8dORdADMzs6HMgdjMzCxHDsRmZmY5ciA2MzPLkQOxmZlZjhyIzczMcuRA3IQkTZV0VIX1XZK+mdG53y7peUkLJD0q6ZwGHXeepF0rrP+upCmNOJdZuSa6px6T9PUGH3+ipIdLznVjI49v2dsx7wJYr6YCXcC88hWSdoyIbqA7w/PPj4ijJQ0HFkj6eUTcX1aGzf05YET0+SGYrv/7AZbVrBZTaY57amfgAUk3RMTvMjyftRDXiDOQfkN9TNL3JP1B0g8lHSHpd5KekHRwut1wSVdKukfSA5KOlbQTcC5wYvoN+kRJsyRdLel3wNWl33oljZD0H5IekvSgpPc36joiYiNwH7B/L2UYK+mnku5Nfw6rVB5JSyXtnl7zf0v6vaSHJZ2Yrr9DUlf6+uR0/4clXVDyd90g6avpvndJGteoa7Xm1kb31IvAAmDv9FzvknSnpPsl/UTSiHT5QZL+J/2/fo+kkenfYH667f2S3tqoclnOIsI/Df4BJgKbgTeSfNm5D7gSEHAs8LN0u68BH0xf7wr8ARgOfBi4tOR4s9Jj7Jy+fztwY/r6AuCSkm1366U8F5Pc/OU/Z/eybemxxwBLgQN6KcOPgL9JX08AHq1UnvQ4uwPvB75Tsn6X9PcdJDWWvYCngLEkLTa3Acel2wTw3vT1hcAX8v639s/g/LTRPbVbet7x6f3wG2B4uu5zwJeAnYAlwEHp8lHpvfBKoDNdNgnoLvnbPFx+Lv+0zo+bprPzZEQ8BCBpIXBrRISkh0huHIB3AcdIOit930kS1HozN5Jv0+WOAE4qvomI58o3iIjP9rPs0yQ9ABSA8yNioaQPlJXhCGCKpOI+o9Jv89XK8xDwr2lN98aImF+2/iDgjohYBSDph8DhwM+AvwDF51/3AUf287qstbX6PfV7kgB6SUQsl3Q0MAX4XXof7QTcCbwOeDYi7k3PtQ6S2j5wqaSpwMvAa/tZBmtSDsTZeankdaHkfYGtf3cB74+Ix0t3lHRIL8fbONCCSLoYeEcvq66NiPN7WT4/Io6uUoYO4NCI6Ck7V8WyRMQfJB0IHAV8RdKtEXFuxZ222hQRxeToL+P/v0NNy99TkvYB7pJ0XVrWWyLi5LJjv7GP034WWAH8Fcn919PHdtZi/Iw4XzcDn1IavSS9OV2+HhhZ4zFuAT5ZfCNpt/INIuKzETG1l5/ePjBq9UvgUyXnnVpLeSTtBbwQET8ALgIOLDvuPcDb0ufJOwAnA7+uo5w2tDT1PRURTwLnkzRD3wUcJmn/9DzDJb0WeBzYU9JB6fKRknYEdiGpKReADwE71Hg91uQciPN1HjAMeDBtajsvXX47SbPvgmJnpgq+AuyWdmz6Pb1/S8/CmUBX2pnlEeDjNZbnjcA9khYA56TbbxERzwJnk/wNfg/cFxH/ld1lWJtphXvq2ySPW4rPrq+R9CBJs/TkiPgLcCLwrfT8t5A0sf8bcGq6bDJ11OituWhrS5+ZmZkNNteIzczMcuRAbGZmliMHYjMzsxw5EJuZmeWobQLx9OnTgyTzkn/8k8dPS/J945+cf4w2CsSrV6/OuwhmNZM0XdLjkhZJOruX9a+Q9ON0/d2SJqbLJ0p6MR2Gs0DSt0v2eUuaH3mRpG8Wx9JW4vvGLH9tE4jNWkWaqOQy4N0kKQ5P1vZTQJ4GPBcR+5PkNb6gZN3ikgQSHy9Z/u/Ax0jSKE4Cpmd1DWbWOA7EZoPvYGBRRCxJkzdcSzJxQaljgavS19cD76xUw5W0JzAqIu5K04B+Hziu4SU3s4ZzILaWUCgES1Zt4M7Fq1myagOFQks/XtobeLrk/bJ0Wa/bRDL38/Mks2EB7JNO8fdrSdNKtl9W5ZgASJohqVtS96pVq+q7EmtqbXbftC0nzbemVygENy1czszrFtCzqUDnsA5mnzCV6QeMp6Oj6mPQdvMsMCEi1kh6C/AzSQf05wARcTlwOUBXV5c/mdtU7vdNoQBrF8P65TByPIzeDzpc9+uN/yrW9Jau2bjlwwSgZ1OBmdctYOmalk21+wzw6pL3r0qX9bpNScL/NRHxUkSsAYiI+4DFJNPhPZMep9IxbQjJ9b4pFOCxn8OcaXDV0cnvx36eLLftOBBb01uxrmfLh0lRz6YCK9e37Cxw9wKTJO0jaSeSuW/nlm0zFzg1fX08cFs69+7YtLMXkvYl6ZS1JJ0sY52kQ9NnyX8HeLKMISzX+2btYrjhdNiUTve86cXk/drF2Z+7Bblp2preuFGddA7r2OZDpXNYB3uM7Kz5GIVCsHTNRlas62HcqE4mjhmeW7N2RGyWdAbJlH07AFdGxEJJ5wLdETEXuAK4WtIiYC1bJ6o/HDhX0iaSeXg/HhFr03WfAL4H7Az8Iv2xIaoR982ArV++NQgXbXoRNiyH3Sdlf/4Wk2kgljQd+AbJh813y+fqlPRxknk/XwY2ADMi4pF03edJhnC8DJwZETdnWVZrXhPHDGf2CVO3e9Y1cczwmvbP/VlZLyJiHjCvbNmXSl73AB/oZb+fAj/t45jdwBsaW1JrVfXeN3UZOR6G7bxtMB62M4wYn/25W1Bm0yCmzWd/AI4k6cF5L3ByMdCm24yKiHXp62OAT0TE9HRM5TUkwzz2An4FvDYiXu7rfF1dXdHd3Z3JtVj+ijXalet72GPk9jXaSjXeJas2cNQ3529XM5h35jT2HTuiUUVsyV5jvm/aW7X7JsMTJ8+Ei83Tw3aG982Bye8t77DVkvdNo2VZI94yVhJAUnGs5JZAXAzCqeFsTXl2LHBtRLwEPJk2zx1MMnG2DUEdHWLfsSN6DZzVaryVnpXtO3ZE1WbrZmrWNuuPSvdNxidOgu7pU5Lm6BHuNV1JloG4t7GSh5RvJOmTwExgJ+B/lex7V9m+242JlDQDmAEwYcKEhhTaWk9fvUMnpzXeSs/KqgXxZmzWNhs09QxB6uhIngf7mXBVuX89iYjLImI/4HPAF/q57+UR0RURXWPHjs2mgNb0qvUOLT4r6xyW/HcvfVZWbYhHGw6dMquNhyANmixrxLWMlSx1LUmu3IHsaw2SZzPtQI9drXdoR4eYfsB4Jp85bbtnZdWarautN2tbfQ1BOn2Ka7kNlmUg3jJWkiSIngScUrqBpEkR8UT69j1A8fVc4EeSZpN01poE3JNhWY3qz1qzbKat59i19A7t61lZtSCe6xAQszx5CNKgyaxpOs2PWxwr+ShwXXGsZNpDGuAMSQslLSB5Tnxquu9C4DqSjl03AZ+s1GPaGiPrZtpKeW/rOXaxxjvvzGlcO+MQ5p05reYvB5WarWtZb9a2ikOQSnkIUiYyHUdcw1jJT1fY96vAV7MrXb6asSduls209fZsrmagvUMrNVvXst6sqdXT2Wr0fsmQo/IhSKP3q//Ytg1n1spB3j1x+/oSkGUzbT09m7NWLYjnNgTErB61j+XtXaUhSPUe27bhv1gOsmzirWXfmxYu56hvzufk79zNUd+cz00Ll1MoRN3NtJXKVU/PZjMbgEbkey4OQZo4LfldDLLOJd1QrhFXkFXzcZZNvNVUq5kOtJm2Wrnq6dlsZgOQZWcrd+RqKNeI+1Cp5li6zUBqpsWgVKreJt5aa9PVaqbFZthD992dfceO2C4Q9rW+WrlqqfFWO7eZ9UOWna3ckauhHIj7UC2w1BKo+1JLUOoryNc7tVk9XwIqqSXAD7Rns5kNQLGzVTFglne2atZjD0Fumu5Dtebjak28lVRrhq3UzFtvp6asZmSppVzu9GQ2iLLM9+xc0g3lQNyHaoEly+E2lYJ8vYE0q2exuU65Zma9qzPfc8V+Ms4l3TAOxH2oFliyHG5TLcjXG0izqJm6s5VZTjIaz5v3MMuhZEgH4krf9qoFlmqBup4e17X0MG7GJt5mLVezkjQd+AawA/DdiDi/bP0rgO8DbwHWACdGxNKS9RNIss/Nioivp8s+C/w9yZSiDwEfiYjaOhBYc6oUaDMcz1vt8VszJiVqVW0fiPv6z1LLt71KgaWeoTzVuJm3/UnaAbgMOJJkms97Jc2NiEdKNjsNeC4i9pd0EnABcGLJ+tnAL0qOuTdwJjAlIl6UdB1JjvfvZXoxlp1qgTbDiRkqtcxNHDPcteUGausn65V6NjdieruBDuWp5bj19DCuJ+GHDZqDgUURsSQi/kIy+9ixZdscC1yVvr4eeKckAUg6DngSWFi2z47AzpJ2BF4J/Cmb4tugqJY4o9J43jpVGmHh6UEbq60DcaX/LPUOA6qkEcce6JjaeoZV2aDaG3i65P2ydFmv26STqDwPjJE0gmT+7i+XbhwRzwBfB54CngWej4hflp9Y0gxJ3ZK6V61a1aDLaQGFAqx+Ap6cn/xuhXl1qwXaDMfzVhpmmeXn51DU1oG40n+WrMbTQnZjdWvhb6pDwizg4ojYULpQ0m4kteh9SKYPHS7pg+U7R8TlEdEVEV1jx44djPLmr1Unua8WaDMcz1upZS7Pz7h21NaBuNJ/lixzG+eZN9nfVFvGM8CrS96/Kl3W6zZpU/MuJJ22DgEulLQU+Azwz5LOAI4AnoyIVRGxCfhP4K0ZXkPraNXcyNUCbUcHhdcdzfpTb+fPJ9zA+lNvp/C6oxs2nrevljnnhm+stu6sVanTU5bDbfIcyuOJ7FvGvcAkSfuQBNyTgFPKtplLMkf3ncDxwG0REcC04gaSZgEbIuJSSYcAh0p6JfAi8E6gO+sLaQmtmhu5SuKMQiG46ZGVzLxuafoZt5TZJ+yaeacpD1dsrLYOxLXMNZvVcJu8hvK4x3VriIjNaS32ZpLhS1dGxEJJ5wLdETEXuAK4WtIiYC1JsK50zLslXQ/cD2wGHgAuz/I6Wkaxibc0GLdKbuQKiTPqyfBXf7E8XLFRlHzBbn1dXV3R3e0v/7B1yJa/qQ6qlvwDD5n7pk3nz71z8WpO/s7d2y2/dsYhHLrv7jmUqN9a8r5ptLauEQ9V/qZqVqZNcyP7UVR7aO3/hWZmteprkvsW5k5T7cE1YjOzFuVOU+3BgdjMrIU166Mo56KuXaaBuIak9jNJEtRvBlYBH42IP6brXiZJWg/wVEQck2VZzcysMTxzU/9k9pCkJKn9u4EpwMmSppRt9gDQFRFvIsmle2HJuhcjYmr64yBsZtYinOGvf7LsrVA1qX1E3B4RL6Rv7yLJLmRmZi3MGf76J8tAXEtS+1KnUTKlG9CZJqa/K51pxszMWoBzUfdPU/TfTxPTdwEXlSx+TUR0kaT9u0TSdlnMh+wsMmZmTczDqvony85atSS1R9IRwL8Ab4uIl4rL0yndiIglku4A3gxsk6E9Ii4nTeHX1dXVHinCzMxanIdV9U+WgbhqUntJbwbmANMjYmXJ8t2AFyLiJUm7A4exbUcuMzNrYs06rKoZZRaIa0xqfxEwAviJJNg6TOn1wBxJBZLm8/Mj4pGsympmNhR5rG9zyHQccUTMA+aVLftSyesj+tjvf4A3Zlk2M2tBhUIyh/D65cmMSs2SL7pZy1WBx/o2D2fWMrPWkOcMSpUCbYvO7LR0zUYuuukRvjqtk3H6MyvYjYtueoTJ40e6OXmQORCbWWtYu3hrsIPk9w2nJzMq9TJXb8NUC7T1liun2vRzG3u48pAV7DN/5pbrOnDabJ7b2AMOxIOqeb+umbUxSdMlPS5pkaSze1n/Ckk/TtffLWli2foJkjZIOqtk2a6Srpf0mKRHJf31IFzK4Fm/fGuwK9r0YjKtYZb6CrRrF9dfrmKQnzMNrjo6+f3Yz5PlGXvtjqu2BmGATS+yz/yZTNrRQ0EHmwOx2SCrMf3racBzEbE/cDFwQdn62WybAAeSvO43RcRk4K+ARxtd9lyNHJ/URksN2zmZWzhLVQJtYUTv5SoMr6Fc1YJ8hoZvWt3rdQ3ftDrzc9u2HIjNBl/V9K/p+6vS19cD71Q6tCDNNPcksLC4saRdgMOBKwAi4i8R8ecMr2Hwjd4vaRIuBr1iE/Ho7XL9NFaVLwBPazxLD5+9TbmWHj6bp1VDIM6rlg909HFdHSMz/mJj23EgNht8taR/3bJNRGwGngfGSBoBfA74ctn2+5DMYPYfkh6Q9F1J7ZXGqKMjeS57+nz48I3J78HoEFXlC8Cfnn+JU347lp8edA3zD7uKnx50Daf8dizPrnupwkFTedXyIb8vNrYdd9Yyay2zgIsjYkNaQS7aETgQ+FRE3C3pG8DZwBfLDyBpBjADYMKECZkXuKE6OpIOUFl2zurtnJPfm3S+2rA8CZIlHarGjepk7Qub+cfb/gIMA16oPa9yMRiWdwQbjGBY5bps8DgQmw2+WtK/FrdZJmlHYBdgDXAIcLykC4FdgYKkHpLm62URcXe6//UkgXg7Tg07ABW+ABTzKpePx60pr3LewTCPLza2HQdis8FXNf0rMBc4FbgTOB64LSICmFbcQNIsYENEXJq+f1rS6yLiceCdgLPRDYK68yo7GA55DsRmg6zG9K9XAFdLWgSsJQnW1XwK+KGknYAlwEeyuQIr57zKVg8HYrMc1JD+tQf4QJVjzCp7v4BkOlEzayF+Km9mZpYjB2IzM7McORCbmZnlyIHYzMwsR+6sZWbWxAqFYOmajaxY18O4Uf0cGmUtwYHYzKxJFQrBTQuXb5csZPoB4x2MayDpM8DlEfFC3mWppGrTtKRxkq6Q9Iv0/RRJp2VfNDOzNlEowOon4Mn5ye8apzlcumbjliAM0LOpwMzrFrB0zcYsS9tOPgO8Mu9CVFPLM+LvkSQe2Ct9/weSizMzG1SFQrBk1QbuXLyaJas2UCi0QIbOOuYcXrGuZ0sQLurZVGDl+p6sStuyJA2X9N+Sfi/pYUnnkMSt2yXdnm7z75K6JS2U9OWSfY9K5/G+T9I3Jd1YcswrJd2TTqZSPktaQ9TSNL17RFwn6fOwJSvQy1kUxsysLy3bTNvXnMOnT6ma1nLcqE46h3VsE4xrnlBi6JkO/Cki3gNbpgb9CPCOiChOsvwvEbE2nRP8VklvIqlczgEOj4gnJV1Tcsx/IUkv+1FJuwL3SPpVRDS0SaKWGvFGSWOAAJB0KMmUbGZmg6Zlm2nrmHO4OKFE57Dko7pfE0oMPQ8BR0q6QNK0iOgtTp0g6X7gAeAAYAowGVgSEU+m25QG4ncBZ0taANwBdAINn7KslhrxTJIE9PtJ+h0wliQJvZnZoKnUTNvUOZ6Lcw6XBuMa5xyue0KJISQi/iDpQOAo4CuSbi1dn06ychZwUEQ8J+l7JIG1EgHvTydSyUzVGnFE3A+8DXgrcDpwQEQ8WMvBJU2X9LikRZK2m5JN0kxJj0h6UNKtkl5Tsu5USU+kP6fWfklm1o6KzbSlWqKZdvR+xHHfToIvwLCdk/c1zjlcnFDi0H13Z9+xIxyE+yBpL+CFiPgBcBHJ/NzrgZHpJqOAjcDzksYB706XPw7sK2li+v7EksPeDHxK6eTfkt6cRdmr1ogl/V3ZogMlERHfr7LfDsBlwJHAMuBeSXMjonRqtgeAroh4QdI/ABcCJ0oaDZxDksA+gPvSfZ+r+crMrK3UNe9vjgqIOzoOZd0h1zIm/swa7cqojsm8HTmjUmO9EbhIUgHYBPwD8NfATZL+FBHvkPQA8BjwNPA7gIh4UdIn0u02kkxTWnQecAnwoKQO4Eng6EYXvJam6YNKXneSzHN6P1AxEAMHA4siYgmApGuBYymZIzUibi/Z/i7gg+nrvwVuiYi16b63kDyIL227N7MhpFWbaZeu2cgnfvRA2qw+DNhI57AHmHfmtOZuUm8xEXEzSQ22VDfwrZJtPtzH7rdHxOS05ntZuh8R8SJJS3CmqgbiiPhU6fu059i1NRx7b5JvHUXLgEMqbH8a8IsK++5dvoOkGcAMgAkTGv783MyaTD3z/lbNUFUoJD2c1y9PnuuO3g866q+zVnu27cxZTeFj6SPQnUhaaucM5skHkllrI7BPIwsh6YMkzdBv689+EXE5cDlAV1dXCwwoNLM8VB36VBzrWxxmNGxneN8cmPzeuoNxpSFILTskq81ExMXAxXmdv5bMWj+XNDf9uZHkwfYNNRz7GeDVJe9flS4rP/4RJGO1jomIl/qzr1krq6Ez4ysk/Thdf3dJZ5Li+gmSNkg6q2z5DmnygRszvoSWUXXoU19jfdcurvvclYYgteyQLGuoWmrEXy95vRn4Y0Qsq2G/e4FJaZfxZ4CTgFNKN0h7oM0BpkfEypJVNwNfk7Rb+v5dwOdrOKdZS6ixM+NpwHMRsb+kk4AL2LZH52y2Ps4p9WngUZJeokYNQ58qjfWtknSjmkrPtlt2SJY1VC3PiH89kAOnGbjOIAmqOwBXRsRCSecC3RExl6SL+QjgJ2nv8Kci4pg088l5bO29dm6x45ZZm6jamTF9Pyt9fT1wqSRFREg6jqQH5zZVJ0mvAt4DfJUkB4BRQ4aqOsb61qKvZ9vOnGVQoWla0npJ63r5WS9pXS0Hj4h5EfHaiNgvIr6aLvtSGoSJiCMiYlxETE1/jinZ98qI2D/9+Y96L9SsydTSIXHLNhGxmSSj3RhJI4DPAV9me5cA/xfoM5GxpBlpvt3uVatWDfgCWknVDFWj90ueCZeM9eV9c2oe65tZuWxI6LNGHBEj+1pnZrmaBVwcERvSliQAJB0NrIyI+yS9va+dh2Inx6pDnzo6ko5Zp09JmqNHNK7XdF3lskGRjgY6JSL+rZ/7zUv3+3M956+517SkPShJBxYRT9VzYrMhrpYOicVtlknaEdgFWEMyDPB4SRcCuwIFST0kNehjJB1Fcq+OkvSDiPggVn3oU0dH8jy4l2fCWQ4xqmdIljXMrsAngG0CsaQd09aoXkXEUY04eS2ZtY4B/pVkOqmVwGtIOoIc0IgCmA1RVTszkuR4PxW4kyS/+20REcC04gaSZgEbIuLSdNHn0+VvB85quiCc0VjdLHmIUfNZ37NpwqPPrj9vxbqevcaP6nx28p4jvzCyc1g9lcPzSeZTWECSlasHeI5kQojXSvoZyZfiTuAbaasSkpaSDL0dQdJx8rck6aCfAY5NE4JUVUuN+DzgUOBXEfFmSe9gawYsMxuAGjszXgFcLWkRsJYkWLeuDMfqZqmvIUaTnRkrF+t7Nk34xUPLf/WluQ9PKn4xOveYNxz67jeOP6KOYHw28IaImJp+if3v9H1xRqaPpp2IdyYZ4fDTiFhTdoxJwMkR8TFJ1wHvB35Qy8lrCcSbImKNpA5JHRFxu6RLajm4mfUtIuYB88qWfankdQ/wgSrHmNXH8jtIpm1rHnXMy5snDzFqLo8+u/68YhCG5N/iS3MfnjRx9+HnHbzP6EZNEHRPSRAGOFPS+9LXryYJuuWB+MmIWJC+vg+YWOvJagnEf057ac4HfihpJWVDJszMqspwrG6WPMSouaxY17NXr1+M1vXs1cDTbIlxaQ35COCv0wmK7qD36RNfKnn9MrBzrSerpT3odpJOIp8GbgIWA++t9QRmZsDWsbqlGjhWNyseYtRcxo/qfLbX6TBHdf6pjsOWTpdYbheSxDovSJpM8qi2oWqpEe8I/JLkGdWPgR/30jZuZlZZcaxu+TPijMfq1stDjJrL5D1HfuHcY95waNkz4idev+fILw70mOnj199Jehh4EVhRsvom4OOSHiVJ8XxXfVewPSWdMGvYUHoTSXq99wPLIuKIRhemHl1dXdHd3Z13MWzoaslP5UG/b4q9pgdxrK41tQHdN8Ve0yvX9ey1x6jOP71+z5FfrLPXdK76M/vSSmA5yQPqPbIpjpm1tQpjdc1qNbJz2FMN7JiVu1pmX/pE+nD6VmAM8LGIeFPWBTMzMxsKaqkRvxr4TEm3bDOzbLRgwg+zetUy+5KnHzSz7LVowg+zevl/t5k1h74SfqxdnG+5zDLmQGxmzaFSwg+zNuZAbGbNoUUTfljrk7SrpE8McN/PSHplPed3IDaz5lBM+FEMxi2S8MPawq4k0yAOxGeAugJxf8YRm5llp6Mj6Zh1+hQn/LDKetZNYMXD57F++V6M3PNZxh3wBTpHNWoaxFtI8macALwCuCEizpE0HLiOZO7wHUhmJhxHMkXw7ZJWR8Q7BnJyB2Izax51JPwoFIKlazayYl0P40Y5DWXb6lk3gUf/61fM+6dJW3rXH3XRobz+2CPqCMal0yC+i2T+74NJMn/NlXQ4MBb4U0S8B0DSLhHxvKSZwDsiYvVAL8lfNc2s5RUKwR2Pr+DB33fz8pL5PPj7+7jj8RUUCrWl8LUWsuLh87YEYUg69M37p0msePi8Bp3hXenPA8D9wGSSaQ8fAo6UdIGkaRHxfIPO5xqxmbW+p9duYN/VtzHx7plbxiAvPXw2T499L6/Zva9JdawlrV++V6+969cvb9Q0iAL+X0TM2W6FdCBwFPAVSbdGxLmNOKFrxGY5kDRd0uOSFkk6u5f1r5D043T93ZImlq2fIGmDpLPS96+WdLukRyQtlPTpQbqUprDz+qVM/M3MbcYgT/zNTF65fmmu5bIMjNzz2V57148c36hpEG8GPippBICkvSXtIWkv4IWI+AFwEXBgL/sOSKaBuIYPm8Ml3S9ps6Tjy9a9LGlB+jM3y3KaDSZJOwCXAe8GpgAnS5pSttlpJHOg7g9cDFxQtn428IuS95uBf4yIKSTzpX6yl2PmrlAIlqzawJ2LV7Nk1YaGNR2P2LSm1zHIwzd5xta2M+6AL3DURU9s07v+qIueYNwb6poGEShOg3gk8CPgTkkPAdeTBNo3AvekHbrOAb6S7n45cJOk2wd6/syapks+bI4ElgH3SpobEY+UbPYU8GHgrF4O8WJETM2qfGY5OhhYFBFLACRdCxwLlN4bxwKz0tfXA5dKUkSEpOOAJ4GNxY0j4lng2fT1+nTu1L3LjpmrQiG4aeFyZl63gOI8srNPmMr0A8bX3amqc7e9kw/k0mA8bOdkubWXzlFP8fpjj2D0fmmv6fF/Ytwbvlhnr2ki4pSyRd8oe7+YpLZcvt+3gG/Vc+4sa8RbPmwi4i9A8cNmi4hYGhEPAoUMy2HWbPYGni55vyxd1us2EbEZeB4YkzaXfQ74cl8HT5ux3wzc3cf6GZK6JXWvWrVqoNfQp75qvUvXbNwShAF6NhWYed0Clq7ZWOlwNekYsx9x3Le3GYMcx32bjjEeg9yWOkc9xWveeipv+N9H8pq3nlpvEM5blp21evuwOaQf+3dK6iZpcjs/In5WvoGkGcAMgAkTJgy8pGatYxZwcURskLavRaaB+qckM6at6+0AEXE5SXMaXV1dDe1WXKnWu2Jdz5YgXNSzqcDK9T3sO3ZEfSfu6ECvPwbGHbBlDLI8BtlaRDP3mn5NRDwjaV/gNkkPRcQ22d+z/EAxy9AzJNOLFr0qXdbbNssk7QjsAqwh+TJ7vKQLSbIBFST1RMSlkoaRBOEfRsR/ZnwNveqr1jv5zGmMG9VJ57CObYJx57AO9hjZ2ZiT1zEG2SxPWX5drOXDpk8R8Uz6ewlwB0lTm1k7uBeYJGkfSTsBJwHlHRLnAqemr48HbovEtIiYGBETgUuAr6VBWMAVwKMRMXtQrqIXlWq9E8cMZ/YJU+kclnzsFGvLE8cMz6OoZk0jyxrxlg8bkgB8ElD+MLxXknYj6Sb+kqTdgcOACzMrqdkgiojNks4g6fixA3BlRCyUdC7QHRFzSYLq1ZIWAWtJ7p9KDgM+BDyU9uoE+OeImJfJRfShUq23o0NMP2A8k8+cxsr1Pewx0tmvzAAUkV2LrqSjSL61Fz9svlr6YSPpIOAGYDegB1geEQdIeiswh6QTVwdwSURcUelcXV1d0d3dndm1mFXRktGk0fdNlj2jrS35PwUZB+LB5EBsOWvJD5Q+75tCAdYuTuYIHtm/yReKOZ9d67Ua+D8Gzd1Zy8zyUCjAYz+HG07fki6S981JZkaqIRh3dIh9x46ovye02RDhvv1mtq21i7cGYUh+33B6stzMGs6B2My2tX55r+ki2bA8n/KYtTkHYjPb1sjx9JpUf8T4fMpj1uYciM1sW6P3S54JlybVf9+cZHnGspoUwqyZubOWmW2royPpmHX6lC3pIvvTa3qg8h76VOztvWJdD+NGube3DR4HYjPbXg7pIiulx8y6B3beXwJsaHPTtJk1hUrpMbOW5cxQZtU4EJtZUyimxyzV0EkhKsjzS4CZA7GZNVahAKufgCfnJ78LtU03nuekEHl+CTDzM2Iza5w6snLlOSlE8UtA+TNizwxlg8GB2Mwap6+sXKdPqanjV17pMT0zlOXJgdjMGqdSVq5B7IE9EM6RbXnxM2Izaxxn5TLrNwdiM2ucHLNymbUqB2KznEiaLulxSYsknd3L+ldI+nG6/m5JE8vWT5C0QdJZtR6zVgNONbklK9d8+PCNye8ap080G6r8jNgsB5J2AC4DjgSWAfdKmhsRj5RsdhrwXETsL+kk4ALgxJL1s4Ff9POYVdWdZSqHrFxmrcxfU83ycTCwKCKWRMRfgGuBY8u2ORa4Kn19PfBOSQKQdBzwJLCwn8esylmmzAaXA7FZPvYGni55vyxd1us2EbEZeB4YI2kE8DngywM4JpJmSOqW1L1q1artCuYsU2aDy4HYrPXMAi6OiA0D2TkiLo+IrojoGjt27HbrnWXKbHA5EJvl4xng1SXvX5Uu63UbSTsCuwBrgEOACyUtBT4D/LOkM2o8ZlV5ppo0G4oy7awlaTrwDWAH4LsRcX7Z+sOBS4A3ASdFxPUl604FvpC+/UpEXIXZQBUKSdan9cuTsa79mV+3nn37di8wSdI+JMHyJOCUsm3mAqcCdwLHA7dFRADTihtImgVsiIhL02Bd7ZhVOcuU2eDKLBDX2IPzKeDDwFll+44GzgG6gADuS/d9LqvyWpOrN5AOMP9xXftWEBGb01rszSRfVK+MiIWSzgW6I2IucAVwtaRFwFqSwNrvYw6kfM4yZTZ4sqwRb+nBCSCp2INzSyCOiKXpuvLpWf4WuCUi1qbrbwGmA9dkWF6rV7VgOdBgWm8wrCf/cZ25kyuJiHnAvLJlXyp53QN8oMoxZlU7ppk1tyyfEdfUg7Oefav1/mxqA5wqrmnPXQyWc6bBVUcnvx/7+dZjV1tfSV/BcO3i2spWKf9xsWx9/T2q7WtmVqeW7qxVrfdn06onKBX3H2ggrffcfakWLGsJpn1dV73BsFL+42p/D+dONrOMZRmI6+nB2ZDen02rnhpevYG02rmrBfmBBstaaqV9XVctwbBSuSvlP67293DuZDPLWJbPiGvpFdqXm4GvSdotff8u4PONL2KdBvrMs5ap4vo6di3PLCuVq9K5R+9X+VlspWe1xWBZeuzSYFltfaXrKgbD8vMWg2G1Z8hb8h9PSa5zRMnfpNq/RaV9zcwaILNPkzQTULEH56PAdcVeoZKOAZB0kKRlJB1S5khamO67FjiPJJjfC5xb7LjVb/U24/a1bz0102o1vErHrqdmWe3c9TQvV6s5Vltf6bqqTSRQSwtDMf/xxGlbA2wt/xaV9h2q8uzfYNaGMh1HXEOv0HtJmp172/dK4Mq6ClCtplSp5lht33p601ar4VU6dj01y90nVT73H39XuXZYrfZYqeZYrWZZ7boqTSRQz2T01f4tbFsZDecyG8rae/alas2d9QTaah/+lYJ8taBU6dgTDqscOOppaq23ebnarDuV1tcTEKuVqxI3PfdPhsO5zIaq9g7E1Zpx6wm0lT78a6k1VApKlY5db82y0rmrBcMsa4/1BMR6y+Vp+2pWWL+cjl7ui8L65XT472c2IO0diCsFpXoCLVT+8K+31lAtsGRVs6wWDLOuPQ40ILpWO2g2Dtudkb3cFxuH7c7I/Ipl1tLaOxBXC0oDDbRQX0/cauoJLPUGpXqal/PUrOVqM3/YPJbR02azz/yZW+6LJ6fN5rnNYzkw78KZtaj2DsSVglI9gbb0+L19+NfzzLLasbPe16yC3YZ38tG7x3HGQdewh55nZezCpXcXuOL1nXkXzaxltXcghr6DUj2Bthr3xLU2NXHMcP5p+hRmXreAnk3D6BzW4ykSzerU/oG4kqxqjn5maW3KUySaNd7QDsRZcvOwtSlPkWjWWK6imZmZ5ciB2MzMLEeKiLzL0BCSVgF/rLDJ7sDqQSpOf7hc/dOs5VodEdPzLkR/+b5pOJerf1ryvmm0tgnE1UjqjoiuvMtRzuXqn2YtV7tq1r+3y9U/zVouS7hp2szMLEcOxGZmZjkaSoH48rwL0AeXq3+atVztqln/3i5X/zRruYwh9IzYzMysGQ2lGrGZmVnTcSA2MzPLUdsHYknTJT0uaZGks/MuTylJSyU9JGmBpO4cy3GlpJWSHi5ZNlrSLZKeSH/v1iTlmiXpmfRvtkDSUYNdrqGgWe+bZrln0rL4vrGGaOtALGkH4DLg3cAU4GRJU/It1XbeERFTcx7j9z2gfFD92cCtETEJuDV9P9i+x/blArg4/ZtNjYh5g1ymttcC900z3DPg+8YapK0DMXAwsCgilkTEX4BrgWNzLlPTiYjfAGvLFh8LXJW+vgo4bjDLBH2Wy7Ln+6YGvm+sUdo9EO8NPF3yflm6rFkE8EtJ90makXdhyoyLiGfT18uBcXkWpswZkh5Mm+AGvelvCGjm+6aZ7xnwfWMD0O6BuNn9TUQcSNIE+ElJh+ddoN5EMsatWca5/TuwHzAVeBb411xLY4OtJe4Z8H1jtWv3QPwM8OqS969KlzWFiHgm/b0SuIGkSbBZrJC0J0D6e2XO5QEgIlZExMsRUQC+Q3P9zdpF0943TX7PgO8bG4B2D8T3ApMk7SNpJ+AkYG7OZQJA0nBJI4uvgXcBD1fea1DNBU5NX58K/FeOZdmi+CGXeh/N9TdrF01537TAPQO+b2wAdsy7AFmKiM2SzgBuBnYAroyIhTkXq2gccIMkSP4dfhQRN+VREEnXAG8Hdpe0DDgHOB+4TtJpJNPkndAk5Xq7pKkkTX5LgdMHu1ztronvm6a5Z8D3jTWOU1yamZnlqN2bps3MzJqaA7GZmVmOHIjNzMxy5EBsZmaWIwdiMzOzHDkQW80kvV3SjXmXw6yV+L6xahyIzczMcuRA3IYkfVDSPem8o3Mk7SBpg6SLJS2UdKuksem2UyXdlSaDv6GYDF7S/pJ+Jen3ku6XtF96+BGSrpf0mKQfKs2uYNbqfN9YXhyI24yk1wMnAodFxFTgZeD/AMOB7og4APg1SbYdgO8Dn4uINwEPlSz/IXBZRPwV8FaSRPEAbwY+QzJP7b7AYRlfklnmfN9Ynto6xeUQ9U7gLcC96ZfunUkSzxeAH6fb/AD4T0m7ALtGxK/T5VcBP0nz+e4dETcAREQPQHq8eyJiWfp+ATAR+G3mV2WWLd83lhsH4vYj4KqI+Pw2C6Uvlm030NymL5W8fhn/H7L24PvGcuOm6fZzK3C8pD0AJI2W9BqSf+vj021OAX4bEc8Dz0mali7/EPDriFgPLJN0XHqMV0h65WBehNkg831jufG3sjYTEY9I+gLwS0kdwCbgk8BG4OB03UqS52GQTNX27fQDYwnwkXT5h4A5ks5Nj/GBQbwMs0Hl+8by5NmXhghJGyJiRN7lMGslvm9sMLhp2szMLEeuEZuZmeXINWIzM7McORCbmZnlyIHYzMwsRw7EZmZmOXIgNjMzy9H/B1Ck7YhL5z97AAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "output2, _ = track_model_metrics(model=model2, \n", + " train_interactions=train_interactions2, \n", + " test_interactions=test_interactions2, \n", + " k=K, \n", + " no_epochs=NO_EPOCHS, \n", + " no_threads=NO_THREADS, \n", + " item_features=item_features,\n", + " user_features=user_features)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These show slightly different behaviour with the two approaches, the reader can then tune the hyperparameters to improve the model fitting process.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 4.1 Performance comparison" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition, the model's performance metrics (based on the test dataset) can be plotted together to facilitate easier comparison as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA9dElEQVR4nO3deXyU5b3//9d7khBCCARCwhZWAQVkKUTAtSrVorWira1Y63Jqtf219nQ77bHfb8/R2tYez+ne+j099qh1rbZupa3WulStGwoWQUBkhyBrEiAJBJLM5/fHfYPDZAIzJJOZJJ/n4zGPzFz39rnvmcxn7uu67+uSmeGcc84lK5LpAJxzznUunjicc86lxBOHc865lHjicM45lxJPHM4551LiicM551xKPHG4Yybpckl/TWK+X0n6t46IqTOQtEzSmZmO42iSfX9d9yO/j6NrkrQeGAg0A/XAk8D1ZlaXybicg0Ofz8+a2TNtXM/V4XpOa4+44tZtwFgzW93e6+7s/Iyja/uomfUGpgEVwLfjZ5CU2+FRdVN+rF1X4YmjGzCzzQRnHCdC8EtK0hclrQJWhWUXSFosaZekVyRNPri8pGGSHpW0Q1KVpF+G5VdLeil8Lkk/kbRd0h5JSyUd3N5vJH0vZn3XSlotqVrSfElDYqaZpM9LWhXGcpskJdovSTmS/o+kNZJqJS2SNCycdoqkNyTtDv+eErPc85K+F+5nnaQ/SiqRdH8Y+xuSRsbF9M+S1kraKem/JEXCacdJei48LjvDdRTHLLte0r9KWgLUS8oNyz4UTp8haWG43W2Sfhyz7IVhtdauMObxcev9F0lLwn18SFLPVo7TTZLui3k9Mtyn3Jj3cW14DNdJujz+/T3aexO+Fz8Kj8E6SdfHbiMunnuB4cAfw+P/zbB8Vvie7JL0lmKq8xLFGB6PXwEnh+vZ1cr+J9y/cNpnJK2QVCPpKUkjwvIXw1neCtd9aaJ1d1tm5o8u+ADWAx8Knw8DlgHfDV8b8DTQHygAPgBsB2YCOcBV4fL54eu3gJ8AhUBP4LRwPVcDL4XPPwwsAooBAeOBweG03wDfC5+fDewkOAvKB34BvBgTtwF/CtczHNgBzGllH78BLAWOD7c5BSgJ96sGuALIBS4LX5eEyz0PrAaOA/oCy4F3gQ+F898D3BUX09/C9Q4P5/1sOG0McE64L6XAi8BP496HxeF7UJDgvXkVuCJ83huYFT4fR1DFeA6QB3wzjLlHzDpeB4aEca0APt/KcboJuC/m9chwn3LD93QPcHw4bTAwMf79Pdp7A3w+PI7lQD/gmYPbONrnM3w9FKgCzif4QXtO+Lo0lRgTbOdIy84Nj+n48Fh8G3glbn/HZPp/ORsffsbRtT0e/gp7CXgBuCVm2g/MrNrM9gHXAf9jZgvMrNnM7gb2A7OAGQRfTt8ws3ozazCzl2ipESgCTiBoO1thZlsSzHc5cKeZvWlm+4FvEfxiHBkzz3+Y2S4z20jwhT21lf37LPBtM1tpgbfMrAr4CLDKzO41syYz+y3wDvDRmGXvMrM1Zrab4GxsjZk9Y2ZNwO8JkmmsW8PjtRH4KUEywsxWm9nTZrbfzHYAPwY+GLfsz81sU3isEx23MZIGmFmdmb0Wll8K/DlcdyPwQ4Ikf0rMsj83s/fMrBr44xGO09FEgRMlFZjZFjNbdoR5W3tvPgn8zMwqzawG+I8UY/g08ISZPWFmUTN7GlhIkEhSjTFea8t+nuD/YEX4vt8CTD141uFa54mja7vIzIrNbISZfSHui2tTzPMRwNfDKoJdYbIZRpAwhgEbwn+sVpnZc8AvgduA7ZJul9QnwaxDgA0xy9UR/LIcGjPP1pjnewl+iScyDFhztG2ENsRtY1vM830JXsdvM/Z4bQi3gaSBkh6UtFnSHuA+YMARlo13DcHZxTthFdkFifbBzKLheo7lOLXKzOoJktTngS2S/izphCMs0to2h3D4fh5pnxMZAXwi7jN4GsFZa6oxHnKUZUcAP4vZXjXBmevQhCtzh3ji6L5iL6fbBHw/TDIHH73CX+qbgOGJ6qpbrNDs52Y2HZhA8GX4jQSzvUfwDwuApEKC6qXNx7APmwiqm464jdDwY9zGQcPi1vVe+PwWgmM5ycz6EPxyjm+TafXSRTNbZWaXAWXArcDD4TGJP04KYziWfagHesW8HhQXw1Nmdg5BNc47wK+PYRtbCKqpDhrW2owHNxv3ehNwb9xnsNDM/uMoMR71stAjLLsJ+FzcNgvM7JWjrbO788ThIPhH+rykmQoUSvqIpCKCevQtwH+E5T0lnRq/AkknhcvnEXxRNRBUEcT7LfBPkqZKyif44l1gZuuPIe7/Bb4raWwY92RJJcATwDhJn1LQGH0pQTL70zFs46BvSOqnoPH9y8BDYXkRUAfsljSUxMmyVZI+Lak0PKPYFRZHgd8BH5E0OzymXyeoPjyWL7XFwBmShkvqS1A9eHD7AyXNDZPV/nBfEr1vR/M74MuShiq4OOBfjzL/NmB0zOv7gI9K+nDY0N5T0pmSyo8S4zagXFKPRBs5yrK/Ar4laWI4b19JnzhCjC7kicNhZguBawmqmmoIGgyvDqc1E7QNjAE2ApUEp/7x+hAkoBqCKpYq4L8SbOsZ4N+ARwgS0nHAvGMM/ccEX1h/JWgAvYOgAboKuIDgy7aKoGH5AjPbeYzbAfgDQeP/YuDP4bYAvkPQ0L87LH80xfXOAZZJqgN+Bswzs31mtpLg7OUXBBcTfJTg8uoDqQYethc8BCwJ9yE2gUaArxGc4VQTtM/8f6lug+C9/2u4jX8QJO8mgvuIEvkB8O2wmuhfzGwTQWP1/yFodN9EkIQjR4nxOYILP7ZKSvT+trqsmT1GcJb3YFjN+DZwXsyyNwF3hzF+MpWD0dX5DYDOHYX8RrCUSToP+JWZeUNzF+RnHM65NpNUIOn8sGpwKHAj8Fim43Lp4YnDOdceRFBtV0NQVbUC+PeMRuTSxquqnHPOpcTPOJxzzqWkW3S6NmDAABs5cmSmw3DOuU5l0aJFO82sNL68WySOkSNHsnDhwkyH4ZxznYqk+B4YAK+qcs45lyJPHM4551LiicM551xKukUbRyKNjY1UVlbS0NCQ6VCyUs+ePSkvLycvLy/ToTjnsky3TRyVlZUUFRUxcuRIlHiAuW7LzKiqqqKyspJRo0ZlOhznXJbptomjoaHBk0YrJFFSUsKOHTsyHYpznZLtqYLtG7D9+1D/wVA2HOV07NetNdRD437o1afdt91tEwfgSeMI/Ng4d2xsTxXRP/4/2LY+eC0RmfslGD2lY7ZvUdj4DtEXHoLdO9D4WVAxBxWXtds2vHHcOefa0/YNh5IGAGZE//Ygtq+2Y7a/o5LoYz+FnZXQuB9b8gLR1/6INR1xEM+UpDVxSJojaaWk1ZJuSDD9DElvSmqSdEnctKskrQofV8WUT5e0NFznz9XJfhrfdNNN/PCHP2x1+uOPP87y5cs7MCLnXHuy/QmGlq+rhsaUh1I5tu1XvQfRuGFQVrwKdTXtto20JQ5JOQTjT59HMPraZZImxM22kWDAoAfilu1P0C3zTGAGcKOkfuHk/yYYdGhs+JiTpl3ICE8cznVu6j8Y4n7PauKpUNi3Y7afX9CysFcfyG2/KyTTecYxA1htZmvDUcseJBjh6xAzW29mS2g5VOWHgafNrNrMaoCngTmSBgN9zOw1C7r1vQe4KI370C6+//3vM27cOE477TRWrlwJwK9//WtOOukkpkyZwsc//nH27t3LK6+8wvz58/nGN77B1KlTWbNmTcL5nHNZrGx40KbRtwxyctHkD6KK8zqucbx0OAw+7rAinXkZ6l3cbptI554MJRj+8aBKgjOIY112aPioTFDegqTrgOsAhg8fnuRm29+iRYt48MEHWbx4MU1NTUybNo3p06fzsY99jGuvvRaAb3/729xxxx186Utf4sILL+SCCy7gkkuCmrvi4uKE8znnspNycmH0FCKDRwfVU4V9O/SKKhX1I3LB52DrBqyhDpUMgdL2HYixy15VZWa3A7cDVFRUZGzQkb///e9cfPHF9OrVC4ALL7wQgLfffptvf/vb7Nq1i7q6Oj784Q8nXD7Z+Zxz2UUFRZCg1qhDtl1UAkUlpKsBOJ1VVZuBYTGvy8Oytiy7OXx+LOvMKldffTW//OUvWbp0KTfeeGOrd7AnO59zznWUdCaON4CxkkZJ6gHMA+YnuexTwLmS+oWN4ucCT5nZFmCPpFnh1VRXAn9IR/Dt5YwzzuDxxx9n37591NbW8sc//hGA2tpaBg8eTGNjI/fff/+h+YuKiqitff+yvdbmc84dmR3Yj+2ry3QYXVLaqqrMrEnS9QRJIAe408yWSboZWGhm8yWdRDCgfT/go5K+Y2YTzaxa0ncJkg/AzWZWHT7/AvAbgpPAJ8NH1po2bRqXXnopU6ZMoaysjJNOOgmA7373u8ycOZPS0lJmzpx5KFnMmzePa6+9lp///Oc8/PDDrc7nnEvMolHY/C7RV/8AtTVo6lno+Bmod7+jL+yS0i3GHK+oqLD4gZxWrFjB+PHjMxRR5+DHyHVGtnUd0Qd/cNi9DDr1YiIzL8hgVJ2TpEVmVhFf7neOO+e6FNu+qcUNcPbm01jdrswE1AV54nDOdS15PVqW5RdCB3cy2JV54nDOdSkaNAri2jN0+sdRQe8MRdT1eAp2zrU727UN27oeDjSgsuFBt+KRnA7ZtvoNJHLJ17HNq6B+NyofBwN9XJn25InDOdeurGYr0Ud+DHuqgteKEPnYV2FEfFd16aP+g4M+o1xaeFWVc65d2eY1h5JGUBAl+vKjiXuNdZ2SJ44uYuTIkezcuTPpeT7zmc9QVlbGiSee2BHhuU7G9tZim1YSXbcU253iSJANCW66q9sFzY3tEpvLPE8c3dTVV1/NX/7yl0yH4bKQ1VYTffLXRH//n9hjPyX6wPexbRuSXl6DW7YnaMqZqFef9gzTZZAnjiQt2L6Ob73+OJ/7+wN86/XHWbB9XZvXuX79ek444QSuvvpqxo0bx+WXX84zzzzDqaeeytixY3n99deprq7moosuYvLkycyaNYslS5YAUFVVxbnnnsvEiRP57Gc/S+yNnPfddx8zZsxg6tSpfO5zn6O5ubnFts844wz69+/f5n1wXY+9txo2LHu/YF8t0TeewJqSPGMYOCroVrzfICjojU6eiyackp5gXUZ44kjCgu3ruG/V61TvD8bCqN6/l/tWvd4uyWP16tV8/etf55133uGdd97hgQce4KWXXuKHP/wht9xyCzfeeCMf+MAHWLJkCbfccgtXXnklAN/5znc47bTTWLZsGRdffDEbN24Egru9H3roIV5++WUWL15MTk6O93HlUlO9tWXZe2vgQHJtFMrNQ8dNJTLvW0Su+A6a9VFU5D9SuhK/qioJj69/iwNxd6IeiDbz+Pq3mFnWtsv8Ro0axaRJkwCYOHEis2fPRhKTJk1i/fr1bNiwgUceeQSAs88+m6qqKvbs2cOLL77Io48+CsBHPvIR+vULrlt/9tlnWbRo0aE+sfbt20dZWfsNUu+6Pg0cQXxHRBo7HXqmdh+E3zfRdXniSMLBM41ky1ORn59/6HkkEjn0OhKJ0NTURF5easM9mhlXXXUVP/jBD9ocm+umBh+HZl6AvfFk0HVH+Qloylko4hUULuCfhCT0z++VUnl7Ov300w9VNT3//PMMGDCAPn36cMYZZ/DAA8FQ7U8++SQ1NcFA9LNnz+bhhx9m+/btAFRXV7NhQ/INm86poDea9VEiV9yEPn0jkbnXo/6DMh2WyyKeOJJw0cgp9Ii767VHJIeLRk5J+7ZvuukmFi1axOTJk7nhhhu4++67Abjxxht58cUXmThxIo8++uih4XEnTJjA9773Pc4991wmT57MOeecw5YtW1qs97LLLuPkk09m5cqVlJeXc8cdd7R77NbYgB3Y3+7r7SyscX+nvXdBObmoZAiRsuEoP0PD2Lms5d2qJ2nB9nU8vv4tqvfvpX9+Ly4aOaXN7RvZ7li7VbcDDbBxOdHXnwAzdNL5aMRElN8zDVFmH2tqhMqVRBf8CfbvQxVz0OjJqGdhpkNznYiZQTSKcjqmq5ZEWutW3ds4kjSzbFSXTxTtZvMqovNvO/TS/vT/0Nx/huPSf4aWFbauI/roTw69tL/8L5x/LTphVgaDcp2JbVuPLXkB27kZTTodjZqMCvtmOqxD0lpVJWmOpJWSVku6IcH0fEkPhdMXSBoZlveQdJekpZLeknRmzDLPh+tcHD78kqEsE132SsuyJS9kIJLMsLVvtSxb9DTW2H2r7VzyrOo9or//Ibb0RdiyBvvrb7ClL5JNtUNpSxyScoDbgPOACcBlkuJ7ObsGqDGzMcBPgFvD8msBzGwScA7wI0mxsV5uZlPDx/Z07YM7NipIUCXTnS7NTHTRRM9C6KDeYV3nZjs2tbhnxt54EmqrW1mi46XzjGMGsNrM1prZAeBBYG7cPHOBu8PnDwOzJYkg0TwHECaGXUCLejaXnTTh5MMHzYnkEJl8RuYC6mAadSLkxbTnSEROOg/5QEIuCQkve47kgLLnWqZ0fpKHAptiXlcCM1ubx8yaJO0GSoC3gAsl/RYYBkwP/74eLneXpGbgEeB7luAcTtJ1wHXAoSuOXAcZNJrIpTdgG1cEjXsjJsCgkZmOqsOobASRS/8V27Acmvaj4RNgkLePuSSVDodefWDvnkNFOuUiVNTvCAt1rGz9CXQnMB5YCGwAXgEO3rp9uZltllREkDiuAO6JX4GZ3Q7cDsFVVR0RtAtIgkGjgpHYuimVDQ8GMHIuRYcGolq9GKveQmTcdCg/PtNhHSad5z6bCc4SDioPyxLOIykX6AtUmVmTmX01bMOYCxQD7wKY2ebwby3wAEGVWLeXSrfqmzZt4qyzzmLChAlMnDiRn/3sZx0UZcex2hps/dtE1y7Bdh/5uDiXbTSgnMisC8g5/1o0ZlrWXcqdzjOON4CxkkYRJIh5wKfi5pkPXAW8ClwCPGdmJqkXwT0m9ZLOAZrMbHmYXIrNbKekPOAC4Jk07kOXlJuby49+9COmTZtGbW0t06dP55xzzmHChI4boS2drHor0T/eBlXvBa8Li4l87KuotDzDkblkWUM9VG3Bmvaj4oGo74BMh+RipO2Mw8yagOuBp4AVwO/MbJmkmyVdGM52B1AiaTXwNeDgJbtlwJuSVgD/SlAdBZAPPCVpCbCYICH9Ol37ECu64lWaf/1Nmn98Dc2//ibRFa+2eZ2Z6lZ98ODBTJs2DYCioiLGjx/P5s3xJ4Odl61bcihpAFC/C3v771l1OaNrndXvIvrsfUQf+gH2yI+J/ja18UBc+qW1md7MnjCzcWZ2nJl9Pyz7dzObHz5vMLNPmNkYM5thZmvD8vVmdryZjTezD5nZhrC83symm9lkM5toZl82s5aDTbSz6IpXsafvgdpwOMzaKuzpe9oleWS6W/X169fzj3/8g5kz469b6LwSfcnYljVBh30u+21ZBytff//13j1EX3kcazyQuZjcYbK1cTyr2EuPQVPch7bpQFA+/uQ2rTuT3arX1dXx8Y9/nJ/+9Kf06dN1RmfTqEnYO68dXnb8DL8ctpNIOFTtljWwfy/k9ej4gFwL/p+UjINnGsmWpyBT3ao3Njby8Y9/nMsvv5yPfexjqQeexTR8PFTMwd58GiwKE04JxpNwnYL6D2kxHgijJqc8HohLn+y5oySbFZWkVt6O0tGtuplxzTXXMH78eL72ta+lfR86mgr7olMvJnLld4hceTOR2VegPul/r1w7GTQSzfzo+ze8lY0gMuN8lOu/c7OFvxNJ0GkXB20csdVVuT3QaRenfds33XQTn/nMZ5g8eTK9evU6rFv1yy67jIkTJ3LKKack7FY9Go2Sl5fHbbfdxogRIw6t8+WXX+bee+9l0qRJTJ06FYBbbrmF888/P+3701GUkwv9B2c6jGNme6pgxyasuQkNGIpS3BerrQ6Wb2pEJUNRSec5FiroDbMuQMefhDUdQH1LMzKaoDU2QGMj6lXU4dvOdt6tepKiK14N2jRqq6CoBJ12MZE2tm9ku2PtVt21jdVsI/qHX0B1OI5Kj55ELvmXpG+otF3bif7hl1AVXimX15PIJV9Hg0enKeKuxSwKle8SfeVx2FONJn8QTTi5W46b7t2qt1Fk/Mltbgh3Lhm26Z33kwbAgQaiC58iMuezSVXXWOW77ycNgMYGoq//mchHPo9yU2sz65a2byT6yI8PXYVnLz8a1DacclHQK4LzNg7nss6ubS3LdmyC5iQvR92doMPonZvBu3VPiu2obHHptv3jWajflZmAslC3ThzdoZruWPmxyRwl6JdIE09FSY5xr6FjW5ZNOCUj7QSdUl5+y7KehRDxCpqDum3i6NmzJ1VVVf4FmYCZUVVVRc+e3WOo16wzZAz64Lyga3ZF0OQz0Qkp3KA5+Dh01qegR7j8pDPQhFPSF28Xo7IRENfFic74hDeSx+i2jeONjY1UVlbS0NCQoaiyW8+ePSkvL0/5PpL2ZHvC+2SK+qEOHovAqt7DtqyFaBMaOArKhndo/baZwZ6qoMqkT8kx3bxou3cGyxeV+KWsKbKabdjmVbB3DxpyHAwchbrhzYfeOB4nLy+PUaO6b7ff2cz21WHLXsZenQ8WRSedB5M+iHp3zJjLtmMT0d//FzTUB69zcol84hswZEyHbB/Crunb2LGfdwx47NRvIOo3MNNhZK1uW1XV1dmu7URX/wNbs7jzdSu+6R3sxd9BY0PQtcurf8DWL+mwzdvaJYeSBgDNTUQX/RXzvq6cA7rxGUdXZjsqiT7yI9i7J+i6oah/0K14yZBMh5aU6LsLW5TZ2y9jE05NPKxme6uraVm2pxqiUR833Dn8jKNLsuUvHzbsJLXV2Ko3MxdQijSgZYJT6bCOSRqAjpvasmzqWX4PhHMhTxxdjFkU27q+Zfn2zjOegcZMC8ZcPii/FzrxtI4LYMgYdP51UFwGvfuhsz6FRk/uuO07l+W8qqqLkSJo/Cxs87uHl3ei3mE1oJzIpd+CHRsBgwHlKffV1Kbt9+iJTpiJjZgYNM736jpdzjvXHtJ6xiFpjqSVklZLuiHB9HxJD4XTF0gaGZb3kHSXpKWS3pJ0Zswy08Py1ZJ+Lu8DoAWNnoymnRvUx+fkopkXoOGda1hY9StD4yrQuJM6NGkcFkNBb08aziWQtjMOSTnAbcA5QCXwhqT5ZrY8ZrZrgBozGyNpHnArcClwLYCZTZJUBjwp6SQziwL/HU5fADwBzAGeTNd+dEbq3Q9O/ziachYI6DOgw9oHnHNdXzq/TWYAq81srZkdAB4E5sbNMxe4O3z+MDA7PIOYADwHYGbbgV1AhaTBQB8ze82COxfvAS5K4z50WsrJDX61F5d50nDOtat0fqMMBTbFvK4MyxLOY2ZNwG6gBHgLuFBSrqRRwHRgWDh/5VHWCYCk6yQtlLRwx44EQ1E655w7Jtn6U/ROgqSwEPgp8AqQ0t1XZna7mVWYWUVpaWn7R+icc91UOq+q2kxwlnBQeViWaJ5KSblAX6AqrIb66sGZJL0CvAvUhOs50jq7BNu7B6veGlQz9RvkPZs657JGOhPHG8DYsKppMzAP+FTcPPOBq4BXgUuA58zMJPUi6ICxXtI5QNPBRnVJeyTNImgcvxL4RTqCt4Z6bNd2FMmB4oGoR4KultPEqrcSffJ22LYhuPN7+AQi51yJ+vqZk3Mu89KWOMysSdL1wFNADnCnmS2TdDOw0MzmA3cA90paDVQTJBeAMuApSVGCpHNFzKq/APwGKCC4mqrdr6iymm1E//ob2Pxu8MU98VQip14cXK3UAWzlAtgWc8PexuXY+reDq6Sccy7D0noDoJk9QXDJbGzZv8c8bwA+kWC59UDL0WyCaQuBE9s10MPXjy1/BWJvoFv2MjZ8Aho/K12bfX/7zU3YuqUtyyvfBU8czrkskK2N45nTeABbs7hledyd2OminFw0emrL8uHjO2T7zjl3NJ444uXlJf6SHjS6w0LQCTMgdvjQMdPQiIkdtn3nnDsS76sqjhSBSWcE1UU1W4PCERM79Be/isuIfPSL2K5twVVVxQNRfkGHbd85547EE0cCKhlC5BPfCC6HzcmB/oM7/HJYFRSigo47y3HOuWR54miFehej3sXHvLzt3YPVbAsu5+03EPUsbL/gnHMugzxxpEGL+zBGTSEy+3LUpyTToTnnXJt543ga2PJXDr8PY91b2MblrS/gnHOdiCeOdmaNB7D1Ce7D2LwqA9E451z788TRzpTXA42e0rJ86LgMROOcc+3PE0caaPwsGBxzRdSYaWiE38DnnOsavHE8DdRvEJGLvozVbH3/qqr8XpkOyznn2oUnjjRRQW9UMCbTYbRJ0Ls9+LDuzrlYnjhcC9bYCO+9S3Txc4CITD0bho5FuXmZDs05lwU8cbiW3ltF9JEfH3oZXfMPIpf8C3hHi845vHHcJRBd+kLLsrf/noFInHPZyBOHaymS4EQ0UZlzrltKa+KQNEfSSkmrJd2QYHq+pIfC6QskjQzL8yTdLWmppBWSvhWzzPqwfLGkhemMv7uKTD4DiGkQl4iceFrG4nHOZZe0/YyUlAPcBpwDVAJvSJp/cOzw0DVAjZmNkTQPuBW4lGBUwHwzmxSOP75c0m/DkQEBzjKznemKvdsbfByRT34TW/EqSMF9KR04HolzLruls/5hBrDazNYCSHoQmAvEJo65wE3h84eBXyq49tOAQkm5BGOLHwD2pDFWF0M5uVA+DpX73e7OuZbSWVU1FNgU87oyLEs4j5k1AbuBEoIkUg9sATYCPzSz6nAZA/4qaZGk61rbuKTrJC2UtHDHjh3tsT/OOefI3sbxGUAzMAQYBXxd0sG6ktPMbBpwHvBFSWckWoGZ3W5mFWZWUVpa2iFBO+dcd5DOxLEZGBbzujwsSzhPWC3VF6gCPgX8xcwazWw78DJQAWBmm8O/24HHCJKMc865DpLOxPEGMFbSKEk9gHnA/Lh55gNXhc8vAZ6zoJ+LjcDZAJIKgVnAO5IKJRXFlJ8LvJ3GfXDOORcnbY3jZtYk6XrgKSAHuNPMlkm6GVhoZvOBO4B7Ja0GqgmSCwRXY90laRnBdaF3mdmSsLrqsbDvpFzgATP7S7r2wTnnXEs62JFdV1ZRUWELF/otH845lwpJi8ysIr48WxvHnXPOZSlPHM4551LiicM551xKPHE455xLiScO55xzKTlq4pA0UNIdkp4MX0+QdE36Q3POOZeNkjnj+A3BvRhDwtfvAl9JUzzOOeeyXDKJY4CZ/Q6IwqHOCJvTGpVzzrmslUziqJdUQtArLZJmEfRi65xzrhtKpsuRrxH0KXWcpJeBUoJ+pZxzznVDR00cZvampA8CxxP0G7XSzBrTHplzzrmsdNTEIenKuKJpkjCze9IUk3POuSyWTFXVSTHPewKzgTcBTxzOOdcNJVNV9aXY15KKgQfTFZBzzrnsdix3jtcTDOfqnHOuG0qmjeOPhJfiEiSaCcDv0hmUc8657JVMG8cPY543ARvMrDJN8TjnnMtyR62qMrMXYh4vp5I0JM2RtFLSakk3JJieL+mhcPoCSSPD8jxJd0taKmmFpG8lu07nnHPp1WrikFQraU+CR62kPUdbsaQcgrHDzyOo3rpM0oS42a4BasxsDPAT4Naw/BNAvplNAqYDn5M0Msl1OuecS6NWE4eZFZlZnwSPIjPrk8S6ZwCrzWytmR0guBJrbtw8c4G7w+cPA7MliaBNpVBSLlAAHAD2JLlO55xzaZT0VVWSyiQNP/hIYpGhwKaY15VhWcJ5ws4TdwMlBEmkHtgCbAR+aGbVSa7zYLzXSVooaeGOHTuSCNc551wykhmP40JJq4B1wAvAeuDJNMc1g6AH3iEEl/5+XdLoVFZgZrebWYWZVZSWlqYjRuec65aSOeP4LjALeNfMRhHcOf5aEsttBobFvC4PyxLOE1ZL9QWqgE8BfzGzRjPbDrwMVCS5Tuecc2mUTOJoNLMqICIpYmZ/I/gSP5o3gLGSRknqAcwj6GU31nzgqvD5JcBzZmYE1VNnA0gqJEhc7yS5Tuecc2mUzH0cuyT1Bv4O3C9pO0H7wxGZWZOk6wlGD8wB7jSzZZJuBhaa2XzgDuBeSauBaoJEAMGVU3dJWkbQI+9dZrYEINE6U9hf55xzbaTgB/4RZpD+L8HwsVuBTxNUJ90fnoV0ChUVFbZw4cJMh+Gcc52KpEVm1qKGKZmqqlzgr8DzQBHwUGdKGs4559pXMneOf8fMJgJfBAYDL0h6Ju2ROeecy0qp9I67naC6qgooS084zjnnsl0y93F8QdLzwLMEN+dda2aT0x2Yc8657JTMVVXDgK+Y2eI0x+Kcc64TSGYEwG8dbR7nnHPdx7GMAOicc64b88ThnHMuJZ44nHPOpcQTh3POuZR44nDOOZcSTxzOOedS4onDOedcSjxxOOecS4knDueccynxxOGccy4laU0ckuZIWilptaQbEkzPl/RQOH2BpJFh+eWSFsc8opKmhtOeD9d5cJr31Ouccx0obYlDUg7BELDnAROAyyRNiJvtGqDGzMYAPwFuBTCz+81sqplNBa4A1sV1snj5welmtj1d++Ccc66ldJ5xzABWm9laMzsAPAjMjZtnLnB3+PxhYLYkxc1zWbisc865LJDOxDEU2BTzujIsSziPmTUBuwnG/Ih1KfDbuLK7wmqqf0uQaACQdJ2khZIW7tix41j3wTnnXJysbhyXNBPYa2ZvxxRfbmaTgNPDxxWJljWz282swswqSktLOyBa55zrHtKZODYTDAJ1UHlYlnAeSblAX4KhaQ+aR9zZhpltDv/WAg8QVIk555zrIOlMHG8AYyWNktSDIAnMj5tnPnBV+PwS4DkzMwBJEeCTxLRvSMqVNCB8ngdcALyNc865DpPM0LHHxMyaJF0PPAXkAHea2TJJNwMLzWw+cAdwr6TVQDVBcjnoDGCTma2NKcsHngqTRg7wDPDrdO2Dc865lhT+wO/SKioqbOHChZkOwznnOhVJi8ysIr48qxvHnXPOZR9PHM4551LiicM551xK0tY47pxz7tjs3r+XjfU17G1qZFCvPpQXFpOj7Pmd74nDOeeyyK79e7nr3Vd5Z9c2ACKIL078ICf2H5LhyN6XPSnMOeccG+uqDyUNgCjGg2veoLaxIYNRHc4Th3POZZH6pgMtynY27GV/c1MGoknME4dzzmWRQb36EN9za0XpcPrm9cxIPIl44nDOuXZmZuzYV0tlXQ37GlueQRzJsMJ+fG786RT3KEBAxYBhfHTEJPJyUmuS3rmvjsq6GvYmOINpK28cd865drS/qYkFO9bx8Lp/sL+5iTFFA/j0uJkM7tU3qeVzIzl8YMAwRvcZwIHmJop7FKSUNBqbm3hj50YeWrOQhuYmRvYu4cpxMxlaWHyMe9SSn3E451w72lhfzf2r3zjUJrG6did/WP8WjSm2UfTtUUBpQVHKZxqb6ndx97uv0RBub31dFY+s/Qf7mxtTWs+R+BmHcy7r1OyvZ1PdLg5EmxjUqw9DexXTyphtWWf7vtoWZW9Vb2ZPYwMlOb0zsv1lu7aw+0ADZQV57bINTxzOuaxS1VDH7SteZn1dMDRPriJ8ZdLZjO1bluHIktMnQSP20F7FFOT26Jjt92i5/YE9iyjIaZ+kAV5V5ZzLMutqqw4lDYAmi/LYusU0NCVf1bK38QBr9+xgec0WqhrqUo5hX1Ow/LKaLezcl9ryw4v6M61k+KHXPSI5XHrcdHp1UOIYVtiPWWWjDr3OVYRPjTmJogQJ5Vj5GYdzLqvsPrCvRdm2hloampvomXv0X817Duzj4XWLWbB9HQC98/L554lnMqKoJKnt7znQwGPrFvPK9mAooMLcHnzpxDMZVTQgqeX79ijg8rEnceaQsexrPsDAgj5JN4y3h6IePfnk6GmcOnA0e5sPUNazqN2372cczrmsUl7Yr0XZrLKR9OmRn9Ty62urDyUNgLrG/Ty2/q2kG4c31lUdShoQ3JD3aIpnPL3z8jm+eCBTS4Z1aNI4qDAvn3Hh9ocUtn/7UFoTh6Q5klZKWi3phgTT8yU9FE5fIGlkWH65pMUxj6ikqeG06ZKWhsv8XJ2lxcw5l5SRvUu4atwsCnN7IMSsslGcOXgckSQ7+aveX9+ibF3tTvYm+cVfs7/lGc+62qq03A/RWaWtqkpSDnAbcA5QCbwhab6ZLY+Z7RqgxszGSJoH3Apcamb3A/eH65kEPG5mi8Nl/hu4FlgAPAHMAZ5M13445zpWfm4upwwczfjigTRGo/TP70VuJCfp5csKilqUTew3hN5JtjGUFrS88mlC8SB6Z9Gd25mWzjOOGcBqM1trZgeAB4G5cfPMBe4Onz8MzE5wBnFZuCySBgN9zOw1C8a8vQe4KE3xO+cyqF9+IWUFRSklDYARvftzwfBJRMKvkvLCYi4YfmLS90MML+zP3BFTDi0/pFdf5o6cQo+c1OLoytLZOD4U2BTzuhKY2do8ZtYkaTdQAuyMmedS3k84Q8P1xK5zaKKNS7oOuA5g+PDhiWZxLmtVNdRTWV9DUzTK0MK+DMpAPXlnVZiXz3nDJjB9wHAORJsYkN+b3km2jwD0yuvBh8vHM7WknP3RJkpTXL47yOqrqiTNBPaa2dupLmtmtwO3A1RUVFh7x+ZcumzfW8tty19g6749APTMyeWrk2YzMsmrglzQbceQwmNPtjmRSJuW7+rSWVW1GRgW87o8LEs4j6RcoC9QFTN9HvDbuPnLj7JO5zq1lbu3HkoaAA3NTTxduZymaHMGo0rN3sYDrNmzgxU1W6luaNlY7Tq3dJ5xvAGMlTSK4Mt9HvCpuHnmA1cBrwKXAM+FbRdIigCfBE4/OLOZbZG0R9IsgsbxK4FfpHEfnOtw2xJ0GbGpfjcHmptTru/PhF379/H7tW+ycOcGAIp7FHD9xDMZ1rvlZbauc0rbGYeZNQHXA08BK4DfmdkySTdLujCc7Q6gRNJq4GtA7CW7ZwCbzGwth/sC8L/AamANfkWV62KOLx7YouzkslH0yuuYO4/bal3tzkNJA2DXgX08sentlDv5c9krrW0cZvYEwSWzsWX/HvO8AfhEK8s+D8xKUL4QOLFdA3Uui4wuKuWTo6fxhw1LaGxu5rRBxzGjbGRK66is28XK3VvZ19TICcUDGVFUQl4Hna1si6lmO2j17h3sa2pMuadXl538XXQuTWr21xM1o19+r6RvXgMozOvB2UOOZ0pJOVGzlO9j2Fxfw4+WPnPohrc/bVzKlyaeycT+Q1KKf9f+vTRZlH49epETST7+Ib2KW5Sd2H9IpzljckfnicO5dra38QCv71jPHza8xYHmZs4ecjxnDz2efvm9kl6HJAb0PLYuuFfu2n7YXdIG/Hnj24zpW0Z+Er/49zc1smjnRh5et5iG5kbOGDSGc8rHU9KzMKntj+pTwjlDx/PM5ncwjGGF/TinfHynaJ9xyfHE4Vw7W1O7g9+uWXjo9V83r6A4v4DZQ0/okO03JOiTqb7pAM0WTWr5dXVV3L1qwaHXf9vyLr3z8rlgxKSkli/K68lFIydzctlIDkSbKS0oonee3wfRlXgnh861s2XVW1qUvbx1bUqd5LXFuL4Die9+4Zzy8Ul36712z84WZS9vW0PtgYakY8iN5DC0dz9G9RngSaML8jMO59pZWYK+joYU9umwxumRRf355xPP4omNb1PXdIAPDT2BKSUJO1hIqDhBlVpZQZ+kqrlc9+CfBOfa2YR+g+mfX3iol9b8nFw+NHR8Sg3MbZEbyWFCv8GM6VNKczRKQYqN0mP7lDKooM+hmxDzIjl8dPgkenjicCGF99t1aRUVFbZw4cKjz+hcO9mxr5bK+l1hX1PFna77iqqGOjbV1XAg2syQwr4Jx8hwXZ+kRWZWEV/uPyGcS4PSgiJKE3Tv3VmU9OxNyTFe1eW6Pm8cd845lxI/43AugR37atlUX0Nz1Bha2JchhcWZDsm5rOGJw7k4W/bu5mdLn6PmQDCEqHdr7tzhvKrKuThLq987lDQg6Nb8b++9SzTJG+ic6+r8jMN1SfWN+9m6bw9RMwYWFNGnR0HSy25P0Enf5vAKqR45/lvLOU8crsvZ2VDHfateZ8WurQAML+zHNSecyqBefZJaflL/ofx965rDyk4fdJzfx+BcyH8+uS5nec2WQ0kDYGN9DQu2r0t6+TF9ypg3ejoFOXnkRXI4f9hEppaUH31B57oJ/wnlupxVu7e3KFtWs4WPDD8xqR5aC/N6cNbQ45k6oDzsFr2QiOJ7f3Ku+0rrGYekOZJWSlot6YYE0/MlPRROXyBpZMy0yZJelbRM0lJJPcPy58N1Lg4fZencB5cZW/buZsG2dby6bS2V9TUpLXtC8aAWZZP7D025W+9++YWU9OztScO5OGk745CUA9wGnANUAm9Imm9my2NmuwaoMbMxkuYBtwKXSsoF7gOuMLO3JJUAsV2LXh6OBOiy1PZ9e6is34WZUV7Yj4FJti8AbKqr4ccxAxHlR3L56uTZjEryctjxxYOYNmAYb+7cBMCYPqUpj6DnnGtdOquqZgCrD44ZLulBYC4QmzjmAjeFzx8GfilJwLnAEjN7C8DMqtIYZ5e0s6GO9+p3I4mhvfrSP8lBeNrD5rpd/OTt56htDLrhLszN56uTzmZY7+T6O1q0c+NhAxHtjzbx0pZVSSeO/j0LuXLsLOaUTyBqRllBEYXetbdz7SadiWMosCnmdSUws7V5zKxJ0m6gBBgHmKSngFLgQTP7z5jl7pLUDDwCfM8S9NQo6TrgOoDhw4e3zx51Eu/V7+Lnb//t0L0Ipfm9+eKJH2Rwr47paG/hzg2HkgZAfdN+Fmxfl3Ti2LmvrkXZtn11RC2a9BCsBbl5jPAb9pxLi2y9qioXOA24PPx7saTZ4bTLzWwScHr4uCLRCszsdjOrMLOK0tLSjog5a7y2fd1hN7Dt2F/H4qrKDtt+Zf2uFmWb6pJvpzipbESLstMHH5fSuN3OufRJ53/iZmBYzOvysCzhPGG7Rl+giuDs5EUz22lme4EngGkAZrY5/FsLPEBQJeZCUTNWJxjBbX1tx9X2nVTa8ot/1sBRSS8/tk8ZV4yZQXGPAoryenLp6OlM7DekPUN0zrVBOquq3gDGShpFkCDmAZ+Km2c+cBXwKnAJ8JyZHayi+qakXsAB4IPAT8LkUmxmOyXlARcAz6RxHzqdiMTM0pGs2bPjsPIPDBjWyhItRc1YX1vF2zXvIeDEfkMYUVSS9NVF44sHcdGIKTy5aRlRjDnlE1L64u+V14PTBo9hcv+hmKBvCnd9O+fSL22JI2yzuB54CsgB7jSzZZJuBhaa2XzgDuBeSauBaoLkgpnVSPoxQfIx4Akz+7OkQuCpMGnkECSNX6drHzqrySVD2bx3F3/fshoEs4ecwPi+LS9Rbc3aPTv50dJniIZNR09uWs7XJ8/muD7JVfkV9ejJnGETwiuZjP75hegYLmntk+8Jw7ls5CMAdlFN0WZ2NtQjYEDP3ikNW/qbla/x6va1h5WdPmgMnx7rtYLOdSc+AmA3kxvJSbpvpnj7mva3KKtPUOac6578MhXXwmmDx7YsG3hcBiJxzmUjP+NwLYzrU8YXJpzBU5uWg2BO+QTG9PGeXZxzAU8croX83FymlJQzvngQIHrkpNbHk3Oua/PE4Vrl40845xLxNg7nnHMp8cThnHMuJV4XkaXeq9/FpvoaIohhvfsf86W1zjnX3jxxZKENtVX8aOmz7G9uAqB3Xj5fO3E2Q3sXZzYw55zDE0erqhrqeG/vbnIUYWivvvTN75XyOg40NxMRKY08Z2a8uGXVoaQBUNe4n8XVlZ44nHNZwRNHApV1Nfzs7b+xJxxTYnhhP64bfxqlBUVJLb+v6QDLa7by9OZ3yI/kcG75BMYVl5GXRAKJmrFl754W5dv2tSxzzrlM8MbxOFGL8vyWVYeSBsDG+hpW7Nqa9Dre2bWV2995iXW1O3ln9zZ+sexvrKtt2dV5IjmRCKcNHtOifPqA7jUYlXMue3niiNMYbWZtgvEskh2IqCnazLObVx5WZsCbOzYlXiCBE/sN4eKRU+iZk0dhbj6XHVfhd24757KGV1XFyc/Jo6J0OJs37Dqs/ITi5LolF6IgN69Fec8EZa3p06Mnc4ZNZGbZSIQoPob2FeecSxc/40hgRtlIKsKBjyISHy4fz9i+yf3iz4lE+NDQ8Yj3x5/Ii+QwtaQ85Tj65Rd60nDOZR0fj6MVB5qb2NFQR0SirGdRSuNZNEejrKutYkl1JT0iOZzYfygjevc/psGMnHMuUzIyHoekOcDPCEbr+18z+4+46fnAPcB0grHGLzWz9eG0ycD/AH2AKHCSmTVImg78BiggGIv8y5aG7NcjJ5ehhcXHtGxOJMKYvqWM6ZvciHnOOdeZpK2qSlIOcBtwHjABuEzShLjZrgFqzGwM8BPg1nDZXOA+4PNmNhE4E2gMl/lv4FpgbPiYk659cM4511I62zhmAKvNbK2ZHQAeBObGzTMXuDt8/jAwW0F9zrnAEjN7C8DMqsysWdJgoI+ZvRaeZdwDXJTGfXDOORcnnYljKBB7DWplWJZwHjNrAnYDJcA4wCQ9JelNSd+Mmb/yKOsEQNJ1khZKWrhjx44274xzzrlAtl6OmwucBpwE7AWelbSIILEkxcxuB26HoHE8HUE651x3lM4zjs3AsJjX5WFZwnnCdo2+BI3klcCLZrbTzPYSNIJPC+ePva410Tqdc86lUToTxxvAWEmjJPUA5gHz4+aZD1wVPr8EeC5su3gKmCSpV5hQPggsN7MtwB5Js8K2kCuBP6RxH5xzzsVJ630cks4HfkpwOe6dZvZ9STcDC81svqSewL3AB4BqYJ6ZrQ2X/TTwLYIeO54ws2+G5RW8fznuk8CXjnY5rqQdwIZj3I0BQHIdTWWGx9c2Hl/beHxtk+3xjTCzFvcVdIsbANtC0sJEN8BkC4+vbTy+tvH42ibb42uNdzninHMuJZ44nHPOpcQTx9HdnukAjsLjaxuPr208vrbJ9vgS8jYO55xzKfEzDueccynxxOGccy4lnjhCkuZIWilptaQbEkzPl/RQOH2BpJEdGNswSX+TtFzSMklfTjDPmZJ2S1ocPv69o+ILt79e0tJw2y0GP1Hg5+HxWyJpWgfGdnzMcVksaY+kr8TN06HHT9KdkrZLejumrL+kpyWtCv/2a2XZq8J5Vkm6KtE8aYrvvyS9E75/j0kqbmXZI34W0hjfTZI2x7yH57ey7BH/19MY30Mxsa2XtLiVZdN+/NrMzLr9g+AGxTXAaKAH8BYwIW6eLwC/Cp/PAx7qwPgGA9PC50XAuwniOxP4UwaP4XpgwBGmn09ww6aAWcCCDL7XWwlubMrY8QPOIOhG5+2Ysv8Ebgif3wDcmmC5/sDa8G+/8Hm/DorvXCA3fH5roviS+SykMb6bgH9J4v0/4v96uuKLm/4j4N8zdfza+vAzjkBbuoBPOzPbYmZvhs9rgRW00itwFpsL3GOB14DisJv8jjYbWGNmx9qTQLswsxcJekuIFfsZu5vEQwZ8GHjazKrNrAZ4mjSMSZMoPjP7qwW9WAO8xuH9xnWoVo5fMpL5X2+zI8UXfm98Evhte2+3o3jiCLSlC/gOFVaRfQBYkGDyyZLekvSkpIkdGxkG/FXSIknXJZiezDHuCPNo/R82k8cPYKAF/bFBcFY0MME82XIcP0NwBpnI0T4L6XR9WJV2ZytVfdlw/E4HtpnZqlamZ/L4JcUTRyciqTfwCPAVM9sTN/lNguqXKcAvgMc7OLzTzGwawYiPX5R0Rgdv/6gUdLZ5IfD7BJMzffwOY0GdRVZeKy/p/wJNwP2tzJKpz8J/A8cBU4EtBNVB2egyjny2kfX/S544Am3pAr5DSMojSBr3m9mj8dPNbI+Z1YXPnwDyJA3oqPjMbHP4dzvwGEGVQKxkjnG6nQe8aWbb4idk+viFth2svgv/bk8wT0aPo6SrgQuAy8Pk1kISn4W0MLNtZtZsZlHg161sN9PHLxf4GPBQa/Nk6vilwhNHoC1dwKddWCd6B7DCzH7cyjyDDra5SJpB8N52SGKTVCip6OBzgkbUt+Nmmw9cGV5dNQvYHVMt01Fa/aWXyeMXI/YzdhWJhwx4CjhXUr+wKubcsCztJM0BvglcaME4OYnmSeazkK74YtvMLm5lu8n8r6fTh4B3zKwy0cRMHr+UZLp1PlseBFf9vEtwxcX/DctuJvgnAehJUMWxGngdGN2BsZ1GUG2xBFgcPs4HPg98PpznemAZwVUirwGndGB8o8PtvhXGcPD4xcYn4Lbw+C4FKjr4/S0kSAR9Y8oydvwIEtgWoJGgnv0agjazZ4FVwDNA/3DeCuB/Y5b9TPg5XA38UwfGt5qgfeDgZ/DgVYZDCIY+aPWz0EHx3Rt+tpYQJIPB8fGFr1v8r3dEfGH5bw5+5mLm7fDj19aHdzninHMuJV5V5ZxzLiWeOJxzzqXEE4dzzrmUeOJwzjmXEk8czjnnUuKJw7ksFvba+6dMx+FcLE8czjnnUuKJw7l2IOnTkl4Px1D4H0k5kuok/UTBGCrPSioN550q6bWYcS36heVjJD0TdrT4pqTjwtX3lvRwOBbG/R3VK7NzrfHE4VwbSRoPXAqcamZTgWbgcoK71Rea2UTgBeDGcJF7gH81s8kEdzofLL8fuM2CjhZPIbjzGILekL8CTCC4s/jUNO+Sc0eUm+kAnOsCZgPTgTfCk4ECgg4Ko7zfmd19wKOS+gLFZvZCWH438Puwf6KhZvYYgJk1AITre93Cvo3CUeNGAi+lfa+ca4UnDufaTsDdZvatwwqlf4ub71j799kf87wZ/791GeZVVc613bPAJZLK4NDY4SMI/r8uCef5FPCSme0GaiSdHpZfAbxgwciOlZIuCteRL6lXR+6Ec8nyXy7OtZGZLZf0bYJR2yIEPaJ+EagHZoTTthO0g0DQZfqvwsSwFvinsPwK4H8k3Ryu4xMduBvOJc17x3UuTSTVmVnvTMfhXHvzqirnnHMp8TMO55xzKfEzDueccynxxOGccy4lnjicc86lxBOHc865lHjicM45l5L/Hy3RXgDUC4hgAAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAA5+ElEQVR4nO3deXyU9bn//9d7khCWhBCSsG+RRfZdQMUVsdbWXSvWVj166ulp7anH2nPs9/RnqbWe0331nFNbrUv1aGvV0latu1ZrFVBAFlFWE2RNAiRAIMlcvz/umzgkE5gJmcyQXE8eeTDzmXu5ZjKZa+7P576vj8wM55xzLlGRdAfgnHPu2OKJwznnXFI8cTjnnEuKJw7nnHNJ8cThnHMuKZ44nHPOJcUTh2tXkl6S9I/h7WskvZrumNqSpP8n6VfpjiMRkmokHZfuONyxxxNHJyZpg6R94QfIFkn3SspLd1zHMjO7w8z+Md1xJMLM8sxsXVtvV9J8Sb9po22ZpBFtsa0m271X0u1tvd3OwhOHO8/M8oDJwBTga+kN59glKTvdMTjXHjxxOADMbAvwF4IEAoCkWZL+JmmnpKWSTo95rLekX0v6UFKVpCfC9kJJf5K0PWz/k6RBrYlJ0uyY/ZdJuiZsL5B0f7iPjZK+LikSPnaNpNck/Shcb52kk8L2MknbJF0ds497Jf2vpGclVUt6WdLQmMd/Eq63W9JiSafEPDZf0qOSfiNpN3BN7LdtSV3DxyrCWBZK6hs+NkDSAkmVktZI+lyT7f42fI7VklZImt7CazQs/FaeHdMW2x04InxOuyTtkPRIzHKN3+bD1+FOSX8O9/mGpOExy54taXW4nf8Ot9nsyErSOcD/Ay4Pj2SXxvzO7pa0WdImSbdLyjpcjJJeCTe7NNzW5XH2d7jnNzr8vVaGsX8qbL8euBL4t3C7f4z32rqWeeJwAIQf7h8H1oT3BwJ/Bm4HegM3A7+XVBKu8gDQHRgH9AF+FLZHgF8DQ4EhwD7g562IZyjwFPAzoIQgoS0JH/4ZUAAcB5wGXAX8Q8zqM4FlQBHwEPAwcAIwAvgM8HMd2iV3JfAtoDjcx4Mxjy0M99073NbvJHWNefwC4FGgV5P1AK4O4xwcxvJ5gteDMKZyYABwKXCHpDNj1j0/XKYXsIBWvIahbwHPAIXAIILXriXzgG+Gy64Bvg0gqZjgOX4tfB6rgZPibcDMngbuAB4Ju8ImhQ/dC9QT/A6mAGcDBxNP3BjN7NTw8UnhthqTwpGen6QewLMEv7M+4XP7b0ljzewugt/Vd8PtnneY18TF4YnDPSGpGigDtgHfCNs/AzxpZk+aWdTMngUWAedK6k+QZD5vZlVmVmdmLwOYWYWZ/d7M9ppZNcGHz2mtiOvTwHNm9n/h9ivMbEn4LXUe8DUzqzazDcAPgM/GrLvezH5tZg3AIwQf3LeZ2X4zewY4QPABdtCfzewVM9sP/AdwoqTB4fP5TbjvejP7AZALHB+z7utm9kT4Gu3jUHUEH7QjzKzBzBab2e5w2ycD/25mtWa2BPgVQQI86NXwtW8gSNKTaJ06giQ+INzX4U5GeNzM3jSzeoIP1slh+7nACjN7LHzsp8CWRAMIj7LOBW40sz1mto3gi8a8VsSY6PP7JLAhfB/Um9nbwO+By5LYtmuBJw53oZnlA6cDowm+dUPwx3hZ2MWyU9JOYDbQn+CDuNLMqppuTFJ3Sb9Q0IW0G3gF6HWwWyIJg4G1cdqLgRxgY0zbRmBgzP2tMbf3AZhZ07bYI46ygzfMrAaoJDgSQNLNklaFXSE7CY4giuOtG8cDBN1/Dyvo0vuupJxw25VhYm3pOcR+MO8Fuqp1Yyj/Bgh4M+zyuvYwyzbd58HXaACHvkZGcLSUqKEEv7PNMe+lXxAcCSQbY1MtrTsUmNnk/Xsl0C+JbbsW+GCeA8DMXpZ0L/B94EKCD4oHzOxzTZcNjzh6S+plZjubPPwVgm/kM81si6TJwNsEf9zJKANmxGnfwUffMleGbUOATUluP9bggzfCLqzewIcKxjP+DZhD8I07KqmKQ59Li+WlzayOoOvnm5KGAU8SdPM8Q/D65cckj9Y+hz3h/92B3eHtxg/HcOzqc+Fzmw08J+kVM1uTxD42E3QDEW5HsffjaPqalAH7geLwiOXQhY8ixpbWDff5spnNTTBGlwQ/4nCxfgzMlTQJ+A1wnqSPScpSMNB7uqRBZraZYPzhvxUMhudIOtgfnU/wjX6npN581PWVrAeBsyR9SlK2pCJJk8Oum98C35aUH46F3BTG21rnKhiI70LQZ/53MysLn0s9sB3IlnQr0DPRjUo6Q9KE8GhrN0HCi4bb/hvwn+HrOhG4rjXPwcy2EyScz4S/p2uB2EHty/TRyQlVBB+Y0SR382dggqQLw6OeL3L4b+5bgWEKT1gI3y/PAD+Q1FNSRNJwSaclEONWgrGsuA6z7p+AUZI+G74/cySdIGlMItt1h+eJwzUKP4TuB24NP9wuIDhDZjvBN7iv8tF75rMEH4TvEoyN3Bi2/xjoRnBk8Hfg6VbG8gFBv/hXCLqOlvBRP/+XCL5prwNeJRgAvac1+wk9RJDgKoFpBOM7EHQzPQ28R9CVVMvhu6aa6kcwqLwbWAW8TNB9BXAFMAz4EHgc+IaZPdfK+D9H8LupIDhZ4W8xj50AvCGphmCQ/cvJXrthZjsIxga+G+5jLMF41/4WVvld+H+FpLfC21cBXQiOEqsIXpf+CcQ4H7gv7G76VJx9xV03PJI7m2Ac5UOCbrjvEIxRAdwNjA23+0Sir4ULyCdycp1Z2D1XbmZfT3csx4rwSKIcuNLMXkx3PK79+RGHc+6Iwi7LXpJyCY5CRXBE6TohTxzOuUScSHCW2w7gPIKz8Zqefuw6Ce+qcs45lxQ/4nDOOZeUTnEdR3FxsQ0bNizdYTjn3DFl8eLFO8yspGl7p0gcw4YNY9GiRekOwznnjimSNsZr964q55xzSfHE4ZxzLimeOJxzziUlpYlD0jkKJlBZI+mWOI/nSnokfPyNsBDcwclp9klaEv78b8w60yS9E67z07DgmnPOuXaSssQRFna7k2DehrHAFZLGNlnsOqDKzEYQ1Of/Tsxja81scvjz+Zj2/yGozTMy/DknVc/BOedcc6k84pgBrAkLjh0gmM3sgibLXADcF95+FJhzuCOIsJx3TzP7ezgnwP0EJcCdc86FrLoKW/s20RWvYh+uxerr2nT7qTwddyCHVhItJ5jSM+4yZlYvaRfBjGkApZLeJqgs+nUz+2u4fOwEMuUcOvlNIwXzCl8PMGTIkKN7Js45d4yw6iqiT/4CNr0f3EfovH9GI6e12T4ydXB8MzDEzKYQzLXwkKSE50EAMLO7zGy6mU0vKWl2/YpzznVM28sak0bAsBcfwvbsarNdpDJxbCJmZjWCGcOaznDWuEw4QUwBUBHODV0BYGaLCYqrjQqXj515LN42nXMu7ay+Htvf/nUgbf/e5o17dkHdgTbbRyoTx0JgpKTScGa1eQQTrcRaAFwd3r4UeMHMTFJJOLiOpOMIBsHXhTOJ7ZY0KxwLuQr4Qwqfg3POJc0+XEv0yf8l+vB/En3rOaymqt32raL+oCYf7cfPhLxebbaPlI1xhGMWNxDMopYF3GNmKyTdBiwyswUEs3A9IGkNwexr88LVTwVuk1RHMA3k582sMnzsC8C9BLPMPRX+OOdcRrDtZUQf/R6EA9L20v9B7R448Xza5eqB4sFELvoy0Rcfgl07YMwsIjM+gbJz2mwXnaKs+vTp081rVTnn2kN05evY0786tDEnl8g1t6P83u0Wh+2rgbr90KMAZbXuGEHSYjOb3rS9UxQ5dM51LrZ1I/bum9ju7UTGnAiDRqGuPdpn5/E+pHNyIZKV8CZsexn23mKs8kMix8+AQaNR97ykwlC3POiW3DqJ8sThnOtQbEc50d99Dw4EA9PR9xejuVejCae2y/7VZwiWVwgx4xo65VLUoyCh9a1qC9FHvw/7aoAw/tPmoWlzUxJva3jicM51KLZ1Y2PSaGx7fQE2fDLqntRZ/a2iwr5ELrkJ+2AV7N6Bho6D/iMSXt+2lTUmjca2v/8BGzWtXbu6DscTh3Oug4k3bmvxm1NERQNQ0YDWrRxv3DnavvEfiScO51ybMzPYtR0a6qFnEcrJTW79PbthRzlWvx8V9kO9+ye8rvoMxXJyg4Hhg20zP4l6pP5ooy2oZBCW2x1irsfQjHMhvzCNUR3KE4dzrk3Z/n3Y8lex1x6D+gMwfAqRUy9DhX0TW7+6kujT90DZquB+Tlcil9yEBgxPaH2VDCZy2VexFa9hO7eh8aegIWNa/Xzam4oGELn05iD+HeVo3MmodEL7nMqbIE8crkOy6iqo2IRZFPUegAqK0x1S57FlHfbywx/dX/s2VlAMp34KRRK45njz+sakAUBdLdHXHidywQ2oS9eEQlC/UtSvNMnAM4f6DkV9h2JmGZUwDvLE4Tocq9pKdMGdUBFUo7G8QiIX34iKBx1hTdcWbGvzaapt9ZvohI9DAmcWWXVF88btZXCgFhJMHB1FJiYNyNwih861mq1f1pg0AKipCrpOOsHFrhmhZ5yju+JBCX/ox0vwGjUtZdckuOR54nAdjm3Z0Lxt0/sQbWj/YDohDRgOA0Z+1JDTlchJFyY+QN6vFJ0+D7K7BPdLJ6CpZ7f66mfX9vw34TocHTcRe/fvh7aNnukfPO1EPYuInPfPsD08K6p3/+TOisrtBlPOQsdNCuo99SxKeGzDtQ//S3IdjgaPhqlnY28/F5wTP+ZENHJqusPqVNSjIKiR1Nr1JejVp01jcm3HE4frcNSjAE65JCgxYVHoVYIOdns4546aJw7XISkrG4oS7x5xziXOB8edc84lxROHc865pHhXlUsZCyt8ys+/T1pQ62kHWAPkF6Ns/1N1mcPfja7N2f692Nol2OsLwAzNOg+NmNJ+E+lkANtXA5VbsGh9UKQvifmerXYP9s5fsdf/AA11MPZkIrPOa/eyKVa7B6IN7VKKPNNYdSVs3Ygd2BdUuS0Zkli5lE7CE4dre2Wrsafvbrxrz/waunRFo5rNQNkh2e4Kos89ABveCe4X9iNy/hcTL7O9aQ321999dH/Fq1ivEjTzkymItjmrr4MPVhJ99TGo3YOmnoXGzEI9erXL/tPNdlcQ/eP/wNb1wX1FiFz0ZRg2Ps2RZQ5Poa7NRVe93qzN3vlrGiJJDyt7tzFpAFC1BVv2MmbRxNYvX928beXrWEyZ7ZTaup7oEz+DHeVBuZZXfoe9+2b77DsTbN3QmDQAsCjRlx/BamtaXKWz8cTh2ly8WcpUUJSGSNIkXsmTD1bCgf3Nl40nXvnxksEfleBIMdu0hqazBtmS5xvHrDo6q42ToHdXQN2B9g8mQ3nicG1OY2ZBTkyJiOwuaPwp6QuovQ1oPk2ohk9OvMjf4DFQNPCjhtxuRKaf034lU+KNRXUvgE5SsiVel6LGnpRQZd/OonO8E1y7Ut9hRObdgm1eBxjqdxzqMyTdYbUbDR6FjZsNK14NGgaMRONmJ1wiW4V9iFx8Y1DrqaE+mIa0d7/UBdx0/wNHYT0KYM+usEFBkcLOUi+qzxB03hewF/8P9u6GsSehaWejSFa6I8sY6gylpqdPn26LFi1KdxiuE7ED+2Hn1uCDv7DvMXdGmVVuxjavhQO1wYRIfYd1ug9O27MrKLKY16vTFsiUtNjMmp3V0jlfDedSTF1yg2+u6Q6klZKtaNsRybumWuRjHM4555KS0sQh6RxJqyWtkXRLnMdzJT0SPv6GpGFNHh8iqUbSzTFtGyS9I2mJJO9/cq4FVrMT212BRRM7Ddi5RKWsq0pSFnAnMBcoBxZKWmBmK2MWuw6oMrMRkuYB3wEuj3n8h8BTcTZ/hpntSFHozh3T7EAt9v5i7JXfQV0tmnwWTDkz7mnSzrVGKo84ZgBrzGydmR0AHgYuaLLMBcB94e1HgTkKTz2RdCGwHliRwhid63i2rMP+cg/sq4b6OmzRU9i7b6Q7KteBpDJxDATKYu6Xh21xlzGzemAXUCQpD/h34JtxtmvAM5IWS7q+pZ1Lul7SIkmLtm/ffhRPw7lji5W917ztnVf8ymfXZjJ1cHw+8CMzi/dOn21mU4GPA1+UdGq8DZjZXWY23cyml5SUpDBU5zJMzzhdUr36ttuV567jS+XpuJuAwTH3B4Vt8ZYpl5QNFAAVwEzgUknfBXoBUUm1ZvZzM9sEYGbbJD1O0CX2Sgqfh+uEbM9uqPwQi0ZR737H1PiABh2PFRQHZdkBsnOIzPyET5/r2kwqE8dCYKSkUoIEMQ/4dJNlFgBXA68DlwIvWHBFYmN9CknzgRoz+7mkHkDEzKrD22cDt6XwObhOyHZtJ/rkr2DzmuB+QTGRC/4FFTftac1MKuxL5NKbYesHWEMdKh6ISgYfeUXnEpSyxGFm9ZJuAP4CZAH3mNkKSbcBi8xsAXA38ICkNUAlQXI5nL7A4+H4eTbwkJk9narn4Don27iiMWkAsGsH9s4rcPq8hMuGpJsKSqCg5Ji9ANFltpReOW5mTwJPNmm7NeZ2LXDZEbYxP+b2OmBS20bp3KEsXnXb8tWo/gDk5LZ/QM5lmEwdHHdtwKqrsJqqdIdxzNHg0c3bRk5DnjScA7xWVYdke3djK/6GvfEniETQieej0bN87u8EafBomHg6tuxlwOC4SWj0zHSH5VzG8MTRAdmGFYdMPWov/h/qUQijpqUxqmOH8nrB6ZfD5DMgGkUFJSi3W7rDci5jeFdVB2MWxZY3n6Y1utqvHE6GsrsQKR5EpM8QTxrONeGJo4ORIvFnMOvkJbKdc23HE0cHpPGnQJeYb8ld89CoZnOxHJFZFDOvrOqcO5SPcXRA6juUyLyvYTvKQUIlg5M64rC6Otj0HtG3n4eIiEyeE0x/mpOTwqidc8cKTxwdlIoHtv5K5w/fI/rYDxvvRtcuIXLpV2DI2DaKzjl3LPOuKtdMdNnLzduWv5qGSJxzmcgTh2suK06XVMQPTp1zAU8crpnIhFMhtiaTRGT8ye0ag1VXYRuWY+vfwXZXtOu+nXOH518jXXMDhhP51L9hq94IBtdHz4T+x7Xb7q1yM9EFd0Ll5qChZxGRC798zFSnda6j88ThmlFWNgwchQaOSsv+be3Sj5IGwO4KbOXr6NRL0xKPc+5Q3lXlMo5tWde8bdN7WNSvKXEuE3jicBlHwyc3bxs9E0X87epcJvC/RJdxNGQsmnIWKBIM0o8/BY2YnO6wnHMhH+Noge2rwaq2oqws6NXXC921I+X1glMuQxNPByyYyS7br1p3LlN44ojDKrcQffpXsGU9BnD8DCKnXobye6c7tE5D2dlQ5IUZnctE3lXVhJlhK16DLes/alz9Jlb+XvqCcs65DOKJo6m6/dj6pc3bN3nicM458MTRXE4uGjqheXv/Ee0fi3POZSBPHE1IQuNnQ/GgjxqHT0KDj09fUM45l0F8cDwOFfUncslNH51VVdgPde3RrjHY7kqsYhNSBIoHBmcaOedcBvDE0QL1KEA9CtKyb9uxiegTPwlKbQAUDSRy/hdRYd+0xOOcc7G8qyoD2Yq/QWxF2IpN2Lo4A/bOOZcGnjgyjDU0YB82P4PLYk8Pds65NEpp4pB0jqTVktZIuiXO47mSHgkff0PSsCaPD5FUI+nmRLd5rFNWFho1o3n7cZPSEI1zzjWXssQhKQu4E/g4MBa4QlLTSauvA6rMbATwI+A7TR7/IfBUkts85mnkFBhzIiBQBE2ZgwaPTndYzjkHpHZwfAawxszWAUh6GLgAWBmzzAXA/PD2o8DPJcnMTNKFwHpgT5LbPOapZzGRuVfBCR8PivwV9AlKcDjnXAZIZVfVQKAs5n552BZ3GTOrB3YBRZLygH8HvtmKbQIg6XpJiyQt2r59e6ufRLoouwsqHoiKBnjScM5llEwdHJ8P/MjMalq7ATO7y8ymm9n0kpKStovMOec6uVR+ld0EDI65Pyhsi7dMuaRsoACoAGYCl0r6LtALiEqqBRYnsE3nnHMplMrEsRAYKamU4MN9HvDpJsssAK4GXgcuBV4wMwNOObiApPlAjZn9PEwuR9qmc865FEpZ4jCzekk3AH8BsoB7zGyFpNuARWa2ALgbeEDSGqCSIBEkvc1UPYfWsvo62LwOe28R5HRBI6dBv1IkpTs055w7agq+4Hds06dPt0WLFrXb/mzjSqK//yEQvrZZ2UQ+9e+o/3HtFoNzzh0tSYvNbHrT9kwdHD9mWUM90cXP0Jg0ABrqsTVvpy0m55xrS544UqH+QPO2hrr2j8M551LAE0cbU1Y2kalzm7aiEVPTEo9zzrU1v7IsFYaMIXL+l4i+/Szk5AaJxMc3nHMdhCeOFFCXrjBiMpHSCWG5qax0h+Scc23GE0cKKcsThnOu4/ExDuecc0nxxOGccy4pnjicc84lxROHc865pHjicM45lxRPHM4555LiicM551xSPHE455xLiicO55xzSTli4pDUV9Ldkp4K74+VdF3qQ3POOZeJEjniuJdgxr0B4f33gBtTFI9zzrkMl0jiKDaz3wJRCKZvBRpSGpVzzrmMlUji2COpiHBKO0mzgF0pjco551zGSqQ67k3AAmC4pNeAEuDSlEblnHMuYx0xcZjZW5JOA44HBKw2M58H1TnnOqkjJg5JVzVpmioJM7s/RTE555zLYIl0VZ0Qc7srMAd4C/DE4ZxznVAiXVVfir0vqRfwcKoCcs45l9lac+X4HqC0rQNxzjl3bEhkjOOPhKfiEiSascBvE9m4pHOAnwBZwK/M7L+aPJ5L0OU1DagALjezDZJmAHcdXAyYb2aPh+tsAKoJriWpN7PpicTinHOubSQyxvH9mNv1wEYzKz/SSpKygDuBuUA5sFDSAjNbGbPYdUCVmY2QNA/4DnA5sByYbmb1kvoDSyX9Mbz4EOAMM9uRQOzOOefaWCJjHC+3ctszgDVmtg5A0sPABUBs4rgAmB/efhT4uSSZ2d6YZbry0RGPc865NGtxjENStaTdcX6qJe1OYNsDgbKY++VhW9xlwqOJXUBRuP+ZklYA7wCfjznaMOAZSYslXZ/Ik3TOOdd2WjziMLP89gwkzv7fAMZJGgPcJ+kpM6sFZpvZJkl9gGclvWtmrzRdP0wq1wMMGTKkXWN3zrmOLOGzqiT1kTTk4E8Cq2wCBsfcHxS2xV1GUjZQQDBI3sjMVgE1wPjw/qbw/23A4wRdYs2Y2V1mNt3MppeUlCQQrnPOuUQkMh/H+ZLeB9YDLwMbgKcS2PZCYKSkUkldgHkENa9iLQCuDm9fCrxgZhaukx3ufygwGtggqYek/LC9B3A2wUC6c865dpLIWVXfAmYBz5nZFElnAJ850krhGVE3EMzlkQXcY2YrJN0GLDKzBcDdwAOS1gCVBMkFYDZwi6Q6gnLuXzCzHZKOAx6XdDD2h8zs6WSesHPOuaMjs8OfsCRpkZlNl7QUmGJmUUlLzWxS+4R49KZPn26LFi1KdxjOOXdMkbQ43rVyiRxx7JSUB/wVeFDSNoKrx51zznVCiQyOv0gwaP1l4GlgLXBeKoNyzjmXuRJJHNnAM8BLQD7wiJlVHHYN55xzHdYRE4eZfdPMxgFfBPoDL0t6LuWROeecy0jJVMfdBmwhuM6iT2rCcc45l+kSuY7jC5JeAp4nKAfyOTObmOrAnHPOZaZEzqoaDNxoZktSHItzzrljQCLVcb/WHoE455w7NrRmBkDnnHOdmCcO55xzSfHE4ZxzLimeOJxzziXFE4dzzrmkeOJwzjmXlESu43DOOXcMqanbz6Y9VeypO0Df7j3p372ASDCPUZvwxOGccx3I7gO1PLJ2EYt2fABAliJ8adxpjCns32b78K4q55zrQMr2VDUmDYAGi/LgmoVU19W22T48cTjnXAdSfaB5gtheW8O++ro224cnDuec60D6dstv1ja+sD8FXbq12T48cTjnXAaqa6hnb/2BpNcb1KMX1x5/It2zcwAY0bOES0qnkJvVdkPaPjjunHMZZu3u7TxVtoKt+6qZ3Xc4M/oMpTC3R0Lr5mRlM7NPKSN6llDbUE/v3O50y+7SpvF54nDOuQxSvqeKH73zAnXRBgAe27CEmrr9XFQ6iYgS7yQq6pqXqhC9q8o55zLJpj27GpPGQS9ufo+q/fvSFFFznjiccy4Fauvr2H2gFjNLar2cSPOP5dysbLLa8AK+o+VdVc4514aiZry/axt/2LCUHfv3MLvfcE7uO5yiromNUQzp0Zvi3Dx27K9pbLt42GR65XZPVchJ88ThnHNtqGxPFT9e/gLR8Ejjzx8s50BDPReXTk5ojKK4Wx7/Mv50Vu/aSmXtHkb16stx+cWpDjspKe2qknSOpNWS1ki6Jc7juZIeCR9/Q9KwsH2GpCXhz1JJFyW6Teecawt1DQ2tOh32wz07G5PGQS9tfp+dSYxR9O3ek1P7j+TC0smMLexP1/DU2kyRsiMOSVnAncBcoBxYKGmBma2MWew6oMrMRkiaB3wHuBxYDkw3s3pJ/YGlkv4IWALbdM51chuqK3hty1q219Ywu99wxvTqR4+c3ITWNTPW7t7Ok2Ur2FG7h9P6j2Ba8ZCEu4q6RJp/rOZl55IdyUrqOWSyVHZVzQDWmNk6AEkPAxcAsR/yFwDzw9uPAj+XJDPbG7NMV4KEkeg2nXOdWHlNFT9c9jz7o/UArNq5hU+POIHT+o9MaP2y8HTYeosC8Nt1b1FbX8e5Q8ajBAaoh+b1pm/XfLbWVje2XXrcFHp26dqKZ5OZUpk4BgJlMffLgZktLRMeXewCioAdkmYC9wBDgc+GjyeyTefcMW5P3X42793F/oYG+nbLp7hb4tckfLCnsjFpHPTkB8uZUjSIngmU3SjfU9WYNA56dtO7nNxveEJHHcXd8vjS+NNZu3sHu+tqGZbXm9L8ooTjPxZk7OC4mb0BjJM0BrhP0lPJrC/peuB6gCFDhqQgQudcKuzcv5dH1i7mrYrgO2JeTi7/Mu4Mhub3TnALzY8KFP5LRE6crqau2TlkJXHxXUm3fEri1IzqKFI5OL4JGBxzf1DYFncZSdlAAVARu4CZrQJqgPEJbvPgeneZ2XQzm15SUnIUT8M51542VFc0Jg0IJiVasHEpBxrqD7PWR4bm9aZr1qGDyZ8cMoH8BLuKhuQVUtjkyOTiYZMTXr8zSOURx0JgpKRSgg/3ecCnmyyzALgaeB24FHjBzCxcpyzsnhoKjAY2ADsT2KZzLs2q9u+hrGYnB6L1DOhewIAevRJed0ftnmZt66sr2Ft/gC4JFOob2KMXX5k4h4XbN7J9Xw2z+gxjZEHfhPfft1tPbpxwJqt3bqXqwD5G9+pLaV5mnQ6bbilLHOGH/g3AX4As4B4zWyHpNmCRmS0A7gYekLQGqCRIBACzgVsk1QFR4AtmtgMg3jZT9Rycc8nbUVvDL1b+lQ/2VAGQE8nixglnMqJnYkf+/Xv0bNY2sfdA8hI8KwpgSF5vhuQl2rXVXL/uBfTrXtDq9Ts6JXs5/LFo+vTptmjRonSH4Vyn8Oa2Ddy9+m+HtB1f0JcvjjstodLee+v28/yH7/Fk2XKiZpTmFXH18bPo7x/k7U7SYjOb3rQ9YwfHnXPHpqr9e5u1bdm3m/0NdQklju45uZw7eBzTi4dwINpASbc8urdxWXB3dDxxOOea2bRnJ+/u3MLeugMcX9iP0vwichK8gG1onC6iWX2GkZeT+OByViRC/x5+hJGpvDquc+4Qm/bs5AfLnuO3697iT2XL+cGy51i9c0vC65fmF/GZETPolpWDELP6lHJKv5FEMqi6qzs6fsThnDvEml3b2dOkRtOCDe8womefhGom5WbncEr/EYwr7E+9Remd271Dldtwnjicc03UNtQ1a9vXUNfsauoj6Z1gGXF37PHE4VyGOtDQQNSiraqMGrUo22traIhGKe6al9D1DweNLOiDEMZHZ1yeNWh0UqfDuo7NE4dzGaY+2sB7u7bxdNkK9tQfYO7A0UzoPTDh6q576vbz8ub3+fMHy6m3KFOKBnFJ6ZSES2AMze/Nl8efwZMfLKe6vpY5A0YzuWjQ0Twl18F44nAuw2yoruSny19s/L7/6/f+zj+MOpFZfUsTWn/t7h38YeOyxvtvV5TTr1tPLhg2KaHqrlmKMKawH8N7FtMQjdItx0+FdYfys6qcyzArqzbT9LLcZzetora++dhDPOuqdzRrW7jjg2YD3kfSJSvbk4aLyxOHcxmmaYE+gO5ZXRI+nbVvt+YlO4blFdI1iXEO5w7H30nOpcAHNVWsrPqQ/Q31jCscQGl+EVmRxL6njSnsR9eybGrDarACPj5kXMID3CMLSijNL2J9dVBount2Fz42eJyfEuvajNeqcq6NfVBTyfeXPtc4mZCAGyecyehe/RLeRllNFauqtrC34QDjCvtTml+U1Af/rv372LR3J3XRBvp3L6BPB54bwqWO16pyrp0sr/zwkBnoDPhL2UqG55eQk5XYh//gvEIG5xW2OoaC3G4U5B55trvOoK6ujvLycmpra9MdSsbq2rUrgwYNIicnsVO/PXE418biXUC3p/7AIddFuPZTXl5Ofn4+w4YNS+isss7GzKioqKC8vJzS0sTO3PPBcefa2ITCgc0mKT1r4OikLsJzbae2tpaioiJPGi2QRFFRUVJHZP5Odq6NDetZxJfGncHTZSuobahj7qDRjCsckO6wOjVPGoeX7OvjicO5OHbsq6F8704aolEG9ChIahKhnEgW43r3Z2RBH4wouXFOr3XuWOaJw7kmtuzdxU+Xv0TF/mDu625ZOfzrhDkMzU9uKtIuWVkEMxy7zmj+/Pnk5eVx8803x338iSeeYNSoUYwdO7adIzt6PsbhXBPLKz9sTBoQVIZ9afNqoklWh3XucJ544glWrlyZ7jBaxROHc01s2be7WVtZzU7qo5443OF9+9vfZtSoUcyePZvVq1cD8Mtf/pITTjiBSZMmcckll7B3717+9re/sWDBAr761a8yefJk1q5dG3e5TOWJw3VYNXW1VNclf+7++N7NB7JP7nucnxXlDmvx4sU8/PDDLFmyhCeffJKFCxcCcPHFF7Nw4UKWLl3KmDFjuPvuuznppJM4//zz+d73vseSJUsYPnx43OUylf8luA6ntr6OZZWbWLBxGfUW5eODxzGteEjC80mM7NmHS0qn8KcP3qEhGuWMAaOYXDw4xVG7Y91f//pXLrroIrp37w7A+eefD8Dy5cv5+te/zs6dO6mpqeFjH/tY3PUTXS4TeOJwHc6a3du5e/XfGu8/tGYhXSPZzEywLHmPnFzmDhzNtOIhRC1K7649yJIfnLvWueaaa3jiiSeYNGkS9957Ly+99NJRLZcJ/K/BdThv7figWdvLm9fQEG1IeBuSKOrag5Ju+Z40XEJOPfVUnnjiCfbt20d1dTV//OMfAaiurqZ///7U1dXx4IMPNi6fn59PdXV14/2WlstEfsThMk59tIG1u3fwyub3qbcop/UfycieJeQkOMbQK7d7s7airt2RJwCXQlOnTuXyyy9n0qRJ9OnThxNOOAGAb33rW8ycOZOSkhJmzpzZmCzmzZvH5z73OX7605/y6KOPtrhcJvLquC7jvLdrGz9c9twhlZ2+PP4Mxhb2T2j9spoqfrDsOfaFNaOyFeGmiXMY3rMkBdG6TLdq1SrGjBmT7jAyXrzXKS3VcSWdA/yE4CqoX5nZfzV5PBe4H5gGVACXm9kGSXOB/wK6AAeAr5rZC+E6LwH9gX3hZs42s22pfB6ufb25bUOzcoAvfvgeo3v1S2gyo8F5hXx10lzWV1cQtSjD8osY3KP1lWadc4dKWeKQlAXcCcwFyoGFkhaYWewVL9cBVWY2QtI84DvA5cAO4Dwz+1DSeOAvwMCY9a40Mz+EyGCb9+6ivKaKKMaQvN5JleyIlxwi0Kxw4OEM7NGLgT16JbGGcy5RqTzimAGsMbN1AJIeBi4AYhPHBcD88PajwM8lyczejllmBdBNUq6Z7U9hvK6NlNVU8cN3nmdvOMd1t6wcbpo4hyF5iZXsOKFkKK9sXnNIGfLTBxzvheqcyxCpTBwDgbKY++XAzJaWMbN6SbuAIoIjjoMuAd5qkjR+LakB+D1wu3WGgZpjyMLtGxuTBgQlO17fuj7hxHFcz2JunjiH17euo96inNR3OMf1LE5VuM65JGX0WVWSxhF0X50d03ylmW2SlE+QOD5LME7SdN3rgesBhgwZ0g7RuoO27t3VrG3znuZtLclShBEFfRhR0Kctw3LOtZFUnp+4CYi93HZQ2BZ3GUnZQAHBIDmSBgGPA1eZ2dqDK5jZpvD/auAhgi6xZszsLjObbmbTS0r8bJr2NKNP8wvtTup3XBoicc6lQioTx0JgpKRSSV2AecCCJsssAK4Ob18KvGBmJqkX8GfgFjN77eDCkrIlFYe3c4BPAstT+Bw6rc17drFw2wYWbt/I5jhHEIdzfK++XH7cNHpk59I9uwuXlU5lTGG/FEXqXOcybNgwduzYkfAy1157LX369GH8+PFtFkPKuqrCMYsbCM6IygLuMbMVkm4DFpnZAuBu4AFJa4BKguQCcAMwArhV0q1h29nAHuAvYdLIAp4Dfpmq59BZfVBdyY+WP8/e+uA6iO7ZXfjXCXMYkpfYKa15ObmcOfB4pob1neJdkOdcpnpj23qe2LCUyv176Z3bnQuHTWJmnKPoY8U111zDDTfcwFVXXdVm20zppbRm9qSZjTKz4Wb27bDt1jBpYGa1ZnaZmY0wsxkHz8Ays9vNrIeZTY752WZme8xsmplNNLNxZvZlM0u8joRLyBvb1zcmDYC99QdYuG1D0tvpldvdk4Y7pryxbT2/ef9NKvcHJc0r9+/lN++/yRvb1h/Vdjds2MDo0aO55pprGDVqFFdeeSXPPfccJ598MiNHjuTNN9+ksrKSCy+8kIkTJzJr1iyWLVsGQEVFBWeffTbjxo3jH//xH4k9F+g3v/kNM2bMYPLkyfzTP/0TDQ3NPw5PPfVUevdObhKyI/EaDK6ZD/c0n48i2e4q545FT2xYyoEmNc0ORBt4YsPSo972mjVr+MpXvsK7777Lu+++y0MPPcSrr77K97//fe644w6+8Y1vMGXKFJYtW8Ydd9zReITwzW9+k9mzZ7NixQouuugiPvggqMW2atUqHnnkEV577TWWLFlCVlZWu9W4yuizqlzrNESjrKvewVs7ysiSmFI8mNL8IiIJ1mo6qW8pK3duPqRtVoKVZZ07lh080ki0PRmlpaVMmDABgHHjxjFnzhwkMWHCBDZs2MDGjRv5/e9/D8CZZ55JRUUFu3fv5pVXXuGxxx4D4BOf+ASFhUGX8fPPP8/ixYsba2Lt27ePPn3a50xETxwd0Nrd2/nhOy80XkD3/IeruXniWQnXahpT2J/LSqfy57J3APHJIeMZ3csHt13H1zu3e9wk0bsNulxzcz+aDyYSiTTej0Qi1NfXk5OTk9T2zIyrr76a//zP/zzq2JLlXVUd0Eub3zvkquuoGW8kMUaRl5PLWYNGc+vUT/CNqecyZ+DohCdBcu5YduGwSXSJZB3S1iWSxYXDJqV836ecckpjV9NLL71EcXExPXv25NRTT+Whhx4C4KmnnqKqqgqAOXPm8Oijj7JtW1Cqr7Kyko0bN6Y8TvDE0eGYWbM+WoADcQbNjqTQB7ddJzOzTymfGTmj8Qijd253PjNyRrucVTV//nwWL17MxIkTueWWW7jvvvsA+MY3vsErr7zCuHHjeOyxxxovaB47diy33347Z599NhMnTmTu3Lls3ry52XavuOIKTjzxRFavXs2gQYPaZEpaL6veAS2v/JCfrXjpkLabJpzJ8d7d5DohL6uemIwpq+7SY2RBH24YdxrPb1pNROKsgaM5Lt9rPTnn2oYnjg4oNyubCb0HMrZXP0BkRbxH0jnXdjxxdGBZTQb5nHOuLXjiyFD76g+wbV81EUXo0y2f3ATn23bOuVTzT6MMtH1fNQ+vXcTyquAMiRklw7i4dBKFuT3SHJlzzvnpuBnprR1ljUkD4M3tG1hVtSWNETnn3Ec8cWSYhmgDb1WUNWtfUdX8/GznXOeTTFn1srIyzjjjDMaOHcu4ceP4yU9+0iYxeFdVhsmKZDG2Vz82VFcc0j7SZ8Nzrl1EV72Ovfo4VFdAfhGafRGRMSemO6xWyc7O5gc/+AFTp06lurqaadOmMXfuXMaOHXtU2/UjjhTY31DP8soPuXPFy/xq1Wu8t3MbDRZNeP2ZfUrp371n4/3hPYsZV9g/FaE652JEV72OPXt/kDQAqiuwZ+8nuur1o9puusqq9+/fn6lTpwKQn5/PmDFj2LSp6USsyfPEkQLv7drKz1a8xLLKTSzcsZEfvvM863dXHHnFUL/uPfnX8Wdy04QzuXniWfzzmFMp6Zafwoidc0BwpFF/4NDG+gNB+1FKd1n1DRs28PbbbzNz5syjfi7eVXUYdQ0NSJCdxPUQDdEGnit/95A2w1hSUcaIgsTnPi/I7U6B14lyrn1Vt/AFr6X2JKSzrHpNTQ2XXHIJP/7xj+nZs2fcZZLhiSOOffUHWFG1mefK3yU3K5uPDR7LqII+CSUQAyQ1a4/EaXPOZZj8ovhJIr/oqDedrrLqdXV1XHLJJVx55ZVcfPHFyQceh3dVxbGiagu/fPc11tdU8O6urfx0+YusT/AbR3Yki7kDDy0UFpGYXDQ4FaE659qQZl8E2V0ObczuErSnWCrKqpsZ1113HWPGjOGmm25qs1j9iKOJuoYGntvUtKsJllZsSvjMplEFJdw4/kxe37aO3Eg2s/qWMiy/bef8dc61vciYE4lCWs6qmj9/Ptdeey0TJ06ke/fuh5RVv+KKKxg3bhwnnXRS3LLq0WiUnJwc7rzzToYOHdq4zddee40HHniACRMmMHnyZADuuOMOzj333KOK1cuqN1EfbeDOla+wssl1E+cPncgnhoxPRXjOuRTysuqJSaasundVNZEdyeLsgWOIHZHIiWQx3k+Hdc45wLuq4hpZUMJXJp7FsspN5EaymdB7AEPbYHDMOec6Ak8ccWRHshhZ0Mev1naugzCzuGc7ukCyQxbeVeWc69C6du1KRUVF0h+OnYWZUVFRQdeuXRNex484nHMd2qBBgygvL2f79u3pDiVjde3alUGDBiW8fEoTh6RzgJ8AWcCvzOy/mjyeC9wPTAMqgMvNbIOkucB/AV2AA8BXzeyFcJ1pwL1AN+BJ4MvmXyWccy3IycmhtLQ03WF0KCnrqpKUBdwJfBwYC1whqWlJxuuAKjMbAfwI+E7YvgM4z8wmAFcDD8Ss8z/A54CR4c85qXoOzjnnmkvlGMcMYI2ZrTOzA8DDwAVNlrkAuC+8/SgwR5LM7G0z+zBsXwF0k5QrqT/Q08z+Hh5l3A9cmMLn4JxzrolUJo6BQOyMROVhW9xlzKwe2AU0Pe/1EuAtM9sfLl9+hG0CIOl6SYskLfK+TeecazsZPTguaRxB99XZya5rZncBd4Xb2S5p4xFWaUkxQddZpvL4jo7Hd3Q8vqOT6fENjdeYysSxCYit7DcobIu3TLmkbKCAYJAcSYOAx4GrzGxtzPKxQ//xttmMmSVez7wJSYviXXKfKTy+o+PxHR2P7+hkenwtSWVX1UJgpKRSSV2AecCCJsssIBj8BrgUeMHMTFIv4M/ALWb22sGFzWwzsFvSLAVX81wF/CGFz8E551wTKUsc4ZjFDcBfgFXAb81shaTbJJ0fLnY3UCRpDXATcEvYfgMwArhV0pLw5+Bl3F8AfgWsAdYCT6XqOTjnnGsupWMcZvYkwbUWsW23xtyuBS6Ls97twO0tbHMR0J5lau9qx321hsd3dDy+o+PxHZ1Mjy+uTlFW3TnnXNvxWlXOOeeS4onDOedcUjxxhCSdI2m1pDWSbonzeK6kR8LH35A0rB1jGyzpRUkrJa2Q9OU4y5wuaVfMyQS3xttWCmPcIOmdcN/NpltU4Kfh67dM0tR2jO34mNdliaTdkm5ssky7vn6S7pG0TdLymLbekp6V9H74f2EL614dLvO+pKvjLZOi+L4n6d3w9/d4ePZjvHUP+15IYXzzJW2K+R3GnR/1SH/rKYzvkZjYNkha0sK6KX/9jpqZdfofgiKMa4HjCAorLgXGNlnmC8D/hrfnAY+0Y3z9ganh7XzgvTjxnQ78KY2v4Qag+DCPn0twBpyAWcAbafxdbwGGpvP1A04FpgLLY9q+S3AKOgRnGH4nznq9gXXh/4Xh7cJ2iu9sIDu8/Z148SXyXkhhfPOBmxP4/R/2bz1V8TV5/AfArel6/Y72x484Aq2uq9UewZnZZjN7K7xdTXB6c9xSKxnsAuB+C/wd6BXWHmtvc4C1ZtbaSgJtwsxeASqbNMe+x+4jfh22jwHPmlmlmVUBz5KCQp/x4jOzZyw4zR7g7xx6MW67auH1S0Qif+tH7XDxhZ8bnwL+r6332148cQTaqq5WyoVdZFOAN+I8fKKkpZKeCsu1tCcDnpG0WNL1cR5P5DVuD/No+Q82na8fQF8LLnKF4Kiob5xlMuV1vJaWr6E60nshlW4Iu9LuaaGrLxNev1OArWb2fguPp/P1S4gnjmOIpDzg98CNZra7ycNvEXS/TAJ+BjzRzuHNNrOpBGX0vyjp1Hbe/xGFFQzOB34X5+F0v36HsKDPIiPPlZf0H0A98GALi6TrvfA/wHBgMrCZoDsoE13B4Y82Mv5vyRNHIJm6WqhJXa32ICmHIGk8aGaPNX3czHabWU14+0kgR1Jxe8VnZpvC/7cR1Bib0WSRRF7jVPs4QaXlrU0fSPfrF9p6sPsu/H9bnGXS+jpKugb4JHBlmNyaSeC9kBJmttXMGswsCvyyhf2m+/XLBi4GHmlpmXS9fsnwxBFodV2t9ggu7BO9G1hlZj9sYZl+B8dcJM0g+N22S2KT1ENS/sHbBIOoy5sstgC4Kjy7ahawK6Zbpr20+E0vna9fjNj32NXEr8P2F+BsSYVhV8zZYVvKKZjR89+A881sbwvLJPJeSFV8sWNmF7Ww30T+1lPpLOBdMyuP92A6X7+kpHt0PlN+CM76eY/gjIv/CNtuI/gjAehK0MWxBngTOK4dY5tN0G2xDFgS/pwLfB74fLjMDQSTXi0lGLg8qR3jOy7c79IwhoOvX2x8IpgRci3wDjC9nX+/PQgSQUFMW9peP4IEthmoI+hnv45gzOx54H3gOaB3uOx0gqmXD657bfg+XAP8QzvGt4ZgfODge/DgWYYDgCcP915op/geCN9bywiSQf+m8YX3m/2tt0d8Yfu9B99zMcu2++t3tD9ecsQ551xSvKvKOedcUjxxOOecS4onDuecc0nxxOGccy4pnjicc84lxROHcxksrNr7p3TH4VwsTxzOOeeS4onDuTYg6TOS3gznUPiFpCxJNZJ+pGAOlecllYTLTpb095h5LQrD9hGSngsLLb4laXi4+TxJj4ZzYTzYXlWZnWuJJw7njpKkMcDlwMlmNhloAK4kuFp9kZmNA14GvhGucj/w72Y2keBK54PtDwJ3WlBo8SSCK48hqIZ8IzCW4Mrik1P8lJw7rOx0B+BcBzAHmAYsDA8GuhEUKIzyUTG73wCPSSoAepnZy2H7fcDvwvpEA83scQAzqwUIt/emhbWNwlnjhgGvpvxZOdcCTxzOHT0B95nZ1w5plP6/Jsu1tr7P/pjbDfjfrUsz76py7ug9D1wqqQ80zh0+lODv69JwmU8Dr5rZLqBK0ilh+2eBly2Y2bFc0oXhNnIldW/PJ+Fcovybi3NHycxWSvo6waxtEYKKqF8E9gAzwse2EYyDQFAy/X/DxLAO+Iew/bPALyTdFm7jsnZ8Gs4lzKvjOpcikmrMLC/dcTjX1ryryjnnXFL8iMM551xS/IjDOedcUjxxOOecS4onDuecc0nxxOGccy4pnjicc84l5f8Ht8j8myVBLFwAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": { + "needs_background": "light" + }, + "output_type": "display_data" + } + ], + "source": [ + "for i in ['Precision', 'Recall']:\n", + " sns.set_palette(\"Set2\")\n", + " plt.figure()\n", + " sns.scatterplot(x=\"epoch\", \n", + " y=\"value\", \n", + " hue='data',\n", + " data=compare_metric(df_list = [output1, output2], metric=i)\n", + " ).set_title(f'{i} comparison using test set');" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Referring to the figures above, it is rather obvious that the number of epochs is too low as the model's performances have not stabilised. Reader can decide on the number of epochs and other hyperparameters to adjust suit the application.\n", + "\n", + "As stated previously, it is interesting to see model2 (using both implicit and explicit data) performed consistently better than model1 (using only explicit ratings). " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 5. Similar users and items" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the LightFM package operates based on latent embeddings, these can be retrieved once the model has been fitted to assess user-user and/or item-item affinity." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 5.1 User affinity" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The user-user affinity can be retrieved with the `get_user_representations` method from the fitted model as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0.17943075, -0.9845197 , 1.724939 , ..., 3.7842598 ,\n", + " -3.438375 , 3.6794803 ],\n", + " [-0.33647582, 0.7195082 , 2.8680375 , ..., 4.22038 ,\n", + " -4.610963 , 4.010645 ],\n", + " [ 0.14344296, 2.1440773 , 1.8434161 , ..., 1.9370167 ,\n", + " -5.640826 , 4.653452 ],\n", + " ...,\n", + " [ 1.4312286 , -1.0642868 , 2.8821077 , ..., 2.8192847 ,\n", + " -2.7393079 , 3.4289758 ],\n", + " [-0.33159262, 0.7337389 , 2.8301528 , ..., 4.112663 ,\n", + " -4.462565 , 3.8659678 ],\n", + " [-0.7364118 , 1.3901651 , 2.1960316 , ..., 3.8899298 ,\n", + " -4.5879855 , 4.744391 ]], dtype=float32)" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "_, user_embeddings = model2.get_user_representations(features=user_features)\n", + "user_embeddings" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to retrieve the top N similar users, we can use the `similar_users` from `recommenders`. For example, if we want to choose top 10 users most similar to the user 1:" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
userIDscore
05550.999998
1540.999997
23140.999995
34110.999993
43950.999992
54650.999992
64810.999990
72820.999990
85270.999990
9570.999989
\n", + "
" + ], + "text/plain": [ + " userID score\n", + "0 555 0.999998\n", + "1 54 0.999997\n", + "2 314 0.999995\n", + "3 411 0.999993\n", + "4 395 0.999992\n", + "5 465 0.999992\n", + "6 481 0.999990\n", + "7 282 0.999990\n", + "8 527 0.999990\n", + "9 57 0.999989" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "similar_users(user_id=1, \n", + " user_features=user_features, \n", + " model=model2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### 5.2 Item affinity" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Similar to the user affinity, the item-item affinity can be retrieved with the `get_item_representations` method using the fitted model." + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[-0.07855016, -0.06326439, -0.24408759, ..., 0.91503495,\n", + " -1.1991384 , 0.6392026 ],\n", + " [ 0.02296161, 0.21057224, 0.52859396, ..., 0.6266738 ,\n", + " -0.5909869 , 0.48717606],\n", + " [-0.05290217, 0.21497665, 0.12442638, ..., 0.64513564,\n", + " -0.89034337, 0.47523445],\n", + " ...,\n", + " [ 0.37707207, 0.12548159, 0.74360174, ..., 0.19332102,\n", + " -0.24798231, -0.3791776 ],\n", + " [-0.27374834, -0.23832163, 0.9083196 , ..., 0.9711132 ,\n", + " -0.36962402, 0.20986083],\n", + " [-0.26275527, -0.3118822 , 0.60458297, ..., 0.52483046,\n", + " -0.46068186, 0.53892124]], dtype=float32)" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "_, item_embeddings = model2.get_item_representations(features=item_features)\n", + "item_embeddings" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function to retrieve the top N similar items is similar to similar_users() above. For example, if we want to choose top 10 items most similar to the item 10:" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
itemIDscore
01810.996882
1140.996467
21460.996463
33730.995977
43210.995873
51140.995869
6440.995434
712510.994995
83520.994736
94170.994391
\n", + "
" + ], + "text/plain": [ + " itemID score\n", + "0 181 0.996882\n", + "1 14 0.996467\n", + "2 146 0.996463\n", + "3 373 0.995977\n", + "4 321 0.995873\n", + "5 114 0.995869\n", + "6 44 0.995434\n", + "7 1251 0.994995\n", + "8 352 0.994736\n", + "9 417 0.994391" + ] + }, + "execution_count": 40, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "similar_items(item_id=10, \n", + " item_features=item_features, \n", + " model=model2)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Record results for tests - ignore this cell\n", + "store_metadata(\"eval_precision\", eval_precision)\n", + "store_metadata(\"eval_recall\", eval_recall)\n", + "store_metadata(\"eval_precision2\", eval_precision2)\n", + "store_metadata(\"eval_recall2\", eval_recall2)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 6. Conclusion" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this notebook, the background of hybrid matrix factorisation model has been explained together with a detailed example of LightFM's implementation. \n", + "\n", + "The process of incorporating additional user and item metadata has also been demonstrated with performance comparison. Furthermore, the calculation of both user and item affinity scores have also been demonstrated and extracted from the fitted model.\n", + "\n", + "This notebook remains a fairly simple treatment on the subject and hopefully could serve as a good foundation for the reader." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## References" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "- [[1](https://arxiv.org/abs/1507.08439)]. Maciej Kula - Metadata Embeddings for User and Item Cold-start Recommendations, 2015. arXiv:1507.08439\n", + "- [[2](https://making.lyst.com/lightfm/docs/home.html)]. LightFM documentation,\n", + "- [3]. Charu C. Aggarwal - Recommender Systems: The Textbook, Springer, April 2016. ISBN 978-3-319-29659-3\n", + "- [4]. Deepak K. Agarwal, Bee-Chung Chen - Statistical Methods for Recommender Systems, 2016. ISBN: 9781107036079 \n" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "reco_cpu", + "language": "python", + "name": "conda-env-reco_cpu-py" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.13" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} From 2ca708eb13a1a8a34a9e2f8f622452bcde82dcfe Mon Sep 17 00:00:00 2001 From: miguelgfierro Date: Fri, 29 Dec 2023 09:20:39 +0100 Subject: [PATCH 5/5] :memo: Signed-off-by: miguelgfierro --- .../fm_deep_dive.ipynb | 2 +- .../lightfm_deep_dive.ipynb | 23 ++++++++----------- 2 files changed, 11 insertions(+), 14 deletions(-) diff --git a/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb b/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb index 04d782e5a6..190f3bc5d1 100644 --- a/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb +++ b/examples/02_model_collaborative_filtering/fm_deep_dive.ipynb @@ -15,7 +15,7 @@ "source": [ "# Factorization Machine Deep Dive\n", "\n", - "Factorization machine (FM) is one of the representative algorithms that are used for building hybrid recommenders model. The algorithm is powerful in terms of capturing the effects of not just the input features but also their interactions. The algorithm provides better generalization capability and expressiveness compared to other classic algorithms such as SVMs. The most recent research extends the basic FM algorithms by using deep learning techniques, which achieve remarkable improvement in a few practical use cases.\n", + "Factorization machine (FM) is one of the representative algorithms that are used for building recommendation model. The algorithm is powerful in terms of capturing the effects of not just the input features but also their interactions. The algorithm provides better generalization capability and expressiveness compared to other classic algorithms such as SVMs. The most recent research extends the basic FM algorithms by using deep learning techniques, which achieve remarkable improvement in a few practical use cases.\n", "\n", "This notebook presents a deep dive into the Factorization Machine algorithm, and demonstrates some best practices of using the contemporary FM implementations like [`xlearn`](https://github.com/aksnzhy/xlearn) for dealing with tasks like click-through rate prediction." ] diff --git a/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb b/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb index 5ce4b79151..8e588760fa 100755 --- a/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb +++ b/examples/02_model_collaborative_filtering/lightfm_deep_dive.ipynb @@ -13,16 +13,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# LightFM - hybrid matrix factorisation on MovieLens (Python, CPU)" + "# LightFM - Factorization Machine on MovieLens (Python, CPU)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "This notebook explains the concept of a hybrid matrix factorisation based model for recommendation, it also outlines the steps to construct a pure matrix factorisation and a hybrid models using the [LightFM](https://github.com/lyst/lightfm) package. It also demonstrates how to extract both user and item affinity from a fitted hybrid model.\n", + "This notebook explains the concept of a Factorization Machine based model for recommendation, it also outlines the steps to construct a pure matrix factorization and a Factorization Machine using the [LightFM](https://github.com/lyst/lightfm) package. It also demonstrates how to extract both user and item affinity from a fitted model.\n", "\n", - "## 1. Hybrid matrix factorisation model\n", + "## 1. Factorization Machine model\n", "\n", "### 1.1 Background\n", "\n", @@ -30,27 +30,24 @@ "- Content based model,\n", "- Collaborative filtering model.\n", "\n", - "The content-based model recommends based on similarity of the items and/or users using their description/metadata/profile. On the other hand, collaborative filtering model (discussion is limited to matrix factorisation approach in this notebook) computes the latent factors of the users and items. It works based on the assumption that if a group of people expressed similar opinions on an item, these peole would tend to have similar opinions on other items. For further background and detailed explanation between these two approaches, the reader can refer to machine learning literatures [3, 4].\n", + "The content-based model recommends based on similarity of the items and/or users using their description/metadata/profile. On the other hand, collaborative filtering model (discussion is limited to matrix factorization approach in this notebook) computes the latent factors of the users and items. It works based on the assumption that if a group of people expressed similar opinions on an item, these people would tend to have similar opinions on other items. For further background and detailed explanation between these two approaches, the reader can refer to machine learning literatures [3, 4].\n", "\n", "The choice between the two models is largely based on the data availability. For example, the collaborative filtering model is usually adopted and effective when sufficient ratings/feedbacks have been recorded for a group of users and items.\n", "\n", "However, if there is a lack of ratings, content based model can be used provided that the metadata of the users and items are available. This is also the common approach to address the cold-start issues, where there are insufficient historical collaborative interactions available to model new users and/or items.\n", "\n", - "\n", + "### 1.2 Factorization Machine algorithm\n", "\n", - "### 1.2 Hybrid matrix factorisation algorithm\n", + "In view of the above problems, there have been a number of proposals to address the cold-start issues by combining both content-based and collaborative filtering approaches. The Factorization Machine model is among one of the solutions proposed [1]. \n", "\n", - "In view of the above problems, there have been a number of proposals to address the cold-start issues by combining both content-based and collaborative filtering approaches. The hybrid matrix factorisation model is among one of the solutions proposed [1]. \n", - "\n", - "In general, most hybrid approaches proposed different ways of assessing and/or combining the feature data in conjunction with the collaborative information.\n", + "In general, most approaches proposed different ways of assessing and/or combining the feature data in conjunction with the collaborative information.\n", "\n", "### 1.3 LightFM package \n", "\n", - "LightFM is a Python implementation of a hybrid recommendation algorithms for both implicit and explicit feedbacks [1].\n", + "LightFM is a Python implementation of a Factorization Machine recommendation algorithm for both implicit and explicit feedbacks [1].\n", "\n", - "It is a hybrid content-collaborative model which represents users and items as linear combinations of their content features’ latent factors. The model learns **embeddings or latent representations of the users and items in such a way that it encodes user preferences over items**. These representations produce scores for every item for a given user; items scored highly are more likely to be interesting to the user.\n", + "It is a Factorization Machine model which represents users and items as linear combinations of their content features’ latent factors. The model learns **embeddings or latent representations of the users and items in such a way that it encodes user preferences over items**. These representations produce scores for every item for a given user; items scored highly are more likely to be interesting to the user.\n", "\n", "The user and item embeddings are estimated for every feature, and these features are then added together to be the final representations for users and items. \n", "\n", @@ -1907,7 +1904,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In this notebook, the background of hybrid matrix factorisation model has been explained together with a detailed example of LightFM's implementation. \n", + "In this notebook, the background of Factorization Machine model has been explained together with a detailed example of LightFM's implementation. \n", "\n", "The process of incorporating additional user and item metadata has also been demonstrated with performance comparison. Furthermore, the calculation of both user and item affinity scores have also been demonstrated and extracted from the fitted model.\n", "\n",