diff --git a/.github/prompts/cherry-pick-between-branches.prompt.md b/.github/prompts/cherry-pick-between-branches.prompt.md new file mode 100644 index 000000000000..f64cafb205ec --- /dev/null +++ b/.github/prompts/cherry-pick-between-branches.prompt.md @@ -0,0 +1,259 @@ +# Cherry-Pick Commits Between Branches + +Reusable prompt for automated agents cherry-picking commits from a **source branch** into a **target branch**. Designed to be repository-agnostic. + +--- + +## Inputs (provided by the user) + +| Variable | Example | Description | +|----------|---------|-------------| +| `SOURCE_BRANCH` | `develop-teak.3` | Branch containing the commits to cherry-pick | +| `TARGET_BRANCH` | `release/ulmo` | Branch that will receive the cherry-picks | +| `CONFLICT_STRATEGY` | `theirs` | How to resolve conflicts: `theirs` (prefer source), `ours` (prefer target), or `manual` | + +--- + +## Step-by-step procedure + +### 1. Set up the working branch + +```bash +git fetch origin $TARGET_BRANCH $SOURCE_BRANCH +git checkout -b cherry-pick/$SOURCE_BRANCH-into-$TARGET_BRANCH origin/$TARGET_BRANCH +``` + +> **CRITICAL — Always branch off `origin/$TARGET_BRANCH`.** +> Never branch from `master`, `main`, or any other branch. The new branch +> must share history *only* with the target branch so the PR diff is clean. + +### 2. Identify commits to cherry-pick + +```bash +# List commits on SOURCE that are NOT on TARGET (oldest first) +git log --oneline --reverse origin/$TARGET_BRANCH..origin/$SOURCE_BRANCH +``` + +Before cherry-picking, **filter out** commits that should NOT be applied: + +| Filter | Command | Why | +|--------|---------|-----| +| **Merge commits** | Exclude lines with more than one parent (`git log --no-merges ...`) | Merge commits bring in entire branch histories and cannot be cleanly cherry-picked. | +| **Already-present changes** | After cherry-picking, if `git cherry-pick` produces an empty commit, skip it with `git cherry-pick --skip`. | Changes may already be in the target branch via a different commit (e.g., a rebase or earlier backport). | + +**Recommended: build the filtered list first, then cherry-pick in one pass.** + +```bash +# Get non-merge commits only +COMMITS=$(git log --oneline --reverse --no-merges origin/$TARGET_BRANCH..origin/$SOURCE_BRANCH | awk '{print $1}') +``` + +### 3. Cherry-pick each commit one by one + +```bash +for COMMIT in $COMMITS; do + echo "Cherry-picking $COMMIT ..." + if ! git cherry-pick "$COMMIT" --no-commit; then + # Conflict occurred — resolve based on CONFLICT_STRATEGY + if [ "$CONFLICT_STRATEGY" = "theirs" ]; then + git checkout --theirs . + git add . + elif [ "$CONFLICT_STRATEGY" = "ours" ]; then + git checkout --ours . + git add . + else + echo "MANUAL resolution needed for $COMMIT" + echo "Conflicted files:" + git diff --name-only --diff-filter=U + # Resolve each file, then: git add + # After resolving all files, the loop continues to commit + fi + fi + + # Check if the cherry-pick produced any changes + if git diff --cached --quiet; then + echo "SKIP: $COMMIT is empty (already in target)" + git cherry-pick --skip 2>/dev/null || git reset --hard HEAD + continue + fi + + git commit -C "$COMMIT" # reuse original commit message +done +``` + +> **NEVER use `git merge` to resolve divergence or conflicts.** +> `git merge --allow-unrelated-histories` or any merge variant will create +> a merge commit with two parents, pulling hundreds of unrelated commits +> into the PR. This was the #1 mistake in previous attempts. + +### 4. Conflict resolution + +The cherry-pick loop in Step 3 handles conflicts inline. Below is a detailed +reference for each strategy: + +**If `CONFLICT_STRATEGY=theirs` (prefer source branch):** +```bash +git checkout --theirs . +git add . +git commit -C "$COMMIT" +``` + +**If `CONFLICT_STRATEGY=ours` (prefer target branch):** +```bash +git checkout --ours . +git add . +git commit -C "$COMMIT" +``` + +**If `CONFLICT_STRATEGY=manual`:** +- List conflicted files: `git diff --name-only --diff-filter=U` +- Resolve each file individually +- `git add ` after resolving +- `git commit -C "$COMMIT"` + +> **Record every conflict resolution.** Maintain a table of: +> `| Commit | Conflicted files | Resolution | Rationale |` +> This is essential for reviewers. + +### 5. Post-cherry-pick review (CRITICAL) + +After all cherry-picks are applied, review and fix branch-specific configuration: + +| What to check | Why | Example | +|---------------|-----|---------| +| **Release identifiers** | Source branch may have a different release name | `RELEASE_LINE = "teak"` must be reverted to `"ulmo"` on the ulmo branch | +| **Version pins / requirements files** | Dependency versions may differ between releases | `requirements/constraints.txt`, `setup.cfg`, `package.json` | +| **CI/CD configuration** | Workflow triggers or environment variables may be branch-specific | `.github/workflows/*.yml`, `Makefile` | +| **Feature flags / waffle switches** | Source may enable features not ready for target release | Django settings, waffle flag definitions | +| **Branch-name references in code** | Hardcoded branch names in scripts or configs | Deployment scripts, documentation links | + +```bash +# Quick check for the most common issue — release line: +git diff origin/$TARGET_BRANCH -- '**/release.py' '**/version.py' '**/release_line*' +``` + +**Fix any branch-specific values before pushing.** This was a known mistake: +cherry-picking a commit that changed `RELEASE_LINE` from the target release +name to the source release name went unnoticed and would have broken the +target branch. + +### 6. Push and create the PR + +```bash +git push origin cherry-pick/$SOURCE_BRANCH-into-$TARGET_BRANCH +``` + +When creating the PR: + +- **Set the base branch to `$TARGET_BRANCH`** — NOT `master` or `main`. + Agents may not be able to change the base branch after PR creation, so + get this right the first time. +- Use a descriptive title: + `chore: cherry-pick $SOURCE_BRANCH into $TARGET_BRANCH` +- Include the summary tables (applied, skipped, conflicts) in the PR body. + +### 7. Write the PR description + +Include these sections in the PR body: + +1. **Summary** — one-line description of what was cherry-picked and why. +2. **Applied commits table** — `| # | SHA | Subject |` +3. **Skipped commits table** — `| SHA | Subject | Reason |` +4. **Conflict resolutions table** — `| Commit | Files | Strategy | Rationale |` +5. **Post-cherry-pick fixes** — any branch-specific config that was reverted. +6. **Testing instructions** — which areas to test based on the changes. + +--- + +## Mistakes to avoid (lessons learned) + +These are real mistakes from previous cherry-pick sessions. **Do not repeat them.** + +### ❌ DO NOT use `git merge` to fix push divergence + +**What happened:** The agent used `git merge --allow-unrelated-histories -s ours` +to resolve a push divergence. This created a merge commit with two parents — one +from the cherry-pick chain and one from an unrelated branch. The PR then showed +~200 commits instead of the expected 38. + +**Fix:** If your local branch diverges from the remote, use +`git reset --hard origin/` and redo the work. If force-push is not +available, start over on a new branch. NEVER merge to fix divergence. + +### ❌ DO NOT open the PR against the wrong base branch + +**What happened:** The PR was opened against `master` instead of `release/ulmo`. +The agent could not change the base branch after creation due to API/firewall +restrictions. + +**Fix:** Always specify the correct base branch when creating the PR. Double-check +before creating. If using `gh pr create`, use `--base $TARGET_BRANCH`. + +### ❌ DO NOT forget to revert branch-specific configuration + +**What happened:** A cherry-picked commit changed `RELEASE_LINE` from `"ulmo"` to +`"teak"`. This was not caught because all conflicts were resolved with `--theirs` +(preferring source). + +**Fix:** After cherry-picking, always run the post-cherry-pick review (Step 5). +Diff critical config files against the target branch. + +### ❌ DO NOT cherry-pick merge commits + +**What happened:** Merge commits were included in the initial commit list. They +cannot be cleanly cherry-picked and bring in entire branch histories. + +**Fix:** Always use `git log --no-merges` when building the commit list. + +### ❌ DO NOT skip recording conflict resolutions + +**What happened:** Conflicts were resolved in bulk with `--theirs` but not +individually documented. Reviewers had no way to verify correctness. + +**Fix:** Record every conflict: which commit, which files, what strategy was used, +and why. + +### ❌ DO NOT create situations requiring force-push + +**What happened:** A bad merge commit required force-push to remove, but the +agent sandbox does not support force-push. A second PR was needed. + +**Fix:** Be extremely careful with git operations. Preview every command before +running. If unsure, create a test branch first. Prefer starting over on a new +branch rather than trying to fix a broken history. + +--- + +## Quick-reference checklist + +Use this checklist before, during, and after the cherry-pick operation: + +### Before starting +- [ ] Confirmed source and target branches exist +- [ ] Built the filtered commit list (`--no-merges`, oldest first) +- [ ] Confirmed conflict resolution strategy with the user + +### During cherry-picking +- [ ] Each commit cherry-picked individually (no `git merge`) +- [ ] Empty cherry-picks skipped and recorded +- [ ] Each conflict resolution recorded with files and rationale + +### After cherry-picking +- [ ] Reviewed and reverted branch-specific config (release line, versions, flags) +- [ ] Diff against target branch looks correct (`git diff origin/$TARGET_BRANCH --stat`) +- [ ] PR created with correct base branch (`$TARGET_BRANCH`) +- [ ] PR description includes applied/skipped/conflict tables +- [ ] No merge commits in the branch history + +--- + +## Example usage + +> **User prompt:** +> Cherry-pick all commits from `develop-teak.3` into `release/ulmo`. +> Resolve conflicts by preferring `develop-teak.3`. + +The agent should follow Steps 1–7 above with: +- `SOURCE_BRANCH=develop-teak.3` +- `TARGET_BRANCH=release/ulmo` +- `CONFLICT_STRATEGY=theirs` diff --git a/cms/djangoapps/contentstore/helpers.py b/cms/djangoapps/contentstore/helpers.py index c5890b5b818b..ff2020afd89f 100644 --- a/cms/djangoapps/contentstore/helpers.py +++ b/cms/djangoapps/contentstore/helpers.py @@ -2,7 +2,6 @@ Helper methods for Studio views. """ from __future__ import annotations -import json import logging import pathlib import urllib @@ -17,7 +16,6 @@ from django.utils.translation import gettext as _ from opaque_keys.edx.keys import CourseKey, UsageKey from opaque_keys.edx.locator import DefinitionLocator, LocalId -from openedx.core.djangoapps.content_tagging.types import TagValuesByObjectIdDict from xblock.core import XBlock from xblock.fields import ScopeIds from xblock.runtime import IdGenerator @@ -288,26 +286,6 @@ class StaticFileNotices: error_files: list[str] = Factory(list) -def _rewrite_static_asset_references(downstream_xblock: XBlock, substitutions: dict[str, str], user_id: int) -> None: - """ - Rewrite the static asset references in the OLX string to point to the new locations in the course. - """ - store = modulestore() - if hasattr(downstream_xblock, "data"): - data_with_substitutions = downstream_xblock.data - for old_static_ref, new_static_ref in substitutions.items(): - data_with_substitutions = _replace_strings( - data_with_substitutions, - old_static_ref, - new_static_ref, - ) - downstream_xblock.data = data_with_substitutions - store.update_item(downstream_xblock, user_id) - - for child in downstream_xblock.get_children(): - _rewrite_static_asset_references(child, substitutions, user_id) - - def _insert_static_files_into_downstream_xblock( downstream_xblock: XBlock, staged_content_id: int, request ) -> StaticFileNotices: @@ -330,12 +308,21 @@ def _insert_static_files_into_downstream_xblock( static_files=static_files, ) - if substitutions: - # Rewrite the OLX's static asset references to point to the new - # locations for those assets. See _import_files_into_course for more - # info on why this is necessary. - _rewrite_static_asset_references(downstream_xblock, substitutions, request.user.id) - + # Rewrite the OLX's static asset references to point to the new + # locations for those assets. See _import_files_into_course for more + # info on why this is necessary. + store = modulestore() + if hasattr(downstream_xblock, "data") and substitutions: + data_with_substitutions = downstream_xblock.data + for old_static_ref, new_static_ref in substitutions.items(): + data_with_substitutions = _replace_strings( + data_with_substitutions, + old_static_ref, + new_static_ref, + ) + downstream_xblock.data = data_with_substitutions + if store is not None: + store.update_item(downstream_xblock, request.user.id) return notices @@ -388,10 +375,9 @@ def import_staged_content_from_user_clipboard(parent_key: UsageKey, request) -> parent_xblock, store, user=request.user, - slug_hint=( - user_clipboard.source_usage_key.block_id - if isinstance(user_clipboard.source_usage_key, UsageKey) else None - ), + slug_hint=user_clipboard.source_usage_key.block_id, + copied_from_block=str(user_clipboard.source_usage_key), + copied_from_version_num=user_clipboard.content.version_num, tags=user_clipboard.content.tags, ) @@ -455,7 +441,7 @@ def _fetch_and_set_upstream_link( Fetch and set upstream link for the given xblock which is being pasted. This function handles following cases: * the xblock is copied from a v2 library; the library block is set as upstream. * the xblock is copied from a course; no upstream is set, only copied_from_block is set. - * the xblock is copied from a course where the source block was imported from a library; the original library block + * the xblock is copied from a course where the source block was imported from a library; the original libary block is set as upstream. """ # Try to link the pasted block (downstream) to the copied block (upstream). @@ -494,14 +480,6 @@ def _fetch_and_set_upstream_link( # new values from the published upstream content. if isinstance(upstream_link.upstream_key, UsageKey): # only if upstream is a block, not a container fetch_customizable_fields_from_block(downstream=temp_xblock, user=user, upstream=temp_xblock) - # Although the above function will set all customisable fields to match its upstream_* counterpart - # We copy the downstream_customized list to the new block to avoid overriding user customisations on sync - # So we will have: - # temp_xblock.display_name == temp_xblock.upstream_display_name - # temp_xblock.data == temp_xblock.upstream_data # for html blocks - # Even then we want to set `downstream_customized` value to avoid overriding user customisations on sync - downstream_customized = temp_xblock.xml_attributes.get("downstream_customized", '[]') - temp_xblock.downstream_customized = json.loads(downstream_customized) def _import_xml_node_to_parent( @@ -513,8 +491,13 @@ def _import_xml_node_to_parent( user: User, # Hint to use as usage ID (block_id) for the new XBlock slug_hint: str | None = None, + # UsageKey of the XBlock that this one is a copy of + copied_from_block: str | None = None, + # Positive int version of source block, if applicable (e.g., library block). + # Zero if not applicable (e.g., course block). + copied_from_version_num: int = 0, # Content tags applied to the source XBlock(s) - tags: TagValuesByObjectIdDict | None = None, + tags: dict[str, str] | None = None, ) -> XBlock: """ Given an XML node representing a serialized XBlock (OLX), import it into modulestore 'store' as a child of the @@ -525,11 +508,9 @@ def _import_xml_node_to_parent( runtime = parent_xblock.runtime parent_key = parent_xblock.scope_ids.usage_id block_type = node.tag - node_copied_from = node.attrib.get('copied_from_block', None) - node_copied_version = node.attrib.get('copied_from_version', None) # Modulestore's IdGenerator here is SplitMongoIdManager which is assigned - # by SplitModuleStoreRuntime and since we need our custom ImportIdGenerator + # by CachingDescriptorSystem Runtime and since we need our custom ImportIdGenerator # here we are temporaraliy swtiching it. original_id_generator = runtime.id_generator @@ -566,8 +547,7 @@ def _import_xml_node_to_parent( else: # We have to handle the children ourselves, because there are lots of complex interactions between # * the vanilla XBlock parse_xml() method, and its lack of API for "create and save a new XBlock" - # * the XmlMixin version of parse_xml() which only works with XMLImportingModuleStoreRuntime, - # not modulestore or the v2 runtime + # * the XmlMixin version of parse_xml() which only works with ImportSystem, not modulestore or the v2 runtime # * the modulestore APIs for creating and saving a new XBlock, which work but don't support XML parsing. # We can safely assume that if the XBLock class supports children, every child node will be the XML # serialization of a child block, in order. For blocks that don't support children, their XML content/nodes @@ -585,10 +565,8 @@ def _import_xml_node_to_parent( if xblock_class.has_children and temp_xblock.children: raise NotImplementedError("We don't yet support pasting XBlocks with children") - - if node_copied_from: - _fetch_and_set_upstream_link(node_copied_from, node_copied_version, temp_xblock, user) - + if copied_from_block: + _fetch_and_set_upstream_link(copied_from_block, copied_from_version_num, temp_xblock, user) # Save the XBlock into modulestore. We need to save the block and its parent for this to work: new_xblock = store.update_item(temp_xblock, user.id, allow_not_found=True) new_xblock.parent = parent_key @@ -604,23 +582,26 @@ def _import_xml_node_to_parent( if not children_handled: for child_node in child_nodes: + child_copied_from = _get_usage_key_from_node(child_node, copied_from_block) if copied_from_block else None _import_xml_node_to_parent( child_node, new_xblock, store, user=user, + copied_from_block=str(child_copied_from), tags=tags, ) # Copy content tags to the new xblock if new_xblock.upstream: + # If this block is synced from an upstream (e.g. library content), # copy the tags from the upstream as ready-only content_tagging_api.copy_tags_as_read_only( new_xblock.upstream, new_xblock.location, ) - elif tags and node_copied_from: - object_tags = tags.get(node_copied_from) + elif copied_from_block and tags: + object_tags = tags.get(str(copied_from_block)) if object_tags: content_tagging_api.set_all_object_tags( content_key=new_xblock.location, @@ -746,10 +727,10 @@ def _import_file_into_course( if thumbnail_content is not None: content.thumbnail_location = thumbnail_location contentstore().save(content) - return True, {clipboard_file_path: filename if not import_path else f"static/{import_path}"} + return True, {clipboard_file_path: f"static/{import_path}"} elif current_file.content_digest == file_data_obj.md5_hash: - # The file already exists and matches exactly, so no action is needed - return None, {} + # The file already exists and matches exactly, so no action is needed except substitutions + return None, {clipboard_file_path: f"static/{import_path}"} else: # There is a conflict with some other file that has the same name. return False, {} @@ -813,6 +794,27 @@ def is_item_in_course_tree(item): return ancestor is not None +def _get_usage_key_from_node(node, parent_id: str) -> UsageKey | None: + """ + Returns the UsageKey for the given node and parent ID. + + If the parent_id is not a valid UsageKey, or there's no "url_name" attribute in the node, then will return None. + """ + parent_key = UsageKey.from_string(parent_id) + parent_context = parent_key.context_key + usage_key = None + block_id = node.attrib.get("url_name") + block_type = node.tag + + if parent_context and block_id and block_type: + usage_key = parent_context.make_usage_key( + block_type=block_type, + block_id=block_id, + ) + + return usage_key + + def concat_static_file_notices(notices: list[StaticFileNotices]) -> StaticFileNotices: """Combines multiple static file notices into a single object diff --git a/cms/djangoapps/contentstore/models.py b/cms/djangoapps/contentstore/models.py index a4f2ce3c6119..2c0a5d42cce9 100644 --- a/cms/djangoapps/contentstore/models.py +++ b/cms/djangoapps/contentstore/models.py @@ -1,18 +1,18 @@ """ Models for contentstore """ -import logging + + from datetime import datetime, timezone -from itertools import chain from config_models.models import ConfigurationModel from django.db import models -from django.db.models import Case, Exists, ExpressionWrapper, OuterRef, Q, QuerySet, Value, When -from django.db.models.fields import BooleanField, IntegerField, TextField +from django.db.models import Count, F, Q, QuerySet, Max +from django.db.models.fields import IntegerField, TextField from django.db.models.functions import Coalesce from django.db.models.lookups import GreaterThan from django.utils.translation import gettext_lazy as _ -from opaque_keys.edx.django.models import ContainerKeyField, CourseKeyField, UsageKeyField +from opaque_keys.edx.django.models import CourseKeyField, ContainerKeyField, UsageKeyField from opaque_keys.edx.keys import CourseKey, UsageKey from opaque_keys.edx.locator import LibraryContainerLocator from openedx_learning.api.authoring import get_published_version @@ -23,8 +23,6 @@ manual_date_time_field, ) -logger = logging.getLogger(__name__) - class VideoUploadConfig(ConfigurationModel): """ @@ -100,44 +98,14 @@ class EntityLinkBase(models.Model): downstream_usage_key = UsageKeyField(max_length=255, unique=True) # Search by course/downstream key downstream_context_key = CourseKeyField(max_length=255, db_index=True) - # This is present if the creation of this link is a consequence of - # importing a container that has one or more levels of children. - # This represents the parent (container) in the top level - # at the moment of the import. - top_level_parent = models.ForeignKey("ContainerLink", on_delete=models.SET_NULL, null=True, blank=True) version_synced = models.IntegerField() version_declined = models.IntegerField(null=True, blank=True) - downstream_customized = models.JSONField( - default=list, - help_text=( - 'Names of the fields which have values set on the upstream block yet have been explicitly' - ' overridden on this downstream block' - ), - ) created = manual_date_time_field() updated = manual_date_time_field() - @property - def upstream_context_title(self) -> str: - """ - Returns upstream context title. - """ - raise NotImplementedError - - @property - def published_at(self) -> str | None: - """ - Returns the published date of the entity - """ - raise NotImplementedError - class Meta: abstract = True - @classmethod - def get_by_downstream_usage_key(cls, downstream_usage_key: UsageKey): - return cls.objects.get(downstream_usage_key=downstream_usage_key) - class ComponentLink(EntityLinkBase): """ @@ -181,39 +149,20 @@ def upstream_context_title(self) -> str: """ return self.upstream_block.publishable_entity.learning_package.title - @property - def published_at(self) -> str | None: - """ - Returns the published date of the component - """ - if self.upstream_block.publishable_entity.published is None: - raise AttributeError(_("The component must be published to access `published_at`")) - return self.upstream_block.publishable_entity.published.publish_log_record.publish_log.published_at - @classmethod def filter_links( cls, - *, - use_top_level_parents=False, **link_filter, - ) -> QuerySet["EntityLinkBase"] | list["EntityLinkBase"]: + ) -> QuerySet["EntityLinkBase"]: """ Get all links along with sync flag, upstream context title and version, with optional filtering. - - `use_top_level_parents` is an special filter, replace any result with the top-level parent if exists. - Example: We have linkA and linkB with top-level parent as linkC, and linkD without top-level parent. - After all other filters: - Case 1: `use_top_level_parents` is False, the result is [linkA, linkB, linkC, linkD] - Case 2: `use_top_level_parents` is True, the result is [linkC, linkD] """ - RELATED_FIELDS = [ + ready_to_sync = link_filter.pop('ready_to_sync', None) + result = cls.objects.filter(**link_filter).select_related( "upstream_block__publishable_entity__published__version", "upstream_block__publishable_entity__learning_package", "upstream_block__publishable_entity__published__publish_log_record__publish_log", - ] - - ready_to_sync = link_filter.pop('ready_to_sync', None) - result = cls.objects.filter(**link_filter).select_related(*RELATED_FIELDS).annotate( + ).annotate( ready_to_sync=( GreaterThan( Coalesce("upstream_block__publishable_entity__published__version__version_num", 0), @@ -222,33 +171,44 @@ def filter_links( Coalesce("upstream_block__publishable_entity__published__version__version_num", 0), Coalesce("version_declined", 0) ) - ), - # This is alwys False, the components doens't have children - ready_to_sync_from_children=Value(False, output_field=BooleanField()) + ) ) if ready_to_sync is not None: result = result.filter(ready_to_sync=ready_to_sync) + return result - # Handle top-level parents logic - if use_top_level_parents: - # Get objects without top_level_parent - objects_without_top_level = result.filter(top_level_parent__isnull=True) - - # Get the top-level parent keys - top_level_keys = result.filter(top_level_parent__isnull=False).values_list( - 'top_level_parent', flat=True, + @classmethod + def summarize_by_downstream_context(cls, downstream_context_key: CourseKey) -> QuerySet: + """ + Returns a summary of links by upstream context for given downstream_context_key. + Example: + [ + { + "upstream_context_title": "CS problems 3", + "upstream_context_key": "lib:OpenedX:CSPROB3", + "ready_to_sync_count": 11, + "total_count": 14, + "last_published_at": "2025-05-02T20:20:44.989042Z" + }, + { + "upstream_context_title": "CS problems 2", + "upstream_context_key": "lib:OpenedX:CSPROB2", + "ready_to_sync_count": 15, + "total_count": 24, + "last_published_at": "2025-05-03T21:20:44.989042Z" + }, + ] + """ + result = cls.filter_links(downstream_context_key=downstream_context_key).values( + "upstream_context_key", + upstream_context_title=F("upstream_block__publishable_entity__learning_package__title"), + ).annotate( + ready_to_sync_count=Count("id", Q(ready_to_sync=True)), + total_count=Count("id"), + last_published_at=Max( + "upstream_block__publishable_entity__published__publish_log_record__publish_log__published_at" ) - - # Get the top-level parents - # Any top-level parent is a container - top_level_objects = ContainerLink.filter_links(**{ - "id__in": top_level_keys - }) - - # Returns a list of `EntityLinkBase` as can be a combination of `ComponentLink`` - # and `ContainerLink`` - return list(chain(top_level_objects, objects_without_top_level)) - + ) return result @classmethod @@ -261,9 +221,7 @@ def update_or_create( downstream_usage_key: UsageKey, downstream_context_key: CourseKey, version_synced: int, - top_level_parent_usage_key: UsageKey | None = None, version_declined: int | None = None, - downstream_customized: list[str] | None = None, created: datetime | None = None, ) -> "ComponentLink": """ @@ -271,15 +229,6 @@ def update_or_create( """ if not created: created = datetime.now(tz=timezone.utc) - top_level_parent = None - if top_level_parent_usage_key is not None: - try: - top_level_parent = ContainerLink.get_by_downstream_usage_key( - top_level_parent_usage_key, - ) - except ContainerLink.DoesNotExist: - logger.info(f"Unable to find the link for the container with the link: {top_level_parent_usage_key}") - new_values = { 'upstream_usage_key': upstream_usage_key, 'upstream_context_key': upstream_context_key, @@ -287,8 +236,6 @@ def update_or_create( 'downstream_context_key': downstream_context_key, 'version_synced': version_synced, 'version_declined': version_declined, - 'top_level_parent': top_level_parent, - 'downstream_customized': downstream_customized, } if upstream_block: new_values['upstream_block'] = upstream_block @@ -354,125 +301,20 @@ def upstream_context_title(self) -> str: """ return self.upstream_container.publishable_entity.learning_package.title - @property - def published_at(self) -> str | None: - """ - Returns the published date of the container - """ - if self.upstream_container.publishable_entity.published is None: - raise AttributeError(_("The container must be published to access `published_at`")) - return self.upstream_container.publishable_entity.published.publish_log_record.publish_log.published_at - @classmethod def filter_links( cls, - *, - use_top_level_parents=False, **link_filter, ) -> QuerySet["EntityLinkBase"]: """ Get all links along with sync flag, upstream context title and version, with optional filtering. - - `use_top_level_parents` is an special filter, replace any result with the top-level parent if exists. - Example: We have linkA and linkB with top-level parent as linkC and linkD without top-level parent. - After all other filters: - Case 1: `use_top_level_parents` is False, the result is [linkA, linkB, linkC, linkD] - Case 2: `use_top_level_parents` is True, the result is [linkC, linkD] """ - RELATED_FIELDS = [ + ready_to_sync = link_filter.pop('ready_to_sync', None) + result = cls.objects.filter(**link_filter).select_related( "upstream_container__publishable_entity__published__version", - "upstream_container__publishable_entity__learning_package", + "upstream_container__publishable_entity__learning_package" "upstream_container__publishable_entity__published__publish_log_record__publish_log", - ] - - ready_to_sync = link_filter.pop('ready_to_sync', None) - result = cls._annotate_query_with_ready_to_sync( - cls.objects.filter(**link_filter).select_related(*RELATED_FIELDS), - ) - if ready_to_sync is not None: - result = result.filter(Q(ready_to_sync=ready_to_sync) | Q(ready_to_sync_from_children=ready_to_sync)) - - # Handle top-level parents logic - if use_top_level_parents: - # Get objects without top_level_parent - objects_without_top_level = result.filter(top_level_parent__isnull=True) - - # Get the top-level parent keys - top_level_keys = result.filter(top_level_parent__isnull=False).values_list( - 'top_level_parent', flat=True, - ) - - # Get the top-level parents - # Any top-level parent is a container - top_level_objects = cls._annotate_query_with_ready_to_sync(cls.objects.filter( - id__in=top_level_keys, - ).select_related(*RELATED_FIELDS)) - - result = top_level_objects.union(objects_without_top_level) - - return result - - @classmethod - def _annotate_query_with_ready_to_sync(cls, query_set: QuerySet["EntityLinkBase"]) -> QuerySet["EntityLinkBase"]: - """ - Adds ready to sync related values to the query set: - * `ready_to_sync`: When the container is ready to sync. - * `ready_to_sync_from_children`: When any children is ready to sync. - """ - # SubQuery to verify if some container children (associated with top-level parent) - # needs sync. - subq_container = cls.objects.filter( - top_level_parent=OuterRef('pk') ).annotate( - child_ready=Case( - When( - GreaterThan( - Coalesce("upstream_container__publishable_entity__published__version__version_num", 0), - Coalesce("version_synced", 0) - ) & GreaterThan( - Coalesce("upstream_container__publishable_entity__published__version__version_num", 0), - Coalesce("version_declined", 0) - ), - then=1 - ), - # If upstream block was deleted, set ready_to_sync = True - When( - Q(upstream_container__publishable_entity__published__version__version_num__isnull=True), - then=1 - ), - default=0, - output_field=models.IntegerField() - ) - ).filter(child_ready=1) - - # SubQuery to verify if some component children (assisiated with top-level parent) - # needs sync. - subq_components = ComponentLink.objects.filter( - top_level_parent=OuterRef('pk') - ).annotate( - child_ready=Case( - When( - GreaterThan( - Coalesce("upstream_block__publishable_entity__published__version__version_num", 0), - Coalesce("version_synced", 0) - ) & GreaterThan( - Coalesce("upstream_block__publishable_entity__published__version__version_num", 0), - Coalesce("version_declined", 0) - ), - then=1 - ), - # If upstream block was deleted, set ready_to_sync = True - When( - Q(upstream_block__publishable_entity__published__version__version_num__isnull=True), - then=1 - ), - default=0, - output_field=models.IntegerField() - ) - ).filter(child_ready=1) - - # TODO: is there a way to run `subq_container` or `subq_components` depending on the container type? - return query_set.annotate( ready_to_sync=( GreaterThan( Coalesce("upstream_container__publishable_entity__published__version__version_num", 0), @@ -481,12 +323,45 @@ def _annotate_query_with_ready_to_sync(cls, query_set: QuerySet["EntityLinkBase" Coalesce("upstream_container__publishable_entity__published__version__version_num", 0), Coalesce("version_declined", 0) ) - ), - ready_to_sync_from_children=ExpressionWrapper( - Exists(subq_container) | Exists(subq_components), - output_field=BooleanField(), - ), + ) + ) + if ready_to_sync is not None: + result = result.filter(ready_to_sync=ready_to_sync) + return result + + @classmethod + def summarize_by_downstream_context(cls, downstream_context_key: CourseKey) -> QuerySet: + """ + Returns a summary of links by upstream context for given downstream_context_key. + Example: + [ + { + "upstream_context_title": "CS problems 3", + "upstream_context_key": "lib:OpenedX:CSPROB3", + "ready_to_sync_count": 11, + "total_count": 14, + "last_published_at": "2025-05-02T20:20:44.989042Z" + }, + { + "upstream_context_title": "CS problems 2", + "upstream_context_key": "lib:OpenedX:CSPROB2", + "ready_to_sync_count": 15, + "total_count": 24, + "last_published_at": "2025-05-03T21:20:44.989042Z" + }, + ] + """ + result = cls.filter_links(downstream_context_key=downstream_context_key).values( + "upstream_context_key", + upstream_context_title=F("upstream_container__publishable_entity__learning_package__title"), + ).annotate( + ready_to_sync_count=Count("id", Q(ready_to_sync=True)), + total_count=Count('id'), + last_published_at=Max( + "upstream_container__publishable_entity__published__publish_log_record__publish_log__published_at" + ) ) + return result @classmethod def update_or_create( @@ -498,9 +373,7 @@ def update_or_create( downstream_usage_key: UsageKey, downstream_context_key: CourseKey, version_synced: int, - top_level_parent_usage_key: UsageKey | None = None, version_declined: int | None = None, - downstream_customized: list[str] | None = None, created: datetime | None = None, ) -> "ContainerLink": """ @@ -508,15 +381,6 @@ def update_or_create( """ if not created: created = datetime.now(tz=timezone.utc) - top_level_parent = None - if top_level_parent_usage_key is not None: - try: - top_level_parent = ContainerLink.get_by_downstream_usage_key( - top_level_parent_usage_key, - ) - except ContainerLink.DoesNotExist: - logger.info(f"Unable to find the link for the container with the link: {top_level_parent_usage_key}") - new_values = { 'upstream_container_key': upstream_container_key, 'upstream_context_key': upstream_context_key, @@ -524,8 +388,6 @@ def update_or_create( 'downstream_context_key': downstream_context_key, 'version_synced': version_synced, 'version_declined': version_declined, - 'top_level_parent': top_level_parent, - 'downstream_customized': downstream_customized, } if upstream_container_id: new_values['upstream_container_id'] = upstream_container_id diff --git a/cms/djangoapps/contentstore/rest_api/v1/serializers/course_waffle_flags.py b/cms/djangoapps/contentstore/rest_api/v1/serializers/course_waffle_flags.py index 31cf9c36f068..dca8e25cb435 100644 --- a/cms/djangoapps/contentstore/rest_api/v1/serializers/course_waffle_flags.py +++ b/cms/djangoapps/contentstore/rest_api/v1/serializers/course_waffle_flags.py @@ -30,7 +30,6 @@ class CourseWaffleFlagsSerializer(serializers.Serializer): enable_course_optimizer = serializers.SerializerMethodField() use_react_markdown_editor = serializers.SerializerMethodField() use_video_gallery_flow = serializers.SerializerMethodField() - enable_course_optimizer_check_prev_run_links = serializers.SerializerMethodField() def get_course_key(self): """ @@ -40,15 +39,9 @@ def get_course_key(self): def get_use_new_home_page(self, obj): """ - Method to indicate whether we should use the new home page. - - This used to be based on a waffle flag but the flag is being removed so we - default it to true for now until we can remove the need for it from the consumers - of this serializer and the related APIs. - - See https://github.com/openedx/edx-platform/issues/37497 + Method to get the use_new_home_page switch """ - return True + return toggles.use_new_home_page() def get_use_new_custom_pages(self, obj): """ @@ -102,11 +95,9 @@ def get_use_new_export_page(self, obj): def get_use_new_files_uploads_page(self, obj): """ Method to get the use_new_files_uploads_page switch - - Always true, because the switch is being removed an the new experience - should alawys be on. """ - return True + course_key = self.get_course_key() + return toggles.use_new_files_uploads_page(course_key) def get_use_new_video_uploads_page(self, obj): """ @@ -118,12 +109,9 @@ def get_use_new_video_uploads_page(self, obj): def get_use_new_course_outline_page(self, obj): """ Method to get the use_new_course_outline_page switch - - Always true, because the switch is being removed and the new experience - should always be on. This function will be removed in - https://github.com/openedx/edx-platform/issues/37497 """ - return True + course_key = self.get_course_key() + return toggles.use_new_course_outline_page(course_key) def get_use_new_unit_page(self, obj): """ @@ -179,10 +167,3 @@ def get_use_video_gallery_flow(self, obj): Method to get the use_video_gallery_flow waffle flag """ return toggles.use_video_gallery_flow() - - def get_enable_course_optimizer_check_prev_run_links(self, obj): - """ - Method to get the enable_course_optimizer_check_prev_run_links waffle flag - """ - course_key = self.get_course_key() - return toggles.enable_course_optimizer_check_prev_run_links(course_key) diff --git a/cms/djangoapps/contentstore/rest_api/v1/views/course_rerun.py b/cms/djangoapps/contentstore/rest_api/v1/views/course_rerun.py index fe39858c5380..70ea44c38f3e 100644 --- a/cms/djangoapps/contentstore/rest_api/v1/views/course_rerun.py +++ b/cms/djangoapps/contentstore/rest_api/v1/views/course_rerun.py @@ -9,6 +9,7 @@ from cms.djangoapps.contentstore.utils import get_course_rerun_context from cms.djangoapps.contentstore.rest_api.v1.serializers import CourseRerunSerializer from common.djangoapps.student.roles import GlobalStaff +from edly_features_app.roles import GlobalCourseCreatorRole from openedx.core.lib.api.view_utils import DeveloperErrorViewMixin, verify_course_exists, view_auth_classes from xmodule.modulestore.django import modulestore @@ -60,10 +61,12 @@ def get(self, request: Request, course_id: str): ``` """ - if not GlobalStaff().has_user(request.user): + #EDLYCUSTOM: Permit global course creators to access rerun status + course_key = CourseKey.from_string(course_id) + + if not GlobalStaff().has_user(request.user) and not GlobalCourseCreatorRole(course_key.org).has_user(request.user): self.permission_denied(request) - course_key = CourseKey.from_string(course_id) with modulestore().bulk_operations(course_key): course_block = modulestore().get_course(course_key) course_rerun_context = get_course_rerun_context(course_key, course_block, request.user) diff --git a/cms/djangoapps/contentstore/rest_api/v1/views/tests/test_course_waffle_flags.py b/cms/djangoapps/contentstore/rest_api/v1/views/tests/test_course_waffle_flags.py index f45cc48810d6..ad5696834af2 100644 --- a/cms/djangoapps/contentstore/rest_api/v1/views/tests/test_course_waffle_flags.py +++ b/cms/djangoapps/contentstore/rest_api/v1/views/tests/test_course_waffle_flags.py @@ -1,7 +1,6 @@ """ Unit tests for the course waffle flags view """ - from django.urls import reverse from cms.djangoapps.contentstore import toggles @@ -14,30 +13,28 @@ class CourseWaffleFlagsViewTest(CourseTestCase): Basic test for the CourseWaffleFlagsView endpoint, which returns waffle flag states for a specific course or globally if no course ID is provided. """ - maxDiff = None # Show the whole dictionary in the diff defaults = { - "enable_course_optimizer": False, - "use_new_advanced_settings_page": True, - "use_new_certificates_page": True, - "use_new_course_outline_page": True, - "use_new_course_team_page": True, - "use_new_custom_pages": True, - "use_new_export_page": True, - "use_new_files_uploads_page": True, - "use_new_grading_page": True, - "use_new_group_configurations_page": True, - "use_new_home_page": True, - "use_new_import_page": True, - "use_new_schedule_details_page": True, - "use_new_textbooks_page": True, - "use_new_unit_page": True, - "use_new_updates_page": True, - "use_new_video_uploads_page": False, - "use_react_markdown_editor": False, - "use_video_gallery_flow": False, - "enable_course_optimizer_check_prev_run_links": False, + 'enable_course_optimizer': False, + 'use_new_advanced_settings_page': True, + 'use_new_certificates_page': True, + 'use_new_course_outline_page': True, + 'use_new_course_team_page': True, + 'use_new_custom_pages': True, + 'use_new_export_page': True, + 'use_new_files_uploads_page': True, + 'use_new_grading_page': True, + 'use_new_group_configurations_page': True, + 'use_new_home_page': True, + 'use_new_import_page': True, + 'use_new_schedule_details_page': True, + 'use_new_textbooks_page': True, + 'use_new_unit_page': True, + 'use_new_updates_page': True, + 'use_new_video_uploads_page': False, + 'use_react_markdown_editor': False, + 'use_video_gallery_flow': False, } def setUp(self): @@ -47,11 +44,6 @@ def setUp(self): course_id=self.course.id, enabled=True, ) - WaffleFlagCourseOverrideModel.objects.create( - waffle_flag=toggles.ENABLE_COURSE_OPTIMIZER_CHECK_PREV_RUN_LINKS.name, - course_id=self.course.id, - enabled=True, - ) def test_global_defaults(self): url = reverse("cms.djangoapps.contentstore:v1:course_waffle_flags") @@ -67,5 +59,4 @@ def test_course_override(self): assert response.data == { **self.defaults, "enable_course_optimizer": True, - "enable_course_optimizer_check_prev_run_links": True, } diff --git a/cms/djangoapps/contentstore/rest_api/v2/serializers/downstreams.py b/cms/djangoapps/contentstore/rest_api/v2/serializers/downstreams.py index b6bb234ec12a..848e9e3a5c7f 100644 --- a/cms/djangoapps/contentstore/rest_api/v2/serializers/downstreams.py +++ b/cms/djangoapps/contentstore/rest_api/v2/serializers/downstreams.py @@ -4,26 +4,20 @@ from rest_framework import serializers -from cms.djangoapps.contentstore.models import ComponentLink, ContainerLink +from cms.djangoapps.contentstore.models import ComponentLink class ComponentLinksSerializer(serializers.ModelSerializer): """ - Serializer for publishable component entity links. + Serializer for publishable entity links. """ upstream_context_title = serializers.CharField(read_only=True) upstream_version = serializers.IntegerField(read_only=True, source="upstream_version_num") ready_to_sync = serializers.BooleanField() - ready_to_sync_from_children = serializers.BooleanField() - top_level_parent_usage_key = serializers.CharField( - source='top_level_parent.downstream_usage_key', - read_only=True, - allow_null=True - ) class Meta: model = ComponentLink - exclude = ['upstream_block', 'uuid', 'top_level_parent'] + exclude = ['upstream_block', 'uuid'] class PublishableEntityLinksSummarySerializer(serializers.Serializer): @@ -35,46 +29,3 @@ class PublishableEntityLinksSummarySerializer(serializers.Serializer): ready_to_sync_count = serializers.IntegerField(read_only=True) total_count = serializers.IntegerField(read_only=True) last_published_at = serializers.DateTimeField(read_only=True) - - -class ContainerLinksSerializer(serializers.ModelSerializer): - """ - Serializer for publishable container entity links. - """ - upstream_context_title = serializers.CharField(read_only=True) - upstream_version = serializers.IntegerField(read_only=True, source="upstream_version_num") - ready_to_sync = serializers.BooleanField() - ready_to_sync_from_children = serializers.BooleanField() - top_level_parent_usage_key = serializers.CharField( - source='top_level_parent.downstream_usage_key', - read_only=True, - allow_null=True - ) - - class Meta: - model = ContainerLink - exclude = ['upstream_container', 'uuid', 'top_level_parent'] - - -class PublishableEntityLinkSerializer(serializers.Serializer): - """ - Serializer for publishable component or container entity links. - """ - upstream_key = serializers.CharField(read_only=True) - upstream_type = serializers.ChoiceField(read_only=True, choices=['component', 'container']) - - def to_representation(self, instance): - if isinstance(instance, ComponentLink): - data = ComponentLinksSerializer(instance).data - data['upstream_key'] = data.get('upstream_usage_key') - data['upstream_type'] = 'component' - del data['upstream_usage_key'] - elif isinstance(instance, ContainerLink): - data = ContainerLinksSerializer(instance).data - data['upstream_key'] = data.get('upstream_container_key') - data['upstream_type'] = 'container' - del data['upstream_container_key'] - else: - raise Exception("Unexpected type") - - return data diff --git a/cms/djangoapps/contentstore/rest_api/v2/views/tests/test_downstreams.py b/cms/djangoapps/contentstore/rest_api/v2/views/tests/test_downstreams.py index b33d980732fa..950464839c09 100644 --- a/cms/djangoapps/contentstore/rest_api/v2/views/tests/test_downstreams.py +++ b/cms/djangoapps/contentstore/rest_api/v2/views/tests/test_downstreams.py @@ -3,27 +3,21 @@ """ import json from datetime import datetime, timezone -from unittest.mock import MagicMock, patch +from unittest.mock import patch, MagicMock -import ddt from django.conf import settings from django.urls import reverse from freezegun import freeze_time -from opaque_keys.edx.keys import ContainerKey, UsageKey -from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2 from organizations.models import Organization from cms.djangoapps.contentstore.helpers import StaticFileNotices +from cms.lib.xblock.upstream_sync import BadUpstream, UpstreamLink from cms.djangoapps.contentstore.tests.utils import CourseTestCase from cms.djangoapps.contentstore.xblock_storage_handlers import view_handlers as xblock_view_handlers -from cms.djangoapps.contentstore.xblock_storage_handlers.xblock_helpers import get_block_key_string -from cms.lib.xblock.upstream_sync import BadUpstream, UpstreamLink -from common.djangoapps.student.auth import add_users -from common.djangoapps.student.roles import CourseStaffRole +from opaque_keys.edx.keys import UsageKey from common.djangoapps.student.tests.factories import UserFactory -from openedx.core.djangoapps.content_libraries import api as lib_api from xmodule.modulestore.django import modulestore -from xmodule.modulestore.tests.django_utils import ImmediateOnCommitMixin, SharedModuleStoreTestCase +from xmodule.modulestore.tests.django_utils import SharedModuleStoreTestCase from xmodule.modulestore.tests.factories import BlockFactory, CourseFactory from .. import downstreams as downstreams_views @@ -32,25 +26,18 @@ URL_PREFIX = '/api/libraries/v2/' URL_LIB_CREATE = URL_PREFIX URL_LIB_BLOCKS = URL_PREFIX + '{lib_key}/blocks/' -URL_LIB_BLOCK = URL_PREFIX + 'blocks/{block_key}/' URL_LIB_BLOCK_PUBLISH = URL_PREFIX + 'blocks/{block_key}/publish/' URL_LIB_BLOCK_OLX = URL_PREFIX + 'blocks/{block_key}/olx/' -URL_LIB_CONTAINER = URL_PREFIX + 'containers/{container_key}/' # Get a container in this library -URL_LIB_CONTAINERS = URL_PREFIX + '{lib_key}/containers/' # Create a new container in this library -URL_LIB_CONTAINER_PUBLISH = URL_LIB_CONTAINER + 'publish/' # Publish changes to the specified container + children def _get_upstream_link_good_and_syncable(downstream): return UpstreamLink( upstream_ref=downstream.upstream, - upstream_key=LibraryUsageLocatorV2.from_string(downstream.upstream), - downstream_key=str(downstream.usage_key), + upstream_key=UsageKey.from_string(downstream.upstream), version_synced=downstream.upstream_version, version_available=(downstream.upstream_version or 0) + 1, version_declined=downstream.upstream_version_declined, error_message=None, - downstream_customized=[], - has_top_level_parent=False, ) @@ -66,7 +53,6 @@ def setUp(self): """ Create a simple course with one unit and two videos, one of which is linked to an "upstream". """ - # pylint: disable=too-many-statements super().setUp() self.now = datetime.now(timezone.utc) freezer = freeze_time(self.now) @@ -79,9 +65,6 @@ def setUp(self): defaults={"name": "Content Libraries Tachyon Exploration & Survey Team"}, ) self.superuser = UserFactory(username="superuser", password="password", is_staff=True, is_superuser=True) - self.simple_user = UserFactory(username="simple_user", password="password") - self.course_user = UserFactory(username="course_user", password="password") - self.lib_user = UserFactory(username="lib_user", password="password") self.client.login(username=self.superuser.username, password="password") self.library_title = "Test Library 1" @@ -90,41 +73,12 @@ def setUp(self): title=self.library_title, description="Testing XBlocks" )["id"] - self.library_key = LibraryLocatorV2.from_string(self.library_id) - lib_api.set_library_user_permissions(self.library_key, self.lib_user, access_level="read") self.html_lib_id = self._add_block_to_library(self.library_id, "html", "html-baz")["id"] self.video_lib_id = self._add_block_to_library(self.library_id, "video", "video-baz")["id"] - self.unit_id = self._create_container(self.library_id, "unit", "unit-1", "Unit 1")["id"] - self.subsection_id = self._create_container(self.library_id, "subsection", "subsection-1", "Subsection 1")["id"] - self.section_id = self._create_container(self.library_id, "section", "section-1", "Section 1")["id"] - - # Creating container to test the top-level parent - self.top_level_unit_id = self._create_container(self.library_id, "unit", "unit-2", "Unit 2")["id"] - self.top_level_unit_id_2 = self._create_container(self.library_id, "unit", "unit-3", "Unit 3")["id"] - self.top_level_subsection_id = self._create_container( - self.library_id, - "subsection", - "subsection-2", - "Subsection 2", - )["id"] - self.top_level_section_id = self._create_container(self.library_id, "section", "section-2", "Section 2")["id"] - self.html_lib_id_2 = self._add_block_to_library(self.library_id, "html", "html-baz-2")["id"] - self.video_lib_id_2 = self._add_block_to_library(self.library_id, "video", "video-baz-2")["id"] - self._publish_library_block(self.html_lib_id) self._publish_library_block(self.video_lib_id) - self._publish_library_block(self.html_lib_id_2) - self._publish_library_block(self.video_lib_id_2) - self._publish_container(self.unit_id) - self._publish_container(self.subsection_id) - self._publish_container(self.section_id) - self._publish_container(self.top_level_unit_id) - self._publish_container(self.top_level_unit_id_2) - self._publish_container(self.top_level_subsection_id) - self._publish_container(self.top_level_section_id) self.mock_upstream_link = f"{settings.COURSE_AUTHORING_MICROFRONTEND_URL}/library/{self.library_id}/components?usageKey={self.video_lib_id}" # pylint: disable=line-too-long # noqa: E501 self.course = CourseFactory.create() - add_users(self.superuser, CourseStaffRole(self.course.id), self.course_user) chapter = BlockFactory.create(category='chapter', parent=self.course) sequential = BlockFactory.create(category='sequential', parent=chapter) unit = BlockFactory.create(category='vertical', parent=sequential) @@ -135,65 +89,6 @@ def setUp(self): self.downstream_html_key = BlockFactory.create( category='html', parent=unit, upstream=self.html_lib_id, upstream_version=1, ).usage_key - self.downstream_chapter_key = BlockFactory.create( - category='chapter', parent=self.course, upstream=self.section_id, upstream_version=1, - ).usage_key - self.downstream_sequential_key = BlockFactory.create( - category='sequential', parent=chapter, upstream=self.subsection_id, upstream_version=1, - ).usage_key - self.downstream_unit_key = BlockFactory.create( - category='vertical', parent=sequential, upstream=self.unit_id, upstream_version=1, - ).usage_key - - # Creating Blocks with top-level-parents - # Unit created as a top-level parent - self.top_level_downstream_unit = BlockFactory.create( - category='vertical', - parent=sequential, - upstream=self.top_level_unit_id, - upstream_version=1, - ) - self.top_level_downstream_html_key = BlockFactory.create( - category='html', - parent=self.top_level_downstream_unit, - upstream=self.html_lib_id_2, - upstream_version=1, - top_level_downstream_parent_key=get_block_key_string( - self.top_level_downstream_unit.usage_key, - ) - ).usage_key - - # Section created as a top-level parent - self.top_level_downstream_chapter = BlockFactory.create( - category='chapter', parent=self.course, upstream=self.top_level_section_id, upstream_version=1, - ) - self.top_level_downstream_sequential = BlockFactory.create( - category='sequential', - parent=self.top_level_downstream_chapter, - upstream=self.top_level_subsection_id, - upstream_version=1, - top_level_downstream_parent_key=get_block_key_string( - self.top_level_downstream_chapter.usage_key, - ), - ) - self.top_level_downstream_unit_2 = BlockFactory.create( - category='vertical', - parent=self.top_level_downstream_sequential, - upstream=self.top_level_unit_id_2, - upstream_version=1, - top_level_downstream_parent_key=get_block_key_string( - self.top_level_downstream_chapter.usage_key, - ), - ) - self.top_level_downstream_video_key = BlockFactory.create( - category='video', - parent=self.top_level_downstream_unit_2, - upstream=self.video_lib_id_2, - upstream_version=1, - top_level_downstream_parent_key=get_block_key_string( - self.top_level_downstream_chapter.usage_key, - ) - ).usage_key self.another_course = CourseFactory.create(display_name="Another Course") another_chapter = BlockFactory.create(category="chapter", parent=self.another_course) @@ -213,8 +108,6 @@ def setUp(self): self.fake_video_key = self.course.id.make_usage_key("video", "NoSuchVideo") self.learner = UserFactory(username="learner", password="password") - self._update_container(self.unit_id, display_name="Unit 2") - self._publish_container(self.unit_id) self._set_library_block_olx(self.html_lib_id, "Hello world!") self._publish_library_block(self.html_lib_id) self._publish_library_block(self.video_lib_id) @@ -224,7 +117,7 @@ def _api(self, method, url, data, expect_response): """ Call a REST API """ - response = getattr(self.client, method)(url, data, format="json", content_type="application/json") + response = getattr(self.client, method)(url, data, format="json") assert response.status_code == expect_response,\ 'Unexpected response code {}:\n{}'.format(response.status_code, getattr(response, 'data', '(no data)')) return response.data @@ -255,15 +148,6 @@ def _publish_library_block(self, block_key, expect_response=200): """ Publish changes from a specified XBlock """ return self._api('post', URL_LIB_BLOCK_PUBLISH.format(block_key=block_key), None, expect_response) - def _publish_container(self, container_key: ContainerKey | str, expect_response=200): - """ Publish all changes in the specified container + children """ - return self._api('post', URL_LIB_CONTAINER_PUBLISH.format(container_key=container_key), None, expect_response) - - def _update_container(self, container_key: ContainerKey | str, display_name: str, expect_response=200): - """ Update a container (unit etc.) """ - data = {"display_name": display_name} - return self._api('patch', URL_LIB_CONTAINER.format(container_key=container_key), data, expect_response) - def _set_library_block_olx(self, block_key, new_olx, expect_response=200): """ Overwrite the OLX of a specific block in the library """ return self._api('post', URL_LIB_BLOCK_OLX.format(block_key=block_key), {"olx": new_olx}, expect_response) @@ -271,17 +155,6 @@ def _set_library_block_olx(self, block_key, new_olx, expect_response=200): def call_api(self, usage_key_string): raise NotImplementedError - def _create_container(self, lib_key, container_type, slug: str | None, display_name: str, expect_response=200): - """ Create a container (unit etc.) """ - data = {"container_type": container_type, "display_name": display_name} - if slug: - data["slug"] = slug - return self._api('post', URL_LIB_CONTAINERS.format(lib_key=lib_key), data, expect_response) - - def _delete_component(self, block_key, expect_response=200): - """ Publish all changes in the specified container + children """ - return self._api('delete', URL_LIB_BLOCK.format(block_key=block_key), None, expect_response) - class SharedErrorTestCases(_BaseDownstreamViewTestMixin): """ @@ -306,7 +179,7 @@ def test_404_downstream_not_accessible(self): assert "not found" in response.data["developer_message"] -class GetComponentDownstreamViewTest(SharedErrorTestCases, SharedModuleStoreTestCase): +class GetDownstreamViewTest(SharedErrorTestCases, SharedModuleStoreTestCase): """ Test that `GET /api/v2/contentstore/downstreams/...` inspects a downstream's link to an upstream. """ @@ -411,7 +284,7 @@ def test_400(self, sync: str): assert video_after.upstream is None -class DeleteDownstreamViewTest(SharedErrorTestCases, ImmediateOnCommitMixin, SharedModuleStoreTestCase): +class DeleteDownstreamViewTest(SharedErrorTestCases, SharedModuleStoreTestCase): """ Test that `DELETE /api/v2/contentstore/downstreams/...` severs a downstream's link to an upstream. """ @@ -439,42 +312,6 @@ def test_204_no_upstream(self, mock_sever): assert response.status_code == 204 assert mock_sever.call_count == 1 - def test_unlink_parent_should_update_children_top_level_parent(self): - """ - If we unlink a parent block, do all children get the new top-level parent? - """ - self.client.login(username="superuser", password="password") - - all_downstreams = self.client.get( - "/api/contentstore/v2/downstreams/", - data={"course_id": str(self.course.id)}, - ) - assert all_downstreams.data["count"] == 11 - - response = self.call_api(self.top_level_downstream_chapter.usage_key) - assert response.status_code == 204 - - # Check that all children have their top_level_downstream_parent_key updated - subsection = modulestore().get_item(self.top_level_downstream_sequential.usage_key) - assert subsection.top_level_downstream_parent_key is None - - unit = modulestore().get_item(self.top_level_downstream_unit_2.usage_key) - # The sequential is the top-level parent for the unit - sequential_block_key = get_block_key_string( - self.top_level_downstream_sequential.usage_key - ) - assert unit.top_level_downstream_parent_key == sequential_block_key - - video = modulestore().get_item(self.top_level_downstream_video_key) - # The sequential is the top-level parent for the video - assert video.top_level_downstream_parent_key == sequential_block_key - - all_downstreams = self.client.get( - "/api/contentstore/v2/downstreams/", - data={"course_id": str(self.course.id)}, - ) - assert all_downstreams.data["count"] == 10 - class _DownstreamSyncViewTestMixin(SharedErrorTestCases): """ @@ -598,67 +435,33 @@ def test_204(self, mock_decline_sync): assert mock_decline_sync.call_count == 1 -@ddt.ddt class GetUpstreamViewTest( _BaseDownstreamViewTestMixin, - ImmediateOnCommitMixin, SharedModuleStoreTestCase, ): """ Test that `GET /api/v2/contentstore/downstreams?...` returns list of links based on the provided filter. """ - def call_api( self, - course_id: str | None = None, - ready_to_sync: bool | None = None, - upstream_key: str | None = None, - item_type: str | None = None, - use_top_level_parents: bool | None = None, + course_id: str = None, + ready_to_sync: bool = None, + upstream_usage_key: str = None, ): data = {} if course_id is not None: data["course_id"] = str(course_id) if ready_to_sync is not None: data["ready_to_sync"] = str(ready_to_sync) - if upstream_key is not None: - data["upstream_key"] = str(upstream_key) - if item_type is not None: - data["item_type"] = str(item_type) - if use_top_level_parents is not None: - data["use_top_level_parents"] = str(use_top_level_parents) + if upstream_usage_key is not None: + data["upstream_usage_key"] = str(upstream_usage_key) return self.client.get("/api/contentstore/v2/downstreams/", data=data) - def test_200_single_upstream_container(self): - """ - Test single upstream container link provides children info as well. - """ - self.client.login(username="superuser", password="password") - # Publish components - self._set_library_block_olx(self.html_lib_id_2, "Hello world!") - self._publish_library_block(self.html_lib_id_2) - - response = self.client.get(f"/api/contentstore/v2/downstreams/{self.top_level_downstream_unit.usage_key}") - assert response.status_code == 200 - data = response.json() - assert data['upstream_ref'] == self.top_level_unit_id - assert data['error_message'] is None - assert data['ready_to_sync'] is True - assert len(data['ready_to_sync_children']) == 1 - html_block = modulestore().get_item(self.top_level_downstream_html_key) - self.assertDictEqual(data['ready_to_sync_children'][0], { - 'name': html_block.display_name, - 'upstream': str(self.html_lib_id_2), - 'block_type': 'html', - 'downstream_customized': [], - 'id': str(html_block.usage_key), - }) - def test_200_all_downstreams_for_a_course(self): """ Returns all links for given course """ - self.client.login(username="course_user", password="password") + self.client.login(username="superuser", password="password") response = self.call_api(course_id=self.course.id) assert response.status_code == 200 data = response.json() @@ -670,17 +473,13 @@ def test_200_all_downstreams_for_a_course(self): 'downstream_usage_key': str(self.downstream_video_key), 'id': 1, 'ready_to_sync': False, - 'ready_to_sync_from_children': False, 'updated': date_format, 'upstream_context_key': self.library_id, 'upstream_context_title': self.library_title, - 'upstream_key': self.video_lib_id, - 'upstream_type': 'component', + 'upstream_usage_key': self.video_lib_id, 'upstream_version': 1, 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], + 'version_synced': 1 }, { 'created': date_format, @@ -688,764 +487,60 @@ def test_200_all_downstreams_for_a_course(self): 'downstream_usage_key': str(self.downstream_html_key), 'id': 2, 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id, - 'upstream_type': 'component', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_html_key), - 'id': 3, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id_2, - 'upstream_type': 'component', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_unit.usage_key), - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_video_key), - 'id': 4, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.video_lib_id_2, - 'upstream_type': 'component', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_chapter_key), - 'id': 1, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_sequential_key), - 'id': 2, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.subsection_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_unit_key), - 'id': 3, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, 'updated': date_format, 'upstream_context_key': self.library_id, 'upstream_context_title': self.library_title, - 'upstream_key': self.unit_id, - 'upstream_type': 'container', + 'upstream_usage_key': self.html_lib_id, 'upstream_version': 2, 'version_declined': None, 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_unit.usage_key), - 'id': 4, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_unit_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'id': 5, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_sequential.usage_key), - 'id': 6, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_subsection_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_unit_2.usage_key), - 'id': 7, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_unit_id_2, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], }, ] self.assertListEqual(data["results"], expected) - self.assertEqual(data["count"], 11) + self.assertEqual(data["count"], 2) - def test_permission_denied_with_course_filter(self): - self.client.login(username="simple_user", password="password") - response = self.call_api(course_id=self.course.id) - assert response.status_code == 403 + def test_200_all_downstreams_ready_to_sync(self): + """ + Returns all links that are syncable + """ + self.client.login(username="superuser", password="password") + response = self.call_api(ready_to_sync=True) + assert response.status_code == 200 + data = response.json() + self.assertTrue(all(o["ready_to_sync"] for o in data["results"])) + self.assertEqual(data["count"], 1) - def test_200_component_downstreams_for_a_course(self): + def test_200_downstream_context_list(self): """ - Returns all component links for given course + Returns all downstream courses for given library block """ - self.client.login(username="course_user", password="password") - response = self.call_api( - course_id=self.course.id, - item_type='components', - ) + self.client.login(username="superuser", password="password") + response = self.call_api(upstream_usage_key=self.video_lib_id) assert response.status_code == 200 data = response.json() - date_format = self.now.isoformat().split("+")[0] + 'Z' - expected = [ - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_video_key), - 'id': 1, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.video_lib_id, - 'upstream_type': 'component', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_html_key), - 'id': 2, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id, - 'upstream_type': 'component', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_html_key), - 'id': 3, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id_2, - 'upstream_type': 'component', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_unit.usage_key), - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_video_key), - 'id': 4, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.video_lib_id_2, - 'upstream_type': 'component', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], - }, - ] - self.assertListEqual(data["results"], expected) + expected = [str(self.downstream_video_key)] + [str(key) for key in self.another_video_keys] + got = [str(o["downstream_usage_key"]) for o in data["results"]] + self.assertListEqual(got, expected) self.assertEqual(data["count"], 4) - def test_200_container_downstreams_for_a_course(self): + +class GetDownstreamSummaryViewTest( + _BaseDownstreamViewTestMixin, + SharedModuleStoreTestCase, +): + """ + Test that `GET /api/v2/contentstore/downstreams//summary` returns summary of links in course. + """ + def call_api(self, course_id): + return self.client.get(f"/api/contentstore/v2/downstreams/{course_id}/summary") + + @patch.object(UpstreamLink, "get_for_block", _get_upstream_link_good_and_syncable) + def test_200_summary(self): """ - Returns all container links for given course + Does the happy path work? """ - self.client.login(username="course_user", password="password") - response = self.call_api( - course_id=self.course.id, - item_type='containers', - ) - assert response.status_code == 200 - data = response.json() - date_format = self.now.isoformat().split("+")[0] + 'Z' - expected = [ - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_chapter_key), - 'id': 1, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_sequential_key), - 'id': 2, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.subsection_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_unit_key), - 'id': 3, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.unit_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_unit.usage_key), - 'id': 4, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_unit_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'id': 5, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_sequential.usage_key), - 'id': 6, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_subsection_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_unit_2.usage_key), - 'id': 7, - 'ready_to_sync': False, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_unit_id_2, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'downstream_customized': [], - }, - ] - self.assertListEqual(data["results"], expected) - self.assertEqual(data["count"], 7) - - @ddt.data( - ('all', 2), - ('components', 1), - ('containers', 1), - ) - @ddt.unpack - def test_200_downstreams_ready_to_sync(self, item_type, expected_count): - """ - Returns all links that are syncable - """ - self.client.login(username="superuser", password="password") - response = self.call_api( - ready_to_sync=True, - item_type=item_type, - ) - assert response.status_code == 200 - data = response.json() - self.assertTrue(all(o["ready_to_sync"] for o in data["results"])) - self.assertEqual(data["count"], expected_count) - - def test_permission_denied_without_filter(self): - self.client.login(username="simple_user", password="password") - response = self.call_api() - assert response.status_code == 403 - - def test_200_component_downstream_context_list(self): - """ - Returns all entity downstream links for given component - """ - self.client.login(username="lib_user", password="password") - response = self.call_api(upstream_key=self.video_lib_id) - assert response.status_code == 200 - data = response.json() - expected = [str(self.downstream_video_key)] + [str(key) for key in self.another_video_keys] - got = [str(o["downstream_usage_key"]) for o in data["results"]] - self.assertListEqual(got, expected) - self.assertEqual(data["count"], 4) - - def test_200_container_downstream_context_list(self): - """ - Returns all entity downstream links for given container - """ - self.client.login(username="lib_user", password="password") - response = self.call_api(upstream_key=self.unit_id) - assert response.status_code == 200 - data = response.json() - expected = [str(self.downstream_unit_key)] - got = [str(o["downstream_usage_key"]) for o in data["results"]] - self.assertListEqual(got, expected) - self.assertEqual(data["count"], 1) - - def test_200_get_ready_to_sync_top_level_parents_with_components(self): - """ - Returns all links that are syncable using the top-level parents of components - """ - self.client.login(username="superuser", password="password") - - # Publish components - self._set_library_block_olx(self.html_lib_id_2, "Hello world!") - self._publish_library_block(self.html_lib_id_2) - self._set_library_block_olx(self.video_lib_id_2, "") - self._publish_library_block(self.video_lib_id_2) - - response = self.call_api( - ready_to_sync=True, - item_type="all", - use_top_level_parents=True, - ) - assert response.status_code == 200 - data = response.json() - self.assertEqual(data["count"], 4) - date_format = self.now.isoformat().split("+")[0] + 'Z' - - # The expected results are - # * The section that is the top-level parent of `video_lib_id_2` - # * The unit that is the top-level parent of `html_lib_id_2` - # * 2 links without top-level parents - expected = [ - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_unit.usage_key), - 'id': 4, - 'ready_to_sync': False, # <-- It's False because the container doesn't have changes - 'ready_to_sync_from_children': True, # <-- It's True because a child has changes - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_unit_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'id': 5, - 'ready_to_sync': False, # <-- It's False because the container doesn't have changes - 'ready_to_sync_from_children': True, # <-- It's True because a child has changes - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_html_key), - 'id': 2, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id, - 'upstream_type': 'component', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_unit_key), - 'id': 3, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.unit_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - ] - self.assertListEqual(data["results"], expected) - - def test_200_get_ready_to_sync_top_level_parents_with_containers(self): - """ - Returns all links that are syncable using the top-level parents of containers - """ - self.client.login(username="superuser", password="password") - - # Publish Subsection - self._update_container(self.top_level_subsection_id, display_name="Subsection 3") - self._publish_container(self.top_level_subsection_id) - - response = self.call_api( - ready_to_sync=True, - item_type="all", - use_top_level_parents=True, - ) - assert response.status_code == 200 - data = response.json() - self.assertEqual(data["count"], 3) - date_format = self.now.isoformat().split("+")[0] + 'Z' - - # The expected results are - # * 2 links without top-level parents - # * The section that is the top-level parent of `top_level_subsection_id` - expected = [ - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_html_key), - 'id': 2, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id, - 'upstream_type': 'component', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_unit_key), - 'id': 3, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.unit_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'id': 5, - 'ready_to_sync': False, # <-- It's False because the container doesn't have changes - 'ready_to_sync_from_children': True, # <-- It's True because a child has changes - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_section_id, - 'upstream_type': 'container', - 'upstream_version': 1, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - ] - self.assertListEqual(data["results"], expected) - - def test_200_get_ready_to_sync_duplicated_top_level_parents(self): - """ - Returns all links that are syncable using the same top-level parents - - According to the requirements, only the top-level parents should be displayed. - Even if all containers and components within a section are updated, only the top-level parent, - which is the section, should be displayed. - This test checks that only the top-level parent is displayed and is not duplicated in the result. - """ - self.client.login(username="superuser", password="password") - - # Publish Section and component/subsection that has the same section as top-level parent - self._update_container(self.top_level_section_id, display_name="Section 3") - self._publish_container(self.top_level_section_id) - self._set_library_block_olx(self.video_lib_id_2, "") - self._publish_library_block(self.video_lib_id_2) - self._update_container(self.top_level_subsection_id, display_name="Subsection 3") - self._publish_container(self.top_level_subsection_id) - - response = self.call_api( - ready_to_sync=True, - item_type="all", - use_top_level_parents=True, - ) - assert response.status_code == 200 - data = response.json() - self.assertEqual(data["count"], 3) - date_format = self.now.isoformat().split("+")[0] + 'Z' - - # The expected results are - # * The section that is the top-level parent of `video_lib_id_2` and `top_level_subsection_id` - # * 2 links without top-level parents - expected = [ - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.top_level_downstream_chapter.usage_key), - 'id': 5, - 'ready_to_sync': True, # <-- It's True because the section has changes - 'ready_to_sync_from_children': True, # <-- It's True because a child has changes - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.top_level_section_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_html_key), - 'id': 2, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.html_lib_id, - 'upstream_type': 'component', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - { - 'created': date_format, - 'downstream_context_key': str(self.course.id), - 'downstream_usage_key': str(self.downstream_unit_key), - 'id': 3, - 'ready_to_sync': True, - 'ready_to_sync_from_children': False, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': self.unit_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 1, - 'top_level_parent_usage_key': None, - 'downstream_customized': [], - }, - ] - self.assertListEqual(data["results"], expected) - - -class GetDownstreamSummaryViewTest( - _BaseDownstreamViewTestMixin, - ImmediateOnCommitMixin, - SharedModuleStoreTestCase, -): - """ - Test that `GET /api/v2/contentstore/downstreams//summary` returns summary of links in course. - """ - def call_api(self, course_id): - return self.client.get(f"/api/contentstore/v2/downstreams/{course_id}/summary") - - @patch.object(UpstreamLink, "get_for_block", _get_upstream_link_good_and_syncable) - def test_200_summary(self): - """ - Does the happy path work? - """ - self.client.login(username="superuser", password="password") - response = self.call_api(str(self.another_course.id)) + self.client.login(username="superuser", password="password") + response = self.call_api(str(self.another_course.id)) assert response.status_code == 200 data = response.json() expected = [{ @@ -1459,158 +554,11 @@ def test_200_summary(self): response = self.call_api(str(self.course.id)) assert response.status_code == 200 data = response.json() - - # The `total_count` is 7 because the top-level logic: - # * The `section-2`, that is the top-level parent of `subsection-2`, `unit-3`, `html-baz-2` - # * The `unit-2`, that is the top-level parent of `video-baz-2` - # * The `section-1` - # * The `subsection-1` - # * The `unit-1` - # * The `html-baz-1` - # * The `video-baz-1` expected = [{ 'upstream_context_title': 'Test Library 1', 'upstream_context_key': self.library_id, - 'ready_to_sync_count': 2, - 'total_count': 7, + 'ready_to_sync_count': 1, + 'total_count': 2, 'last_published_at': self.now.strftime('%Y-%m-%dT%H:%M:%S.%fZ'), }] self.assertListEqual(data, expected) - - # Publish Subsection - self._update_container(self.top_level_subsection_id, display_name="Subsection 3") - self._publish_container(self.top_level_subsection_id) - - response = self.call_api(str(self.course.id)) - assert response.status_code == 200 - data = response.json() - expected = [{ - 'upstream_context_title': 'Test Library 1', - 'upstream_context_key': self.library_id, - 'ready_to_sync_count': 3, # <-- + the section (top-level parent of subsection) - 'total_count': 7, - 'last_published_at': self.now.strftime('%Y-%m-%dT%H:%M:%S.%fZ'), - }] - self.assertListEqual(data, expected) - - # Publish Section - self._update_container(self.top_level_section_id, display_name="Section 3") - self._publish_container(self.top_level_section_id) - - response = self.call_api(str(self.course.id)) - assert response.status_code == 200 - data = response.json() - expected = [{ - 'upstream_context_title': 'Test Library 1', - 'upstream_context_key': self.library_id, - 'ready_to_sync_count': 3, # <-- is the same value because the section is the top-level parent - 'total_count': 7, - 'last_published_at': self.now.strftime('%Y-%m-%dT%H:%M:%S.%fZ'), - }] - self.assertListEqual(data, expected) - - -class GetDownstreamDeletedUpstream( - _BaseDownstreamViewTestMixin, - ImmediateOnCommitMixin, - SharedModuleStoreTestCase, -): - """ - Test that parent container is marked ready_to_sync when even when the only change is a deleted component under it - """ - def call_api( - self, - course_id: str | None = None, - ready_to_sync: bool | None = None, - upstream_key: str | None = None, - item_type: str | None = None, - use_top_level_parents: bool | None = None, - ): - data = {} - if course_id is not None: - data["course_id"] = str(course_id) - if ready_to_sync is not None: - data["ready_to_sync"] = str(ready_to_sync) - if upstream_key is not None: - data["upstream_key"] = str(upstream_key) - if item_type is not None: - data["item_type"] = str(item_type) - if use_top_level_parents is not None: - data["use_top_level_parents"] = str(use_top_level_parents) - return self.client.get("/api/contentstore/v2/downstreams/", data=data) - - def test_delete_component_should_be_ready_to_sync(self): - """ - Test deleting a component from library should mark the entire section container ready to sync - """ - # Create blocks - section_id = self._create_container(self.library_id, "section", "section-12", "Section 12")["id"] - subsection_id = self._create_container(self.library_id, "subsection", "subsection-12", "Subsection 12")["id"] - unit_id = self._create_container(self.library_id, "unit", "unit-12", "Unit 12")["id"] - video_id = self._add_block_to_library(self.library_id, "video", "video-bar-13")["id"] - section_key = ContainerKey.from_string(section_id) - subsection_key = ContainerKey.from_string(subsection_id) - unit_key = ContainerKey.from_string(unit_id) - video_key = LibraryUsageLocatorV2.from_string(video_id) - - # Set children - lib_api.update_container_children(section_key, [subsection_key], None) - lib_api.update_container_children(subsection_key, [unit_key], None) - lib_api.update_container_children(unit_key, [video_key], None) - self._publish_container(unit_id) - self._publish_container(subsection_id) - self._publish_container(section_id) - self._publish_library_block(video_id) - course = CourseFactory.create(display_name="Course New") - add_users(self.superuser, CourseStaffRole(course.id), self.course_user) - chapter = BlockFactory.create( - category='chapter', parent=course, upstream=section_id, upstream_version=2, - ) - sequential = BlockFactory.create( - category='sequential', - parent=chapter, - upstream=subsection_id, - upstream_version=2, - top_level_downstream_parent_key=get_block_key_string(chapter.usage_key), - ) - vertical = BlockFactory.create( - category='vertical', - parent=sequential, - upstream=unit_id, - upstream_version=2, - top_level_downstream_parent_key=get_block_key_string(chapter.usage_key), - ) - BlockFactory.create( - category='video', - parent=vertical, - upstream=video_id, - upstream_version=1, - top_level_downstream_parent_key=get_block_key_string(chapter.usage_key), - ) - self._delete_component(video_id) - self._publish_container(unit_id) - response = self.call_api(course_id=course.id, ready_to_sync=True, use_top_level_parents=True) - assert response.status_code == 200 - data = response.json()['results'] - assert len(data) == 1 - date_format = self.now.isoformat().split("+")[0] + 'Z' - expected_results = { - 'created': date_format, - 'downstream_context_key': str(course.id), - 'downstream_usage_key': str(chapter.usage_key), - 'downstream_customized': [], - 'id': 8, - 'ready_to_sync': False, - 'ready_to_sync_from_children': True, - 'top_level_parent_usage_key': None, - 'updated': date_format, - 'upstream_context_key': self.library_id, - 'upstream_context_title': self.library_title, - 'upstream_key': section_id, - 'upstream_type': 'container', - 'upstream_version': 2, - 'version_declined': None, - 'version_synced': 2, - } - - self.assertDictEqual(data[0], expected_results) diff --git a/cms/djangoapps/contentstore/utils.py b/cms/djangoapps/contentstore/utils.py index 78e41b7b1813..83f33e670608 100644 --- a/cms/djangoapps/contentstore/utils.py +++ b/cms/djangoapps/contentstore/utils.py @@ -15,7 +15,7 @@ from bs4 import BeautifulSoup from django.conf import settings -from django.core.exceptions import ImproperlyConfigured, ObjectDoesNotExist, ValidationError +from django.core.exceptions import ObjectDoesNotExist, ValidationError from django.urls import reverse from django.utils import translation from django.utils.text import Truncator @@ -26,13 +26,12 @@ from milestones import api as milestones_api from opaque_keys import InvalidKeyError from opaque_keys.edx.keys import CourseKey, UsageKey, UsageKeyV2 -from opaque_keys.edx.locator import BlockUsageLocator, LibraryContainerLocator, LibraryLocator +from opaque_keys.edx.locator import LibraryContainerLocator, LibraryLocator from openedx_events.content_authoring.data import DuplicatedXBlockData from openedx_events.content_authoring.signals import XBLOCK_DUPLICATED from openedx_events.learning.data import CourseNotificationData from openedx_events.learning.signals import COURSE_NOTIFICATION_REQUESTED from pytz import UTC -from rest_framework.fields import BooleanField from xblock.fields import Scope from cms.djangoapps.contentstore.toggles import ( @@ -43,21 +42,25 @@ split_library_view_on_dashboard, use_new_advanced_settings_page, use_new_certificates_page, + use_new_course_outline_page, use_new_course_team_page, use_new_custom_pages, use_new_export_page, + use_new_files_uploads_page, use_new_grading_page, use_new_group_configurations_page, + use_new_home_page, use_new_import_page, use_new_schedule_details_page, + use_new_text_editor, use_new_textbooks_page, use_new_unit_page, use_new_updates_page, + use_new_video_editor, use_new_video_uploads_page, ) from cms.djangoapps.models.settings.course_grading import CourseGradingModel from cms.djangoapps.models.settings.course_metadata import CourseMetadata -from cms.djangoapps.modulestore_migrator.api import get_migration_info from common.djangoapps.course_action_state.managers import CourseActionStateItemNotFoundError from common.djangoapps.course_action_state.models import CourseRerunState, CourseRerunUIStateManager from common.djangoapps.course_modes.models import CourseMode @@ -84,7 +87,6 @@ from common.djangoapps.xblock_django.api import deprecated_xblocks from common.djangoapps.xblock_django.user_service import DjangoXBlockUserService from openedx.core import toggles as core_toggles -from openedx.core.djangoapps.content.course_overviews.models import CourseOverview from openedx.core.djangoapps.content_libraries.api import get_container from openedx.core.djangoapps.content_tagging.toggles import is_tagging_feature_disabled from openedx.core.djangoapps.credit.api import get_credit_requirements, is_credit_course @@ -112,7 +114,6 @@ get_all_partitions_for_course, # lint-amnesty, pylint: disable=wrong-import-order ) from xmodule.services import ConfigurationService, SettingsService, TeamsConfigurationService -from xmodule.util.keys import BlockKey from .models import ComponentLink, ContainerLink @@ -284,10 +285,11 @@ def get_editor_page_base_url(course_locator) -> str: Gets course authoring microfrontend URL for links to the new base editors """ editor_url = None - mfe_base_url = get_course_authoring_url(course_locator) - course_mfe_url = f'{mfe_base_url}/course/{course_locator}/editor' - if mfe_base_url: - editor_url = course_mfe_url + if use_new_text_editor(course_locator) or use_new_video_editor(course_locator): + mfe_base_url = get_course_authoring_url(course_locator) + course_mfe_url = f'{mfe_base_url}/course/{course_locator}/editor' + if mfe_base_url: + editor_url = course_mfe_url return editor_url @@ -295,15 +297,12 @@ def get_studio_home_url(): """ Gets course authoring microfrontend URL for Studio Home view. """ - mfe_base_url = settings.COURSE_AUTHORING_MICROFRONTEND_URL - if mfe_base_url: - studio_home_url = f'{mfe_base_url}/home' - return studio_home_url - - raise ImproperlyConfigured( - "The COURSE_AUTHORING_MICROFRONTEND_URL must be configured. " - "Please set it to the base url for your authoring MFE." - ) + studio_home_url = None + if use_new_home_page(): + mfe_base_url = settings.COURSE_AUTHORING_MICROFRONTEND_URL + if mfe_base_url: + studio_home_url = f'{mfe_base_url}/home' + return studio_home_url def get_schedule_details_url(course_locator) -> str: @@ -415,10 +414,11 @@ def get_files_uploads_url(course_locator) -> str: Gets course authoring microfrontend URL for files and uploads page view. """ files_uploads_url = None - mfe_base_url = get_course_authoring_url(course_locator) - course_mfe_url = f'{mfe_base_url}/course/{course_locator}/assets' - if mfe_base_url: - files_uploads_url = course_mfe_url + if use_new_files_uploads_page(course_locator): + mfe_base_url = get_course_authoring_url(course_locator) + course_mfe_url = f'{mfe_base_url}/course/{course_locator}/assets' + if mfe_base_url: + files_uploads_url = course_mfe_url return files_uploads_url @@ -435,17 +435,16 @@ def get_video_uploads_url(course_locator) -> str: return video_uploads_url -def get_course_outline_url(course_locator, block_to_show=None) -> str: +def get_course_outline_url(course_locator) -> str: """ Gets course authoring microfrontend URL for course oultine page view. """ course_outline_url = None - mfe_base_url = get_course_authoring_url(course_locator) - course_mfe_url = f'{mfe_base_url}/course/{course_locator}' - if block_to_show: - course_mfe_url += f'?show={quote_plus(block_to_show)}' - if mfe_base_url: - course_outline_url = course_mfe_url + if use_new_course_outline_page(course_locator): + mfe_base_url = get_course_authoring_url(course_locator) + course_mfe_url = f'{mfe_base_url}/course/{course_locator}' + if mfe_base_url: + course_outline_url = course_mfe_url return course_outline_url @@ -703,13 +702,6 @@ def get_sequence_usage_keys(course): for subsection in section.get_children()] -def create_course_info_usage_key(course, section_key): - """ - Returns the usage key for the specified section's course info block. - """ - return course.id.make_usage_key('course_info', section_key) - - def reverse_url(handler_name, key_name=None, key_value=None, kwargs=None): """ Creates the URL for the given handler. @@ -1582,12 +1574,12 @@ def get_library_context(request, request_is_json=False): It is used for both DRF and django views. """ from cms.djangoapps.contentstore.views.course import ( - _accessible_libraries_iter, - _format_library_for_view, - _get_course_creator_status, get_allowed_organizations, get_allowed_organizations_for_libraries, user_can_create_organizations, + _accessible_libraries_iter, + _get_course_creator_status, + _format_library_for_view, ) from cms.djangoapps.contentstore.views.library import ( user_can_view_create_library_button, @@ -1596,22 +1588,9 @@ def get_library_context(request, request_is_json=False): user_can_create_library, ) - libraries = list(_accessible_libraries_iter(request.user) if libraries_v1_enabled() else []) - library_keys = [lib.location.library_key for lib in libraries] - migration_info = get_migration_info(library_keys) - is_migrated_filter = request.GET.get('is_migrated', None) + libraries = _accessible_libraries_iter(request.user) if libraries_v1_enabled() else [] data = { - 'libraries': [ - _format_library_for_view( - lib, - request, - migrated_to=migration_info.get(lib.location.library_key) - ) - for lib in libraries - if is_migrated_filter is None or ( - BooleanField().to_internal_value(is_migrated_filter) == (lib.location.library_key in migration_info) - ) - ] + 'libraries': [_format_library_for_view(lib, request) for lib in libraries], } if not request_is_json: @@ -1727,7 +1706,9 @@ def get_home_context(request, no_course=False): get_allowed_organizations, get_allowed_organizations_for_libraries, user_can_create_organizations, + _accessible_libraries_iter, _get_course_creator_status, + _format_library_for_view, ) from cms.djangoapps.contentstore.views.library import ( user_can_view_create_library_button, @@ -1765,7 +1746,8 @@ def get_home_context(request, no_course=False): 'user': user, 'request_course_creator_url': reverse('request_course_creator'), 'course_creator_status': _get_course_creator_status(user), - 'rerun_creator_status': GlobalStaff().has_user(user), + #EDLYCUSTOM: we need to have same permission for rerun creator as course creator + 'rerun_creator_status': _get_course_creator_status(user) == 'granted', 'allow_unicode_course_id': settings.FEATURES.get('ALLOW_UNICODE_COURSE_ID', False), 'allow_course_reruns': settings.FEATURES.get('ALLOW_COURSE_RERUNS', True), 'active_tab': 'courses', @@ -1949,10 +1931,7 @@ def _get_course_index_context(request, course_key, course_block): course_block.discussions_settings['discussion_configuration_url'] = ( f'{get_pages_and_resources_url(course_block.id)}/discussion/settings' ) - try: - course_overview = CourseOverview.objects.get(id=course_block.id) - except CourseOverview.DoesNotExist: - course_overview = None + course_index_context = { 'language_code': request.LANGUAGE_CODE, 'context_course': course_block, @@ -1979,8 +1958,8 @@ def _get_course_index_context(request, course_key, course_block): 'advance_settings_url': reverse_course_url('advanced_settings_handler', course_block.id), 'proctoring_errors': proctoring_errors, 'taxonomy_tags_widget_url': get_taxonomy_tags_widget_url(course_block.id), - 'created_on': course_overview.created if course_overview else None, } + return course_index_context @@ -2142,7 +2121,11 @@ def get_certificates_context(course, user): handler_name='certificate_activation_handler', course_key=course_key ) - course_modes = CertificateManager.get_course_modes(course) + course_modes = [ + mode.slug for mode in CourseMode.modes_for_course( + course_id=course_key, include_expired=True + ) if mode.slug != 'audit' + ] has_certificate_modes = len(course_modes) > 0 @@ -2332,8 +2315,6 @@ def send_course_update_notification(course_key, content, user): app_name="updates", audience_filters={}, ) - # .. event_implemented_name: COURSE_NOTIFICATION_REQUESTED - # .. event_type: org.openedx.learning.course.notification.requested.v1 COURSE_NOTIFICATION_REQUESTED.send_event(course_notification_data=notification_data) @@ -2394,7 +2375,7 @@ def get_xblock_render_context(request, block): return "" -def _create_or_update_component_link(created: datetime | None, xblock): +def _create_or_update_component_link(course_key: CourseKey, created: datetime | None, xblock): """ Create or update upstream->downstream link for components in database for given xblock. """ @@ -2404,31 +2385,19 @@ def _create_or_update_component_link(created: datetime | None, xblock): except ObjectDoesNotExist: log.error(f"Library component not found for {upstream_usage_key}") lib_component = None - - top_level_parent_usage_key = None - if xblock.top_level_downstream_parent_key is not None: - block_key = BlockKey.from_string(xblock.top_level_downstream_parent_key) - top_level_parent_usage_key = BlockUsageLocator( - xblock.usage_key.course_key, - block_key.type, - block_key.id, - ) - ComponentLink.update_or_create( lib_component, upstream_usage_key=upstream_usage_key, upstream_context_key=str(upstream_usage_key.context_key), - downstream_context_key=xblock.usage_key.course_key, + downstream_context_key=course_key, downstream_usage_key=xblock.usage_key, - top_level_parent_usage_key=top_level_parent_usage_key, version_synced=xblock.upstream_version, version_declined=xblock.upstream_version_declined, - downstream_customized=getattr(xblock, "downstream_customized", []), created=created, ) -def _create_or_update_container_link(created: datetime | None, xblock): +def _create_or_update_container_link(course_key: CourseKey, created: datetime | None, xblock): """ Create or update upstream->downstream link for containers in database for given xblock. """ @@ -2438,31 +2407,19 @@ def _create_or_update_container_link(created: datetime | None, xblock): except ObjectDoesNotExist: log.error(f"Library component not found for {upstream_container_key}") lib_component = None - - top_level_parent_usage_key = None - if xblock.top_level_downstream_parent_key is not None: - block_key = BlockKey.from_string(xblock.top_level_downstream_parent_key) - top_level_parent_usage_key = BlockUsageLocator( - xblock.usage_key.course_key, - block_key.type, - block_key.id, - ) - ContainerLink.update_or_create( lib_component, upstream_container_key=upstream_container_key, upstream_context_key=str(upstream_container_key.context_key), - downstream_context_key=xblock.usage_key.course_key, + downstream_context_key=course_key, downstream_usage_key=xblock.usage_key, version_synced=xblock.upstream_version, - top_level_parent_usage_key=top_level_parent_usage_key, version_declined=xblock.upstream_version_declined, - downstream_customized=getattr(xblock, "downstream_customized", []), created=created, ) -def create_or_update_xblock_upstream_link(xblock, created: datetime | None = None) -> None: +def create_or_update_xblock_upstream_link(xblock, course_key: CourseKey, created: datetime | None = None) -> None: """ Create or update upstream->downstream link in database for given xblock. """ @@ -2470,44 +2427,8 @@ def create_or_update_xblock_upstream_link(xblock, created: datetime | None = Non return None try: # Try to create component link - _create_or_update_component_link(created, xblock) + _create_or_update_component_link(course_key, created, xblock) except InvalidKeyError: # It is possible that the upstream is a container and UsageKeyV2 parse failed # Create upstream container link and raise InvalidKeyError if xblock.upstream is a valid key. - _create_or_update_container_link(created, xblock) - - -def get_previous_run_course_key(course_key): - """ - Retrieves the course key of the previous run for a given course. - """ - try: - rerun_state = CourseRerunState.objects.get(course_key=course_key) - except CourseRerunState.DoesNotExist: - log.warning(f'[Link Check] No rerun state found for course {course_key}. Cannot find previous run.') - return None - - return rerun_state.source_course_key - - -def contains_course_reference(url, course_key): - """ - Checks if a URL contains an exact reference to the specified course key. - Uses specific delimiter matching to ensure exact matching and avoid partial matches. - - Args: - url: The URL to check - course_key: The course key to look for - - Returns: - bool: True if URL contains exact reference to the course - """ - if not course_key or not url: - return False - - course_key_pattern = re.escape(str(course_key)) - - # Ensure the course key is followed by '/' or end of string - pattern = course_key_pattern + r'(?=/|$)' - - return bool(re.search(pattern, url, re.IGNORECASE)) + _create_or_update_container_link(course_key, created, xblock) diff --git a/cms/djangoapps/contentstore/views/course.py b/cms/djangoapps/contentstore/views/course.py index 453e30e0aad0..00fd64d4272c 100644 --- a/cms/djangoapps/contentstore/views/course.py +++ b/cms/djangoapps/contentstore/views/course.py @@ -7,19 +7,14 @@ import random import re import string -from typing import Dict, NamedTuple, Optional +from typing import Dict import django.utils from ccx_keys.locator import CCXLocator from django.conf import settings from django.contrib.auth import get_user_model from django.contrib.auth.decorators import login_required -from django.core.exceptions import ( - FieldError, - ImproperlyConfigured, - PermissionDenied, - ValidationError as DjangoValidationError, -) +from django.core.exceptions import FieldError, PermissionDenied, ValidationError as DjangoValidationError from django.db.models import QuerySet from django.http import Http404, HttpResponse, HttpResponseBadRequest, HttpResponseNotFound from django.shortcuts import redirect @@ -27,7 +22,8 @@ from django.utils.translation import gettext as _ from django.views.decorators.csrf import ensure_csrf_cookie from django.views.decorators.http import require_GET, require_http_methods -from drf_spectacular.utils import extend_schema, OpenApiParameter, OpenApiRequest, OpenApiResponse +from edly_features_app.filters import CoursesRequested, OrganizationsRequested +from edly_features_app.roles import GlobalCourseCreatorRole from edx_django_utils.monitoring import function_trace from opaque_keys import InvalidKeyError from opaque_keys.edx.keys import CourseKey @@ -35,8 +31,6 @@ from organizations.api import add_organization_course, ensure_organization from organizations.exceptions import InvalidOrganizationException from rest_framework.exceptions import ValidationError -from rest_framework.decorators import api_view -from openedx.core.lib.api.view_utils import view_auth_classes from cms.djangoapps.contentstore.xblock_storage_handlers.view_handlers import create_xblock_info from cms.djangoapps.course_creators.views import add_user_with_status_unrequested, get_course_creator_status @@ -90,6 +84,8 @@ from ..tasks import rerun_course as rerun_course_task from ..toggles import ( default_enable_flexible_peer_openassessments, + use_new_course_outline_page, + use_new_home_page, use_new_updates_page, use_new_advanced_settings_page, use_new_grading_page, @@ -101,12 +97,15 @@ add_instructor, get_advanced_settings_url, get_course_grading, + get_course_index_context, get_course_outline_url, get_course_rerun_context, get_course_settings, get_grading_url, get_group_configurations_context, get_group_configurations_url, + get_home_context, + get_library_context, get_lms_link_for_item, get_proctored_exam_settings_url, get_schedule_details_url, @@ -138,7 +137,7 @@ 'course_notifications_handler', 'textbooks_list_handler', 'textbooks_detail_handler', 'group_configurations_list_handler', 'group_configurations_detail_handler', - 'get_course_and_check_access', 'bulk_enable_disable_discussions'] + 'get_course_and_check_access'] class AccessListFallback(Exception): @@ -320,9 +319,12 @@ def course_rerun_handler(request, course_key_string): html: return html page with form to rerun a course for the given course id """ # Only global staff (PMs) are able to rerun courses during the soft launch - if not GlobalStaff().has_user(request.user): - raise PermissionDenied() + #EDLYCUSTOM: Permit global course creators to access rerun status + course_key = CourseKey.from_string(course_key_string) + if not GlobalStaff().has_user(request.user) and not GlobalCourseCreatorRole(course_key.org).has_user(request.user): + raise PermissionDenied() + with modulestore().bulk_operations(course_key): course_block = get_course_and_check_access(course_key, request.user, depth=3) if request.method == 'GET': @@ -537,7 +539,9 @@ def filter_ccx(course_access): instructor_courses = UserBasedRole(request.user, CourseInstructorRole.ROLE).courses_with_role() staff_courses = UserBasedRole(request.user, CourseStaffRole.ROLE).courses_with_role() - all_courses = list(filter(filter_ccx, instructor_courses | staff_courses)) + #EDLYCUSTOM: Include Global Course Creators of org as site courses + site_courses = UserBasedRole(request.user, GlobalCourseCreatorRole.ROLE).courses_with_role() + all_courses = list(filter(filter_ccx, instructor_courses | staff_courses | site_courses)) courses_list = [] course_keys = {} @@ -652,7 +656,11 @@ def course_listing(request): """ List all courses and libraries available to the logged in user """ - return redirect(get_studio_home_url()) + if use_new_home_page(): + return redirect(get_studio_home_url()) + + home_context = get_home_context(request) + return render_to_response('index.html', home_context) @login_required @@ -661,17 +669,11 @@ def library_listing(request): """ List all Libraries available to the logged in user """ - mfe_base_url = settings.COURSE_AUTHORING_MICROFRONTEND_URL - if mfe_base_url: - return redirect(f'{mfe_base_url}/libraries') - - raise ImproperlyConfigured( - "The COURSE_AUTHORING_MICROFRONTEND_URL must be configured. " - "Please set it to the base url for your authoring MFE." - ) + data = get_library_context(request) + return render_to_response('index.html', data) -def _format_library_for_view(library, request, migrated_to: Optional[NamedTuple]): +def _format_library_for_view(library, request): """ Return a dict of the data which the view requires for each library """ @@ -683,7 +685,6 @@ def _format_library_for_view(library, request, migrated_to: Optional[NamedTuple] 'org': library.display_org_with_default, 'number': library.display_number_with_default, 'can_edit': has_studio_write_access(request.user, library.location.library_key), - **(migrated_to._asdict() if migrated_to is not None else {}), } @@ -738,8 +739,17 @@ def course_index(request, course_key): org, course, name: Attributes of the Location for the item to edit """ - block_to_show = request.GET.get("show") - return redirect(get_course_outline_url(course_key, block_to_show)) + if use_new_course_outline_page(course_key): + return redirect(get_course_outline_url(course_key)) + with modulestore().bulk_operations(course_key): + # A depth of None implies the whole course. The course outline needs this in order to compute has_changes. + # A unit may not have a draft version, but one of its components could, and hence the unit itself has changes. + course_block = get_course_and_check_access(course_key, request.user, depth=None) + if not course_block: + raise Http404 + # should be under bulk_operations if course_block is passed + course_index_context = get_course_index_context(request, course_key, course_block) + return render_to_response('course_outline.html', course_index_context) @function_trace('get_courses_accessible_to_user') @@ -761,6 +771,8 @@ def get_courses_accessible_to_user(request): # user have some old groups or there was some error getting courses from django groups # so fallback to iterating through all courses courses, in_process_course_actions = _accessible_courses_summary_iter(request) + + courses = CoursesRequested.run_filter(courses=courses) return courses, in_process_course_actions @@ -1707,89 +1719,6 @@ def group_configurations_detail_handler(request, course_key_string, group_config ) -@extend_schema( - summary="Bulk enable/disable discussions for all units in a course.", - description="Enable or disable discussions for all verticals in the specified course.", - request=OpenApiRequest( - request={ - "type": "object", - "properties": {"discussion_enabled": {"type": "boolean"}}, - "required": ["discussion_enabled"], - } - ), - responses={ - 200: OpenApiResponse( - response={ - "type": "object", - "properties": {"units_updated_and_republished": {"type": "integer"}}, - } - ), - 400: OpenApiResponse(description="Bad request"), - 403: OpenApiResponse(description="Permission denied"), - }, - methods=["PUT"], - parameters=[ - OpenApiParameter( - name="course_key_string", - description="Course key string", - required=True, - type=str, - location=OpenApiParameter.PATH, - ) - ], -) -@api_view(['PUT']) -@view_auth_classes() -@expect_json -def bulk_enable_disable_discussions(request, course_key_string): - """ - API endpoint to enable/disable discussions for all verticals in the course and republish them. - - PUT - json: enable/disable discussions for all units and republish - """ - try: - # Validate the course key - course_key = CourseKey.from_string(course_key_string) - except InvalidKeyError: - return JsonResponseBadRequest({"error": "Invalid course key format"}) - - user = request.user - - # check that logged in user has permissions to update this course - if not has_studio_write_access(user, course_key): - raise PermissionDenied() - - if 'discussion_enabled' not in request.json: - return JsonResponseBadRequest({"error": "Missing 'discussion_enabled' field in request body"}) - discussion_enabled = request.json['discussion_enabled'] - log.info( - "User %s is attempting to %s discussions for all verticals in course %s", - user.username, - "enable" if discussion_enabled else "disable", - course_key - ) - - if request.method == 'PUT': - try: - store = modulestore() - changed = 0 - with store.bulk_operations(course_key): - verticals = store.get_items(course_key, qualifiers={'block_type': 'vertical'}) - for vertical in verticals: - if vertical.discussion_enabled != discussion_enabled: - vertical.discussion_enabled = discussion_enabled - store.update_item(vertical, user.id) - - if store.has_published_version(vertical): - store.publish(vertical.location, user.id) - changed += 1 - return JsonResponse({"units_updated_and_republished": changed}) - except Exception as e: # lint-amnesty, pylint: disable=broad-except - log.exception("Exception occurred while enabling/disabling discussion: %s", str(e)) - return JsonResponseBadRequest({"error": str(e)}) - - def are_content_experiments_enabled(course): """ Returns True if content experiments have been enabled for the course. @@ -1879,4 +1808,5 @@ def get_organizations(user): else: organizations = course_creator.organizations.all().values_list('short_name', flat=True) + organizations = OrganizationsRequested.run_filter(organizations=organizations) return organizations diff --git a/cms/djangoapps/contentstore/xblock_storage_handlers/view_handlers.py b/cms/djangoapps/contentstore/xblock_storage_handlers/view_handlers.py index 78c393532305..31a17466769d 100644 --- a/cms/djangoapps/contentstore/xblock_storage_handlers/view_handlers.py +++ b/cms/djangoapps/contentstore/xblock_storage_handlers/view_handlers.py @@ -20,7 +20,7 @@ from django.http import HttpResponse, HttpResponseBadRequest from django.utils.translation import gettext as _ from edx_django_utils.plugins import pluggable_override -from openedx.core.djangoapps.content_libraries.api import ContainerMetadata, ContainerType, LibraryXBlockMetadata +from openedx.core.djangoapps.content_libraries.api import LibraryXBlockMetadata from openedx.core.djangoapps.content_tagging.api import get_object_tag_counts from edx_proctoring.api import ( does_backend_support_onboarding, @@ -33,7 +33,6 @@ from pytz import UTC from xblock.core import XBlock from xblock.fields import Scope -from .xblock_helpers import get_block_key_string from cms.djangoapps.contentstore.config.waffle import SHOW_REVIEW_RULES_FLAG from cms.djangoapps.contentstore.helpers import StaticFileNotices @@ -301,11 +300,14 @@ def modify_xblock(usage_key, request): ) -def save_xblock_with_callback(xblock, user, old_metadata=None, old_content=None): +def _update_with_callback(xblock, user, old_metadata=None, old_content=None): """ Updates the xblock in the modulestore. But before doing so, it calls the xblock's editor_saved callback function, and after doing so, it calls the xblock's post_editor_saved callback function. + + TODO: Remove getattrs from this function. + See https://github.com/openedx/edx-platform/issues/33715 """ if old_metadata is None: old_metadata = own_metadata(xblock) @@ -374,7 +376,7 @@ def _save_xblock( if old_parent_location: old_parent = store.get_item(old_parent_location) old_parent.children.remove(new_child) - old_parent = save_xblock_with_callback(old_parent, user) + old_parent = _update_with_callback(old_parent, user) else: # the Studio UI currently doesn't present orphaned children, so assume this is an error return JsonResponse( @@ -444,7 +446,7 @@ def _save_xblock( validate_and_update_xblock_due_date(xblock) # update the xblock and call any xblock callbacks - xblock = save_xblock_with_callback(xblock, user, old_metadata, old_content) + xblock = _update_with_callback(xblock, user, old_metadata, old_content) # for static tabs, their containing course also records their display name course = store.get_course(xblock.location.course_key) @@ -526,33 +528,16 @@ def create_item(request): return _create_block(request) -def sync_library_content( - downstream: XBlock, - request, - store, - top_level_parent: XBlock | None = None, -) -> StaticFileNotices: +def sync_library_content(downstream: XBlock, request, store) -> StaticFileNotices: """ Handle syncing library content for given xblock depending on its upstream type. It can sync unit containers and lower level xblocks. """ link = UpstreamLink.get_for_block(downstream) upstream_key = link.upstream_key - request_data = getattr(request, "json", getattr(request, "data", {})) - override_customizations = request_data.get("override_customizations", False) - keep_custom_fields = request_data.get("keep_custom_fields", []) if isinstance(upstream_key, LibraryUsageLocatorV2): - lib_block = sync_from_upstream_block( - downstream=downstream, - user=request.user, - top_level_parent=top_level_parent, - override_customizations=override_customizations, - keep_custom_fields=keep_custom_fields, - ) - if lib_block: - static_file_notices = import_static_assets_for_library_sync(downstream, lib_block, request) - else: - static_file_notices = StaticFileNotices() + lib_block = sync_from_upstream_block(downstream=downstream, user=request.user) + static_file_notices = import_static_assets_for_library_sync(downstream, lib_block, request) store.update_item(downstream, request.user.id) else: with store.bulk_operations(downstream.usage_key.context_key): @@ -561,66 +546,30 @@ def sync_library_content( downstream_children_keys = [child.upstream for child in downstream_children] # Sync the children: notices = [] - # Store final children keys to update order of items in containers + # Store final children keys to update order of components in unit children = [] - - top_level_downstream_parent = top_level_parent or downstream - for i, upstream_child in enumerate(upstream_children): - if isinstance(upstream_child, LibraryXBlockMetadata): - upstream_key = str(upstream_child.usage_key) - block_type = upstream_child.usage_key.block_type - elif isinstance(upstream_child, ContainerMetadata): - upstream_key = str(upstream_child.container_key) - match upstream_child.container_type: - case ContainerType.Unit: - block_type = "vertical" - case ContainerType.Subsection: - block_type = "sequential" - case _: - # We don't support other container types for now. - log.error( - "Unexpected upstream child container type: %s", - upstream_child.container_type, - ) - continue - else: - log.error( - "Unexpected type of upstream child: %s", - type(upstream_child), - ) - continue - - if upstream_key not in downstream_children_keys: + assert isinstance(upstream_child, LibraryXBlockMetadata) # for now we only support units + if upstream_child.usage_key not in downstream_children_keys: # This upstream_child is new, create it. downstream_child = store.create_child( parent_usage_key=downstream.usage_key, position=i, user_id=request.user.id, - block_type=block_type, + block_type=upstream_child.usage_key.block_type, # TODO: Can we generate a unique but friendly block_id, perhaps using upstream block_id - block_id=f"{block_type}{uuid4().hex[:8]}", + block_id=f"{upstream_child.usage_key.block_type}{uuid4().hex[:8]}", fields={ - "upstream": upstream_key, - "top_level_downstream_parent_key": get_block_key_string( - top_level_downstream_parent.usage_key, - ), + "upstream": str(upstream_child.usage_key), }, ) else: - downstream_child_old_index = downstream_children_keys.index(upstream_key) + downstream_child_old_index = downstream_children_keys.index(upstream_child.usage_key) downstream_child = downstream_children[downstream_child_old_index] + result = sync_library_content(downstream=downstream_child, request=request, store=store) children.append(downstream_child.usage_key) - - result = sync_library_content( - downstream=downstream_child, - request=request, - store=store, - top_level_parent=top_level_downstream_parent, - ) notices.append(result) - for child in downstream_children: if child.usage_key not in children: # This downstream block was added, or deleted from upstream block. @@ -1127,14 +1076,11 @@ def create_xblock_info( # lint-amnesty, pylint: disable=too-many-statements # defining the default value 'True' for delete, duplicate, drag and add new child actions # in xblock_actions for each xblock. - # The unlinkable action is set to None by default, which means the action is not applicable for - # any xblock unless explicitly set to True or False for a specific xblock condition. xblock_actions = { "deletable": True, "draggable": True, "childAddable": True, "duplicable": True, - "unlinkable": None, } explanatory_message = None @@ -1329,20 +1275,13 @@ def create_xblock_info( # lint-amnesty, pylint: disable=too-many-statements # Update with gating info xblock_info.update(_get_gating_info(course, xblock)) - # Also add upstream info - upstream_info = UpstreamLink.try_get_for_block(xblock, log_error=False).to_json() - xblock_info["upstream_info"] = upstream_info - - if upstream_info["upstream_ref"]: - # Disable adding or removing children component if xblock is imported from library - xblock_actions["childAddable"] = False - # Enable unlinking only for top level imported components - xblock_actions["unlinkable"] = not upstream_info["has_top_level_parent"] - if is_xblock_unit: # if xblock is a Unit we add the discussion_enabled option xblock_info["discussion_enabled"] = xblock.discussion_enabled + # Also add upstream info + xblock_info["upstream_info"] = UpstreamLink.try_get_for_block(xblock, log_error=False).to_json() + if xblock.category == "sequential": # Entrance exam subsection should be hidden. in_entrance_exam is # inherited metadata, all children will have it. @@ -1687,7 +1626,7 @@ def _get_release_date(xblock, user=None): if reset_to_default and user: xblock.start = DEFAULT_START_DATE - xblock = save_xblock_with_callback(xblock, user) + xblock = _update_with_callback(xblock, user) # Treat DEFAULT_START_DATE as a magic number that means the release date has not been set return ( diff --git a/cms/djangoapps/course_creators/admin.py b/cms/djangoapps/course_creators/admin.py index a1d34b3897fc..8ecf878e1b4f 100644 --- a/cms/djangoapps/course_creators/admin.py +++ b/cms/djangoapps/course_creators/admin.py @@ -13,6 +13,8 @@ from django.core.mail import send_mail from django.db.models.signals import m2m_changed from django.dispatch import receiver +from edly_features_app.overrides.emails import should_send_email, create_user_unsubscribe_url +from edly_features_app.constants import ROLE_ASSIGNED, ROLE_REVOKED from cms.djangoapps.course_creators.models import ( CourseCreator, @@ -22,6 +24,9 @@ ) from cms.djangoapps.course_creators.views import update_course_creator_group, update_org_content_creator_role from common.djangoapps.edxmako.shortcuts import render_to_string +from openedx.core.djangoapps.ace_common.template_context import get_base_template_context +from openedx.core.djangoapps.theming.helpers import get_current_site +import types log = logging.getLogger("studio.coursecreatoradmin") @@ -144,6 +149,18 @@ def send_user_notification_callback(sender, **kwargs): # pylint: disable=unused studio_request_email = settings.FEATURES.get('STUDIO_REQUEST_EMAIL', '') context = {'studio_request_email': studio_request_email} + + # we need to mock Edx ace context to check if the email should be sent + site = get_current_site() + dummy_context = get_base_template_context(site) + dummy_context['tenant_key'] = settings.EDNX_TENANT_KEY + dummy_message = types.SimpleNamespace( + name=ROLE_ASSIGNED if updated_state == CourseCreator.GRANTED else ROLE_REVOKED, + context=dummy_context, + recipient=types.SimpleNamespace(email_address=user.email), + ) + if not should_send_email(dummy_message): + return subject = render_to_string('emails/course_creator_subject.txt', context) subject = ''.join(subject.splitlines()) @@ -154,6 +171,12 @@ def send_user_notification_callback(sender, **kwargs): # pylint: disable=unused else: # changed to unrequested or pending message_template = 'emails/course_creator_revoked.txt' + + context['unsubscribe_url'] = create_user_unsubscribe_url( + user.email, + getattr(settings, 'PANEL_NOTIFICATIONS_BASE_URL', None), + getattr(settings, 'EDNX_TENANT_KEY', None), + ) message = render_to_string(message_template, context) try: diff --git a/cms/static/sass/course-unit-mfe-iframe-bundle.scss b/cms/static/sass/course-unit-mfe-iframe-bundle.scss index 6c9b43cf0b25..2cb071f2ce6a 100644 --- a/cms/static/sass/course-unit-mfe-iframe-bundle.scss +++ b/cms/static/sass/course-unit-mfe-iframe-bundle.scss @@ -633,6 +633,42 @@ input.xblock-inline-title-editor { } } +.lib-edit-warning-tooltipbox { + .tooltiptext { + visibility: hidden; + position: absolute; + width: 200px; + background-color: $black; + color: $white; + text-align: center; + padding: 5px; + border-radius: 6px; + z-index: 1; + top: 50%; + right: 100%; + margin-right: 10px; + transform: translateY(-50%); + opacity: 0; + transition: opacity 0.3s; + } + + .tooltiptext::after { + content: ""; + position: absolute; + top: 50%; + left: 100%; + transform: translateY(-50%); + border-width: 5px; + border-style: solid; + border-color: transparent transparent transparent $black; + } + + &:hover .tooltiptext { + visibility: visible; + opacity: 1; + } +} + .action-edit { .action-button-text { display: none; @@ -647,6 +683,13 @@ input.xblock-inline-title-editor { display: none; } + &.disabled-button { + pointer-events: all; + opacity: .5; + cursor: default; + border-color: $transparent + } + &::before { @extend %icon-position; diff --git a/cms/templates/studio_xblock_wrapper.html b/cms/templates/studio_xblock_wrapper.html index 8929b225cc91..e5ec43434747 100644 --- a/cms/templates/studio_xblock_wrapper.html +++ b/cms/templates/studio_xblock_wrapper.html @@ -7,11 +7,12 @@ from openedx.core.djangolib.js_utils import ( dump_js_escaped_json, js_escaped_string ) -from cms.djangoapps.contentstore.toggles import use_new_problem_editor, use_new_video_editor, use_video_gallery_flow +from cms.djangoapps.contentstore.toggles import use_new_text_editor, use_new_problem_editor, use_new_video_editor, use_video_gallery_flow from cms.lib.xblock.upstream_sync import UpstreamLink from openedx.core.djangoapps.content_tagging.toggles import is_tagging_feature_disabled %> <% +use_new_editor_text = use_new_text_editor(xblock.context_key) use_new_editor_video = use_new_video_editor(xblock.context_key) use_new_editor_problem = use_new_problem_editor(xblock.context_key) use_new_video_gallery_flow = use_video_gallery_flow() @@ -25,7 +26,6 @@ block_is_unit = is_unit(xblock) upstream_info = UpstreamLink.try_get_for_block(xblock, log_error=False) -can_unlink = upstream_info.upstream_ref and not upstream_info.has_top_level_parent %> <%namespace name='static' file='static_content.html'/> @@ -82,6 +82,7 @@ is-collapsed % endif " + use-new-editor-text = ${use_new_editor_text} use-new-editor-video = ${use_new_editor_video} use-new-editor-problem = ${use_new_editor_problem} use-video-gallery-flow = ${use_new_video_gallery_flow} @@ -91,7 +92,6 @@ % if upstream_info.upstream_ref: data-upstream-ref = ${upstream_info.upstream_ref} data-version-synced = ${upstream_info.version_synced} - data-is-modified = ${len(upstream_info.downstream_customized) > 0} %endif >
@@ -119,14 +119,10 @@ ${_("Sourced from a library - but the upstream link is broken/invalid.")} % else: - 0 else "Sourced from a library.")}" aria-hidden="true" xmlns="http://www.w3.org/2000/svg" width="24px" height="24px" viewBox="0 -960 960 960" fill="currentColor" style="vertical-align: middle; padding-bottom: 4px;"> + - % if len(upstream_info.downstream_customized) > 0: - ${_("Sourced from a library - but has been modified locally.")} - % else: ${_("Sourced from a library.")} - % endif % endif % endif ${label} @@ -138,94 +134,95 @@
- - % if is_reorderable and can_move: -
  • -
  • + % if is_reorderable: +
  • + +
  • + % endif + % else: + % if not show_inline: +
  • +
    + + ${_('Components within a library referenced object cannot be edited')} +
    +
  • + % endif % endif % endif diff --git a/common/djangoapps/student/auth.py b/common/djangoapps/student/auth.py index e199142fe377..bf291b767fd3 100644 --- a/common/djangoapps/student/auth.py +++ b/common/djangoapps/student/auth.py @@ -97,6 +97,12 @@ def get_user_permissions(user, course_key, org=None, service_variant=None): # global staff, org instructors, and course instructors have all permissions: if GlobalStaff().has_user(user) or OrgInstructorRole(org=org).has_user(user): return all_perms + + #EDLYCUSTOM: Provide all course permissions to Global Course Creators + from edly_features_app.roles import GlobalCourseCreatorRole + if GlobalCourseCreatorRole(org).has_user(user): + return all_perms + if course_key and user_has_role(user, CourseInstructorRole(course_key)): return all_perms # HACK: Limited Staff should not have studio read access. However, since many LMS views depend on the diff --git a/common/templates/xblock_v2/xblock_iframe.html b/common/templates/xblock_v2/xblock_iframe.html index 9bf0ab2df94f..cd3096aa4626 100644 --- a/common/templates/xblock_v2/xblock_iframe.html +++ b/common/templates/xblock_v2/xblock_iframe.html @@ -196,11 +196,6 @@ event listeners below, in certain situations. Resetting it to the default "auto" skirts the problem.-->
    - {% if show_title %} -
    - {{ display_name | safe }} -
    - {% endif %} {{ fragment.body_html | safe }} diff --git a/lms/djangoapps/discussion/tasks.py b/lms/djangoapps/discussion/tasks.py index d483388f54ae..c88fc674a447 100644 --- a/lms/djangoapps/discussion/tasks.py +++ b/lms/djangoapps/discussion/tasks.py @@ -179,10 +179,9 @@ def _track_notification_sent(message, context): def _should_send_message(context): cc_thread_author = cc.User(id=context['thread_author_id'], course_id=context['course_id']) + # EDLYCUSTOM: we want to enable email notifications for thread replies return ( - _is_user_subscribed_to_thread(cc_thread_author, context['thread_id']) and - _is_not_subcomment(context['comment_id']) and - not _comment_author_is_thread_author(context) + _is_user_subscribed_to_thread(cc_thread_author, context['thread_id']) ) diff --git a/lms/djangoapps/discussion/templates/discussion/edx_ace/responsenotification/email/body.html b/lms/djangoapps/discussion/templates/discussion/edx_ace/responsenotification/email/body.html index fcfe2ef9024b..7a74c07fffc7 100644 --- a/lms/djangoapps/discussion/templates/discussion/edx_ace/responsenotification/email/body.html +++ b/lms/djangoapps/discussion/templates/discussion/edx_ace/responsenotification/email/body.html @@ -17,7 +17,7 @@ padding: 1px 1px 1px 15px; margin: 20px 20px 30px 30px; color: rgba(0,0,0,.75);"> - {{ comment_body }} + {{ comment_body|safe }}
    {% filter force_escape %} diff --git a/lms/djangoapps/grades/course_grade.py b/lms/djangoapps/grades/course_grade.py index f5639873a809..60b0becf2030 100644 --- a/lms/djangoapps/grades/course_grade.py +++ b/lms/djangoapps/grades/course_grade.py @@ -30,6 +30,7 @@ def __init__( letter_grade=None, passed=False, force_update_subsections=False, + passed_timestamp=None, last_updated=None ): self.user = user @@ -44,6 +45,9 @@ def __init__( self.last_updated = last_updated + # EDLYCUSTOM: this timestamp is used to mark a course as completed in edly panel via figures LCGM + self.passed_timestamp = passed_timestamp + def __str__(self): return 'Course Grade: percent: {}, letter_grade: {}, passed: {}'.format( str(self.percent), diff --git a/lms/djangoapps/grades/course_grade_factory.py b/lms/djangoapps/grades/course_grade_factory.py index a6665bf67b84..b9d3bb278c33 100644 --- a/lms/djangoapps/grades/course_grade_factory.py +++ b/lms/djangoapps/grades/course_grade_factory.py @@ -144,6 +144,7 @@ def _read(user, course_data): persistent_grade.percent_grade, persistent_grade.letter_grade, persistent_grade.letter_grade != '', + passed_timestamp=persistent_grade.passed_timestamp, last_updated=persistent_grade.modified ) diff --git a/lms/djangoapps/instructor/views/api.py b/lms/djangoapps/instructor/views/api.py index eae657ed19e7..21d1c9f5e515 100644 --- a/lms/djangoapps/instructor/views/api.py +++ b/lms/djangoapps/instructor/views/api.py @@ -19,6 +19,7 @@ import edx_api_doc_tools as apidocs from django.conf import settings from django.contrib.auth.models import User # lint-amnesty, pylint: disable=imported-auth-user +from django_countries.fields import Country from django.core.exceptions import MultipleObjectsReturned, ObjectDoesNotExist, PermissionDenied, ValidationError from django.core.validators import validate_email from django.db import IntegrityError, transaction @@ -37,6 +38,8 @@ from opaque_keys import InvalidKeyError from opaque_keys.edx.keys import CourseKey, UsageKey from openedx.core.djangoapps.course_groups.cohorts import get_cohort_by_name +from openedx_events.learning.data import UserData, UserPersonalData +from openedx_events.learning.signals import STUDENT_REGISTRATION_COMPLETED from rest_framework.exceptions import MethodNotAllowed from rest_framework import serializers, status # lint-amnesty, pylint: disable=wrong-import-order from rest_framework.permissions import IsAdminUser, IsAuthenticated, BasePermission # lint-amnesty, pylint: disable=wrong-import-order @@ -416,6 +419,14 @@ def post(self, request, course_id): # pylint: disable=too-many-statements else: cohort_name = None course_mode = None + + if not Country(country).name: + row_errors.append({ + 'username': username, + 'email': email, + 'response': _('Invalid country: {country}. Please enter a valid country code. e.g., US, GB').format(country=country) + }) + continue # Validate cohort name, and get the cohort object. Skip if course # is not cohorted. @@ -641,6 +652,18 @@ def create_user_and_user_profile(email, username, name, country, password): profile.country = country profile.save() + STUDENT_REGISTRATION_COMPLETED.send_event( + user=UserData( + pii=UserPersonalData( + username=user.username, + email=user.email, + name=user.profile.name, + ), + id=user.id, + is_active=user.is_active, + ), + ) + return user diff --git a/lms/djangoapps/learner_dashboard/programs.py b/lms/djangoapps/learner_dashboard/programs.py index dc334c0ce34e..c5472ac097d3 100644 --- a/lms/djangoapps/learner_dashboard/programs.py +++ b/lms/djangoapps/learner_dashboard/programs.py @@ -102,7 +102,7 @@ def render_to_fragment(self, request, program_uuid, **kwargs): # lint-amnesty, program_data, course_data = get_program_and_course_data(site, user, program_uuid, mobile_only) - if not program_data: + if not program_data or program_data.get('status', '') == 'unpublished': raise Http404 certificate_data = get_certificates(user, program_data) diff --git a/lms/static/js/learner_dashboard/models/program_model.js b/lms/static/js/learner_dashboard/models/program_model.js index 4bfe9f6df334..284cd6b55c71 100644 --- a/lms/static/js/learner_dashboard/models/program_model.js +++ b/lms/static/js/learner_dashboard/models/program_model.js @@ -12,6 +12,7 @@ class ProgramModel extends Backbone.Model { subtitle: data.subtitle, authoring_organizations: data.authoring_organizations, detailUrl: data.detail_url, + cardImageUrl: data.card_image_url, xsmallBannerUrl: (data.banner_image && data.banner_image['x-small']) ? data.banner_image['x-small'].url : '', smallBannerUrl: (data.banner_image && data.banner_image.small) ? data.banner_image.small.url : '', mediumBannerUrl: (data.banner_image && data.banner_image.medium) ? data.banner_image.medium.url : '', diff --git a/lms/templates/learner_dashboard/program_card.underscore b/lms/templates/learner_dashboard/program_card.underscore index de98c952dd15..a4ba23915657 100644 --- a/lms/templates/learner_dashboard/program_card.underscore +++ b/lms/templates/learner_dashboard/program_card.underscore @@ -57,7 +57,7 @@ - + diff --git a/openedx/core/djangoapps/catalog/management/commands/cache_programs.py b/openedx/core/djangoapps/catalog/management/commands/cache_programs.py index 4af05da89280..6b3150345d1d 100644 --- a/openedx/core/djangoapps/catalog/management/commands/cache_programs.py +++ b/openedx/core/djangoapps/catalog/management/commands/cache_programs.py @@ -10,6 +10,7 @@ from django.contrib.sites.models import Site from django.core.cache import cache from django.core.management import BaseCommand +from edly_features_app.utils import get_active_tenant_sites_with_catalog_urls from openedx.core.djangoapps.catalog.cache import ( CATALOG_COURSE_PROGRAMS_CACHE_KEY_TPL, @@ -78,17 +79,18 @@ def handle(self, *args, **options): # lint-amnesty, pylint: disable=too-many-st programs_by_type_slug = {} organizations = {} - sites = Site.objects.filter(domain=domain) if domain else Site.objects.all() + # EDLYCUSTOM: we need to filter active site and grab the catalog api url from tenant config + sites, domain_to_catalog_url = get_active_tenant_sites_with_catalog_urls(domain) for site in sites: - site_config = getattr(site, 'configuration', None) - if site_config is None or not site_config.get_value('COURSE_CATALOG_API_URL'): + catalog_api_url = domain_to_catalog_url[site.domain] + if not catalog_api_url: logger.info(f'Skipping site {site.domain}. No configuration.') cache.set(SITE_PROGRAM_UUIDS_CACHE_KEY_TPL.format(domain=site.domain), [], None) cache.set(SITE_PATHWAY_IDS_CACHE_KEY_TPL.format(domain=site.domain), [], None) continue client = get_catalog_api_client(user) - api_base_url = get_catalog_api_base_url(site=site) + api_base_url = catalog_api_url uuids, program_uuids_failed = self.get_site_program_uuids(client, site, api_base_url) new_programs, program_details_failed = self.fetch_program_details(client, uuids, api_base_url) new_pathways, pathways_failed = self.get_pathways(client, site, api_base_url) @@ -154,7 +156,7 @@ def get_site_program_uuids(self, client, site, api_base_url): # lint-amnesty, p try: querystring = { 'exclude_utm': 1, - 'status': ('active', 'retired'), + 'status': ('active', 'retired', 'unpublished'), 'uuids_only': 1, } api_url = urljoin(f"{api_base_url}/", "programs/") diff --git a/openedx/core/djangoapps/content/search/documents.py b/openedx/core/djangoapps/content/search/documents.py index df673db55dd6..28b5e74450a6 100644 --- a/openedx/core/djangoapps/content/search/documents.py +++ b/openedx/core/djangoapps/content/search/documents.py @@ -48,9 +48,8 @@ class Fields: org = "org" access_id = "access_id" # .models.SearchAccess.id # breadcrumbs: an array of {"display_name": "..."} entries. First one is the name of the course/library itself. - # After that is the name of any parent Section/Subsection/Unit/etc and its usage_key. + # After that is the name of any parent Section/Subsection/Unit/etc. # It's a list of dictionaries because for now we just include the name of each but in future we may add their IDs. - # Example: [{"display_name": "My course"}, {"display_name": "Section1", "usage_key": "..."}]} breadcrumbs = "breadcrumbs" # tags (dictionary) # See https://blog.meilisearch.com/nested-hierarchical-facets-guide/ @@ -68,15 +67,6 @@ class Fields: collections = "collections" collections_display_name = "display_name" collections_key = "key" - # Containers (dictionaries) that this object belongs to. - units = "units" - subsections = "subsections" - sections = "sections" - containers_display_name = "display_name" - containers_key = "key" - - sections_display_name = "display_name" - sections_key = "key" # The "content" field is a dictionary of arbitrary data, depending on the block_type. # It comes from each XBlock's index_dictionary() method (if present) plus some processing. @@ -101,11 +91,6 @@ class Fields: published_content = "content" published_num_children = "num_children" - # List of children keys - child_usage_keys = "child_usage_keys" - # List of children display names - child_display_names = "child_display_names" - # Note: new fields or values can be added at any time, but if they need to be indexed for filtering or keyword # search, the index configuration will need to be changed, which is only done as part of the 'reindex_studio' # command (changing those settings on an large active index is not recommended). @@ -263,92 +248,7 @@ class implementation returns only: return block_data -def _published_data_from_block(block_published) -> dict: - """ - Given an library block get the published data. - """ - result = { - Fields.published: { - Fields.published_display_name: xblock_api.get_block_display_name(block_published), - } - } - - try: - content_data = _get_content_from_block(block_published) - - description = _get_description_from_block_content( - block_published.scope_ids.block_type, - content_data, - ) - - if description: - result[Fields.published][Fields.published_description] = description - except Exception as err: # pylint: disable=broad-except - log.exception(f"Failed to process index_dictionary for {block_published.usage_key}: {err}") - - return result - - -def searchable_doc_for_library_block(xblock_metadata: lib_api.LibraryXBlockMetadata) -> dict: - """ - Generate a dictionary document suitable for ingestion into a search engine - like Meilisearch or Elasticsearch, so that the given library block can be - found using faceted search. - - Datetime fields (created, modified, last_published) are serialized to POSIX timestamps so that they can be used to - sort the search results. - """ - library_name = lib_api.get_library(xblock_metadata.usage_key.context_key).title - block = xblock_api.load_block(xblock_metadata.usage_key, user=None) - - publish_status = PublishStatus.published - try: - block_published = xblock_api.load_block(xblock_metadata.usage_key, user=None, version=LatestVersion.PUBLISHED) - if xblock_metadata.last_published and xblock_metadata.last_published < xblock_metadata.modified: - publish_status = PublishStatus.modified - except NotFound: - # Never published - block_published = None - publish_status = PublishStatus.never - - doc = searchable_doc_for_key(xblock_metadata.usage_key) - doc.update({ - Fields.type: DocType.library_block, - Fields.breadcrumbs: [], - Fields.created: xblock_metadata.created.timestamp(), - Fields.modified: xblock_metadata.modified.timestamp(), - Fields.last_published: xblock_metadata.last_published.timestamp() if xblock_metadata.last_published else None, - Fields.publish_status: publish_status, - }) - - doc.update(_fields_from_block(block)) - - if block_published: - doc.update(_published_data_from_block(block_published)) - - # Add the breadcrumbs. In v2 libraries, the library itself is not a "parent" of the XBlocks so we add it here: - doc[Fields.breadcrumbs] = [{"display_name": library_name}] - - return doc - - -def searchable_doc_for_course_block(block) -> dict: - """ - Generate a dictionary document suitable for ingestion into a search engine - like Meilisearch or Elasticsearch, so that the given course block can be - found using faceted search. - """ - doc = searchable_doc_for_key(block.usage_key) - doc.update({ - Fields.type: DocType.course_block, - }) - - doc.update(_fields_from_block(block)) - - return doc - - -def searchable_doc_tags(object_id: OpaqueKey) -> dict: +def _tags_for_content_object(object_id: OpaqueKey) -> dict: """ Given an XBlock, course, library, etc., get the tag data for its index doc. @@ -413,7 +313,7 @@ def searchable_doc_tags(object_id: OpaqueKey) -> dict: return {Fields.tags: result} -def searchable_doc_collections(object_id: OpaqueKey) -> dict: +def _collections_for_content_object(object_id: OpaqueKey) -> dict: """ Given an XBlock, course, library, etc., get the collections for its index doc. @@ -469,55 +369,126 @@ def searchable_doc_collections(object_id: OpaqueKey) -> dict: return result -def searchable_doc_containers(object_id: OpaqueKey, container_type: str) -> dict: +def _published_data_from_block(block_published) -> dict: """ - Given an XBlock, course, library, etc., get the containers that it is part of for its index doc. - - e.g. for something in Units "UNIT_A" and "UNIT_B", this would return: - { - "units": { - "display_name": ["Unit A", "Unit B"], - "key": ["UNIT_A", "UNIT_B"], - } - } - - If the object is in no containers, returns: - { - "sections": { - "display_name": [], - "key": [], - }, - } + Given an library block get the published data. """ - container_field = getattr(Fields, container_type) result = { - container_field: { - Fields.containers_display_name: [], - Fields.containers_key: [], + Fields.published: { + Fields.published_display_name: xblock_api.get_block_display_name(block_published), } } - # Gather the units associated with this object - containers = None try: - if isinstance(object_id, OpaqueKey): - containers = lib_api.get_containers_contains_item(object_id) - else: - log.warning(f"Unexpected key type for {object_id}") - - except ObjectDoesNotExist: - log.warning(f"No library item found for {object_id}") + content_data = _get_content_from_block(block_published) - if not containers: - return result + description = _get_description_from_block_content( + block_published.scope_ids.block_type, + content_data, + ) - for container in containers: - result[container_field][Fields.containers_display_name].append(container.display_name) - result[container_field][Fields.containers_key].append(str(container.container_key)) + if description: + result[Fields.published][Fields.published_description] = description + except Exception as err: # pylint: disable=broad-except + log.exception(f"Failed to process index_dictionary for {block_published.usage_key}: {err}") return result +def searchable_doc_for_library_block(xblock_metadata: lib_api.LibraryXBlockMetadata) -> dict: + """ + Generate a dictionary document suitable for ingestion into a search engine + like Meilisearch or Elasticsearch, so that the given library block can be + found using faceted search. + + Datetime fields (created, modified, last_published) are serialized to POSIX timestamps so that they can be used to + sort the search results. + """ + library_name = lib_api.get_library(xblock_metadata.usage_key.context_key).title + block = xblock_api.load_block(xblock_metadata.usage_key, user=None) + + publish_status = PublishStatus.published + try: + block_published = xblock_api.load_block(xblock_metadata.usage_key, user=None, version=LatestVersion.PUBLISHED) + if xblock_metadata.last_published and xblock_metadata.last_published < xblock_metadata.modified: + publish_status = PublishStatus.modified + except NotFound: + # Never published + block_published = None + publish_status = PublishStatus.never + + doc = searchable_doc_for_key(xblock_metadata.usage_key) + doc.update({ + Fields.type: DocType.library_block, + Fields.breadcrumbs: [], + Fields.created: xblock_metadata.created.timestamp(), + Fields.modified: xblock_metadata.modified.timestamp(), + Fields.last_published: xblock_metadata.last_published.timestamp() if xblock_metadata.last_published else None, + Fields.publish_status: publish_status, + }) + + doc.update(_fields_from_block(block)) + + if block_published: + doc.update(_published_data_from_block(block_published)) + + # Add the breadcrumbs. In v2 libraries, the library itself is not a "parent" of the XBlocks so we add it here: + doc[Fields.breadcrumbs] = [{"display_name": library_name}] + + return doc + + +def searchable_doc_tags(key: OpaqueKey) -> dict: + """ + Generate a dictionary document suitable for ingestion into a search engine + like Meilisearch or Elasticsearch, with the tags data for the given content object. + """ + doc = searchable_doc_for_key(key) + doc.update(_tags_for_content_object(key)) + + return doc + + +def searchable_doc_collections(opaque_key: OpaqueKey) -> dict: + """ + Generate a dictionary document suitable for ingestion into a search engine + like Meilisearch or Elasticsearch, with the collections data for the given content object. + """ + doc = searchable_doc_for_key(opaque_key) + doc.update(_collections_for_content_object(opaque_key)) + + return doc + + +def searchable_doc_tags_for_collection( + collection_key: LibraryCollectionLocator +) -> dict: + """ + Generate a dictionary document suitable for ingestion into a search engine + like Meilisearch or Elasticsearch, with the tags data for the given library collection. + """ + doc = searchable_doc_for_key(collection_key) + doc.update(_tags_for_content_object(collection_key)) + + return doc + + +def searchable_doc_for_course_block(block) -> dict: + """ + Generate a dictionary document suitable for ingestion into a search engine + like Meilisearch or Elasticsearch, so that the given course block can be + found using faceted search. + """ + doc = searchable_doc_for_key(block.usage_key) + doc.update({ + Fields.type: DocType.course_block, + }) + + doc.update(_fields_from_block(block)) + + return doc + + def searchable_doc_for_collection( collection_key: LibraryCollectionLocator, *, @@ -622,32 +593,16 @@ def searchable_doc_for_container( elif container.has_unpublished_changes: publish_status = PublishStatus.modified - container_type = lib_api.ContainerType(container_key.container_type) - - def get_child_keys(children) -> list[str]: - match container_type: - case lib_api.ContainerType.Unit: - return [ - str(child.usage_key) - for child in children - ] - case lib_api.ContainerType.Subsection | lib_api.ContainerType.Section: - return [ - str(child.container_key) - for child in children - ] - - def get_child_names(children) -> list[str]: - return [child.display_name for child in children] - doc.update({ Fields.display_name: container.display_name, Fields.created: container.created.timestamp(), Fields.modified: container.modified.timestamp(), Fields.num_children: len(draft_children), Fields.content: { - Fields.child_usage_keys: get_child_keys(draft_children), - Fields.child_display_names: get_child_names(draft_children), + "child_usage_keys": [ + str(child.usage_key) + for child in draft_children + ], }, Fields.publish_status: publish_status, Fields.last_published: container.last_published.timestamp() if container.last_published else None, @@ -665,8 +620,10 @@ def get_child_names(children) -> list[str]: Fields.published_display_name: container.published_display_name, Fields.published_num_children: len(published_children), Fields.published_content: { - Fields.child_usage_keys: get_child_keys(published_children), - Fields.child_display_names: get_child_names(published_children), + "child_usage_keys": [ + str(child.usage_key) + for child in published_children + ], }, } diff --git a/openedx/core/djangoapps/content/search/handlers.py b/openedx/core/djangoapps/content/search/handlers.py index 38dac3153596..315d3cde53fd 100644 --- a/openedx/core/djangoapps/content/search/handlers.py +++ b/openedx/core/djangoapps/content/search/handlers.py @@ -12,55 +12,48 @@ from openedx_events.content_authoring.data import ( ContentLibraryData, ContentObjectChangedData, - CourseData, LibraryBlockData, LibraryCollectionData, LibraryContainerData, XBlockData, ) from openedx_events.content_authoring.signals import ( - CONTENT_LIBRARY_CREATED, CONTENT_LIBRARY_DELETED, CONTENT_LIBRARY_UPDATED, - CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - COURSE_IMPORT_COMPLETED, - COURSE_RERUN_COMPLETED, LIBRARY_BLOCK_CREATED, LIBRARY_BLOCK_DELETED, - LIBRARY_BLOCK_PUBLISHED, LIBRARY_BLOCK_UPDATED, + LIBRARY_BLOCK_PUBLISHED, LIBRARY_COLLECTION_CREATED, LIBRARY_COLLECTION_DELETED, LIBRARY_COLLECTION_UPDATED, LIBRARY_CONTAINER_CREATED, LIBRARY_CONTAINER_DELETED, - LIBRARY_CONTAINER_PUBLISHED, LIBRARY_CONTAINER_UPDATED, + LIBRARY_CONTAINER_PUBLISHED, XBLOCK_CREATED, XBLOCK_DELETED, XBLOCK_UPDATED, + CONTENT_OBJECT_ASSOCIATIONS_CHANGED, ) from openedx.core.djangoapps.content.course_overviews.models import CourseOverview from openedx.core.djangoapps.content.search.models import SearchAccess from openedx.core.djangoapps.content_libraries import api as lib_api -from xmodule.modulestore.django import SignalHandler from .api import ( only_if_meilisearch_enabled, upsert_content_object_tags_index_doc, + upsert_collection_tags_index_docs, upsert_item_collections_index_docs, - upsert_item_containers_index_docs, ) from .tasks import ( - delete_course_index_docs, delete_library_block_index_doc, delete_library_container_index_doc, delete_xblock_index_doc, update_content_library_index_docs, update_library_collection_index_doc, update_library_container_index_doc, - upsert_course_blocks_docs, upsert_library_block_index_doc, upsert_xblock_index_doc, ) @@ -188,21 +181,6 @@ def library_block_deleted(**kwargs) -> None: delete_library_block_index_doc.apply(args=[str(library_block_data.usage_key)]) -@receiver(CONTENT_LIBRARY_CREATED) -@only_if_meilisearch_enabled -def content_library_created_handler(**kwargs) -> None: - """ - Create the index for the content library - """ - content_library_data = kwargs.get("content_library", None) - if not content_library_data or not isinstance(content_library_data, ContentLibraryData): # pragma: no cover - log.error("Received null or incorrect data for event") - return - library_key = content_library_data.library_key - - update_content_library_index_docs.apply(args=[str(library_key), True]) - - @receiver(CONTENT_LIBRARY_UPDATED) @only_if_meilisearch_enabled def content_library_updated_handler(**kwargs) -> None: @@ -279,15 +257,12 @@ def content_object_associations_changed_handler(**kwargs) -> None: # This event's changes may contain both "tags" and "collections", but this will happen rarely, if ever. # So we allow a potential double "upsert" here. if not content_object.changes or "tags" in content_object.changes: - upsert_content_object_tags_index_doc(opaque_key) + if isinstance(opaque_key, LibraryCollectionLocator): + upsert_collection_tags_index_docs(opaque_key) + else: + upsert_content_object_tags_index_doc(opaque_key) if not content_object.changes or "collections" in content_object.changes: upsert_item_collections_index_docs(opaque_key) - if not content_object.changes or "units" in content_object.changes: - upsert_item_containers_index_docs(opaque_key, "units") - if not content_object.changes or "sections" in content_object.changes: - upsert_item_containers_index_docs(opaque_key, "sections") - if not content_object.changes or "subsections" in content_object.changes: - upsert_item_containers_index_docs(opaque_key, "subsections") @receiver(LIBRARY_CONTAINER_CREATED) @@ -349,25 +324,3 @@ def library_container_deleted(**kwargs) -> None: # TODO: post-Teak, move all the celery tasks directly inline into this handlers? Because now the # events are emitted in an [async] worker, so it doesn't matter if the handlers are synchronous. # See https://github.com/openedx/edx-platform/pull/36640 discussion. - - -@receiver([COURSE_IMPORT_COMPLETED, COURSE_RERUN_COMPLETED]) -def handle_reindex_on_signal(**kwargs): - """ - Automatically update Meiliesearch index for course in database on new import or rerun. - """ - course_data = kwargs.get("course", None) - if not course_data or not isinstance(course_data, CourseData): - log.error("Received null or incorrect data for event") - return - - upsert_course_blocks_docs.delay(str(course_data.course_key)) - - -@receiver(SignalHandler.course_deleted) -def listen_for_course_delete(sender, course_key, **kwargs): # pylint: disable=unused-argument - """ - Catches the signal that a course has been deleted - and removes its entry from the Course About Search index. - """ - delete_course_index_docs.delay(str(course_key)) diff --git a/openedx/core/djangoapps/content/search/tasks.py b/openedx/core/djangoapps/content/search/tasks.py index 8d18c2aae6ee..5015f6912b10 100644 --- a/openedx/core/djangoapps/content/search/tasks.py +++ b/openedx/core/djangoapps/content/search/tasks.py @@ -10,7 +10,7 @@ from celery_utils.logged_task import LoggedTask from edx_django_utils.monitoring import set_code_owner_attribute from meilisearch.errors import MeilisearchError -from opaque_keys.edx.keys import CourseKey, UsageKey +from opaque_keys.edx.keys import UsageKey from opaque_keys.edx.locator import ( LibraryCollectionLocator, LibraryContainerLocator, @@ -36,19 +36,6 @@ def upsert_xblock_index_doc(usage_key_str: str, recursive: bool) -> None: api.upsert_xblock_index_doc(usage_key, recursive) -@shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) -@set_code_owner_attribute -def upsert_course_blocks_docs(course_key_str: str) -> None: - """ - Celery task to update the content index document for all XBlocks in a course. - """ - course_key = CourseKey.from_string(course_key_str) - - log.info("Updating content index documents for XBlocks in course with id: %s", course_key) - - api.index_course(course_key) - - @shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) @set_code_owner_attribute def delete_xblock_index_doc(usage_key_str: str) -> None: @@ -59,8 +46,7 @@ def delete_xblock_index_doc(usage_key_str: str) -> None: log.info("Updating content index document for XBlock with id: %s", usage_key) - # Delete children index data for course blocks. - api.delete_index_doc(usage_key, delete_children=True) + api.delete_index_doc(usage_key) @shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) @@ -91,7 +77,7 @@ def delete_library_block_index_doc(usage_key_str: str) -> None: @shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) @set_code_owner_attribute -def update_content_library_index_docs(library_key_str: str, full_index: bool = False) -> None: +def update_content_library_index_docs(library_key_str: str) -> None: """ Celery task to update the content index documents for all library blocks in a library """ @@ -99,8 +85,7 @@ def update_content_library_index_docs(library_key_str: str, full_index: bool = F log.info("Updating content index documents for library with id: %s", library_key) - # If full_index is True, also update collections and containers data - api.upsert_content_library_index_docs(library_key, full_index=full_index) + api.upsert_content_library_index_docs(library_key) @shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) @@ -170,17 +155,3 @@ def delete_library_container_index_doc(container_key_str: str) -> None: log.info("Deleting content index document for library block with id: %s", container_key) api.delete_index_doc(container_key) - - -@shared_task(base=LoggedTask, autoretry_for=(MeilisearchError, ConnectionError)) -@set_code_owner_attribute -def delete_course_index_docs(course_key_str: str) -> None: - """ - Celery task to delete the content index documents for a Course - """ - course_key = CourseKey.from_string(course_key_str) - - log.info("Deleting all index documents related to course_key: %s", course_key) - - # Delete children index data for course blocks. - api.delete_docs_with_context_key(course_key) diff --git a/openedx/core/djangoapps/content/search/tests/test_api.py b/openedx/core/djangoapps/content/search/tests/test_api.py index e1b6f8fe16b0..90b6a407e717 100644 --- a/openedx/core/djangoapps/content/search/tests/test_api.py +++ b/openedx/core/djangoapps/content/search/tests/test_api.py @@ -8,7 +8,7 @@ from datetime import datetime, timezone from unittest.mock import MagicMock, Mock, call, patch from opaque_keys.edx.keys import UsageKey -from opaque_keys.edx.locator import LibraryCollectionLocator, LibraryContainerLocator +from opaque_keys.edx.locator import LibraryCollectionLocator import ddt import pytest @@ -47,8 +47,7 @@ class TestSearchApi(ModuleStoreTestCase): MODULESTORE = TEST_DATA_SPLIT_MODULESTORE - def setUp(self) -> None: - # pylint: disable=too-many-statements + def setUp(self): super().setUp() self.user = UserFactory.create() self.user_id = self.user.id @@ -135,8 +134,8 @@ def setUp(self) -> None: lib_access, _ = SearchAccess.objects.get_or_create(context_key=self.library.key) # Populate it with 2 problems, freezing the date so we can verify created date serializes correctly. - self.created_date = datetime(2023, 4, 5, 6, 7, 8, tzinfo=timezone.utc) - with freeze_time(self.created_date): + created_date = datetime(2023, 4, 5, 6, 7, 8, tzinfo=timezone.utc) + with freeze_time(created_date): self.problem1 = library_api.create_library_block(self.library.key, "problem", "p1") self.problem2 = library_api.create_library_block(self.library.key, "problem", "p2") # Update problem1, freezing the date so we can verify modified date serializes correctly. @@ -156,7 +155,7 @@ def setUp(self) -> None: "type": "library_block", "access_id": lib_access.id, "last_published": None, - "created": self.created_date.timestamp(), + "created": created_date.timestamp(), "modified": modified_date.timestamp(), "publish_status": "never", } @@ -173,8 +172,8 @@ def setUp(self) -> None: "type": "library_block", "access_id": lib_access.id, "last_published": None, - "created": self.created_date.timestamp(), - "modified": self.created_date.timestamp(), + "created": created_date.timestamp(), + "modified": created_date.timestamp(), "publish_status": "never", } @@ -190,7 +189,7 @@ def setUp(self) -> None: # Create a collection: self.learning_package = authoring_api.get_learning_package_by_key(self.library.key) - with freeze_time(self.created_date): + with freeze_time(created_date): self.collection = authoring_api.create_collection( learning_package_id=self.learning_package.id, key="MYCOL", @@ -211,8 +210,8 @@ def setUp(self) -> None: "num_children": 0, "context_key": "lib:org1:lib", "org": "org1", - "created": self.created_date.timestamp(), - "modified": self.created_date.timestamp(), + "created": created_date.timestamp(), + "modified": created_date.timestamp(), "access_id": lib_access.id, "published": { "num_children": 0 @@ -220,8 +219,8 @@ def setUp(self) -> None: "breadcrumbs": [{"display_name": "Library"}], } - # Create a container: - with freeze_time(self.created_date): + # Create a unit: + with freeze_time(created_date): self.unit = library_api.create_container( library_key=self.library.key, container_type=library_api.ContainerType.Unit, @@ -230,33 +229,6 @@ def setUp(self) -> None: user_id=None, ) self.unit_key = "lct:org1:lib:unit:unit-1" - self.subsection = library_api.create_container( - self.library.key, - container_type=library_api.ContainerType.Subsection, - slug="subsection-1", - title="Subsection 1", - user_id=None, - ) - library_api.update_container_children( - self.subsection.container_key, - [self.unit.container_key], - None, - ) - self.subsection_key = "lct:org1:lib:subsection:subsection-1" - self.section = library_api.create_container( - self.library.key, - container_type=library_api.ContainerType.Section, - slug="section-1", - title="Section 1", - user_id=None, - ) - self.section_key = "lct:org1:lib:section:section-1" - library_api.update_container_children( - self.section.container_key, - [self.subsection.container_key], - None, - ) - self.unit_dict = { "id": "lctorg1libunitunit-1-e4527f7c", "block_id": "unit-1", @@ -266,61 +238,12 @@ def setUp(self) -> None: "display_name": "Unit 1", # description is not set for containers "num_children": 0, - "content": { - "child_usage_keys": [], - "child_display_names": [], - }, - "publish_status": "never", - "context_key": "lib:org1:lib", - "org": "org1", - "created": self.created_date.timestamp(), - "modified": self.created_date.timestamp(), - "last_published": None, - "access_id": lib_access.id, - "breadcrumbs": [{"display_name": "Library"}], - # "published" is not set since we haven't published it yet - } - self.subsection_dict = { - "id": "lctorg1libsubsectionsubsection-1-cf808309", - "block_id": "subsection-1", - "block_type": "subsection", - "usage_key": self.subsection_key, - "type": "library_container", - "display_name": "Subsection 1", - # description is not set for containers - "num_children": 1, - "content": { - "child_usage_keys": ["lct:org1:lib:unit:unit-1"], - "child_display_names": ["Unit 1"], - }, + "content": {"child_usage_keys": []}, "publish_status": "never", "context_key": "lib:org1:lib", "org": "org1", - "created": self.created_date.timestamp(), - "modified": self.created_date.timestamp(), - "last_published": None, - "access_id": lib_access.id, - "breadcrumbs": [{"display_name": "Library"}], - # "published" is not set since we haven't published it yet - } - self.section_dict = { - "id": "lctorg1libsectionsection-1-dc4791a4", - "block_id": "section-1", - "block_type": "section", - "usage_key": self.section_key, - "type": "library_container", - "display_name": "Section 1", - # description is not set for containers - "num_children": 1, - "content": { - "child_usage_keys": ["lct:org1:lib:subsection:subsection-1"], - "child_display_names": ["Subsection 1"], - }, - "publish_status": "never", - "context_key": "lib:org1:lib", - "org": "org1", - "created": self.created_date.timestamp(), - "modified": self.created_date.timestamp(), + "created": created_date.timestamp(), + "modified": created_date.timestamp(), "last_published": None, "access_id": lib_access.id, "breadcrumbs": [{"display_name": "Library"}], @@ -328,14 +251,14 @@ def setUp(self) -> None: } @override_settings(MEILISEARCH_ENABLED=False) - def test_reindex_meilisearch_disabled(self, mock_meilisearch) -> None: + def test_reindex_meilisearch_disabled(self, mock_meilisearch): with self.assertRaises(RuntimeError): api.rebuild_index() mock_meilisearch.return_value.swap_indexes.assert_not_called() @override_settings(MEILISEARCH_ENABLED=True) - def test_reindex_meilisearch(self, mock_meilisearch) -> None: + def test_reindex_meilisearch(self, mock_meilisearch): # Add tags field to doc, since reindex calls includes tags doc_sequential = copy.deepcopy(self.doc_sequential) @@ -345,24 +268,14 @@ def test_reindex_meilisearch(self, mock_meilisearch) -> None: doc_problem1 = copy.deepcopy(self.doc_problem1) doc_problem1["tags"] = {} doc_problem1["collections"] = {'display_name': [], 'key': []} - doc_problem1["units"] = {'display_name': [], 'key': []} doc_problem2 = copy.deepcopy(self.doc_problem2) doc_problem2["tags"] = {} doc_problem2["collections"] = {'display_name': [], 'key': []} - doc_problem2["units"] = {'display_name': [], 'key': []} doc_collection = copy.deepcopy(self.collection_dict) doc_collection["tags"] = {} doc_unit = copy.deepcopy(self.unit_dict) doc_unit["tags"] = {} doc_unit["collections"] = {'display_name': [], 'key': []} - doc_unit["subsections"] = {'display_name': ['Subsection 1'], 'key': ['lct:org1:lib:subsection:subsection-1']} - doc_subsection = copy.deepcopy(self.subsection_dict) - doc_subsection["tags"] = {} - doc_subsection["collections"] = {'display_name': [], 'key': []} - doc_subsection["sections"] = {'display_name': ['Section 1'], 'key': ['lct:org1:lib:section:section-1']} - doc_section = copy.deepcopy(self.section_dict) - doc_section["tags"] = {} - doc_section["collections"] = {'display_name': [], 'key': []} api.rebuild_index() assert mock_meilisearch.return_value.index.return_value.add_documents.call_count == 4 @@ -371,13 +284,13 @@ def test_reindex_meilisearch(self, mock_meilisearch) -> None: call([doc_sequential, doc_vertical]), call([doc_problem1, doc_problem2]), call([doc_collection]), - call([doc_unit, doc_subsection, doc_section]), + call([doc_unit]), ], any_order=True, ) @override_settings(MEILISEARCH_ENABLED=True) - def test_reindex_meilisearch_incremental(self, mock_meilisearch) -> None: + def test_reindex_meilisearch_incremental(self, mock_meilisearch): # Add tags field to doc, since reindex calls includes tags doc_sequential = copy.deepcopy(self.doc_sequential) @@ -387,24 +300,14 @@ def test_reindex_meilisearch_incremental(self, mock_meilisearch) -> None: doc_problem1 = copy.deepcopy(self.doc_problem1) doc_problem1["tags"] = {} doc_problem1["collections"] = {"display_name": [], "key": []} - doc_problem1["units"] = {'display_name': [], 'key': []} doc_problem2 = copy.deepcopy(self.doc_problem2) doc_problem2["tags"] = {} doc_problem2["collections"] = {"display_name": [], "key": []} - doc_problem2["units"] = {'display_name': [], 'key': []} doc_collection = copy.deepcopy(self.collection_dict) doc_collection["tags"] = {} doc_unit = copy.deepcopy(self.unit_dict) doc_unit["tags"] = {} doc_unit["collections"] = {"display_name": [], "key": []} - doc_unit["subsections"] = {'display_name': ['Subsection 1'], 'key': ['lct:org1:lib:subsection:subsection-1']} - doc_subsection = copy.deepcopy(self.subsection_dict) - doc_subsection["tags"] = {} - doc_subsection["collections"] = {'display_name': [], 'key': []} - doc_subsection["sections"] = {'display_name': ['Section 1'], 'key': ['lct:org1:lib:section:section-1']} - doc_section = copy.deepcopy(self.section_dict) - doc_section["tags"] = {} - doc_section["collections"] = {'display_name': [], 'key': []} api.rebuild_index(incremental=True) assert mock_meilisearch.return_value.index.return_value.add_documents.call_count == 4 @@ -413,7 +316,7 @@ def test_reindex_meilisearch_incremental(self, mock_meilisearch) -> None: call([doc_sequential, doc_vertical]), call([doc_problem1, doc_problem2]), call([doc_collection]), - call([doc_unit, doc_subsection, doc_section]), + call([doc_unit]), ], any_order=True, ) @@ -436,7 +339,7 @@ def simulated_interruption(message): assert mock_meilisearch.return_value.index.return_value.add_documents.call_count == 8 @override_settings(MEILISEARCH_ENABLED=True) - def test_reset_meilisearch_index(self, mock_meilisearch) -> None: + def test_reset_meilisearch_index(self, mock_meilisearch): api.reset_index() mock_meilisearch.return_value.swap_indexes.assert_called_once() mock_meilisearch.return_value.create_index.assert_called_once() @@ -445,7 +348,7 @@ def test_reset_meilisearch_index(self, mock_meilisearch) -> None: mock_meilisearch.return_value.delete_index.call_count = 4 @override_settings(MEILISEARCH_ENABLED=True) - def test_init_meilisearch_index(self, mock_meilisearch) -> None: + def test_init_meilisearch_index(self, mock_meilisearch): # Test index already exists api.init_index() mock_meilisearch.return_value.swap_indexes.assert_not_called() @@ -476,7 +379,7 @@ def test_init_meilisearch_index(self, mock_meilisearch) -> None: "openedx.core.djangoapps.content.search.api.searchable_doc_for_collection", Mock(side_effect=Exception("Failed to generate document")), ) - def test_reindex_meilisearch_collection_error(self, mock_meilisearch) -> None: + def test_reindex_meilisearch_collection_error(self, mock_meilisearch): mock_logger = Mock() api.rebuild_index(mock_logger) @@ -492,7 +395,7 @@ def test_reindex_meilisearch_collection_error(self, mock_meilisearch) -> None: "openedx.core.djangoapps.content.search.api.searchable_doc_for_container", Mock(side_effect=Exception("Failed to generate document")), ) - def test_reindex_meilisearch_container_error(self, mock_meilisearch) -> None: + def test_reindex_meilisearch_container_error(self, mock_meilisearch): mock_logger = Mock() api.rebuild_index(mock_logger) @@ -504,7 +407,7 @@ def test_reindex_meilisearch_container_error(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_reindex_meilisearch_library_block_error(self, mock_meilisearch) -> None: + def test_reindex_meilisearch_library_block_error(self, mock_meilisearch): # Add tags field to doc, since reindex calls includes tags doc_sequential = copy.deepcopy(self.doc_sequential) @@ -514,7 +417,6 @@ def test_reindex_meilisearch_library_block_error(self, mock_meilisearch) -> None doc_problem2 = copy.deepcopy(self.doc_problem2) doc_problem2["tags"] = {} doc_problem2["collections"] = {'display_name': [], 'key': []} - doc_problem2["units"] = {'display_name': [], 'key': []} orig_from_component = library_api.LibraryXBlockMetadata.from_component @@ -562,7 +464,7 @@ def mocked_from_component(lib_key, component): False ) @override_settings(MEILISEARCH_ENABLED=True) - def test_index_xblock_metadata(self, recursive, mock_meilisearch) -> None: + def test_index_xblock_metadata(self, recursive, mock_meilisearch): """ Test indexing an XBlock. """ @@ -576,13 +478,13 @@ def test_index_xblock_metadata(self, recursive, mock_meilisearch) -> None: mock_meilisearch.return_value.index.return_value.update_documents.assert_called_once_with(expected_docs) @override_settings(MEILISEARCH_ENABLED=True) - def test_no_index_excluded_xblocks(self, mock_meilisearch) -> None: + def test_no_index_excluded_xblocks(self, mock_meilisearch): api.upsert_xblock_index_doc(UsageKey.from_string(self.course_block_key)) mock_meilisearch.return_value.index.return_value.update_document.assert_not_called() @override_settings(MEILISEARCH_ENABLED=True) - def test_index_xblock_tags(self, mock_meilisearch) -> None: + def test_index_xblock_tags(self, mock_meilisearch): """ Test indexing an XBlock with tags. """ @@ -616,22 +518,18 @@ def test_index_xblock_tags(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_delete_index_xblock(self, mock_meilisearch) -> None: + def test_delete_index_xblock(self, mock_meilisearch): """ - Test deleting an XBlock doc and its children docs from the index. + Test deleting an XBlock doc from the index. """ - api.delete_index_doc(self.sequential.usage_key, delete_children=True) + api.delete_index_doc(self.sequential.usage_key) mock_meilisearch.return_value.index.return_value.delete_document.assert_called_once_with( self.doc_sequential['id'] ) - mock_meilisearch.return_value.index.return_value.delete_documents.assert_called_once_with( - filter=f'breadcrumbs.usage_key = "{self.sequential.usage_key}"' - ) - @override_settings(MEILISEARCH_ENABLED=True) - def test_index_library_block_metadata(self, mock_meilisearch) -> None: + def test_index_library_block_metadata(self, mock_meilisearch): """ Test indexing a Library Block. """ @@ -640,7 +538,7 @@ def test_index_library_block_metadata(self, mock_meilisearch) -> None: mock_meilisearch.return_value.index.return_value.update_documents.assert_called_once_with([self.doc_problem1]) @override_settings(MEILISEARCH_ENABLED=True) - def test_index_library_block_tags(self, mock_meilisearch) -> None: + def test_index_library_block_tags(self, mock_meilisearch): """ Test indexing an Library Block with tags. """ @@ -675,7 +573,7 @@ def test_index_library_block_tags(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_index_library_block_and_collections(self, mock_meilisearch) -> None: + def test_index_library_block_and_collections(self, mock_meilisearch): """ Test indexing an Library Block and the Collections it's in. """ @@ -815,7 +713,7 @@ def test_index_library_block_and_collections(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_delete_index_library_block(self, mock_meilisearch) -> None: + def test_delete_index_library_block(self, mock_meilisearch): """ Test deleting a Library Block doc from the index. """ @@ -826,18 +724,7 @@ def test_delete_index_library_block(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_delete_docs_with_context_key(self, mock_meilisearch) -> None: - """ - Test deleting a all Block docs from the index using context_key. - """ - api.delete_docs_with_context_key(self.course.id) - - mock_meilisearch.return_value.index.return_value.delete_documents.assert_called_once_with( - filter=f'context_key = "{self.course.id}"' - ) - - @override_settings(MEILISEARCH_ENABLED=True) - def test_index_content_library_metadata(self, mock_meilisearch) -> None: + def test_index_content_library_metadata(self, mock_meilisearch): """ Test indexing a whole content library. """ @@ -848,7 +735,7 @@ def test_index_content_library_metadata(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_index_tags_in_collections(self, mock_meilisearch) -> None: + def test_index_tags_in_collections(self, mock_meilisearch): # Tag collection tagging_api.tag_object(str(self.collection_key), self.taxonomyA, ["one", "two"]) tagging_api.tag_object(str(self.collection_key), self.taxonomyB, ["three", "four"]) @@ -879,7 +766,7 @@ def test_index_tags_in_collections(self, mock_meilisearch) -> None: ) @override_settings(MEILISEARCH_ENABLED=True) - def test_delete_collection(self, mock_meilisearch) -> None: + def test_delete_collection(self, mock_meilisearch): """ Test soft-deleting, restoring, and hard-deleting a collection. """ @@ -1002,123 +889,42 @@ def test_delete_collection(self, mock_meilisearch) -> None: any_order=True, ) - @ddt.data( - "unit", - "subsection", - "section", - ) @override_settings(MEILISEARCH_ENABLED=True) - def test_delete_index_container(self, container_type, mock_meilisearch) -> None: + def test_delete_index_container(self, mock_meilisearch): """ Test delete a container index. """ - container = getattr(self, container_type) - container_dict = getattr(self, f"{container_type}_dict") - update_doc_calls = [] - - def clear_contents(data: dict): - return { - **data, - "num_children": 0, - "content": { - "child_usage_keys": [], - "child_display_names": [], - }, - } - if container_type == "unit": - update_doc_calls.append(call([clear_contents(self.subsection_dict)])) - elif container_type == "subsection": - update_doc_calls.append(call([clear_contents(self.section_dict)])) - update_doc_calls.append(call([{ - 'id': self.unit_dict['id'], - 'subsections': {'display_name': [], 'key': []}, - }])) - elif container_type == "section": - update_doc_calls.append(call([{ - 'id': self.subsection_dict['id'], - 'sections': {'display_name': [], 'key': []}, - }])) - - library_api.delete_container(container.container_key) + library_api.delete_container(self.unit.container_key) mock_meilisearch.return_value.index.return_value.delete_document.assert_called_once_with( - container_dict["id"], + self.unit_dict["id"], ) - # Parent containers index data is updated. - if update_doc_calls: - mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls( - update_doc_calls, - any_order=True, - ) - # Restore - library_api.restore_container(container.container_key) - if container_type == "unit": - update_doc_calls.append(call([self.subsection_dict])) - elif container_type == "subsection": - update_doc_calls.append(call([self.section_dict])) - update_doc_calls.append(call([{ - 'id': self.unit_dict['id'], - 'subsections': { - 'display_name': [self.subsection_dict['display_name']], - 'key': [self.subsection_key], - }, - }])) - elif container_type == "section": - update_doc_calls.append(call([{ - 'id': self.subsection_dict['id'], - 'sections': { - 'display_name': [self.section_dict['display_name']], - 'key': [self.section_key], - }, - }])) - # Parent containers index data is updated on restore again. - if update_doc_calls: - mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls( - update_doc_calls, - any_order=True, - ) - - @ddt.data( - "unit", - "subsection", - "section", - ) @override_settings(MEILISEARCH_ENABLED=True) - def test_index_library_container_metadata(self, container_type, mock_meilisearch) -> None: + def test_index_library_container_metadata(self, mock_meilisearch): """ Test indexing a Library Container. """ - container = getattr(self, container_type) - container_dict = getattr(self, f"{container_type}_dict") - api.upsert_library_container_index_doc(container.container_key) + api.upsert_library_container_index_doc(self.unit.container_key) - mock_meilisearch.return_value.index.return_value.update_documents.assert_called_once_with([container_dict]) + mock_meilisearch.return_value.index.return_value.update_documents.assert_called_once_with([self.unit_dict]) - @ddt.data( - ("unit", "lctorg1libunitunit-1-e4527f7c"), - ("subsection", "lctorg1libsubsectionsubsection-1-cf808309"), - ("section", "lctorg1libsectionsection-1-dc4791a4"), - ) - @ddt.unpack @override_settings(MEILISEARCH_ENABLED=True) - def test_index_tags_in_containers(self, container_type, container_id, mock_meilisearch) -> None: - container_key = getattr(self, f"{container_type}_key") - - # Tag container - tagging_api.tag_object(container_key, self.taxonomyA, ["one", "two"]) - tagging_api.tag_object(container_key, self.taxonomyB, ["three", "four"]) + def test_index_tags_in_containers(self, mock_meilisearch): + # Tag collection + tagging_api.tag_object(self.unit_key, self.taxonomyA, ["one", "two"]) + tagging_api.tag_object(self.unit_key, self.taxonomyB, ["three", "four"]) # Build expected docs with tags at each stage doc_unit_with_tags1 = { - "id": container_id, + "id": "lctorg1libunitunit-1-e4527f7c", "tags": { 'taxonomy': ['A'], 'level0': ['A > one', 'A > two'] } } doc_unit_with_tags2 = { - "id": container_id, + "id": "lctorg1libunitunit-1-e4527f7c", "tags": { 'taxonomy': ['A', 'B'], 'level0': ['A > one', 'A > two', 'B > four', 'B > three'] @@ -1133,103 +939,3 @@ def test_index_tags_in_containers(self, container_type, container_id, mock_meili ], any_order=True, ) - - @override_settings(MEILISEARCH_ENABLED=True) - def test_block_in_units(self, mock_meilisearch) -> None: - with freeze_time(self.created_date): - library_api.update_container_children( - LibraryContainerLocator.from_string(self.unit_key), - [self.problem1.usage_key], - None, - ) - - doc_block_with_units = { - "id": self.doc_problem1["id"], - "units": { - "display_name": [self.unit.display_name], - "key": [self.unit_key], - }, - } - new_unit_dict = { - **self.unit_dict, - "num_children": 1, - 'content': { - 'child_usage_keys': [self.doc_problem1["usage_key"]], - 'child_display_names': [self.doc_problem1["display_name"]], - } - } - - assert mock_meilisearch.return_value.index.return_value.update_documents.call_count == 2 - mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls( - [ - call([doc_block_with_units]), - call([new_unit_dict]), - ], - any_order=True, - ) - - @override_settings(MEILISEARCH_ENABLED=True) - def test_units_in_subsection(self, mock_meilisearch) -> None: - with freeze_time(self.created_date): - library_api.update_container_children( - LibraryContainerLocator.from_string(self.subsection_key), - [LibraryContainerLocator.from_string(self.unit_key)], - None, - ) - - doc_block_with_subsections = { - "id": self.unit_dict["id"], - "subsections": { - "display_name": [self.subsection.display_name], - "key": [self.subsection_key], - }, - } - new_subsection_dict = { - **self.subsection_dict, - "num_children": 1, - 'content': { - 'child_usage_keys': [self.unit_key], - 'child_display_names': [self.unit.display_name] - } - } - assert mock_meilisearch.return_value.index.return_value.update_documents.call_count == 2 - mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls( - [ - call([doc_block_with_subsections]), - call([new_subsection_dict]), - ], - any_order=True, - ) - - @override_settings(MEILISEARCH_ENABLED=True) - def test_section_in_usbsections(self, mock_meilisearch) -> None: - with freeze_time(self.created_date): - library_api.update_container_children( - LibraryContainerLocator.from_string(self.section_key), - [LibraryContainerLocator.from_string(self.subsection_key)], - None, - ) - - doc_block_with_sections = { - "id": self.subsection_dict["id"], - "sections": { - "display_name": [self.section.display_name], - "key": [self.section_key], - }, - } - new_section_dict = { - **self.section_dict, - "num_children": 1, - 'content': { - 'child_usage_keys': [self.subsection_key], - 'child_display_names': [self.subsection.display_name], - } - } - assert mock_meilisearch.return_value.index.return_value.update_documents.call_count == 2 - mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls( - [ - call([doc_block_with_sections]), - call([new_section_dict]), - ], - any_order=True, - ) diff --git a/openedx/core/djangoapps/content/search/tests/test_documents.py b/openedx/core/djangoapps/content/search/tests/test_documents.py index 5a8a2231dcdd..74772c89c017 100644 --- a/openedx/core/djangoapps/content/search/tests/test_documents.py +++ b/openedx/core/djangoapps/content/search/tests/test_documents.py @@ -1,7 +1,6 @@ """ Tests for the Studio content search documents (what gets stored in the index) """ -import ddt from dataclasses import replace from datetime import datetime, timezone @@ -26,11 +25,13 @@ searchable_doc_for_course_block, searchable_doc_for_library_block, searchable_doc_tags, + searchable_doc_tags_for_collection, ) from ..models import SearchAccess except RuntimeError: searchable_doc_for_course_block = lambda x: x searchable_doc_tags = lambda x: x + searchable_doc_tags_for_collection = lambda x: x searchable_doc_for_collection = lambda x: x searchable_doc_for_container = lambda x: x searchable_doc_for_library_block = lambda x: x @@ -41,7 +42,6 @@ @skip_unless_cms -@ddt.ddt class StudioDocumentsTest(SharedModuleStoreTestCase): """ Tests for the Studio content search documents (what gets stored in the @@ -100,26 +100,6 @@ def setUpClass(cls): cls.container_key = LibraryContainerLocator.from_string( "lct:edX:2012_Fall:unit:unit1", ) - cls.subsection = library_api.create_container( - cls.library.key, - container_type=library_api.ContainerType.Subsection, - slug="subsection1", - title="A Subsection in the Search Index", - user_id=None, - ) - cls.subsection_key = LibraryContainerLocator.from_string( - "lct:edX:2012_Fall:subsection:subsection1", - ) - cls.section = library_api.create_container( - cls.library.key, - container_type=library_api.ContainerType.Section, - slug="section1", - title="A Section in the Search Index", - user_id=None, - ) - cls.section_key = LibraryContainerLocator.from_string( - "lct:edX:2012_Fall:section:section1", - ) # Add the problem block to the collection library_api.update_library_collection_items( @@ -150,8 +130,6 @@ def setUpClass(cls): tagging_api.tag_object(str(cls.library_block.usage_key), cls.difficulty_tags, tags=["Normal"]) tagging_api.tag_object(str(cls.collection_key), cls.difficulty_tags, tags=["Normal"]) tagging_api.tag_object(str(cls.container_key), cls.difficulty_tags, tags=["Normal"]) - tagging_api.tag_object(str(cls.subsection_key), cls.difficulty_tags, tags=["Normal"]) - tagging_api.tag_object(str(cls.section_key), cls.difficulty_tags, tags=["Normal"]) @property def toy_course_access_id(self): @@ -482,7 +460,7 @@ def test_html_published_library_block(self): def test_collection_with_library(self): doc = searchable_doc_for_collection(self.collection_key) - doc.update(searchable_doc_tags(self.collection_key)) + doc.update(searchable_doc_tags_for_collection(self.collection_key)) assert doc == { "id": "lib-collectionedx2012_falltoy_collection-d1d907a4", @@ -511,7 +489,7 @@ def test_collection_with_published_library(self): library_api.publish_changes(self.library.key) doc = searchable_doc_for_collection(self.collection_key) - doc.update(searchable_doc_tags(self.collection_key)) + doc.update(searchable_doc_tags_for_collection(self.collection_key)) assert doc == { "id": "lib-collectionedx2012_falltoy_collection-d1d907a4", @@ -536,33 +514,25 @@ def test_collection_with_published_library(self): } } - @ddt.data( - ("container", "unit1", "unit", "edd13a0c"), - ("subsection", "subsection1", "subsection", "c6c172be"), - ("section", "section1", "section", "79ee8fa2"), - ) - @ddt.unpack - def test_draft_container(self, container_name, container_slug, container_type, doc_id): + def test_draft_container(self): """ Test creating a search document for a draft-only container """ - container = getattr(self, container_name) - doc = searchable_doc_for_container(container.container_key) - doc.update(searchable_doc_tags(container.container_key)) + doc = searchable_doc_for_container(self.container.container_key) + doc.update(searchable_doc_tags(self.container.container_key)) assert doc == { - "id": f"lctedx2012_fall{container_type}{container_slug}-{doc_id}", - "block_id": container_slug, - "block_type": container_type, - "usage_key": f"lct:edX:2012_Fall:{container_type}:{container_slug}", + "id": "lctedx2012_fallunitunit1-edd13a0c", + "block_id": "unit1", + "block_type": "unit", + "usage_key": "lct:edX:2012_Fall:unit:unit1", "type": "library_container", "org": "edX", - "display_name": container.display_name, + "display_name": "A Unit in the Search Index", # description is not set for containers "num_children": 0, "content": { "child_usage_keys": [], - "child_display_names": [], }, "publish_status": "never", "context_key": "lib:edX:2012_Fall", @@ -608,9 +578,6 @@ def test_published_container(self): "child_usage_keys": [ "lb:edX:2012_Fall:html:text2", ], - "child_display_names": [ - "Text", - ], }, "publish_status": "published", "context_key": "lib:edX:2012_Fall", @@ -630,9 +597,6 @@ def test_published_container(self): "child_usage_keys": [ "lb:edX:2012_Fall:html:text2", ], - "child_display_names": [ - "Text", - ], }, }, } @@ -681,10 +645,6 @@ def test_published_container_with_changes(self): "lb:edX:2012_Fall:html:text2", "lb:edX:2012_Fall:html:text3", ], - "child_display_names": [ - "Text", - "Text", - ], }, "publish_status": "modified", "context_key": "lib:edX:2012_Fall", @@ -704,9 +664,6 @@ def test_published_container_with_changes(self): "child_usage_keys": [ "lb:edX:2012_Fall:html:text2", ], - "child_display_names": [ - "Text", - ], }, }, } diff --git a/openedx/core/djangoapps/content/search/tests/test_handlers.py b/openedx/core/djangoapps/content/search/tests/test_handlers.py index 7ccbec687e98..33d0e4db8378 100644 --- a/openedx/core/djangoapps/content/search/tests/test_handlers.py +++ b/openedx/core/djangoapps/content/search/tests/test_handlers.py @@ -11,11 +11,7 @@ from common.djangoapps.student.tests.factories import UserFactory from openedx.core.djangoapps.content_libraries import api as library_api from openedx.core.djangolib.testing.utils import skip_unless_cms -from xmodule.modulestore.tests.django_utils import ( - TEST_DATA_SPLIT_MODULESTORE, - ModuleStoreTestCase, - ImmediateOnCommitMixin, -) +from xmodule.modulestore.tests.django_utils import TEST_DATA_SPLIT_MODULESTORE, ModuleStoreTestCase try: @@ -30,7 +26,7 @@ @patch("openedx.core.djangoapps.content.search.api.MeilisearchClient") @override_settings(MEILISEARCH_ENABLED=True) @skip_unless_cms -class TestUpdateIndexHandlers(ImmediateOnCommitMixin, ModuleStoreTestCase, LiveServerTestCase): +class TestUpdateIndexHandlers(ModuleStoreTestCase, LiveServerTestCase): """ Test that the search index is updated when XBlocks and Library Blocks are modified """ @@ -84,9 +80,7 @@ def test_create_delete_xblock(self, meilisearch_client): "access_id": course_access.id, "modified": created_date.timestamp(), } - meilisearch_client.return_value.index.return_value.update_documents.assert_called_with([doc_sequential]) - with freeze_time(created_date): vertical = self.store.create_child(self.user_id, sequential.location, "vertical", "test_vertical") doc_vertical = { @@ -125,7 +119,6 @@ def test_create_delete_xblock(self, meilisearch_client): doc_sequential["display_name"] = "Updated Sequential" doc_vertical["breadcrumbs"][1]["display_name"] = "Updated Sequential" doc_sequential["modified"] = modified_date.timestamp() - meilisearch_client.return_value.index.return_value.update_documents.assert_called_with([ doc_sequential, doc_vertical, diff --git a/openedx/core/djangoapps/content_libraries/api/blocks.py b/openedx/core/djangoapps/content_libraries/api/blocks.py index be5b296147fd..d693ff30d7e5 100644 --- a/openedx/core/djangoapps/content_libraries/api/blocks.py +++ b/openedx/core/djangoapps/content_libraries/api/blocks.py @@ -19,8 +19,7 @@ from django.utils.text import slugify from django.utils.translation import gettext as _ from lxml import etree -from opaque_keys import InvalidKeyError -from opaque_keys.edx.locator import LibraryContainerLocator, LibraryLocatorV2, LibraryUsageLocatorV2 +from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2 from opaque_keys.edx.keys import LearningContextKey, UsageKeyV2 from openedx_events.content_authoring.data import ( ContentObjectChangedData, @@ -37,10 +36,7 @@ LIBRARY_CONTAINER_UPDATED ) from openedx_learning.api import authoring as authoring_api -from openedx_learning.api.authoring_models import ( - Component, ComponentVersion, LearningPackage, MediaType, - Container, Collection -) +from openedx_learning.api.authoring_models import Component, ComponentVersion, LearningPackage, MediaType from xblock.core import XBlock from openedx.core.djangoapps.xblock.api import ( @@ -62,7 +58,7 @@ from .containers import ( create_container, get_container, - get_containers_contains_item, + get_containers_contains_component, update_container_children, ContainerMetadata, ContainerType, @@ -83,8 +79,6 @@ __all__ = [ # API methods "get_library_components", - "get_library_containers", - "get_library_collections", "get_library_block", "set_library_block_olx", "get_component_from_usage_key", @@ -126,33 +120,6 @@ def get_library_components( return components -def get_library_containers(library_key: LibraryLocatorV2) -> QuerySet[Container]: - """ - Get all containers in the given content library. - """ - lib = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined] - learning_package = lib.learning_package - assert learning_package is not None - containers: QuerySet[Container] = authoring_api.get_containers( - learning_package.id - ) - - return containers - - -def get_library_collections(library_key: LibraryLocatorV2) -> QuerySet[Collection]: - """ - Get all collections in the given content library. - """ - lib = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined] - learning_package = lib.learning_package - assert learning_package is not None - collections = authoring_api.get_collections( - learning_package.id - ) - return collections - - def get_library_block(usage_key: LibraryUsageLocatorV2, include_collections=False) -> LibraryXBlockMetadata: """ Get metadata about (the draft version of) one specific XBlock in a library. @@ -253,8 +220,6 @@ def set_library_block_olx(usage_key: LibraryUsageLocatorV2, new_olx_str: str) -> created=now, ) - # .. event_implemented_name: LIBRARY_BLOCK_UPDATED - # .. event_type: org.openedx.content_authoring.library_block.updated.v1 LIBRARY_BLOCK_UPDATED.send_event( library_block=LibraryBlockData( library_key=usage_key.context_key, @@ -264,10 +229,8 @@ def set_library_block_olx(usage_key: LibraryUsageLocatorV2, new_olx_str: str) -> # For each container, trigger LIBRARY_CONTAINER_UPDATED signal and set background=True to trigger # container indexing asynchronously. - affected_containers = get_containers_contains_item(usage_key) + affected_containers = get_containers_contains_component(usage_key) for container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData( container_key=container.container_key, @@ -346,9 +309,6 @@ def create_library_block( _create_component_for_block(content_library, usage_key, user_id, can_stand_alone) # Now return the metadata about the new block: - - # .. event_implemented_name: LIBRARY_BLOCK_CREATED - # .. event_type: org.openedx.content_authoring.library_block.created.v1 LIBRARY_BLOCK_CREATED.send_event( library_block=LibraryBlockData( library_key=content_library.library_key, @@ -489,8 +449,6 @@ def _import_staged_block( ) # Emit library block created event - # .. event_implemented_name: LIBRARY_BLOCK_CREATED - # .. event_type: org.openedx.content_authoring.library_block.created.v1 LIBRARY_BLOCK_CREATED.send_event( library_block=LibraryBlockData( library_key=content_library.library_key, @@ -502,36 +460,22 @@ def _import_staged_block( return get_library_block(usage_key) -def _is_container(block_type: str) -> bool: - """ - Return True if the block type is a container. - """ - return block_type in ["vertical", "sequential", "chapter"] - - def _import_staged_block_as_container( + olx_str: str, library_key: LibraryLocatorV2, source_context_key: LearningContextKey, user, staged_content_id: int, staged_content_files: list[StagedContentFileData], now: datetime, - *, - olx_str: str | None = None, - olx_node: etree.Element | None = None, - copied_from_map: dict[str, LibraryUsageLocatorV2 | LibraryContainerLocator] | None = None, ) -> ContainerMetadata: """ Convert the given XBlock (e.g. "vertical") to a Container (e.g. Unit) and import it into the library, along with all its child XBlocks. """ - if olx_node is None: - if olx_str is None: - raise ValueError("Either olx_str or olx_node must be provided") - olx_node = etree.fromstring(olx_str) - - assert olx_node is not None # This assert to make sure olx_node has the correct type - + olx_node = etree.fromstring(olx_str) + if olx_node.tag != "vertical": + raise ValueError("This method is only designed to work with XBlocks (units).") # The olx_str looks like this: # ...[XML]......[XML]...... # Ideally we could split it up and preserve the strings, but that is difficult to do correctly, so we'll split @@ -540,91 +484,34 @@ def _import_staged_block_as_container( title = _title_from_olx_node(olx_node) - container = create_container( - library_key=library_key, - container_type=ContainerType.from_source_olx_tag(olx_node.tag), - slug=None, # auto-generate slug from title - title=title, - user_id=user.id, - ) - - # Keep track of which blocks were copied from the library, so we don't duplicate them - if copied_from_map is None: - copied_from_map = {} - - # Handle children - new_child_keys: list[LibraryUsageLocatorV2 | LibraryContainerLocator] = [] - for child_node in olx_node: - child_is_container = _is_container(child_node.tag) - copied_from_block = child_node.attrib.get('copied_from_block', None) - if copied_from_block: - # Get the key of the child block + # Start an atomic section so the whole paste succeeds or fails together: + with transaction.atomic(): + container = create_container( + library_key=library_key, + container_type=ContainerType.Unit, + slug=None, # auto-generate slug from title + title=title, + user_id=user.id, + ) + new_child_keys: list[UsageKeyV2] = [] + for child_node in olx_node: try: - child_key: LibraryContainerLocator | LibraryUsageLocatorV2 - if child_is_container: - child_key = LibraryContainerLocator.from_string(copied_from_block) - else: - child_key = LibraryUsageLocatorV2.from_string(copied_from_block) - - if child_key.context_key == library_key: - # This is a block that was copied from the library, so we just link it to the container - new_child_keys.append(child_key) - continue - - except InvalidKeyError: - # This is a XBlock copied from a course, so we need to create a new copy of it. - pass - - # This block is not copied from a course, or it was copied from a different library. - # We need to create a new copy of it. - if child_is_container: - if copied_from_block in copied_from_map: - # This container was already copied from the library, so we just link it to the container - new_child_keys.append(copied_from_map[copied_from_block]) - continue - - child_container = _import_staged_block_as_container( - library_key=library_key, - source_context_key=source_context_key, - user=user, - staged_content_id=staged_content_id, - staged_content_files=staged_content_files, - now=now, - olx_node=child_node, - copied_from_map=copied_from_map, - ) - if copied_from_block: - copied_from_map[copied_from_block] = child_container.container_key - new_child_keys.append(child_container.container_key) - continue - - # This is not a container, so we import it as a standalone block - try: - if copied_from_block in copied_from_map: - # This block was already copied from the library, so we just link it to the container - new_child_keys.append(copied_from_map[copied_from_block]) - continue - - child_metadata = _import_staged_block( - block_type=child_node.tag, - olx_str=etree.tostring(child_node, encoding='unicode'), - library_key=library_key, - source_context_key=source_context_key, - user=user, - staged_content_id=staged_content_id, - staged_content_files=staged_content_files, - now=now, - ) - if copied_from_block: - copied_from_map[copied_from_block] = child_metadata.usage_key - new_child_keys.append(child_metadata.usage_key) - except IncompatibleTypesError: - continue # Skip blocks that won't work in libraries - - update_container_children(container.container_key, new_child_keys, user_id=user.id) # type: ignore[arg-type] - # Re-fetch the container because the 'last_draft_created' will have changed when we added children - container = get_container(container.container_key) - + child_metadata = _import_staged_block( + block_type=child_node.tag, + olx_str=etree.tostring(child_node, encoding='unicode'), + library_key=library_key, + source_context_key=source_context_key, + user=user, + staged_content_id=staged_content_id, + staged_content_files=staged_content_files, + now=now, + ) + new_child_keys.append(child_metadata.usage_key) + except IncompatibleTypesError: + continue # Skip blocks that won't work in libraries + update_container_children(container.container_key, new_child_keys, user_id=user.id) + # Re-fetch the container because the 'last_draft_created' will have changed when we added children + container = get_container(container.container_key) return container @@ -642,7 +529,7 @@ def import_staged_content_from_user_clipboard(library_key: LibraryLocatorV2, use raise ValidationError("The user's clipboard is empty") staged_content_id = user_clipboard.content.id - source_context_key = user_clipboard.source_context_key + source_context_key: LearningContextKey = user_clipboard.source_context_key staged_content_files = content_staging_api.get_staged_content_static_files(staged_content_id) @@ -652,19 +539,17 @@ def import_staged_content_from_user_clipboard(library_key: LibraryLocatorV2, use now = datetime.now(tz=timezone.utc) - if _is_container(user_clipboard.content.block_type): - # This is a container and we can import it as such. - # Start an atomic section so the whole paste succeeds or fails together: - with transaction.atomic(): - return _import_staged_block_as_container( - library_key, - source_context_key, - user, - staged_content_id, - staged_content_files, - now, - olx_str=olx_str, - ) + if user_clipboard.content.block_type == "vertical": + # This is a Unit. To import it into a library, we have to create it as a container. + return _import_staged_block_as_container( + olx_str, + library_key, + source_context_key, + user, + staged_content_id, + staged_content_files, + now, + ) else: return _import_staged_block( user_clipboard.content.block_type, @@ -700,12 +585,10 @@ def delete_library_block( component = get_component_from_usage_key(usage_key) library_key = usage_key.context_key affected_collections = authoring_api.get_entity_collections(component.learning_package_id, component.key) - affected_containers = get_containers_contains_item(usage_key) + affected_containers = get_containers_contains_component(usage_key) authoring_api.soft_delete_draft(component.pk, deleted_by=user_id) - # .. event_implemented_name: LIBRARY_BLOCK_DELETED - # .. event_type: org.openedx.content_authoring.library_block.deleted.v1 LIBRARY_BLOCK_DELETED.send_event( library_block=LibraryBlockData( library_key=library_key, @@ -718,8 +601,6 @@ def delete_library_block( # # To delete the component on collections for collection in affected_collections: - # .. event_implemented_name: LIBRARY_COLLECTION_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.collection.updated.v1 LIBRARY_COLLECTION_UPDATED.send_event( library_collection=LibraryCollectionData( collection_key=library_collection_locator( @@ -735,8 +616,6 @@ def delete_library_block( # # To update the components count in containers for container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData( container_key=container.container_key, @@ -760,8 +639,6 @@ def restore_library_block(usage_key: LibraryUsageLocatorV2, user_id: int | None set_by=user_id, ) - # .. event_implemented_name: LIBRARY_BLOCK_CREATED - # .. event_type: org.openedx.content_authoring.library_block.created.v1 LIBRARY_BLOCK_CREATED.send_event( library_block=LibraryBlockData( library_key=library_key, @@ -770,12 +647,10 @@ def restore_library_block(usage_key: LibraryUsageLocatorV2, user_id: int | None ) # Add tags and collections back to index - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( content_object=ContentObjectChangedData( object_id=str(usage_key), - changes=["collections", "tags", "units"], + changes=["collections", "tags"], ), ) @@ -784,8 +659,6 @@ def restore_library_block(usage_key: LibraryUsageLocatorV2, user_id: int | None # # To restore the component in the collections for collection in affected_collections: - # .. event_implemented_name: LIBRARY_COLLECTION_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.collection.updated.v1 LIBRARY_COLLECTION_UPDATED.send_event( library_collection=LibraryCollectionData( collection_key=library_collection_locator( @@ -800,10 +673,8 @@ def restore_library_block(usage_key: LibraryUsageLocatorV2, user_id: int | None # container indexing asynchronously. # # To update the components count in containers - affected_containers = get_containers_contains_item(usage_key) + affected_containers = get_containers_contains_component(usage_key) for container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData( container_key=container.container_key, @@ -897,8 +768,6 @@ def add_library_block_static_asset_file( created_by=user.id if user else None, ) transaction.on_commit( - # .. event_implemented_name: LIBRARY_BLOCK_UPDATED - # .. event_type: org.openedx.content_authoring.library_block.updated.v1 lambda: LIBRARY_BLOCK_UPDATED.send_event( library_block=LibraryBlockData( library_key=usage_key.context_key, @@ -945,8 +814,6 @@ def delete_library_block_static_asset_file(usage_key, file_path, user=None): created_by=user.id if user else None, ) transaction.on_commit( - # .. event_implemented_name: LIBRARY_BLOCK_UPDATED - # .. event_type: org.openedx.content_authoring.library_block.updated.v1 lambda: LIBRARY_BLOCK_UPDATED.send_event( library_block=LibraryBlockData( library_key=usage_key.context_key, @@ -956,7 +823,7 @@ def delete_library_block_static_asset_file(usage_key, file_path, user=None): ) -def publish_component_changes(usage_key: LibraryUsageLocatorV2, user_id: int): +def publish_component_changes(usage_key: LibraryUsageLocatorV2, user: UserType): """ Publish all pending changes in a single component. """ @@ -969,7 +836,7 @@ def publish_component_changes(usage_key: LibraryUsageLocatorV2, user_id: int): drafts_to_publish = authoring_api.get_all_drafts(learning_package.id).filter(entity__key=component.key) # Publish the component and update anything that needs to be updated (e.g. search index): publish_log = authoring_api.publish_from_drafts( - learning_package.id, draft_qset=drafts_to_publish, published_by=user_id, + learning_package.id, draft_qset=drafts_to_publish, published_by=user.id, ) # Since this is a single component, it should be safe to process synchronously and in-process: tasks.send_events_after_publish(publish_log.pk, str(library_key)) diff --git a/openedx/core/djangoapps/content_libraries/api/containers.py b/openedx/core/djangoapps/content_libraries/api/containers.py index 5cca0b3a994b..d97a6100a648 100644 --- a/openedx/core/djangoapps/content_libraries/api/containers.py +++ b/openedx/core/djangoapps/content_libraries/api/containers.py @@ -3,12 +3,14 @@ """ from __future__ import annotations +from dataclasses import dataclass from datetime import datetime, timezone +from enum import Enum import logging from uuid import uuid4 -import typing from django.utils.text import slugify +from opaque_keys.edx.keys import UsageKeyV2 from opaque_keys.edx.locator import LibraryContainerLocator, LibraryLocatorV2, LibraryUsageLocatorV2 from openedx_events.content_authoring.data import ( ContentObjectChangedData, @@ -23,58 +25,181 @@ LIBRARY_CONTAINER_UPDATED, ) from openedx_learning.api import authoring as authoring_api -from openedx_learning.api.authoring_models import Container, ContainerVersion, Component +from openedx_learning.api.authoring_models import Container from openedx.core.djangoapps.content_libraries.api.collections import library_collection_locator from openedx.core.djangoapps.xblock.api import get_component_from_usage_key -from .. import tasks from ..models import ContentLibrary +from .exceptions import ContentLibraryContainerNotFound +from .libraries import PublishableItem from .block_metadata import LibraryXBlockMetadata -from .container_metadata import ( - ContainerHierarchy, - ContainerMetadata, - ContainerType, - library_container_locator, - get_container_from_key, -) -from .serializers import ContainerSerializer - - -if typing.TYPE_CHECKING: - from openedx.core.djangoapps.content_staging.api import UserClipboardData - +from .. import tasks -# 🛑 UNSTABLE: All APIs related to containers are unstable until we've figured -# out our approach to dynamic content (randomized, A/B tests, etc.) +# The public API is only the following symbols: __all__ = [ + # Models + "ContainerMetadata", + "ContainerType", + # API methods "get_container", "create_container", "get_container_children", "get_container_children_count", + "library_container_locator", "update_container", "delete_container", "restore_container", "update_container_children", - "get_containers_contains_item", + "get_containers_contains_component", "publish_container_changes", - "get_library_object_hierarchy", - "copy_container", - "library_container_locator", ] log = logging.getLogger(__name__) +class ContainerType(Enum): + """ + The container types supported by content_libraries, and logic to map them to OLX. + """ + Unit = "unit" + Subsection = "subsection" + Section = "section" + + @property + def olx_tag(self) -> str: + """ + Canonical XML tag to use when representing this container as OLX. + + For example, Units are encoded as .... + + These tag names are historical. We keep them around for the backwards compatibility of OLX + and for easier interaction with legacy modulestore-powered structural XBlocks + (e.g., copy-paste of Units between courses and V2 libraries). + """ + match self: + case self.Unit: + return "vertical" + case self.Subsection: + return "sequential" + case self.Section: + return "chapter" + raise TypeError(f"unexpected ContainerType: {self!r}") + + @classmethod + def from_source_olx_tag(cls, olx_tag: str) -> 'ContainerType': + """ + Get the ContainerType that this OLX tag maps to. + """ + if olx_tag == "unit": + # There is an alternative implementation to VerticalBlock called UnitBlock whose + # OLX tag is . When converting from OLX, we want to handle both + # and as Unit containers, although the canonical serialization is still . + return cls.Unit + try: + return next(ct for ct in cls if olx_tag == ct.olx_tag) + except StopIteration: + raise ValueError(f"no container_type for XML tag: <{olx_tag}>") from None + + +@dataclass(frozen=True, kw_only=True) +class ContainerMetadata(PublishableItem): + """ + Class that represents the metadata about a Container (e.g. Unit) in a content library. + """ + container_key: LibraryContainerLocator + container_type: ContainerType + container_pk: int + + @classmethod + def from_container(cls, library_key, container: Container, associated_collections=None): + """ + Construct a ContainerMetadata object from a Container object. + """ + last_publish_log = container.versioning.last_publish_log + container_key = library_container_locator( + library_key, + container=container, + ) + container_type = ContainerType(container_key.container_type) + published_by = "" + if last_publish_log and last_publish_log.published_by: + published_by = last_publish_log.published_by.username + + draft = container.versioning.draft + published = container.versioning.published + last_draft_created = draft.created if draft else None + if draft and draft.publishable_entity_version.created_by: + last_draft_created_by = draft.publishable_entity_version.created_by.username + else: + last_draft_created_by = "" + + return cls( + container_key=container_key, + container_type=container_type, + container_pk=container.pk, + display_name=draft.title, + created=container.created, + modified=draft.created, + draft_version_num=draft.version_num, + published_version_num=published.version_num if published else None, + published_display_name=published.title if published else None, + last_published=None if last_publish_log is None else last_publish_log.published_at, + published_by=published_by, + last_draft_created=last_draft_created, + last_draft_created_by=last_draft_created_by, + has_unpublished_changes=authoring_api.contains_unpublished_changes(container.pk), + collections=associated_collections or [], + ) + + +def library_container_locator( + library_key: LibraryLocatorV2, + container: Container, +) -> LibraryContainerLocator: + """ + Returns a LibraryContainerLocator for the given library + container. + + Currently only supports Unit-type containers; will support other container types in future. + """ + assert container.unit is not None + container_type = ContainerType.Unit + + return LibraryContainerLocator( + library_key, + container_type=container_type.value, + container_id=container.publishable_entity.key, + ) + + +def _get_container_from_key(container_key: LibraryContainerLocator, isDeleted=False) -> Container: + """ + Internal method to fetch the Container object from its LibraryContainerLocator + + Raises ContentLibraryContainerNotFound if no container found, or if the container has been soft deleted. + """ + assert isinstance(container_key, LibraryContainerLocator) + content_library = ContentLibrary.objects.get_by_key(container_key.lib_key) + learning_package = content_library.learning_package + assert learning_package is not None + container = authoring_api.get_container_by_key( + learning_package.id, + key=container_key.container_id, + ) + if container and (isDeleted or container.versioning.draft): + return container + raise ContentLibraryContainerNotFound + + def get_container( container_key: LibraryContainerLocator, *, include_collections=False, ) -> ContainerMetadata: """ - [ 🛑 UNSTABLE ] Get a container (a Section, Subsection, or Unit). + Get a container (a Section, Subsection, or Unit). """ - container = get_container_from_key(container_key) + container = _get_container_from_key(container_key) if include_collections: associated_collections = authoring_api.get_entity_collections( container.publishable_entity.learning_package_id, @@ -100,7 +225,7 @@ def create_container( created: datetime | None = None, ) -> ContainerMetadata: """ - [ 🛑 UNSTABLE ] Create a container (a Section, Subsection, or Unit) in the specified content library. + Create a container (e.g. a Unit) in the specified content library. It will initially be empty. """ @@ -116,13 +241,6 @@ def create_container( container_type=container_type.value, container_id=slug, ) - - if not created: - created = datetime.now(tz=timezone.utc) - - container: Container - _initial_version: ContainerVersion - # Then try creating the actual container: match container_type: case ContainerType.Unit: @@ -130,30 +248,12 @@ def create_container( content_library.learning_package_id, key=slug, title=title, - created=created, - created_by=user_id, - ) - case ContainerType.Subsection: - container, _initial_version = authoring_api.create_subsection_and_version( - content_library.learning_package_id, - key=slug, - title=title, - created=created, - created_by=user_id, - ) - case ContainerType.Section: - container, _initial_version = authoring_api.create_section_and_version( - content_library.learning_package_id, - key=slug, - title=title, - created=created, + created=created or datetime.now(tz=timezone.utc), created_by=user_id, ) case _: - raise NotImplementedError(f"Library does not support {container_type} yet") + raise NotImplementedError(f"Library support for {container_type} is in progress") - # .. event_implemented_name: LIBRARY_CONTAINER_CREATED - # .. event_type: org.openedx.content_authoring.content_library.container.created.v1 LIBRARY_CONTAINER_CREATED.send_event( library_container=LibraryContainerData( container_key=container_key, @@ -169,116 +269,45 @@ def update_container( user_id: int | None, ) -> ContainerMetadata: """ - [ 🛑 UNSTABLE ] Update a container (a Section, Subsection, or Unit) title. + Update a container (e.g. a Unit) title. """ - container = get_container_from_key(container_key) + container = _get_container_from_key(container_key) library_key = container_key.lib_key - created = datetime.now(tz=timezone.utc) - - container_type = ContainerType(container_key.container_type) - version: ContainerVersion - affected_containers: list[ContainerMetadata] = [] - # Get children containers or components to update their index data - children = get_container_children( - container_key, - published=False, + assert container.unit + unit_version = authoring_api.create_next_unit_version( + container.unit, + title=display_name, + created=datetime.now(tz=timezone.utc), + created_by=user_id, ) - child_key_name = 'container_key' - match container_type: - case ContainerType.Unit: - version = authoring_api.create_next_unit_version( - container.unit, - title=display_name, - created=created, - created_by=user_id, - ) - affected_containers = get_containers_contains_item(container_key) - # Components have usage_key instead of container_key - child_key_name = 'usage_key' - case ContainerType.Subsection: - version = authoring_api.create_next_subsection_version( - container.subsection, - title=display_name, - created=created, - created_by=user_id, - ) - affected_containers = get_containers_contains_item(container_key) - case ContainerType.Section: - version = authoring_api.create_next_section_version( - container.section, - title=display_name, - created=created, - created_by=user_id, - ) - - # The `affected_containers` are not obtained, because the sections are - # not contained in any container. - case _: - raise NotImplementedError(f"Library does not support {container_type} yet") - - # Send event related to the updated container - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData( container_key=container_key, ) ) - # Send events related to the containers that contains the updated container. - # This is to update the children display names used in the section/subsection previews. - for affected_container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 - LIBRARY_CONTAINER_UPDATED.send_event( - library_container=LibraryContainerData( - container_key=affected_container.container_key, - ) - ) - # Update children components and containers index data, for example, - # All subsections under a section have section key in index that needs to be updated. - # So if parent section name has been changed, it needs to be reflected in sections key of children - for child in children: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(getattr(child, child_key_name)), - changes=[container_key.container_type + "s"], - ), - ) - - return ContainerMetadata.from_container(library_key, version.container) + return ContainerMetadata.from_container(library_key, unit_version.container) def delete_container( container_key: LibraryContainerLocator, ) -> None: """ - [ 🛑 UNSTABLE ] Delete a container (a Section, Subsection, or Unit) (soft delete). + Delete a container (e.g. a Unit) (soft delete). No-op if container doesn't exist or has already been soft-deleted. """ library_key = container_key.lib_key - container = get_container_from_key(container_key) + container = _get_container_from_key(container_key) - # Fetch related collections and containers before soft-delete affected_collections = authoring_api.get_entity_collections( container.publishable_entity.learning_package_id, container.key, ) - affected_containers = get_containers_contains_item(container_key) - # Get children containers or components to update their index data - children = get_container_children( - container_key, - published=False, - ) authoring_api.soft_delete_draft(container.pk) - # .. event_implemented_name: LIBRARY_CONTAINER_DELETED - # .. event_type: org.openedx.content_authoring.content_library.container.deleted.v1 LIBRARY_CONTAINER_DELETED.send_event( library_container=LibraryContainerData( container_key=container_key, @@ -290,8 +319,6 @@ def delete_container( # # To delete the container on collections for collection in affected_collections: - # .. event_implemented_name: LIBRARY_COLLECTION_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.collection.updated.v1 LIBRARY_COLLECTION_UPDATED.send_event( library_collection=LibraryCollectionData( collection_key=library_collection_locator( @@ -301,41 +328,14 @@ def delete_container( background=True, ) ) - # Send events related to the containers that contains the updated container. - # This is to update the children display names used in the section/subsection previews. - for affected_container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 - LIBRARY_CONTAINER_UPDATED.send_event( - library_container=LibraryContainerData( - container_key=affected_container.container_key, - ) - ) - container_type = ContainerType(container_key.container_type) - key_name = 'container_key' - if container_type == ContainerType.Unit: - # Components have usage_key instead of container_key - key_name = 'usage_key' - # Update children components and containers index data, for example, - # All subsections under a section have section key in index that needs to be updated. - # So if parent section is deleted, it needs to be removed from sections key of children - for child in children: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(getattr(child, key_name)), - changes=[container_key.container_type + "s"], - ), - ) def restore_container(container_key: LibraryContainerLocator) -> None: """ - [ 🛑 UNSTABLE ] Restore the specified library container. + Restore the specified library container. """ library_key = container_key.lib_key - container = get_container_from_key(container_key, include_deleted=True) + container = _get_container_from_key(container_key, isDeleted=True) affected_collections = authoring_api.get_entity_collections( container.publishable_entity.learning_package_id, @@ -343,33 +343,18 @@ def restore_container(container_key: LibraryContainerLocator) -> None: ) authoring_api.set_draft_version(container.pk, container.versioning.latest.pk) - # Fetch related containers after restore - affected_containers = get_containers_contains_item(container_key) - # Get children containers or components to update their index data - children = get_container_children( - container_key, - published=False, - ) - # .. event_implemented_name: LIBRARY_CONTAINER_CREATED - # .. event_type: org.openedx.content_authoring.content_library.container.created.v1 LIBRARY_CONTAINER_CREATED.send_event( library_container=LibraryContainerData( container_key=container_key, ) ) - content_changes = ["collections", "tags"] - if affected_containers and len(affected_containers) > 0: - # Update parent key data in index. Eg. `sections` key in index for subsection - content_changes.append(str(affected_containers[0].container_type.value) + "s") - # Add tags, collections and parent data back to index - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 + # Add tags and collections back to index CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( content_object=ContentObjectChangedData( object_id=str(container_key), - changes=content_changes, + changes=["collections", "tags"], ), ) @@ -378,8 +363,6 @@ def restore_container(container_key: LibraryContainerLocator) -> None: # # To restore the container on collections for collection in affected_collections: - # .. event_implemented_name: LIBRARY_COLLECTION_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.collection.updated.v1 LIBRARY_COLLECTION_UPDATED.send_event( library_collection=LibraryCollectionData( collection_key=library_collection_locator( @@ -388,32 +371,6 @@ def restore_container(container_key: LibraryContainerLocator) -> None: ), ) ) - # Send events related to the containers that contains the updated container. - # This is to update the children display names used in the section/subsection previews. - for affected_container in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 - LIBRARY_CONTAINER_UPDATED.send_event( - library_container=LibraryContainerData( - container_key=affected_container.container_key, - ) - ) - container_type = ContainerType(container_key.container_type) - key_name = 'container_key' - if container_type == ContainerType.Unit: - key_name = 'usage_key' - # Update children components and containers index data, for example, - # All subsections under a section have section key in index that needs to be updated. - # Should restore removed parent section in sections key of children subsections - for child in children: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(getattr(child, key_name)), - changes=[container_key.container_type + "s"], - ), - ) def get_container_children( @@ -422,37 +379,21 @@ def get_container_children( published=False, ) -> list[LibraryXBlockMetadata | ContainerMetadata]: """ - [ 🛑 UNSTABLE ] Get the entities contained in the given container - (e.g. the components/xblocks in a unit, units in a subsection, subsections in a section) + Get the entities contained in the given container (e.g. the components/xblocks in a unit) """ - container = get_container_from_key(container_key) - container_type = ContainerType(container_key.container_type) - - match container_type: - case ContainerType.Unit: - child_components = authoring_api.get_components_in_unit(container.unit, published=published) - return [LibraryXBlockMetadata.from_component( - container_key.lib_key, - entry.component - ) for entry in child_components] - case ContainerType.Subsection: - child_units = authoring_api.get_units_in_subsection(container.subsection, published=published) - return [ContainerMetadata.from_container( - container_key.lib_key, - entry.unit - ) for entry in child_units] - case ContainerType.Section: - child_subsections = authoring_api.get_subsections_in_section(container.section, published=published) - return [ContainerMetadata.from_container( - container_key.lib_key, - entry.subsection, - ) for entry in child_subsections] - case _: - child_entities = authoring_api.get_entities_in_container(container, published=published) - return [ContainerMetadata.from_container( - container_key.lib_key, - entry.entity - ) for entry in child_entities] + container = _get_container_from_key(container_key) + if container_key.container_type == ContainerType.Unit.value: + child_components = authoring_api.get_components_in_unit(container.unit, published=published) + return [LibraryXBlockMetadata.from_component( + container_key.lib_key, + entry.component + ) for entry in child_components] + else: + child_entities = authoring_api.get_entities_in_container(container, published=published) + return [ContainerMetadata.from_container( + container_key.lib_key, + entry.entity + ) for entry in child_entities] def get_container_children_count( @@ -460,89 +401,37 @@ def get_container_children_count( published=False, ) -> int: """ - [ 🛑 UNSTABLE ] Get the count of entities contained in the given container (e.g. the components/xblocks in a unit) + Get the count of entities contained in the given container (e.g. the components/xblocks in a unit) """ - container = get_container_from_key(container_key) + container = _get_container_from_key(container_key) return authoring_api.get_container_children_count(container, published=published) def update_container_children( container_key: LibraryContainerLocator, - children_ids: list[LibraryUsageLocatorV2] | list[LibraryContainerLocator], + children_ids: list[UsageKeyV2] | list[LibraryContainerLocator], user_id: int | None, entities_action: authoring_api.ChildrenEntitiesAction = authoring_api.ChildrenEntitiesAction.REPLACE, ): """ - [ 🛑 UNSTABLE ] Adds children components or containers to given container. + Adds children components or containers to given container. """ library_key = container_key.lib_key - container_type = ContainerType(container_key.container_type) - container = get_container_from_key(container_key) - created = datetime.now(tz=timezone.utc) - new_version: ContainerVersion + container_type = container_key.container_type + container = _get_container_from_key(container_key) match container_type: - case ContainerType.Unit: + case ContainerType.Unit.value: components = [get_component_from_usage_key(key) for key in children_ids] # type: ignore[arg-type] new_version = authoring_api.create_next_unit_version( container.unit, components=components, # type: ignore[arg-type] - created=created, + created=datetime.now(tz=timezone.utc), created_by=user_id, entities_action=entities_action, ) - - for key in children_ids: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(key), - changes=["units"], - ), - ) - case ContainerType.Subsection: - units = [get_container_from_key(key).unit for key in children_ids] # type: ignore[arg-type] - new_version = authoring_api.create_next_subsection_version( - container.subsection, - units=units, # type: ignore[arg-type] - created=created, - created_by=user_id, - entities_action=entities_action, - ) - - for key in children_ids: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(key), - changes=["subsections"], - ), - ) - case ContainerType.Section: - subsections = [get_container_from_key(key).subsection for key in children_ids] # type: ignore[arg-type] - new_version = authoring_api.create_next_section_version( - container.section, - subsections=subsections, # type: ignore[arg-type] - created=created, - created_by=user_id, - entities_action=entities_action, - ) - - for key in children_ids: - # .. event_implemented_name: CONTENT_OBJECT_ASSOCIATIONS_CHANGED - # .. event_type: org.openedx.content_authoring.content.object.associations.changed.v1 - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.send_event( - content_object=ContentObjectChangedData( - object_id=str(key), - changes=["sections"], - ), - ) case _: raise ValueError(f"Invalid container type: {container_type}") - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData( container_key=container_key, @@ -552,39 +441,29 @@ def update_container_children( return ContainerMetadata.from_container(library_key, new_version.container) -def get_containers_contains_item( - key: LibraryUsageLocatorV2 | LibraryContainerLocator +def get_containers_contains_component( + usage_key: LibraryUsageLocatorV2 ) -> list[ContainerMetadata]: """ - [ 🛑 UNSTABLE ] Get containers that contains the item, that can be a component or another container. + Get containers that contains the component. """ - item: Component | Container - - if isinstance(key, LibraryUsageLocatorV2): - item = get_component_from_usage_key(key) - - elif isinstance(key, LibraryContainerLocator): - item = get_container_from_key(key) - + assert isinstance(usage_key, LibraryUsageLocatorV2) + component = get_component_from_usage_key(usage_key) containers = authoring_api.get_containers_with_entity( - item.publishable_entity.pk, + component.publishable_entity.pk, ) return [ - ContainerMetadata.from_container(key.lib_key, container) + ContainerMetadata.from_container(usage_key.context_key, container) for container in containers ] -def publish_container_changes( - container_key: LibraryContainerLocator, - user_id: int | None, - call_post_publish_events_sync=False, -) -> None: +def publish_container_changes(container_key: LibraryContainerLocator, user_id: int | None) -> None: """ - [ 🛑 UNSTABLE ] Publish all unpublished changes in a container and all its child + Publish all unpublished changes in a container and all its child containers/blocks. """ - container = get_container_from_key(container_key) + container = _get_container_from_key(container_key) library_key = container_key.lib_key content_library = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined] learning_package = content_library.learning_package @@ -599,43 +478,4 @@ def publish_container_changes( ) # Update the search index (and anything else) for the affected container + blocks # This is mostly synchronous but may complete some work asynchronously if there are a lot of changes. - if call_post_publish_events_sync: - tasks.send_events_after_publish(publish_log.pk, str(library_key)) - else: - tasks.wait_for_post_publish_events(publish_log, library_key) - - -def copy_container(container_key: LibraryContainerLocator, user_id: int) -> UserClipboardData: - """ - [ 🛑 UNSTABLE ] Copy a container (a Section, Subsection, or Unit) to the content staging. - """ - container_metadata = get_container(container_key) - container_serializer = ContainerSerializer(container_metadata) - block_type = ContainerType(container_key.container_type).olx_tag - - from openedx.core.djangoapps.content_staging import api as content_staging_api - - return content_staging_api.save_content_to_user_clipboard( - user_id=user_id, - block_type=block_type, - olx=container_serializer.olx_str, - display_name=container_metadata.display_name, - suggested_url_name=str(container_key), - tags=container_serializer.tags, - copied_from=container_key, - version_num=container_metadata.published_version_num, - static_files=container_serializer.static_files, - ) - - -def get_library_object_hierarchy( - object_key: LibraryUsageLocatorV2 | LibraryContainerLocator, -) -> ContainerHierarchy: - """ - [ 🛑 UNSTABLE ] Returns the full ancestry and descendents of the library object with the given object_key. - - TODO: We intend to replace this implementation with a more efficient one that makes fewer - database queries in the future. More details being discussed in - https://github.com/openedx/edx-platform/pull/36813#issuecomment-3136631767 - """ - return ContainerHierarchy.create_from_library_object_key(object_key) + tasks.wait_for_post_publish_events(publish_log, library_key) diff --git a/openedx/core/djangoapps/content_libraries/api/libraries.py b/openedx/core/djangoapps/content_libraries/api/libraries.py index 66281addb643..290c88a16a64 100644 --- a/openedx/core/djangoapps/content_libraries/api/libraries.py +++ b/openedx/core/djangoapps/content_libraries/api/libraries.py @@ -41,10 +41,9 @@ """ from __future__ import annotations -import logging -from dataclasses import dataclass -from dataclasses import field as dataclass_field +from dataclasses import dataclass, field as dataclass_field from datetime import datetime +import logging from django.conf import settings from django.contrib.auth.models import AbstractUser, AnonymousUser, Group @@ -54,28 +53,29 @@ from django.db.models import Q, QuerySet from django.utils.translation import gettext as _ from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2 -from openedx_authz import api as authz_api -from openedx_authz.api import assign_role_to_user_in_scope -from openedx_authz.constants import permissions as authz_permissions -from openedx_events.content_authoring.data import ContentLibraryData +from openedx_events.content_authoring.data import ( + ContentLibraryData, +) from openedx_events.content_authoring.signals import ( CONTENT_LIBRARY_CREATED, CONTENT_LIBRARY_DELETED, CONTENT_LIBRARY_UPDATED, ) from openedx_learning.api import authoring as authoring_api -from openedx_learning.api.authoring_models import Component, LearningPackage +from openedx_learning.api.authoring_models import Component from organizations.models import Organization -from user_tasks.models import UserTaskArtifact, UserTaskStatus from xblock.core import XBlock from openedx.core.types import User as UserType -from .. import permissions, tasks +from .. import permissions from ..constants import ALL_RIGHTS_RESERVED from ..models import ContentLibrary, ContentLibraryPermission -from .exceptions import LibraryAlreadyExists, LibraryPermissionIntegrityError -from .permissions import LEGACY_LIB_PERMISSIONS +from .. import tasks +from .exceptions import ( + LibraryAlreadyExists, + LibraryPermissionIntegrityError, +) log = logging.getLogger(__name__) @@ -105,9 +105,6 @@ "get_allowed_block_types", "publish_changes", "revert_changes", - "get_backup_task_status", - "assign_library_role_to_user", - "user_has_permission_across_lib_authz_systems", ] @@ -156,12 +153,6 @@ class AccessLevel: NO_ACCESS = None -ACCESS_LEVEL_TO_LIBRARY_ROLE = { - AccessLevel.ADMIN_LEVEL: "library_admin", - AccessLevel.AUTHOR_LEVEL: "library_author", -} - - @dataclass(frozen=True) class ContentLibraryPermissionEntry: """ @@ -203,7 +194,7 @@ class PublishableItem(LibraryItem): published_display_name: str | None last_published: datetime | None = None # The username of the user who last published this. - published_by: str | None = "" + published_by: str = "" last_draft_created: datetime | None = None # The username of the user who created the last draft. last_draft_created_by: str = "" @@ -244,18 +235,7 @@ def user_can_create_library(user: AbstractUser) -> bool: """ Check if the user has permission to create a content library. """ - library_permission = permissions.CAN_CREATE_CONTENT_LIBRARY - lib_permission_in_authz = _transform_legacy_lib_permission_to_authz_permission(library_permission) - # The authz_api.is_user_allowed check only validates permissions within a specific library context. Since - # creating a library is not tied to an existing one, we use user.has_perm (via Bridgekeeper) to check if the user - # can create libraries, meaning they have the course creator role. In the future, this should rely on a global (*) - # role defined in the Authorization Framework for instance-level resource creation. - has_perms = user.has_perm(library_permission) or authz_api.is_user_allowed( - user, - lib_permission_in_authz, - authz_api.data.GLOBAL_SCOPE_WILDCARD, - ) - return has_perms + return user.has_perm(permissions.CAN_CREATE_CONTENT_LIBRARY) def get_libraries_for_user(user, org=None, text_search=None, order=None) -> QuerySet[ContentLibrary]: @@ -277,11 +257,7 @@ def get_libraries_for_user(user, org=None, text_search=None, order=None) -> Quer Q(learning_package__description__icontains=text_search) ) - # Using distinct() temporarily to avoid duplicate results caused by overlapping permission checks - # between Bridgekeeper and the new authorization framework. This ensures correct results for now, - # but it should be removed once Bridgekeeper support is fully dropped and all permission logic - # is handled through openedx-authz. - filtered = permissions.perms[permissions.CAN_VIEW_THIS_CONTENT_LIBRARY].filter(user, qs).distinct() + filtered = permissions.perms[permissions.CAN_VIEW_THIS_CONTENT_LIBRARY].filter(user, qs) if order: order_query = 'learning_package__' @@ -316,6 +292,7 @@ def get_metadata(queryset: QuerySet[ContentLibrary], text_search: str | None = N key=lib.library_key, title=lib.learning_package.title if lib.learning_package else "", description="", + version=0, allow_public_learning=lib.allow_public_learning, allow_public_read=lib.allow_public_read, @@ -343,10 +320,8 @@ def require_permission_for_library_key(library_key: LibraryLocatorV2, user: User Raises django.core.exceptions.PermissionDenied if the user doesn't have permission. """ - library_obj = ContentLibrary.objects.get_by_key(library_key) - # obj should be able to read any valid model object but mypy thinks it can only be - # "User | AnonymousUser | None" - if not user_has_permission_across_lib_authz_systems(user, permission, library_obj): + library_obj = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined] + if not user.has_perm(permission, obj=library_obj): raise PermissionDenied return library_obj @@ -378,6 +353,22 @@ def get_library(library_key: LibraryLocatorV2) -> ContentLibraryMetadata: has_unpublished_deletes = authoring_api.get_entities_with_unpublished_deletes(learning_package.id) \ .exists() + # Learning Core doesn't really have a notion of a global version number,but + # we can sort of approximate it by using the primary key of the last publish + # log entry, in the sense that it will be a monotonically increasing + # integer, though there will be large gaps. We use 0 to denote that nothing + # has been done, since that will never be a valid value for a PublishLog pk. + # + # That being said, we should figure out if we really even want to keep a top + # level version indicator for the Library as a whole. In the v1 libs + # implemention, this served as a way to know whether or not there was an + # updated version of content that a course could pull in. But more recently, + # we've decided to do those version references at the level of the + # individual blocks being used, since a Learning Core backed library is + # intended to be referenced in multiple course locations and not 1:1 like v1 + # libraries. The top level version stays for now because LegacyLibraryContentBlock + # uses it, but that should hopefully change before the Redwood release. + version = 0 if last_publish_log is None else last_publish_log.pk published_by = "" if last_publish_log and last_publish_log.published_by: published_by = last_publish_log.published_by.username @@ -387,6 +378,7 @@ def get_library(library_key: LibraryLocatorV2) -> ContentLibraryMetadata: title=learning_package.title, description=learning_package.description, num_blocks=num_blocks, + version=version, last_published=None if last_publish_log is None else last_publish_log.published_at, published_by=published_by, last_draft_created=last_draft_created, @@ -411,7 +403,6 @@ def create_library( allow_public_learning: bool = False, allow_public_read: bool = False, library_license: str = ALL_RIGHTS_RESERVED, - learning_package: LearningPackage | None = None, ) -> ContentLibraryMetadata: """ Create a new content library. @@ -428,13 +419,10 @@ def create_library( allow_public_read: Allow anyone to view blocks (including source) in Studio? - learning_package: A learning package to associate with this library. - Returns a ContentLibraryMetadata instance. """ assert isinstance(org, Organization) validate_unicode_slug(slug) - is_learning_package_loaded = learning_package is not None try: with transaction.atomic(): ref = ContentLibrary.objects.create( @@ -444,41 +432,28 @@ def create_library( allow_public_read=allow_public_read, license=library_license, ) - - if learning_package: - # A temporary LearningPackage was passed in, so update its key to match the library, - # and also update its title/description in case they differ. - authoring_api.update_learning_package( - learning_package.id, - key=str(ref.library_key), - title=title, - description=description, - ) - else: - # We have to generate a new LearningPackage for this library. - learning_package = authoring_api.create_learning_package( - key=str(ref.library_key), - title=title, - description=description, - ) + learning_package = authoring_api.create_learning_package( + key=str(ref.library_key), + title=title, + description=description, + ) ref.learning_package = learning_package ref.save() + except IntegrityError: raise LibraryAlreadyExists(slug) # lint-amnesty, pylint: disable=raise-missing-from - # .. event_implemented_name: CONTENT_LIBRARY_CREATED - # .. event_type: org.openedx.content_authoring.content_library.created.v1 CONTENT_LIBRARY_CREATED.send_event( content_library=ContentLibraryData( library_key=ref.library_key ) ) - return ContentLibraryMetadata( key=ref.library_key, title=title, description=description, num_blocks=0, + version=0, last_published=None, allow_public_learning=ref.allow_public_learning, allow_public_read=ref.allow_public_read, @@ -540,30 +515,6 @@ def set_library_user_permissions(library_key: LibraryLocatorV2, user: UserType, ) -def assign_library_role_to_user(library_key: LibraryLocatorV2, user: UserType, access_level: str): - """Grant a role to the specified user for this library. - - Args: - library_key (LibraryLocatorV2): The key of the content library. - user (UserType): The user to whom the role will be granted. - access_level (str | None): The access level to be granted. This access level maps to a specific role. - - Raises: - TypeError: If the user is an instance of AnonymousUser. - """ - if isinstance(user, AnonymousUser): - raise TypeError("Invalid user type") - - role = ACCESS_LEVEL_TO_LIBRARY_ROLE.get(access_level) - if role is None: - raise ValueError(f"Invalid access level: {access_level}") - - if assign_role_to_user_in_scope(user.username, role, str(library_key)): - log.info(f"Assigned role '{role}' to user '{user.username}' for library '{library_key}'") - else: - log.warning(f"Failed to assign role '{role}' to user '{user.username}' for library '{library_key}'") - - def set_library_group_permissions(library_key: LibraryLocatorV2, group, access_level: str): """ Change the specified group's level of access to this library. @@ -629,8 +580,6 @@ def update_library( description=description, ) - # .. event_implemented_name: CONTENT_LIBRARY_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.updated.v1 CONTENT_LIBRARY_UPDATED.send_event( content_library=ContentLibraryData( library_key=content_lib.library_key @@ -656,8 +605,6 @@ def delete_library(library_key: LibraryLocatorV2) -> None: if learning_package: learning_package.delete() - # .. event_implemented_name: CONTENT_LIBRARY_DELETED - # .. event_type: org.openedx.content_authoring.content_library.deleted.v1 CONTENT_LIBRARY_DELETED.send_event( content_library=ContentLibraryData( library_key=library_key @@ -737,117 +684,3 @@ def revert_changes(library_key: LibraryLocatorV2, user_id: int | None = None) -> # Call the event handlers as needed. tasks.wait_for_post_revert_events(draft_change_log, library_key) - - -def get_backup_task_status( - user_id: int, - task_id: str -) -> dict | None: - """ - Get the status of a library backup task. - - Returns a dictionary with the following keys: - - state: One of "Pending", "Exporting", "Succeeded", "Failed" - - file: If state is "Succeeded", the FileField of the exported .zip. Otherwise, None. - If no task is found, returns None. - """ - - try: - task_status = UserTaskStatus.objects.get(task_id=task_id, user_id=user_id) - except UserTaskStatus.DoesNotExist: - return None - - result = {'state': task_status.state, 'file': None} - - if task_status.state == UserTaskStatus.SUCCEEDED: - artifact = UserTaskArtifact.objects.get(status=task_status, name='Output') - result['file'] = artifact.file - - return result - - -def _transform_legacy_lib_permission_to_authz_permission(permission: str) -> str: - """ - Transform a legacy content library permission to an openedx-authz permission. - """ - # There is no dedicated permission or role for can_create_content_library in openedx-authz yet, - # so we reuse the same permission to rely on user.has_perm via Bridgekeeper. - return { - permissions.CAN_CREATE_CONTENT_LIBRARY: permissions.CAN_CREATE_CONTENT_LIBRARY, - permissions.CAN_DELETE_THIS_CONTENT_LIBRARY: authz_permissions.DELETE_LIBRARY.identifier, - permissions.CAN_EDIT_THIS_CONTENT_LIBRARY: authz_permissions.EDIT_LIBRARY_CONTENT.identifier, - permissions.CAN_EDIT_THIS_CONTENT_LIBRARY_TEAM: authz_permissions.MANAGE_LIBRARY_TEAM.identifier, - permissions.CAN_VIEW_THIS_CONTENT_LIBRARY: authz_permissions.VIEW_LIBRARY.identifier, - permissions.CAN_VIEW_THIS_CONTENT_LIBRARY_TEAM: authz_permissions.VIEW_LIBRARY_TEAM.identifier, - }.get(permission, permission) - - -def _transform_authz_permission_to_legacy_lib_permission(permission: str) -> str: - """ - Transform an openedx-authz permission to a legacy content library permission. - """ - return { - authz_permissions.PUBLISH_LIBRARY_CONTENT.identifier: permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, - authz_permissions.CREATE_LIBRARY_COLLECTION.identifier: permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, - authz_permissions.EDIT_LIBRARY_COLLECTION.identifier: permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, - authz_permissions.DELETE_LIBRARY_COLLECTION.identifier: permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, - }.get(permission, permission) - - -def user_has_permission_across_lib_authz_systems( - user: UserType, - permission: str | authz_api.data.PermissionData, - library_obj: ContentLibrary, -) -> bool: - """ - Check whether a user has a given permission on a content library across both the - legacy edx-platform permission system and the newer openedx-authz system. - - The provided permission name is normalized to both systems (legacy and authz), and - authorization is granted if either: - - the user holds the legacy object-level permission on the ContentLibrary instance, or - - the openedx-authz API allows the user for the corresponding permission on the library. - - **Note:** - Temporary: this function uses Bridgekeeper-based logic for cases not yet modeled in openedx-authz. - - Current gaps covered here: - - CAN_CREATE_CONTENT_LIBRARY: we call user.has_perm via Bridgekeeper to verify the user is a course creator. - - CAN_VIEW_THIS_CONTENT_LIBRARY: we respect the allow_public_read flag via Bridgekeeper. - - Replace these with authz_api.is_user_allowed once openedx-authz supports - these conditions natively (including global (*) roles). - - Args: - user: The Django user (or user-like object) to check. - permission: The permission identifier (either a legacy codename or an openedx-authz name). - library_obj: The ContentLibrary instance to check against. - - Returns: - bool: True if the user is authorized by either system; otherwise False. - """ - if isinstance(permission, authz_api.data.PermissionData): - permission = permission.identifier - if _is_legacy_permission(permission): - legacy_permission = permission - authz_permission = _transform_legacy_lib_permission_to_authz_permission(permission) - else: - authz_permission = permission - legacy_permission = _transform_authz_permission_to_legacy_lib_permission(permission) - return ( - # Check both the legacy and the new openedx-authz permissions - user.has_perm(perm=legacy_permission, obj=library_obj) - or authz_api.is_user_allowed( - user, - authz_permission, - str(library_obj.library_key), - ) - ) - - -def _is_legacy_permission(permission: str) -> bool: - """ - Determine if the specified library permission is part of the legacy - or the new openedx-authz system. - """ - return permission in LEGACY_LIB_PERMISSIONS diff --git a/openedx/core/djangoapps/content_libraries/rest_api/blocks.py b/openedx/core/djangoapps/content_libraries/rest_api/blocks.py index e72980f6ba0d..bc314099893c 100644 --- a/openedx/core/djangoapps/content_libraries/rest_api/blocks.py +++ b/openedx/core/djangoapps/content_libraries/rest_api/blocks.py @@ -9,7 +9,6 @@ from django.utils.decorators import method_decorator from drf_yasg.utils import swagger_auto_schema from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2 -from openedx_authz.constants import permissions as authz_permissions from openedx_learning.api import authoring as authoring_api from rest_framework import status from rest_framework.exceptions import NotFound, ValidationError @@ -19,7 +18,14 @@ from rest_framework.views import APIView from openedx.core.djangoapps.content_libraries import api, permissions -from openedx.core.djangoapps.content_libraries.rest_api import serializers +from openedx.core.djangoapps.content_libraries.rest_api.serializers import ( + ContentLibraryItemCollectionsUpdateSerializer, + LibraryXBlockCreationSerializer, + LibraryXBlockMetadataSerializer, + LibraryXBlockOlxSerializer, + LibraryXBlockStaticFileSerializer, + LibraryXBlockStaticFilesSerializer, +) from openedx.core.djangoapps.xblock import api as xblock_api from openedx.core.lib.api.view_utils import view_auth_classes from openedx.core.types.http import RestRequest @@ -34,7 +40,7 @@ class LibraryBlocksView(GenericAPIView): """ Views to work with XBlocks in a specific content library. """ - serializer_class = serializers.LibraryXBlockMetadataSerializer + serializer_class = LibraryXBlockMetadataSerializer @apidocs.schema( parameters=[ @@ -68,13 +74,13 @@ def get(self, request, lib_key_str): api.LibraryXBlockMetadata.from_component(key, component) for component in self.paginate_queryset(components) ] - serializer = self.serializer_class(paginated_xblock_metadata, many=True) + serializer = LibraryXBlockMetadataSerializer(paginated_xblock_metadata, many=True) return self.get_paginated_response(serializer.data) @convert_exceptions @swagger_auto_schema( - request_body=serializers.LibraryXBlockCreationSerializer, - responses={200: serializers.LibraryXBlockMetadataSerializer} + request_body=LibraryXBlockCreationSerializer, + responses={200: LibraryXBlockMetadataSerializer} ) def post(self, request, lib_key_str): """ @@ -82,7 +88,7 @@ def post(self, request, lib_key_str): """ library_key = LibraryLocatorV2.from_string(lib_key_str) api.require_permission_for_library_key(library_key, request.user, permissions.CAN_EDIT_THIS_CONTENT_LIBRARY) - serializer = serializers.LibraryXBlockCreationSerializer(data=request.data) + serializer = LibraryXBlockCreationSerializer(data=request.data) serializer.is_valid(raise_exception=True) # Create a new regular top-level block: @@ -93,7 +99,7 @@ def post(self, request, lib_key_str): detail={'block_type': str(err)}, ) - return Response(self.serializer_class(result).data) + return Response(LibraryXBlockMetadataSerializer(result).data) @method_decorator(non_atomic_requests, name="dispatch") @@ -102,8 +108,6 @@ class LibraryBlockView(APIView): """ Views to work with an existing XBlock in a content library. """ - serializer_class = serializers.LibraryXBlockMetadataSerializer - @convert_exceptions def get(self, request, usage_key_str): """ @@ -119,7 +123,7 @@ def get(self, request, usage_key_str): api.require_permission_for_library_key(key.lib_key, request.user, permissions.CAN_VIEW_THIS_CONTENT_LIBRARY) result = api.get_library_block(key, include_collections=True) - return Response(self.serializer_class(result).data) + return Response(LibraryXBlockMetadataSerializer(result).data) @convert_exceptions def delete(self, request, usage_key_str): # pylint: disable=unused-argument @@ -146,8 +150,6 @@ class LibraryBlockAssetListView(APIView): """ Views to list an existing XBlock's static asset files """ - serializer_class = serializers.LibraryXBlockStaticFilesSerializer - @convert_exceptions def get(self, request, usage_key_str): """ @@ -156,7 +158,7 @@ def get(self, request, usage_key_str): key = LibraryUsageLocatorV2.from_string(usage_key_str) api.require_permission_for_library_key(key.lib_key, request.user, permissions.CAN_VIEW_THIS_CONTENT_LIBRARY) files = api.get_library_block_static_asset_files(key) - return Response(self.serializer_class({"files": files}).data) + return Response(LibraryXBlockStaticFilesSerializer({"files": files}).data) @method_decorator(non_atomic_requests, name="dispatch") @@ -166,7 +168,6 @@ class LibraryBlockAssetView(APIView): Views to work with an existing XBlock's static asset files """ parser_classes = (MultiPartParser, ) - serializer_class = serializers.LibraryXBlockStaticFileSerializer @convert_exceptions def get(self, request, usage_key_str, file_path): @@ -178,7 +179,7 @@ def get(self, request, usage_key_str, file_path): files = api.get_library_block_static_asset_files(key) for f in files: if f.path == file_path: - return Response(self.serializer_class(f).data) + return Response(LibraryXBlockStaticFileSerializer(f).data) raise NotFound @convert_exceptions @@ -205,7 +206,7 @@ def put(self, request, usage_key_str, file_path): result = api.add_library_block_static_asset_file(usage_key, file_path, file_content, request.user) except ValueError: raise ValidationError("Invalid file path") # lint-amnesty, pylint: disable=raise-missing-from - return Response(self.serializer_class(result).data) + return Response(LibraryXBlockStaticFileSerializer(result).data) @convert_exceptions def delete(self, request, usage_key_str, file_path): @@ -239,9 +240,9 @@ def post(self, request, usage_key_str): api.require_permission_for_library_key( key.lib_key, request.user, - authz_permissions.PUBLISH_LIBRARY_CONTENT + permissions.CAN_EDIT_THIS_CONTENT_LIBRARY ) - api.publish_component_changes(key, request.user.id) + api.publish_component_changes(key, request.user) return Response({}) @@ -264,7 +265,7 @@ def patch(self, request: RestRequest, usage_key_str) -> Response: request.user, permissions.CAN_EDIT_THIS_CONTENT_LIBRARY ) - serializer = serializers.ContentLibraryItemCollectionsUpdateSerializer(data=request.data) + serializer = ContentLibraryItemCollectionsUpdateSerializer(data=request.data) serializer.is_valid(raise_exception=True) component = api.get_component_from_usage_key(key) @@ -308,8 +309,6 @@ class LibraryBlockOlxView(APIView): """ Views to work with an existing XBlock's OLX """ - serializer_class = serializers.LibraryXBlockOlxSerializer - @convert_exceptions def get(self, request, usage_key_str): """ @@ -321,7 +320,7 @@ def get(self, request, usage_key_str): key = LibraryUsageLocatorV2.from_string(usage_key_str) api.require_permission_for_library_key(key.lib_key, request.user, permissions.CAN_VIEW_THIS_CONTENT_LIBRARY) xml_str = xblock_api.get_block_draft_olx(key) - return Response(self.serializer_class({"olx": xml_str}).data) + return Response(LibraryXBlockOlxSerializer({"olx": xml_str}).data) @convert_exceptions def post(self, request, usage_key_str): @@ -333,14 +332,14 @@ def post(self, request, usage_key_str): """ key = LibraryUsageLocatorV2.from_string(usage_key_str) api.require_permission_for_library_key(key.lib_key, request.user, permissions.CAN_EDIT_THIS_CONTENT_LIBRARY) - serializer = self.serializer_class(data=request.data) + serializer = LibraryXBlockOlxSerializer(data=request.data) serializer.is_valid(raise_exception=True) new_olx_str = serializer.validated_data["olx"] try: version_num = api.set_library_block_olx(key, new_olx_str).version_num except ValueError as err: raise ValidationError(detail=str(err)) # lint-amnesty, pylint: disable=raise-missing-from - return Response(self.serializer_class({"olx": new_olx_str, "version_num": version_num}).data) + return Response(LibraryXBlockOlxSerializer({"olx": new_olx_str, "version_num": version_num}).data) @view_auth_classes() @@ -359,25 +358,6 @@ def post(self, request, usage_key_str) -> Response: return Response(None, status=status.HTTP_204_NO_CONTENT) -@method_decorator(non_atomic_requests, name="dispatch") -@view_auth_classes() -class LibraryBlockHierarchy(GenericAPIView): - """ - View to return the full hierarchy of containers that contain a library block. - """ - serializer_class = serializers.ContainerHierarchySerializer - - @convert_exceptions - def get(self, request, usage_key_str) -> Response: - """ - Fetches and returns the full container hierarchy for the given library block. - """ - key = LibraryUsageLocatorV2.from_string(usage_key_str) - api.require_permission_for_library_key(key.lib_key, request.user, permissions.CAN_VIEW_THIS_CONTENT_LIBRARY) - hierarchy = api.get_library_object_hierarchy(key) - return Response(self.serializer_class(hierarchy).data) - - def get_component_version_asset(request, component_version_uuid, asset_path): """ Serves static assets associated with particular Component versions. diff --git a/openedx/core/djangoapps/content_libraries/rest_api/serializers.py b/openedx/core/djangoapps/content_libraries/rest_api/serializers.py index c0bf07d087fc..38765f0b320f 100644 --- a/openedx/core/djangoapps/content_libraries/rest_api/serializers.py +++ b/openedx/core/djangoapps/content_libraries/rest_api/serializers.py @@ -2,33 +2,29 @@ Serializers for the content libraries REST API """ # pylint: disable=abstract-method -import json -import logging - from django.core.validators import validate_unicode_slug -from opaque_keys import InvalidKeyError, OpaqueKey -from opaque_keys.edx.locator import LibraryContainerLocator, LibraryUsageLocatorV2 -from openedx_learning.api.authoring_models import Collection, LearningPackage from rest_framework import serializers from rest_framework.exceptions import ValidationError -from user_tasks.models import UserTaskStatus -from openedx.core.djangoapps.content_libraries.tasks import LibraryRestoreTask -from openedx.core.djangoapps.content_libraries import api +from opaque_keys import OpaqueKey +from opaque_keys.edx.locator import LibraryContainerLocator, LibraryUsageLocatorV2 +from opaque_keys import InvalidKeyError + +from openedx_learning.api.authoring_models import Collection from openedx.core.djangoapps.content_libraries.api.containers import ContainerType -from openedx.core.djangoapps.content_libraries.constants import ALL_RIGHTS_RESERVED, LICENSE_OPTIONS +from openedx.core.djangoapps.content_libraries.constants import ( + ALL_RIGHTS_RESERVED, + LICENSE_OPTIONS, +) from openedx.core.djangoapps.content_libraries.models import ( - ContentLibrary, - ContentLibraryBlockImportTask, - ContentLibraryPermission + ContentLibraryPermission, ContentLibraryBlockImportTask, + ContentLibrary ) from openedx.core.lib.api.serializers import CourseKeyField - from .. import permissions -DATETIME_FORMAT = '%Y-%m-%dT%H:%M:%SZ' -log = logging.getLogger(__name__) +DATETIME_FORMAT = '%Y-%m-%dT%H:%M:%SZ' class ContentLibraryMetadataSerializer(serializers.Serializer): @@ -45,8 +41,8 @@ class ContentLibraryMetadataSerializer(serializers.Serializer): slug = serializers.CharField(source="key.slug", validators=(validate_unicode_slug, )) title = serializers.CharField() description = serializers.CharField(allow_blank=True) - learning_package = serializers.PrimaryKeyRelatedField(queryset=LearningPackage.objects.all(), required=False) num_blocks = serializers.IntegerField(read_only=True) + version = serializers.IntegerField(read_only=True) last_published = serializers.DateTimeField(format=DATETIME_FORMAT, read_only=True) published_by = serializers.CharField(read_only=True) last_draft_created = serializers.DateTimeField(format=DATETIME_FORMAT, read_only=True) @@ -76,8 +72,7 @@ def get_can_edit_library(self, obj): return False library_obj = ContentLibrary.objects.get_by_key(obj.key) - return api.user_has_permission_across_lib_authz_systems( - user, permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, library_obj) + return user.has_perm(permissions.CAN_EDIT_THIS_CONTENT_LIBRARY, obj=library_obj) class ContentLibraryUpdateSerializer(serializers.Serializer): @@ -344,6 +339,14 @@ def to_internal_value(self, value: str) -> LibraryUsageLocatorV2: raise ValidationError from err +class ContentLibraryComponentKeysSerializer(serializers.Serializer): + """ + Serializer for adding/removing Components to/from a Collection. + """ + + usage_keys = serializers.ListField(child=UsageKeyV2Serializer(), allow_empty=False) + + class OpaqueKeySerializer(serializers.BaseSerializer): """ Serializes a OpaqueKey with the correct class. @@ -369,14 +372,6 @@ def to_internal_value(self, value: str) -> OpaqueKey: raise ValidationError from err -class ContentLibraryItemContainerKeysSerializer(serializers.Serializer): - """ - Serializer for adding/removing items to/from a Container. - """ - - usage_keys = serializers.ListField(child=OpaqueKeySerializer(), allow_empty=False) - - class ContentLibraryItemKeysSerializer(serializers.Serializer): """ Serializer for adding/removing items to/from a Collection. @@ -391,156 +386,3 @@ class ContentLibraryItemCollectionsUpdateSerializer(serializers.Serializer): """ collection_keys = serializers.ListField(child=serializers.CharField(), allow_empty=True) - - -class UnionLibraryMetadataSerializer(serializers.Serializer): - """ - Union serializer for swagger api response. - """ - - type_a = LibraryXBlockMetadataSerializer(many=True, required=False) - type_b = LibraryContainerMetadataSerializer(many=True, required=False) - - -class ContainerHierarchyMemberSerializer(serializers.Serializer): - """ - Serializer for the members of a hierarchy, which can be either Components or Containers. - """ - id = OpaqueKeySerializer() - display_name = serializers.CharField() - has_unpublished_changes = serializers.BooleanField() - - -class ContainerHierarchySerializer(serializers.Serializer): - """ - Serializer which represents the full hierarchy of containers and components that contain and are contained by a - given library container or library block. - """ - sections = serializers.ListField(child=ContainerHierarchyMemberSerializer(), allow_empty=True) - subsections = serializers.ListField(child=ContainerHierarchyMemberSerializer(), allow_empty=True) - units = serializers.ListField(child=ContainerHierarchyMemberSerializer(), allow_empty=True) - components = serializers.ListField(child=ContainerHierarchyMemberSerializer(), allow_empty=True) - object_key = OpaqueKeySerializer() - - -class LibraryBackupResponseSerializer(serializers.Serializer): - """ - Serializer for the response after requesting a backup of a content library. - """ - task_id = serializers.CharField() - - -class LibraryBackupTaskStatusSerializer(serializers.Serializer): - """ - Serializer for checking the status of a library backup task. - """ - state = serializers.CharField() - url = serializers.FileField(source='file', allow_null=True, use_url=True) - - -class LibraryRestoreFileSerializer(serializers.Serializer): - """ - Serializer for restoring a library from a backup file. - """ - # input only fields - file = serializers.FileField(write_only=True, help_text="A ZIP file containing a library backup.") - - # output only fields - task_id = serializers.UUIDField(read_only=True) - - def validate_file(self, value): - """ - Validate that the uploaded file is a ZIP file. - """ - if value.content_type != 'application/zip': - raise serializers.ValidationError("Only ZIP files are allowed.") - return value - - -class LibraryRestoreTaskRequestSerializer(serializers.Serializer): - """ - Serializer for requesting the status of a library restore task. - """ - task_id = serializers.UUIDField(write_only=True, help_text="The ID of the restore task to check.") - - -class RestoreSuccessDataSerializer(serializers.Serializer): - """ - Serializer for the data returned upon successful restoration of a library. - """ - learning_package_id = serializers.IntegerField(source="lp_restored_data.id") - title = serializers.CharField(source="lp_restored_data.title") - org = serializers.CharField(source="lp_restored_data.archive_org_key") - slug = serializers.CharField(source="lp_restored_data.archive_slug") - - # The `key` is a unique temporary key assigned to the learning package during the restore process, - # whereas the `archive_key` is the original key of the learning package from the backup. - # The temporary learning package key is replaced with a standard key once it is added to a content library. - key = serializers.CharField(source="lp_restored_data.key") - archive_key = serializers.CharField(source="lp_restored_data.archive_lp_key") - - containers = serializers.IntegerField(source="lp_restored_data.num_containers") - components = serializers.IntegerField(source="lp_restored_data.num_components") - collections = serializers.IntegerField(source="lp_restored_data.num_collections") - sections = serializers.IntegerField(source="lp_restored_data.num_sections") - subsections = serializers.IntegerField(source="lp_restored_data.num_subsections") - units = serializers.IntegerField(source="lp_restored_data.num_units") - - created_on_server = serializers.CharField(source="backup_metadata.original_server", required=False) - created_at = serializers.DateTimeField(source="backup_metadata.created_at", format=DATETIME_FORMAT) - created_by = serializers.SerializerMethodField() - - def get_created_by(self, obj): - """ - Get the user information of the archive creator, if available. - - The information is stored in the backup metadata of the archive and references - a user that may not exist in the system where the restore is being performed. - """ - username = obj["backup_metadata"].get("created_by") - email = obj["backup_metadata"].get("created_by_email") - return {"username": username, "email": email} - - -class LibraryRestoreTaskResultSerializer(serializers.Serializer): - """ - Serializer for the result of a library restore task. - """ - state = serializers.CharField() - result = RestoreSuccessDataSerializer(required=False, allow_null=True, default=None) - error = serializers.CharField(required=False, allow_blank=True, default=None) - error_log = serializers.FileField(source='error_log_url', allow_null=True, use_url=True, default=None) - - @classmethod - def from_task_status(cls, task_status, request): - """Build serializer input from task status object.""" - - # If the task did not complete, just return the state. - if task_status.state not in {UserTaskStatus.SUCCEEDED, UserTaskStatus.FAILED}: - return cls({ - "state": task_status.state, - }) - - artifact_name = LibraryRestoreTask.ARTIFACT_NAMES.get(task_status.state, '') - artifact = task_status.artifacts.filter(name=artifact_name).first() - - # If the task failed, include the log artifact if it exists - if task_status.state == UserTaskStatus.FAILED: - return cls({ - "state": UserTaskStatus.FAILED, - "error": "Library restore failed. See error log for details.", - "error_log_url": artifact.file if artifact else None, - }, context={'request': request}) - - if task_status.state == UserTaskStatus.SUCCEEDED: - input_data = { - "state": UserTaskStatus.SUCCEEDED, - } - try: - result = json.loads(artifact.text) if artifact else {} - input_data["result"] = result - except json.JSONDecodeError: - log.error("Failed to decode JSON from artifact (%s): %s", artifact.id, artifact.text) - input_data["error"] = f'Could not decode artifact JSON. Artifact Text: {artifact.text}' - - return cls(input_data) diff --git a/openedx/core/djangoapps/content_libraries/tasks.py b/openedx/core/djangoapps/content_libraries/tasks.py index c54779a8e621..b472126e8ce7 100644 --- a/openedx/core/djangoapps/content_libraries/tasks.py +++ b/openedx/core/djangoapps/content_libraries/tasks.py @@ -16,52 +16,38 @@ """ from __future__ import annotations -from io import StringIO import logging -import os -from datetime import datetime -from tempfile import mkdtemp, NamedTemporaryFile -import json -import shutil - -from django.core.files.base import ContentFile -from django.contrib.auth import get_user_model -from django.core.serializers.json import DjangoJSONEncoder -from django.conf import settings + from celery import shared_task -from celery.utils.log import get_task_logger from celery_utils.logged_task import LoggedTask -from django.core.files import File -from django.utils.text import slugify -from edx_django_utils.monitoring import ( - set_code_owner_attribute, - set_code_owner_attribute_from_module, - set_custom_attribute -) +from celery.utils.log import get_task_logger +from edx_django_utils.monitoring import set_code_owner_attribute, set_code_owner_attribute_from_module from opaque_keys.edx.keys import CourseKey from opaque_keys.edx.locator import ( BlockUsageLocator, LibraryCollectionLocator, LibraryContainerLocator, - LibraryLocatorV2 + LibraryLocatorV2, +) +from openedx_learning.api import authoring as authoring_api +from openedx_learning.api.authoring_models import DraftChangeLog, PublishLog +from openedx_events.content_authoring.data import ( + LibraryBlockData, + LibraryCollectionData, + LibraryContainerData, ) -from openedx_events.content_authoring.data import LibraryBlockData, LibraryCollectionData, LibraryContainerData from openedx_events.content_authoring.signals import ( LIBRARY_BLOCK_CREATED, LIBRARY_BLOCK_DELETED, - LIBRARY_BLOCK_PUBLISHED, LIBRARY_BLOCK_UPDATED, + LIBRARY_BLOCK_PUBLISHED, LIBRARY_COLLECTION_UPDATED, LIBRARY_CONTAINER_CREATED, LIBRARY_CONTAINER_DELETED, + LIBRARY_CONTAINER_UPDATED, LIBRARY_CONTAINER_PUBLISHED, - LIBRARY_CONTAINER_UPDATED ) -from openedx_learning.api import authoring as authoring_api -from openedx_learning.api.authoring import create_zip_file as create_lib_zip_file -from openedx_learning.api.authoring_models import DraftChangeLog, PublishLog -from path import Path -from user_tasks.models import UserTaskArtifact + from user_tasks.tasks import UserTask, UserTaskStatus from xblock.fields import Scope @@ -73,18 +59,12 @@ from xmodule.modulestore.exceptions import ItemNotFoundError from xmodule.modulestore.mixed import MixedModuleStore -from cms.djangoapps.contentstore.storage import course_import_export_storage - from . import api from .models import ContentLibraryBlockImportTask log = logging.getLogger(__name__) TASK_LOGGER = get_task_logger(__name__) -User = get_user_model() - -DATETIME_FORMAT = '%Y-%m-%dT%H:%M:%SZ' # Should match serializer format. Redefined to avoid circular import. - @shared_task(base=LoggedTask) @set_code_owner_attribute @@ -113,30 +93,18 @@ def send_events_after_publish(publish_log_pk: int, library_key_str: str) -> None usage_key = api.library_component_usage_key(library_key, record.entity.component) # Note that this item may be newly created, updated, or even deleted - but all we care about for this event # is that the published version is now different. Only for draft changes do we send differentiated events. - - # .. event_implemented_name: LIBRARY_BLOCK_PUBLISHED - # .. event_type: org.openedx.content_authoring.library_block.published.v1 LIBRARY_BLOCK_PUBLISHED.send_event( library_block=LibraryBlockData(library_key=library_key, usage_key=usage_key) ) # Publishing a container will auto-publish its children, but publishing a single component or all changes # in the library will NOT usually include any parent containers. But we do need to notify listeners that the # parent container(s) have changed, e.g. so the search index can update the "has_unpublished_changes" - for parent_container in api.get_containers_contains_item(usage_key): + for parent_container in api.get_containers_contains_component(usage_key): affected_containers.add(parent_container.container_key) # TODO: should this be a CONTAINER_CHILD_PUBLISHED event instead of CONTAINER_PUBLISHED ? elif hasattr(record.entity, "container"): container_key = api.library_container_locator(library_key, record.entity.container) affected_containers.add(container_key) - - try: - # We do need to notify listeners that the parent container(s) have changed, - # e.g. so the search index can update the "has_unpublished_changes" - for parent_container in api.get_containers_contains_item(container_key): - affected_containers.add(parent_container.container_key) - except api.ContentLibraryContainerNotFound: - # The deleted children remains in the entity, so, in this case, the container may not be found. - pass else: log.warning( f"PublishableEntity {record.entity.pk} / {record.entity.key} was modified during publish operation " @@ -144,8 +112,6 @@ def send_events_after_publish(publish_log_pk: int, library_key_str: str) -> None ) for container_key in affected_containers: - # .. event_implemented_name: LIBRARY_CONTAINER_PUBLISHED - # .. event_type: org.openedx.content_authoring.content_library.container.published.v1 LIBRARY_CONTAINER_PUBLISHED.send_event( library_container=LibraryContainerData(container_key=container_key) ) @@ -211,20 +177,11 @@ def send_events_after_revert(draft_change_log_id: int, library_key_str: str) -> event = LIBRARY_BLOCK_DELETED elif is_undeleted: event = LIBRARY_BLOCK_CREATED - - # .. event_implemented_name: LIBRARY_BLOCK_UPDATED - # .. event_type: org.openedx.content_authoring.library_block.updated.v1 - - # .. event_implemented_name: LIBRARY_BLOCK_DELETED - # .. event_type: org.openedx.content_authoring.library_block.deleted.v1 - - # .. event_implemented_name: LIBRARY_BLOCK_CREATED - # .. event_type: org.openedx.content_authoring.library_block.created.v1 event.send_event(library_block=LibraryBlockData(library_key=library_key, usage_key=usage_key)) # If any containers contain this component, their child list / component count may need to be updated # e.g. if this was a newly created component in the container and is now deleted, or this was deleted and # is now restored. - for parent_container in api.get_containers_contains_item(usage_key): + for parent_container in api.get_containers_contains_component(usage_key): updated_container_keys.add(parent_container.container_key) # TODO: do we also need to send CONTENT_OBJECT_ASSOCIATIONS_CHANGED for this component, or is @@ -254,8 +211,6 @@ def send_events_after_revert(draft_change_log_id: int, library_key_str: str) -> affected_collection_keys.add(collection_key) for container_key in deleted_container_keys: - # .. event_implemented_name: LIBRARY_CONTAINER_DELETED - # .. event_type: org.openedx.content_authoring.content_library.container.deleted.v1 LIBRARY_CONTAINER_DELETED.send_event( library_container=LibraryContainerData(container_key=container_key) ) @@ -263,22 +218,16 @@ def send_events_after_revert(draft_change_log_id: int, library_key_str: str) -> created_container_keys.discard(container_key) for container_key in created_container_keys: - # .. event_implemented_name: LIBRARY_CONTAINER_CREATED - # .. event_type: org.openedx.content_authoring.content_library.container.created.v1 LIBRARY_CONTAINER_CREATED.send_event( library_container=LibraryContainerData(container_key=container_key) ) for container_key in updated_container_keys: - # .. event_implemented_name: LIBRARY_CONTAINER_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.container.updated.v1 LIBRARY_CONTAINER_UPDATED.send_event( library_container=LibraryContainerData(container_key=container_key) ) for collection_key in affected_collection_keys: - # .. event_implemented_name: LIBRARY_COLLECTION_UPDATED - # .. event_type: org.openedx.content_authoring.content_library.collection.updated.v1 LIBRARY_COLLECTION_UPDATED.send_event( library_collection=LibraryCollectionData(collection_key=collection_key) ) @@ -506,186 +455,3 @@ def _copy_overrides( dest_block=store.get_item(dest_child_key), ) store.update_item(dest_block, user_id) - - -class LibraryBackupTask(UserTask): # pylint: disable=abstract-method - """ - Base class for tasks related with Library backup functionality. - """ - - @classmethod - def generate_name(cls, arguments_dict) -> str: - """ - Create a name for this particular backup task instance. - - Should be both: - a. semi human-friendly - b. something we can query in order to determine whether the library has a task in progress - - Arguments: - arguments_dict (dict): The arguments given to the task function - - Returns: - str: The generated name - """ - key = arguments_dict['library_key_str'] - return f'Backup of {key}' - - -@shared_task(base=LibraryBackupTask, bind=True) -# Note: The decorator @set_code_owner_attribute cannot be used here because the UserTaskMixin -# does stack inspection and can't handle additional decorators. -def backup_library(self, user_id: int, library_key_str: str) -> None: - """ - Export a library to a .zip archive and prepare it for download. - Possible Task states: - - Pending: Task is created but not started yet. - - Exporting: Task is running and the library is being exported. - - Succeeded: Task completed successfully and the exported file is available for download. - - Failed: Task failed and the export did not complete. - """ - ensure_cms("backup_library may only be executed in a CMS context") - set_code_owner_attribute_from_module(__name__) - library_key = LibraryLocatorV2.from_string(library_key_str) - - try: - self.status.set_state('Exporting') - set_custom_attribute("exporting_started", str(library_key)) - - root_dir = Path(mkdtemp()) - sanitized_lib_key = str(library_key).replace(":", "-") - sanitized_lib_key = slugify(sanitized_lib_key, allow_unicode=True) - timestamp = datetime.now().strftime("%Y-%m-%d-%H%M%S") - filename = f'{sanitized_lib_key}-{timestamp}.zip' - file_path = os.path.join(root_dir, filename) - user = User.objects.get(id=user_id) - origin_server = getattr(settings, 'CMS_BASE', None) - create_lib_zip_file(lp_key=str(library_key), path=file_path, user=user, origin_server=origin_server) - set_custom_attribute("exporting_completed", str(library_key)) - - with open(file_path, 'rb') as zipfile: - artifact = UserTaskArtifact(status=self.status, name='Output') - artifact.file.save(name=os.path.basename(zipfile.name), content=File(zipfile)) - artifact.save() - except Exception as exception: # pylint: disable=broad-except - TASK_LOGGER.exception('Error exporting library %s', library_key, exc_info=True) - if self.status.state != UserTaskStatus.FAILED: - self.status.fail({'raw_error_msg': str(exception)}) - - -class LibraryRestoreLoadError(Exception): - def __init__(self, message, logfile=None): - super().__init__(message) - self.logfile = logfile - - -class LibraryRestoreTask(UserTask): - """ - Base class for library restore tasks. - """ - - ARTIFACT_NAMES = { - UserTaskStatus.FAILED: 'Error log', - UserTaskStatus.SUCCEEDED: 'Library Restore', - } - - ERROR_LOG_ARTIFACT_NAME = 'Error log' - - @classmethod - def generate_name(cls, arguments_dict): - storage_path = arguments_dict['storage_path'] - return f'learning package restore of {storage_path}' - - def fail_with_error_log(self, logfile) -> None: - """ - Helper method to create an error log artifact and fail the task. - - Args: - logfile (io.StringIO): The error log content - """ - # Prepare the error log to be saved as a file - error_log_file = ContentFile(logfile.getvalue().encode("utf-8")) - - # Save the error log as an artifact - artifact = UserTaskArtifact(status=self.status, name=self.ERROR_LOG_ARTIFACT_NAME) - artifact.file.save(name=f'{self.status.task_id}-error.log', content=error_log_file) - artifact.save() - - self.status.fail(json.dumps({'error': 'Error(s) restoring learning package'})) - - def load_learning_package(self, storage_path, user): - """ - Load learning package from a backup file in storage. - - Args: - storage_path (str): The path to the backup file in storage - - Returns: - dict: The result of loading the learning package, including status and info - Raises: - LibraryRestoreLoadError: If there is an error loading the learning package - """ - # First ensure the backup file exists - if not course_import_export_storage.exists(storage_path): - raise LibraryRestoreLoadError(f'Uploaded file {storage_path} not found') - - # Temporarily copy the file locally, and then load the learning package from it - with NamedTemporaryFile(suffix=".zip") as tmp_file: - with course_import_export_storage.open(storage_path, "rb") as storage_file: - shutil.copyfileobj(storage_file, tmp_file) - tmp_file.flush() - - TASK_LOGGER.info('Restoring learning package from temporary file %s', tmp_file.name) - - result = authoring_api.load_learning_package(tmp_file.name, user=user) - - # If there was an error during the load, fail the task with the error log - if result.get("status") == "error": - raise LibraryRestoreLoadError( - "Error(s) loading learning package", - logfile=result.get("log_file_error") - ) - - return result - - -@shared_task(base=LibraryRestoreTask, bind=True) -def restore_library(self, user_id, storage_path): - """ - Restore a learning package from a backup file. - """ - ensure_cms("restore_library may only be executed in a CMS context") - set_code_owner_attribute_from_module(__name__) - - TASK_LOGGER.info('Starting restore of learning package from %s', storage_path) - - try: - # Load the learning package from the backup file - user = User.objects.get(id=user_id) - result = self.load_learning_package(storage_path, user=user) - learning_package_data = result.get("lp_restored_data", {}) - - TASK_LOGGER.info( - 'Restored learning package (id: %s) with key %s', - learning_package_data.get('id'), - learning_package_data.get('key') - ) - - # Save the restore details as an artifact in JSON format - restore_data = json.dumps(result, cls=DjangoJSONEncoder) - - UserTaskArtifact.objects.create( - status=self.status, - name=self.ARTIFACT_NAMES[UserTaskStatus.SUCCEEDED], - text=restore_data - ) - TASK_LOGGER.info('Finished restore of learning package from %s', storage_path) - - except Exception as exc: # pylint: disable=broad-except - TASK_LOGGER.exception('Error restoring learning package from %s', storage_path) - logfile = getattr(exc, 'logfile', StringIO("Unexpected error during library restore: " + str(exc))) - self.fail_with_error_log(logfile) - finally: - # Make sure to clean up the uploaded file from storage - course_import_export_storage.delete(storage_path) - TASK_LOGGER.info('Deleted uploaded file %s after restore', storage_path) diff --git a/openedx/core/djangoapps/content_libraries/tests/base.py b/openedx/core/djangoapps/content_libraries/tests/base.py index 9ccd33f942c2..e5a9f5f12ec9 100644 --- a/openedx/core/djangoapps/content_libraries/tests/base.py +++ b/openedx/core/djangoapps/content_libraries/tests/base.py @@ -32,23 +32,16 @@ URL_LIB_TEAM_USER = URL_LIB_TEAM + 'user/{username}/' # Add/edit/remove a user's permission to use this library URL_LIB_TEAM_GROUP = URL_LIB_TEAM + 'group/{group_name}/' # Add/edit/remove a group's permission to use this library URL_LIB_PASTE_CLIPBOARD = URL_LIB_DETAIL + 'paste_clipboard/' # Paste user clipboard (POST) containing Xblock data -URL_LIB_BACKUP = URL_LIB_DETAIL + 'backup/' # Start a backup task for this library -URL_LIB_BACKUP_GET = URL_LIB_BACKUP + '?{query_params}' # Get status on a backup task for this library -URL_LIB_RESTORE = URL_PREFIX + 'restore/' # Restore a library from a learning package backup file -URL_LIB_RESTORE_GET = URL_LIB_RESTORE + '?{query_params}' # Get status/result of a library restore task URL_LIB_BLOCK = URL_PREFIX + 'blocks/{block_key}/' # Get data about a block, or delete it URL_LIB_BLOCK_PUBLISH = URL_LIB_BLOCK + 'publish/' # Publish changes from a specified XBlock URL_LIB_BLOCK_OLX = URL_LIB_BLOCK + 'olx/' # Get or set the OLX of the specified XBlock URL_LIB_BLOCK_ASSETS = URL_LIB_BLOCK + 'assets/' # List the static asset files of the specified XBlock URL_LIB_BLOCK_ASSET_FILE = URL_LIB_BLOCK + 'assets/{file_name}' # Get, delete, or upload a specific static asset file -URL_LIB_BLOCK_HIERARCHY = URL_LIB_BLOCK + 'hierarchy/' # Get a library block's full hierarchy URL_LIB_CONTAINER = URL_PREFIX + 'containers/{container_key}/' # Get a container in this library -URL_LIB_CONTAINER_CHILDREN = URL_LIB_CONTAINER + 'children/' # Get, add or delete a component in this container -URL_LIB_CONTAINER_HIERARCHY = URL_LIB_CONTAINER + 'hierarchy/' # Get a container's full hierarchy +URL_LIB_CONTAINER_COMPONENTS = URL_LIB_CONTAINER + 'children/' # Get, add or delete a component in this container URL_LIB_CONTAINER_RESTORE = URL_LIB_CONTAINER + 'restore/' # Restore a deleted container URL_LIB_CONTAINER_COLLECTIONS = URL_LIB_CONTAINER + 'collections/' # Handle associated collections URL_LIB_CONTAINER_PUBLISH = URL_LIB_CONTAINER + 'publish/' # Publish changes to the specified container + children -URL_LIB_CONTAINER_COPY = URL_LIB_CONTAINER + 'copy/' # Copy the specified container to the clipboard URL_LIB_COLLECTION = URL_LIB_COLLECTIONS + '{collection_key}/' # Get a collection in this library URL_LIB_COLLECTION_ITEMS = URL_LIB_COLLECTION + 'items/' # Get a collection in this library @@ -141,21 +134,18 @@ def as_user(self, user): def _create_library( self, slug, title, description="", org=None, - license_type=ALL_RIGHTS_RESERVED, expect_response=200, learning_package=None + license_type=ALL_RIGHTS_RESERVED, expect_response=200, ): """ Create a library """ if org is None: org = self.organization.short_name - data = { + return self._api('post', URL_LIB_CREATE, { "org": org, "slug": slug, "title": title, "description": description, "license": license_type, - } - if learning_package is not None: - data["learning_package"] = learning_package - return self._api('post', URL_LIB_CREATE, data, expect_response) + }, expect_response) def _list_libraries(self, query_params_dict=None, expect_response=200): """ List libraries """ @@ -326,32 +316,6 @@ def _paste_clipboard_content_in_library(self, lib_key, expect_response=200): url = URL_LIB_PASTE_CLIPBOARD.format(lib_key=lib_key) return self._api('post', url, {}, expect_response) - def _start_library_backup_task(self, lib_key, expect_response=200): - """ Start a backup task for this library """ - url = URL_LIB_BACKUP.format(lib_key=lib_key) - return self._api('post', url, {}, expect_response) - - def _get_library_backup_task(self, lib_key, task_id, expect_response=200): - """ Get the status of a backup task for this library """ - query_params = urlencode({"task_id": task_id}) - url = URL_LIB_BACKUP_GET.format(lib_key=lib_key, query_params=query_params) - return self._api('get', url, None, expect_response) - - def _start_library_restore_task(self, file, expect_response=200): - """ Start a library restore task from a backup file """ - url = URL_LIB_RESTORE - data = {"file": file} - response = self.client.post(url, data, format='multipart') - assert response.status_code == expect_response, \ - f'Unexpected response code {response.status_code}:\n{getattr(response, "data", "(no data)")}' - return response.data - - def _get_library_restore_task(self, task_id, expect_response=200): - """ Get the status/result of a library restore task """ - query_params = urlencode({"task_id": task_id}) - url = URL_LIB_RESTORE_GET.format(query_params=query_params) - return self._api('get', url, None, expect_response) - def _render_block_view(self, block_key, view_name, version=None, expect_response=200): """ Render an XBlock's view in the active application's runtime. @@ -432,53 +396,53 @@ def _restore_container(self, container_key: ContainerKey | str, expect_response= """ Restore a deleted a container (unit etc.) """ return self._api('post', URL_LIB_CONTAINER_RESTORE.format(container_key=container_key), None, expect_response) - def _get_container_children(self, container_key: ContainerKey | str, expect_response=200): - """ Get container children""" + def _get_container_components(self, container_key: ContainerKey | str, expect_response=200): + """ Get container components""" return self._api( 'get', - URL_LIB_CONTAINER_CHILDREN.format(container_key=container_key), + URL_LIB_CONTAINER_COMPONENTS.format(container_key=container_key), None, expect_response ) - def _add_container_children( + def _add_container_components( self, container_key: ContainerKey | str, children_ids: list[str], expect_response=200, ): - """ Add container children""" + """ Add container components""" return self._api( 'post', - URL_LIB_CONTAINER_CHILDREN.format(container_key=container_key), + URL_LIB_CONTAINER_COMPONENTS.format(container_key=container_key), {'usage_keys': children_ids}, expect_response ) - def _remove_container_children( + def _remove_container_components( self, container_key: ContainerKey | str, children_ids: list[str], expect_response=200, ): - """ Remove container children""" + """ Remove container components""" return self._api( 'delete', - URL_LIB_CONTAINER_CHILDREN.format(container_key=container_key), + URL_LIB_CONTAINER_COMPONENTS.format(container_key=container_key), {'usage_keys': children_ids}, expect_response ) - def _patch_container_children( + def _patch_container_components( self, container_key: ContainerKey | str, children_ids: list[str], expect_response=200, ): - """ Update container children""" + """ Update container components""" return self._api( 'patch', - URL_LIB_CONTAINER_CHILDREN.format(container_key=container_key), + URL_LIB_CONTAINER_COMPONENTS.format(container_key=container_key), {'usage_keys': children_ids}, expect_response ) @@ -501,31 +465,6 @@ def _publish_container(self, container_key: ContainerKey | str, expect_response= """ Publish all changes in the specified container + children """ return self._api('post', URL_LIB_CONTAINER_PUBLISH.format(container_key=container_key), None, expect_response) - def _copy_container(self, container_key: ContainerKey | str, expect_response=200): - """ Copy the specified container to the clipboard """ - return self._api('post', URL_LIB_CONTAINER_COPY.format(container_key=container_key), None, expect_response) - - @staticmethod - def _hierarchy_member(obj) -> dict: - """ - Returns the subset of metadata fields used by the container hierarchy. - """ - return { - "id": obj["id"], - "display_name": obj["display_name"], - "has_unpublished_changes": obj["has_unpublished_changes"], - } - - def _get_block_hierarchy(self, block_key, expect_response=200): - """ Returns the hierarchy of containers that contain the given block """ - url = URL_LIB_BLOCK_HIERARCHY.format(block_key=block_key) - return self._api('get', url, None, expect_response) - - def _get_container_hierarchy(self, container_key, expect_response=200): - """ Returns the hierarchy of containers that contain and are contained by the given container """ - url = URL_LIB_CONTAINER_HIERARCHY.format(container_key=container_key) - return self._api('get', url, None, expect_response) - def _create_collection( self, lib_key: LibraryLocatorV2 | str, diff --git a/openedx/core/djangoapps/content_libraries/tests/test_api.py b/openedx/core/djangoapps/content_libraries/tests/test_api.py index 1c78597db970..3a1121da38c2 100644 --- a/openedx/core/djangoapps/content_libraries/tests/test_api.py +++ b/openedx/core/djangoapps/content_libraries/tests/test_api.py @@ -4,18 +4,15 @@ import base64 import hashlib -import uuid from unittest import mock from django.test import TestCase -from user_tasks.models import UserTaskStatus from opaque_keys.edx.keys import ( CourseKey, UsageKey, - UsageKeyV2, ) -from opaque_keys.edx.locator import LibraryContainerLocator, LibraryLocatorV2, LibraryUsageLocatorV2 +from opaque_keys.edx.locator import LibraryContainerLocator, LibraryLocatorV2 from openedx_events.content_authoring.data import ( ContentObjectChangedData, LibraryCollectionData, @@ -28,10 +25,8 @@ LIBRARY_COLLECTION_UPDATED, LIBRARY_CONTAINER_UPDATED, ) -from openedx_authz.api.users import get_user_role_assignments_in_scope from openedx_learning.api import authoring as authoring_api -from common.djangoapps.student.tests.factories import UserFactory from .. import api from ..models import ContentLibrary from .base import ContentLibrariesRestApiTest @@ -270,7 +265,7 @@ class ContentLibraryCollectionsTest(ContentLibrariesRestApiTest): Same guidelines as ContentLibrariesTestCase. """ - def setUp(self) -> None: + def setUp(self): super().setUp() # Create Content Libraries @@ -317,20 +312,12 @@ def setUp(self) -> None: "unit", 'unit-1', 'Unit 1' ) - # Create a subsection container - self.subsection1 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-1', - 'Subsection 1', - None, - ) # Create some library blocks in lib2 self.lib2_problem_block = self._add_block_to_library( self.lib2.library_key, "problem", "problem2", ) - def test_create_library_collection(self) -> None: + def test_create_library_collection(self): event_receiver = mock.Mock() LIBRARY_COLLECTION_CREATED.connect(event_receiver) @@ -347,8 +334,7 @@ def test_create_library_collection(self) -> None: assert collection.created_by == self.user assert event_receiver.call_count == 1 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_CREATED, "sender": None, @@ -359,9 +345,10 @@ def test_create_library_collection(self) -> None: ), ), }, + event_receiver.call_args_list[0].kwargs, ) - def test_create_library_collection_invalid_library(self) -> None: + def test_create_library_collection_invalid_library(self): library_key = LibraryLocatorV2.from_string("lib:INVALID:test-lib-does-not-exist") with self.assertRaises(api.ContentLibraryNotFound) as exc: api.create_library_collection( @@ -370,7 +357,7 @@ def test_create_library_collection_invalid_library(self) -> None: title="Collection 3", ) - def test_update_library_collection(self) -> None: + def test_update_library_collection(self): event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(event_receiver) @@ -385,8 +372,7 @@ def test_update_library_collection(self) -> None: assert self.col1.created_by == self.user assert event_receiver.call_count == 1 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_UPDATED, "sender": None, @@ -397,20 +383,20 @@ def test_update_library_collection(self) -> None: ), ), }, + event_receiver.call_args_list[0].kwargs, ) - def test_update_library_collection_wrong_library(self) -> None: + def test_update_library_collection_wrong_library(self): with self.assertRaises(api.ContentLibraryCollectionNotFound) as exc: api.update_library_collection( self.lib1.library_key, self.col2.key, ) - def test_delete_library_collection(self) -> None: + def test_delete_library_collection(self): event_receiver = mock.Mock() LIBRARY_COLLECTION_DELETED.connect(event_receiver) - assert self.lib1.learning_package_id is not None authoring_api.delete_collection( self.lib1.learning_package_id, self.col1.key, @@ -418,8 +404,7 @@ def test_delete_library_collection(self) -> None: ) assert event_receiver.call_count == 1 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_DELETED, "sender": None, @@ -430,17 +415,18 @@ def test_delete_library_collection(self) -> None: ), ), }, + event_receiver.call_args_list[0].kwargs, ) - def test_update_library_collection_items(self) -> None: + def test_update_library_collection_items(self): assert not list(self.col1.entities.all()) self.col1 = api.update_library_collection_items( self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), LibraryContainerLocator.from_string(self.unit1["id"]), ], ) @@ -450,13 +436,13 @@ def test_update_library_collection_items(self) -> None: self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), ], remove=True, ) assert len(self.col1.entities.all()) == 2 - def test_update_library_collection_components_event(self) -> None: + def test_update_library_collection_components_event(self): """ Check that a CONTENT_OBJECT_ASSOCIATIONS_CHANGED event is raised for each added/removed component. """ @@ -468,15 +454,14 @@ def test_update_library_collection_components_event(self) -> None: self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), LibraryContainerLocator.from_string(self.unit1["id"]), ], ) assert event_receiver.call_count == 4 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, "sender": None, @@ -485,9 +470,9 @@ def test_update_library_collection_components_event(self) -> None: changes=["collections"], ), }, + event_receiver.call_args_list[0].kwargs, ) - self.assertDictContainsEntries( - event_receiver.call_args_list[1].kwargs, + self.assertDictContainsSubset( { "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, "sender": None, @@ -496,9 +481,9 @@ def test_update_library_collection_components_event(self) -> None: changes=["collections"], ), }, + event_receiver.call_args_list[1].kwargs, ) - self.assertDictContainsEntries( - event_receiver.call_args_list[2].kwargs, + self.assertDictContainsSubset( { "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, "sender": None, @@ -507,9 +492,9 @@ def test_update_library_collection_components_event(self) -> None: changes=["collections"], ), }, + event_receiver.call_args_list[2].kwargs, ) - self.assertDictContainsEntries( - event_receiver.call_args_list[3].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_UPDATED, "sender": None, @@ -520,49 +505,45 @@ def test_update_library_collection_components_event(self) -> None: ), ), }, + event_receiver.call_args_list[3].kwargs, ) - def test_update_collection_components_from_wrong_library(self) -> None: + def test_update_collection_components_from_wrong_library(self): with self.assertRaises(api.ContentLibraryBlockNotFound) as exc: api.update_library_collection_items( self.lib2.library_key, self.col2.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), LibraryContainerLocator.from_string(self.unit1["id"]), ], ) assert self.lib1_problem_block["id"] in str(exc.exception) - def test_set_library_component_collections(self) -> None: + def test_set_library_component_collections(self): event_receiver = mock.Mock() CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_receiver) collection_update_event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(collection_update_event_receiver) assert not list(self.col2.entities.all()) - component = api.get_component_from_usage_key(UsageKeyV2.from_string(self.lib2_problem_block["id"])) + component = api.get_component_from_usage_key(UsageKey.from_string(self.lib2_problem_block["id"])) api.set_library_item_collections( library_key=self.lib2.library_key, entity_key=component.publishable_entity.key, collection_keys=[self.col2.key, self.col3.key], ) - assert self.lib2.learning_package_id is not None assert len(authoring_api.get_collection(self.lib2.learning_package_id, self.col2.key).entities.all()) == 1 assert len(authoring_api.get_collection(self.lib2.learning_package_id, self.col3.key).entities.all()) == 1 - - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=self.lib2_problem_block["id"], - changes=["collections"], - ), - }, - ) + assert { + "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, + "sender": None, + "content_object": ContentObjectChangedData( + object_id=self.lib2_problem_block["id"], + changes=["collections"], + ), + }.items() <= event_receiver.call_args_list[0].kwargs.items() assert len(collection_update_event_receiver.call_args_list) == 2 collection_update_events = [call.kwargs for call in collection_update_event_receiver.call_args_list] @@ -578,24 +559,23 @@ def test_set_library_component_collections(self) -> None: ) } - def test_delete_library_block(self) -> None: + def test_delete_library_block(self): api.update_library_collection_items( self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), ], ) event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(event_receiver) - api.delete_library_block(LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"])) + api.delete_library_block(UsageKey.from_string(self.lib1_problem_block["id"])) assert event_receiver.call_count == 1 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_UPDATED, "sender": None, @@ -607,34 +587,27 @@ def test_delete_library_block(self) -> None: background=True, ), }, + event_receiver.call_args_list[0].kwargs, ) - def test_delete_library_container(self) -> None: + def test_delete_library_container(self): api.update_library_collection_items( self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), LibraryContainerLocator.from_string(self.unit1["id"]), ], ) - # Add container under another container - api.update_container_children( - self.subsection1.container_key, - [LibraryContainerLocator.from_string(self.unit1["id"])], - None, - ) event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(event_receiver) - LIBRARY_CONTAINER_UPDATED.connect(event_receiver) api.delete_container(LibraryContainerLocator.from_string(self.unit1["id"])) - assert event_receiver.call_count == 2 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + assert event_receiver.call_count == 1 + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_UPDATED, "sender": None, @@ -646,37 +619,26 @@ def test_delete_library_container(self) -> None: background=True, ), }, - ) - self.assertDictContainsEntries( - event_receiver.call_args_list[1].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.subsection1.container_key, - background=False, - ) - }, + event_receiver.call_args_list[0].kwargs, ) - def test_restore_library_block(self) -> None: + def test_restore_library_block(self): api.update_library_collection_items( self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), ], ) event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(event_receiver) - api.restore_library_block(LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"])) + api.restore_library_block(UsageKey.from_string(self.lib1_problem_block["id"])) assert event_receiver.call_count == 1 - self.assertDictContainsEntries( - event_receiver.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_COLLECTION_UPDATED, "sender": None, @@ -688,9 +650,10 @@ def test_restore_library_block(self) -> None: background=True, ), }, + event_receiver.call_args_list[0].kwargs, ) - def test_add_component_and_revert(self) -> None: + def test_add_component_and_revert(self): # Publish changes api.publish_changes(self.lib1.library_key) @@ -704,8 +667,8 @@ def test_add_component_and_revert(self) -> None: self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]), - LibraryUsageLocatorV2.from_string(new_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]), + UsageKey.from_string(new_problem_block["id"]), ], ) @@ -715,21 +678,18 @@ def test_add_component_and_revert(self) -> None: api.revert_changes(self.lib1.library_key) assert collection_update_event_receiver.call_count == 1 - self.assertDictContainsEntries( - collection_update_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_COLLECTION_UPDATED, - "sender": None, - "library_collection": LibraryCollectionData( - collection_key=api.library_collection_locator( - self.lib1.library_key, - collection_key=self.col1.key, - ), + assert { + "signal": LIBRARY_COLLECTION_UPDATED, + "sender": None, + "library_collection": LibraryCollectionData( + collection_key=api.library_collection_locator( + self.lib1.library_key, + collection_key=self.col1.key, ), - }, - ) + ), + }.items() <= collection_update_event_receiver.call_args_list[0].kwargs.items() - def test_delete_component_and_revert(self) -> None: + def test_delete_component_and_revert(self): """ When a component is deleted and then the delete is reverted, signals will be emitted to update any containing collections. @@ -739,14 +699,14 @@ def test_delete_component_and_revert(self) -> None: self.lib1.library_key, self.col1.key, opaque_keys=[ - LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"]), - LibraryUsageLocatorV2.from_string(self.lib1_html_block["id"]) + UsageKey.from_string(self.lib1_problem_block["id"]), + UsageKey.from_string(self.lib1_html_block["id"]) ], ) api.publish_changes(self.lib1.library_key) # Delete component and revert - api.delete_library_block(LibraryUsageLocatorV2.from_string(self.lib1_problem_block["id"])) + api.delete_library_block(UsageKey.from_string(self.lib1_problem_block["id"])) collection_update_event_receiver = mock.Mock() LIBRARY_COLLECTION_UPDATED.connect(collection_update_event_receiver) @@ -754,19 +714,16 @@ def test_delete_component_and_revert(self) -> None: api.revert_changes(self.lib1.library_key) assert collection_update_event_receiver.call_count == 1 - self.assertDictContainsEntries( - collection_update_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_COLLECTION_UPDATED, - "sender": None, - "library_collection": LibraryCollectionData( - collection_key=api.library_collection_locator( - self.lib1.library_key, - collection_key=self.col1.key, - ), + assert { + "signal": LIBRARY_COLLECTION_UPDATED, + "sender": None, + "library_collection": LibraryCollectionData( + collection_key=api.library_collection_locator( + self.lib1.library_key, + collection_key=self.col1.key, ), - }, - ) + ), + }.items() <= collection_update_event_receiver.call_args_list[0].kwargs.items() class ContentLibraryContainersTest(ContentLibrariesRestApiTest): @@ -774,7 +731,7 @@ class ContentLibraryContainersTest(ContentLibrariesRestApiTest): Tests for Content Library API containers methods. """ - def setUp(self) -> None: + def setUp(self): super().setUp() # Create Content Libraries @@ -786,53 +743,17 @@ def setUp(self) -> None: # Create Units self.unit1 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-1', 'Unit 1', None) self.unit2 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-2', 'Unit 2', None) - self.unit3 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-3', 'Unit 3', None) - - # Create Subsections - self.subsection1 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-1', - 'Subsection 1', - None, - ) - self.subsection2 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-2', - 'Subsection 2', - None, - ) - - # Create Sections - self.section1 = api.create_container( - self.lib1.library_key, - api.ContainerType.Section, - 'section-1', - 'Section 1', - None, - ) - self.section2 = api.create_container( - self.lib1.library_key, - api.ContainerType.Section, - 'section-2', - 'Section 2', - None, - ) # Create XBlocks # Create some library blocks in lib1 self.problem_block = self._add_block_to_library( self.lib1.library_key, "problem", "problem1", ) - self.problem_block_usage_key = LibraryUsageLocatorV2.from_string(self.problem_block["id"]) - self.problem_block_2 = self._add_block_to_library( - self.lib1.library_key, "problem", "problem2", - ) + self.problem_block_usage_key = UsageKey.from_string(self.problem_block["id"]) self.html_block = self._add_block_to_library( self.lib1.library_key, "html", "html1", ) - self.html_block_usage_key = LibraryUsageLocatorV2.from_string(self.html_block["id"]) + self.html_block_usage_key = UsageKey.from_string(self.html_block["id"]) # Add content to units api.update_container_children( @@ -846,37 +767,9 @@ def setUp(self) -> None: None, ) - # Add units to subsections - api.update_container_children( - self.subsection1.container_key, - [self.unit1.container_key, self.unit2.container_key], - None, - ) - api.update_container_children( - self.subsection2.container_key, - [self.unit1.container_key], - None, - ) - - # Add subsections to sections - api.update_container_children( - self.section1.container_key, - [self.subsection1.container_key, self.subsection2.container_key], - None, - ) - api.update_container_children( - self.section2.container_key, - [self.subsection1.container_key], - None, - ) - - def test_get_containers_contains_item(self): - problem_block_containers = api.get_containers_contains_item(self.problem_block_usage_key) - html_block_containers = api.get_containers_contains_item(self.html_block_usage_key) - unit_1_containers = api.get_containers_contains_item(self.unit1.container_key) - unit_2_containers = api.get_containers_contains_item(self.unit2.container_key) - subsection_1_containers = api.get_containers_contains_item(self.subsection1.container_key) - subsection_2_containers = api.get_containers_contains_item(self.subsection2.container_key) + def test_get_containers_contains_component(self): + problem_block_containers = api.get_containers_contains_component(self.problem_block_usage_key) + html_block_containers = api.get_containers_contains_component(self.html_block_usage_key) assert len(problem_block_containers) == 1 assert problem_block_containers[0].container_key == self.unit1.container_key @@ -885,28 +778,13 @@ def test_get_containers_contains_item(self): assert html_block_containers[0].container_key == self.unit1.container_key assert html_block_containers[1].container_key == self.unit2.container_key - assert len(unit_1_containers) == 2 - assert unit_1_containers[0].container_key == self.subsection1.container_key - assert unit_1_containers[1].container_key == self.subsection2.container_key - - assert len(unit_2_containers) == 1 - assert unit_2_containers[0].container_key == self.subsection1.container_key - - assert len(subsection_1_containers) == 2 - assert subsection_1_containers[0].container_key == self.section1.container_key - assert subsection_1_containers[1].container_key == self.section2.container_key - - assert len(subsection_2_containers) == 1 - assert subsection_2_containers[0].container_key == self.section1.container_key - def _validate_calls_of_html_block(self, event_mock): """ Validate that the `event_mock` has been called twice using the `LIBRARY_CONTAINER_UPDATED` signal. """ assert event_mock.call_count == 2 - self.assertDictContainsEntries( - event_mock.call_args_list[0].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_CONTAINER_UPDATED, "sender": None, @@ -915,9 +793,9 @@ def _validate_calls_of_html_block(self, event_mock): background=True, ) }, + event_mock.call_args_list[0].kwargs, ) - self.assertDictContainsEntries( - event_mock.call_args_list[1].kwargs, + self.assertDictContainsSubset( { "signal": LIBRARY_CONTAINER_UPDATED, "sender": None, @@ -926,16 +804,17 @@ def _validate_calls_of_html_block(self, event_mock): background=True, ) }, + event_mock.call_args_list[1].kwargs, ) - def test_call_container_update_signal_when_delete_component(self) -> None: + def test_call_container_update_signal_when_delete_component(self): container_update_event_receiver = mock.Mock() LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) api.delete_library_block(self.html_block_usage_key) self._validate_calls_of_html_block(container_update_event_receiver) - def test_call_container_update_signal_when_restore_component(self) -> None: + def test_call_container_update_signal_when_restore_component(self): api.delete_library_block(self.html_block_usage_key) container_update_event_receiver = mock.Mock() @@ -944,7 +823,7 @@ def test_call_container_update_signal_when_restore_component(self) -> None: self._validate_calls_of_html_block(container_update_event_receiver) - def test_call_container_update_signal_when_update_olx(self) -> None: + def test_call_container_update_signal_when_update_olx(self): block_olx = "Hello world!" container_update_event_receiver = mock.Mock() LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) @@ -952,7 +831,7 @@ def test_call_container_update_signal_when_update_olx(self) -> None: self._set_library_block_olx(self.html_block_usage_key, block_olx) self._validate_calls_of_html_block(container_update_event_receiver) - def test_call_container_update_signal_when_update_component(self) -> None: + def test_call_container_update_signal_when_update_component(self): block_olx = "Hello world!" container_update_event_receiver = mock.Mock() LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) @@ -960,342 +839,19 @@ def test_call_container_update_signal_when_update_component(self) -> None: self._set_library_block_fields(self.html_block_usage_key, {"data": block_olx, "metadata": {}}) self._validate_calls_of_html_block(container_update_event_receiver) - def test_call_container_update_signal_when_update_unit(self) -> None: - container_update_event_receiver = mock.Mock() - LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) - self._update_container(self.unit1.container_key, 'New Unit Display Name') - - assert container_update_event_receiver.call_count == 3 - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.unit1.container_key, - ) - }, - ) - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[1].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.subsection1.container_key, - ) - }, - ) - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[2].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.subsection2.container_key, - ) - }, - ) - - def test_call_container_update_signal_when_update_subsection(self) -> None: - container_update_event_receiver = mock.Mock() - LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) - self._update_container(self.subsection1.container_key, 'New Subsection Display Name') - - assert container_update_event_receiver.call_count == 3 - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.subsection1.container_key, - ) - }, - ) - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[1].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.section1.container_key, - ) - }, - ) - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[2].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.section2.container_key, - ) - }, - ) - - def test_call_container_update_signal_when_update_section(self) -> None: - container_update_event_receiver = mock.Mock() - LIBRARY_CONTAINER_UPDATED.connect(container_update_event_receiver) - self._update_container(self.section1.container_key, 'New Section Display Name') - - assert container_update_event_receiver.call_count == 1 - self.assertDictContainsEntries( - container_update_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.section1.container_key, - ) - }, - ) - - def test_call_object_changed_signal_when_remove_component(self) -> None: - html_block_1 = self._add_block_to_library( - self.lib1.library_key, "html", "html3", - ) - api.update_container_children( - self.unit2.container_key, - [LibraryUsageLocatorV2.from_string(html_block_1["id"])], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - api.update_container_children( - self.unit2.container_key, - [LibraryUsageLocatorV2.from_string(html_block_1["id"])], - None, - entities_action=authoring_api.ChildrenEntitiesAction.REMOVE, - ) - - assert event_reciver.call_count == 1 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=html_block_1["id"], - changes=["units"], - ), - }, - ) - - def test_call_object_changed_signal_when_remove_unit(self) -> None: - unit4 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-4', 'Unit 4', None) - - api.update_container_children( - self.subsection2.container_key, - [unit4.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - api.update_container_children( - self.subsection2.container_key, - [unit4.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.REMOVE, - ) - - assert event_reciver.call_count == 1 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(unit4.container_key), - changes=["subsections"], - ), - }, - ) - - def test_call_object_changed_signal_when_remove_subsection(self) -> None: - subsection3 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-3', - 'Subsection 3', - None, - ) - - api.update_container_children( - self.section2.container_key, - [subsection3.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - api.update_container_children( - self.section2.container_key, - [subsection3.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.REMOVE, - ) - - assert event_reciver.call_count == 1 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(subsection3.container_key), - changes=["sections"], - ), - }, - ) - - def test_call_object_changed_signal_when_add_component(self) -> None: - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - html_block_1 = self._add_block_to_library( - self.lib1.library_key, "html", "html4", - ) - html_block_2 = self._add_block_to_library( - self.lib1.library_key, "html", "html5", - ) - - api.update_container_children( - self.unit2.container_key, - [ - LibraryUsageLocatorV2.from_string(html_block_1["id"]), - LibraryUsageLocatorV2.from_string(html_block_2["id"]) - ], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - - assert event_reciver.call_count == 2 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=html_block_1["id"], - changes=["units"], - ), - }, - ) - self.assertDictContainsEntries( - event_reciver.call_args_list[1].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=html_block_2["id"], - changes=["units"], - ), - }, - ) - - def test_call_object_changed_signal_when_add_unit(self) -> None: - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - - unit4 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-4', 'Unit 4', None) - unit5 = api.create_container(self.lib1.library_key, api.ContainerType.Unit, 'unit-5', 'Unit 5', None) - - api.update_container_children( - self.subsection2.container_key, - [unit4.container_key, unit5.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - assert event_reciver.call_count == 2 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(unit4.container_key), - changes=["subsections"], - ), - }, - ) - self.assertDictContainsEntries( - event_reciver.call_args_list[1].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(unit5.container_key), - changes=["subsections"], - ), - }, - ) - - def test_call_object_changed_signal_when_add_subsection(self) -> None: - event_reciver = mock.Mock() - CONTENT_OBJECT_ASSOCIATIONS_CHANGED.connect(event_reciver) - - subsection3 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-3', - 'Subsection 3', - None, - ) - subsection4 = api.create_container( - self.lib1.library_key, - api.ContainerType.Subsection, - 'subsection-4', - 'Subsection 4', - None, - ) - api.update_container_children( - self.section2.container_key, - [subsection3.container_key, subsection4.container_key], - None, - entities_action=authoring_api.ChildrenEntitiesAction.APPEND, - ) - assert event_reciver.call_count == 2 - self.assertDictContainsEntries( - event_reciver.call_args_list[0].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(subsection3.container_key), - changes=["sections"], - ), - }, - ) - self.assertDictContainsEntries( - event_reciver.call_args_list[1].kwargs, - { - "signal": CONTENT_OBJECT_ASSOCIATIONS_CHANGED, - "sender": None, - "content_object": ContentObjectChangedData( - object_id=str(subsection4.container_key), - changes=["sections"], - ), - }, - ) - - def test_delete_component_and_revert(self) -> None: + def test_delete_component_and_revert(self): """ When a component is deleted and then the delete is reverted, signals will be emitted to update any containing containers. """ # Add components and publish - api.update_container_children(self.unit3.container_key, [ - LibraryUsageLocatorV2.from_string(self.problem_block_2["id"]), + api.update_container_children(self.unit1.container_key, [ + UsageKey.from_string(self.problem_block["id"]), ], user_id=None) api.publish_changes(self.lib1.library_key) # Delete component and revert - api.delete_library_block(LibraryUsageLocatorV2.from_string(self.problem_block_2["id"])) + api.delete_library_block(UsageKey.from_string(self.problem_block["id"])) container_event_receiver = mock.Mock() LIBRARY_CONTAINER_UPDATED.connect(container_event_receiver) @@ -1303,304 +859,8 @@ def test_delete_component_and_revert(self) -> None: api.revert_changes(self.lib1.library_key) assert container_event_receiver.call_count == 1 - self.assertDictContainsEntries( - container_event_receiver.call_args_list[0].kwargs, - { - "signal": LIBRARY_CONTAINER_UPDATED, - "sender": None, - "library_container": LibraryContainerData( - container_key=self.unit3.container_key - ), - }, - ) - - def test_copy_and_paste_container_same_library(self) -> None: - # Copy a section with children - api.copy_container(self.section1.container_key, self.user.id) - # Paste the container - new_container: api.ContainerMetadata = ( - api.import_staged_content_from_user_clipboard(self.lib1.library_key, self.user) # type: ignore[assignment] - ) - - # Verify that the container is copied - assert new_container.container_type == self.section1.container_type - assert new_container.display_name == self.section1.display_name - - # Verify that the children are linked - subsections = api.get_container_children(new_container.container_key) - assert len(subsections) == 2 - assert isinstance(subsections[0], api.ContainerMetadata) - assert subsections[0].container_key == self.subsection1.container_key - assert isinstance(subsections[1], api.ContainerMetadata) - assert subsections[1].container_key == self.subsection2.container_key - - def test_copy_and_paste_container_another_library(self) -> None: - # Copy a section with children - api.copy_container(self.section1.container_key, self.user.id) - - self._create_library("test-lib-cont-2", "Test Library 2") - lib2 = ContentLibrary.objects.get(slug="test-lib-cont-2") - # Paste the container - new_container: api.ContainerMetadata = ( - api.import_staged_content_from_user_clipboard(lib2.library_key, self.user) # type: ignore[assignment] - ) - - # Verify that the container is copied - assert new_container.container_type == self.section1.container_type - assert new_container.display_name == self.section1.display_name - - # Verify that the children are copied - subsections = api.get_container_children(new_container.container_key) - assert len(subsections) == 2 - assert isinstance(subsections[0], api.ContainerMetadata) - assert subsections[0].container_key != self.subsection1.container_key # This subsection was copied - assert subsections[0].display_name == self.subsection1.display_name - units_subsection1 = api.get_container_children(subsections[0].container_key) - assert len(units_subsection1) == 2 - assert isinstance(units_subsection1[0], api.ContainerMetadata) - assert units_subsection1[0].container_key != self.unit1.container_key # This unit was copied - assert units_subsection1[0].display_name == self.unit1.display_name == "Unit 1" - unit1_components = api.get_container_children(units_subsection1[0].container_key) - assert len(unit1_components) == 2 - assert isinstance(unit1_components[0], api.LibraryXBlockMetadata) - assert unit1_components[0].usage_key != self.problem_block_usage_key # This component was copied - assert isinstance(unit1_components[1], api.LibraryXBlockMetadata) - assert unit1_components[1].usage_key != self.html_block_usage_key # This component was copied - - assert isinstance(units_subsection1[1], api.ContainerMetadata) - assert units_subsection1[1].container_key != self.unit2.container_key # This unit was copied - assert units_subsection1[1].display_name == self.unit2.display_name == "Unit 2" - unit2_components = api.get_container_children(units_subsection1[1].container_key) - assert len(unit2_components) == 1 - assert isinstance(unit2_components[0], api.LibraryXBlockMetadata) - assert unit2_components[0].usage_key != self.html_block_usage_key - - # This is the same component, so it should not be duplicated - assert unit1_components[1].usage_key == unit2_components[0].usage_key - - assert isinstance(subsections[1], api.ContainerMetadata) - assert subsections[1].container_key != self.subsection2.container_key # This subsection was copied - assert subsections[1].display_name == self.subsection2.display_name - units_subsection2 = api.get_container_children(subsections[1].container_key) - assert len(units_subsection2) == 1 - assert isinstance(units_subsection2[0], api.ContainerMetadata) - assert units_subsection2[0].container_key != self.unit1.container_key # This unit was copied - assert units_subsection2[0].display_name == self.unit1.display_name - - # This is the same unit, so it should not be duplicated - assert units_subsection1[0].container_key == units_subsection2[0].container_key - - -class ContentLibraryExportTest(ContentLibrariesRestApiTest): - """ - Tests for Content Library API export methods. - """ - - def setUp(self) -> None: - super().setUp() - - # Create Content Libraries - self._create_library("test-lib-exp-1", "Test Library Export 1") - - # Fetch the created ContentLibrary objects so we can access their learning_package.id - self.lib1 = ContentLibrary.objects.get(slug="test-lib-exp-1") - self.wrong_task_id = '11111111-1111-1111-1111-111111111111' - - def test_get_backup_task_status_no_task(self) -> None: - status = api.get_backup_task_status(self.user.id, "") - assert status is None - - def test_get_backup_task_status_wrong_task_id(self) -> None: - status = api.get_backup_task_status(self.user.id, task_id=self.wrong_task_id) - assert status is None - - def test_get_backup_task_status_in_progress(self) -> None: - # Create a mock UserTaskStatus in IN_PROGRESS state - task_id = str(uuid.uuid4()) - mock_task = UserTaskStatus( - task_id=task_id, - user_id=self.user.id, - name=f"Export of {self.lib1.library_key}", - state=UserTaskStatus.IN_PROGRESS - ) - - with mock.patch( - 'openedx.core.djangoapps.content_libraries.api.libraries.UserTaskStatus.objects.get' - ) as mock_get: - mock_get.return_value = mock_task - - status = api.get_backup_task_status(self.user.id, task_id=task_id) - assert status is not None - assert status['state'] == UserTaskStatus.IN_PROGRESS - assert status['file'] is None - - def test_get_backup_task_status_succeeded(self) -> None: - # Create a mock UserTaskStatus in SUCCEEDED state - task_id = str(uuid.uuid4()) - mock_task = UserTaskStatus( - task_id=task_id, - user_id=self.user.id, - name=f"Export of {self.lib1.library_key}", - state=UserTaskStatus.SUCCEEDED - ) - - # Create a mock UserTaskArtifact - mock_artifact = mock.Mock() - mock_artifact.file.url = "/media/user_tasks/2025/10/01/library-libOEXCSPROB_mOw1rPL.zip" - - with mock.patch( - 'openedx.core.djangoapps.content_libraries.api.libraries.UserTaskStatus.objects.get' - ) as mock_get, mock.patch( - 'openedx.core.djangoapps.content_libraries.api.libraries.UserTaskArtifact.objects.get' - ) as mock_artifact_get: - - mock_get.return_value = mock_task - mock_artifact_get.return_value = mock_artifact - - status = api.get_backup_task_status(self.user.id, task_id=task_id) - assert status is not None - assert status['state'] == UserTaskStatus.SUCCEEDED - assert status['file'].url == "/media/user_tasks/2025/10/01/library-libOEXCSPROB_mOw1rPL.zip" - - def test_get_backup_task_status_failed(self) -> None: - # Create a mock UserTaskStatus in FAILED state - task_id = str(uuid.uuid4()) - mock_task = UserTaskStatus( - task_id=task_id, - user_id=self.user.id, - name=f"Export of {self.lib1.library_key}", - state=UserTaskStatus.FAILED - ) - - with mock.patch( - 'openedx.core.djangoapps.content_libraries.api.libraries.UserTaskStatus.objects.get' - ) as mock_get: - mock_get.return_value = mock_task - - status = api.get_backup_task_status(self.user.id, task_id=task_id) - assert status is not None - assert status['state'] == UserTaskStatus.FAILED - assert status['file'] is None - - -class ContentLibraryAuthZRoleAssignmentTest(ContentLibrariesRestApiTest): - """ - Tests for Content Library role assignment via the AuthZ Authorization Framework. - - These tests verify that library roles are correctly assigned to users through - the openedx-authz (AuthZ) Authorization Framework when libraries are created or when - explicit role assignments are made. - - See: https://github.com/openedx/openedx-authz/ - """ - - def setUp(self) -> None: - super().setUp() - - # Create Content Libraries - self._create_library("test-lib-role-1", "Test Library Role 1") - - # Fetch the created ContentLibrary objects so we can access their learning_package.id - self.lib1 = ContentLibrary.objects.get(slug="test-lib-role-1") - - def test_assign_library_admin_role_to_user_via_authz(self) -> None: - """ - Test assigning a library admin role to a user via the AuthZ Authorization Framework. - - This test verifies that the openedx-authz Authorization Framework correctly - assigns the library_admin role to a user when explicitly called. - """ - api.assign_library_role_to_user(self.lib1.library_key, self.user, api.AccessLevel.ADMIN_LEVEL) - - roles = get_user_role_assignments_in_scope(self.user.username, str(self.lib1.library_key)) - assert len(roles) == 1 - assert "library_admin" in repr(roles[0].roles[0]) - - def test_assign_library_author_role_to_user_via_authz(self) -> None: - """ - Test assigning a library author role to a user via the AuthZ Authorization Framework. - - This test verifies that the openedx-authz Authorization Framework correctly - assigns the library_author role to a user when explicitly called. - """ - # Create a new user to avoid conflicts with roles assigned during library creation - author_user = UserFactory.create(username="Author", email="author@example.com") - - api.assign_library_role_to_user(self.lib1.library_key, author_user, api.AccessLevel.AUTHOR_LEVEL) - - roles = get_user_role_assignments_in_scope(author_user.username, str(self.lib1.library_key)) - assert len(roles) == 1 - assert "library_author" in repr(roles[0].roles[0]) - - @mock.patch("openedx.core.djangoapps.content_libraries.api.libraries.assign_role_to_user_in_scope") - def test_library_creation_assigns_admin_role_via_authz( - self, - mock_assign_role - ) -> None: - """ - Test that creating a library via REST API assigns admin role via AuthZ. - - This test verifies that when a library is created via the REST API, - the creator is automatically assigned the library_admin role through - the openedx-authz Authorization Framework. - """ - mock_assign_role.return_value = True - - # Create a new library (this should trigger role assignment in the REST API) - self._create_library("test-lib-role-2", "Test Library Role 2") - - # Verify that assign_role_to_user_in_scope was called - mock_assign_role.assert_called_once() - call_args = mock_assign_role.call_args - assert call_args[0][0] == self.user.username # username - assert call_args[0][1] == "library_admin" # role - assert "test-lib-role-2" in call_args[0][2] # library_key (contains slug) - - @mock.patch("openedx.core.djangoapps.content_libraries.api.libraries.assign_role_to_user_in_scope") - def test_library_creation_handles_authz_failure_gracefully( - self, - mock_assign_role - ) -> None: - """ - Test that library creation succeeds even if AuthZ role assignment fails. - - This test verifies that if the openedx-authz Authorization Framework fails to assign - a role (returns False), the library creation still succeeds. This ensures that - the system degrades gracefully and doesn't break library creation if there are - issues with the Authorization Framework. - """ - # Simulate openedx-authz failing to assign the role - mock_assign_role.return_value = False - - # Library creation should still succeed - result = self._create_library("test-lib-role-3", "Test Library Role 3") - assert result is not None - assert result["slug"] == "test-lib-role-3" - - # Verify that the library was created successfully - lib3 = ContentLibrary.objects.get(slug="test-lib-role-3") - assert lib3 is not None - assert lib3.slug == "test-lib-role-3" - - @mock.patch("openedx.core.djangoapps.content_libraries.api.libraries.assign_role_to_user_in_scope") - def test_library_creation_handles_authz_exception( - self, - mock_assign_role - ) -> None: - """ - Test that library creation succeeds even if AuthZ raises an exception. - - This test verifies that if the openedx-authz Authorization Framework raises an - exception during role assignment, the library creation still succeeds. This ensures - robust error handling when the Authorization Framework is unavailable or misconfigured. - """ - # Simulate openedx-authz raising an exception for unknown issues - mock_assign_role.side_effect = Exception("AuthZ unavailable") - - # Library creation should still succeed (the exception should be caught/handled) - # Note: Currently, the code doesn't catch this exception, so we expect it to propagate. - # This test documents the current behavior and can be updated if error handling is added. - with self.assertRaises(Exception) as context: - self._create_library("test-lib-role-4", "Test Library Role 4") - - assert "AuthZ unavailable" in str(context.exception) + assert { + "signal": LIBRARY_CONTAINER_UPDATED, + "sender": None, + "library_container": LibraryContainerData(container_key=self.unit1.container_key), + }.items() <= container_event_receiver.call_args_list[0].kwargs.items() diff --git a/openedx/core/djangoapps/content_libraries/tests/test_containers.py b/openedx/core/djangoapps/content_libraries/tests/test_containers.py index 7e6eac3beda8..6c59c8c086e4 100644 --- a/openedx/core/djangoapps/content_libraries/tests/test_containers.py +++ b/openedx/core/djangoapps/content_libraries/tests/test_containers.py @@ -2,7 +2,6 @@ Tests for Learning-Core-based Content Libraries """ from datetime import datetime, timezone -import textwrap import ddt from freezegun import freeze_time @@ -12,7 +11,6 @@ from common.djangoapps.student.tests.factories import UserFactory from openedx.core.djangoapps.content_libraries import api from openedx.core.djangoapps.content_libraries.tests.base import ContentLibrariesRestApiTest -from openedx.core.djangoapps.content_tagging import api as tagging_api from openedx.core.djangolib.testing.utils import skip_unless_cms @@ -37,144 +35,23 @@ class ContainersTestCase(ContentLibrariesRestApiTest): break any tests, but backwards-incompatible API changes will. """ - def setUp(self) -> None: - super().setUp() - self.create_date = datetime(2024, 9, 8, 7, 6, 5, tzinfo=timezone.utc) - self.modified_date = datetime(2024, 10, 9, 8, 7, 6, tzinfo=timezone.utc) - self.lib = self._create_library( - slug="containers", - title="Container Test Library", - description="Units and more", - ) - self.lib_key = LibraryLocatorV2.from_string(self.lib["id"]) - - self.taxonomy = tagging_api.create_taxonomy('New Taxonomy') - tagging_api.set_taxonomy_orgs(self.taxonomy, all_orgs=True) - tagging_api.add_tag_to_taxonomy(self.taxonomy, "one") - tagging_api.add_tag_to_taxonomy(self.taxonomy, "two") - tagging_api.add_tag_to_taxonomy(self.taxonomy, "three") - tagging_api.add_tag_to_taxonomy(self.taxonomy, "four") - - # Create containers - with freeze_time(self.create_date): - # Unit - self.unit = self._create_container(self.lib["id"], "unit", display_name="Alpha Bravo", slug=None) - self.unit_with_components = self._create_container( - self.lib["id"], - "unit", - display_name="Alpha Charly", - slug=None, - ) - self.unit_2 = self._create_container(self.lib["id"], "unit", display_name="Test Unit 2", slug=None) - self.unit_3 = self._create_container(self.lib["id"], "unit", display_name="Test Unit 3", slug=None) - - # Subsection - self.subsection = self._create_container( - self.lib["id"], - "subsection", - display_name="Subsection Alpha", - slug=None, - ) - self.subsection_with_units = self._create_container( - self.lib["id"], - "subsection", - display_name="Subsection with units", - slug=None, - ) - self.subsection_2 = self._create_container( - self.lib["id"], - "subsection", - display_name="Test Subsection 2", - slug=None, - ) - self.subsection_3 = self._create_container( - self.lib["id"], - "subsection", - display_name="Test Subsection 3", - slug=None, - ) - - # Section - self.section = self._create_container(self.lib["id"], "section", display_name="Section Alpha", slug=None) - self.section_with_subsections = self._create_container( - self.lib["id"], - "section", - display_name="Section with subsections", - slug=None, - ) - - # Create blocks - self.problem_block = self._add_block_to_library(self.lib["id"], "problem", "Problem1", can_stand_alone=False) - self.html_block = self._add_block_to_library(self.lib["id"], "html", "Html1", can_stand_alone=False) - self.problem_block_2 = self._add_block_to_library(self.lib["id"], "problem", "Problem2", can_stand_alone=False) - self.html_block_2 = self._add_block_to_library(self.lib["id"], "html", "Html2") - - with freeze_time(self.modified_date): - # Add components to `unit_with_components` - self._add_container_children( - self.unit_with_components["id"], - children_ids=[ - self.problem_block["id"], - self.html_block["id"], - self.problem_block_2["id"], - self.html_block_2["id"], - ], - ) - # Refetch to update modified dates - self.unit_with_components = self._get_container(self.unit_with_components["id"]) - - # Add units to `subsection_with_units` - self._add_container_children( - self.subsection_with_units["id"], - children_ids=[ - self.unit["id"], - self.unit_with_components["id"], - self.unit_2["id"], - self.unit_3["id"], - ], - ) - # Refetch to update modified dates - self.subsection_with_units = self._get_container(self.subsection_with_units["id"]) - - # Add subsections to `section_with_subsections` - self._add_container_children( - self.section_with_subsections["id"], - children_ids=[ - self.subsection["id"], - self.subsection_with_units["id"], - self.subsection_2["id"], - self.subsection_3["id"], - ], - ) - # Refetch to update modified dates - self.section_with_subsections = self._get_container(self.section_with_subsections["id"]) - - @ddt.data( - ("unit", "u1", "Test Unit"), - ("subsection", "subs1", "Test Subsection"), - ("section", "s1", "Test Section"), - ) - @ddt.unpack - def test_container_crud(self, container_type, slug, display_name) -> None: + def test_unit_crud(self): """ - Test Create, Read, Update, and Delete of a Containers + Test Create, Read, Update, and Delete of a Unit """ - # Create container: + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) + + # Create a unit: create_date = datetime(2024, 9, 8, 7, 6, 5, tzinfo=timezone.utc) with freeze_time(create_date): - container_data = self._create_container( - self.lib["id"], - container_type, - slug=slug, - display_name=display_name - ) - container_id = f"lct:CL-TEST:containers:{container_type}:{slug}" + container_data = self._create_container(lib["id"], "unit", slug="u1", display_name="Test Unit") expected_data = { - "id": container_id, - "container_type": container_type, - "display_name": display_name, + "id": "lct:CL-TEST:containers:unit:u1", + "container_type": "unit", + "display_name": "Test Unit", "last_published": None, - "published_by": None, + "published_by": "", "last_draft_created": "2024-09-08T07:06:05Z", "last_draft_created_by": 'Bob', 'has_unpublished_changes': True, @@ -185,108 +62,94 @@ def test_container_crud(self, container_type, slug, display_name) -> None: self.assertDictContainsEntries(container_data, expected_data) - # Fetch the container: - container_as_read = self._get_container(container_data["id"]) + # Fetch the unit: + unit_as_read = self._get_container(container_data["id"]) # make sure it contains the same data when we read it back: - self.assertDictContainsEntries(container_as_read, expected_data) + self.assertDictContainsEntries(unit_as_read, expected_data) - # Update the container: + # Update the unit: modified_date = datetime(2024, 10, 9, 8, 7, 6, tzinfo=timezone.utc) with freeze_time(modified_date): - container_data = self._update_container(container_id, display_name=f"New Display Name for {container_type}") - expected_data["last_draft_created"] = expected_data["modified"] = "2024-10-09T08:07:06Z" - expected_data["display_name"] = f"New Display Name for {container_type}" + container_data = self._update_container("lct:CL-TEST:containers:unit:u1", display_name="Unit ABC") + expected_data['last_draft_created'] = expected_data['modified'] = '2024-10-09T08:07:06Z' + expected_data['display_name'] = 'Unit ABC' self.assertDictContainsEntries(container_data, expected_data) - # Re-fetch the container - container_as_re_read = self._get_container(container_data["id"]) + # Re-fetch the unit + unit_as_re_read = self._get_container(container_data["id"]) # make sure it contains the same data when we read it back: - self.assertDictContainsEntries(container_as_re_read, expected_data) + self.assertDictContainsEntries(unit_as_re_read, expected_data) - # Delete the container + # Delete the unit self._delete_container(container_data["id"]) self._get_container(container_data["id"], expect_response=404) - @ddt.data( - ("unit", "u2", "Test Unit"), - ("subsection", "subs2", "Test Subsection"), - ("section", "s2", "Test Section"), - ) - @ddt.unpack - def test_container_permissions(self, container_type, slug, display_name) -> None: + def test_unit_permissions(self): """ - Test that a regular user with read-only permissions on the library cannot create, update, or delete containers. + Test that a regular user with read-only permissions on the library cannot create, update, or delete units. """ - container_data = self._create_container(self.lib["id"], container_type, slug=slug, display_name=display_name) + lib = self._create_library(slug="containers2", title="Container Test Library 2", description="Unit permissions") + container_data = self._create_container(lib["id"], "unit", slug="u2", display_name="Test Unit") random_user = UserFactory.create(username="Random", email="random@example.com") with self.as_user(random_user): - self._create_container( - self.lib["id"], - container_type, - slug="new_slug", - display_name=display_name, - expect_response=403, - ) + self._create_container(lib["id"], "unit", slug="u3", display_name="Test Unit", expect_response=403) self._get_container(container_data["id"], expect_response=403) - self._update_container(container_data["id"], display_name="New Display Name", expect_response=403) + self._update_container(container_data["id"], display_name="Unit ABC", expect_response=403) self._delete_container(container_data["id"], expect_response=403) # Granting read-only permissions on the library should only allow retrieval, nothing else. - self._add_user_by_email(self.lib["id"], random_user.email, access_level="read") + self._add_user_by_email(lib["id"], random_user.email, access_level="read") with self.as_user(random_user): - self._create_container( - self.lib["id"], - container_type, - slug=slug, - display_name=display_name, - expect_response=403, - ) + self._create_container(lib["id"], "unit", slug="u2", display_name="Test Unit", expect_response=403) self._get_container(container_data["id"], expect_response=200) - self._update_container(container_data["id"], display_name="New Display Name", expect_response=403) + self._update_container(container_data["id"], display_name="Unit ABC", expect_response=403) self._delete_container(container_data["id"], expect_response=403) - @ddt.data( - ("unit", "Alpha Bravo", "lct:CL-TEST:containers:unit:alpha-bravo-"), - ("subsection", "Subsection Alpha", "lct:CL-TEST:containers:subsection:subsection-alpha-"), - ("section", "Section Alpha", "lct:CL-TEST:containers:section:section-alpha-"), - ) - @ddt.unpack - def test_containers_gets_auto_slugs(self, container_type, display_name, expected_id) -> None: + def test_unit_gets_auto_slugs(self): """ - Test that we can create containers by specifying only a title, and they get + Test that we can create units by specifying only a title, and they get unique slugs assigned automatically. """ - container_1 = getattr(self, container_type) - container_2 = self._create_container(self.lib["id"], container_type, display_name=display_name, slug=None) + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") - assert container_1["id"].startswith(expected_id) - assert container_2["id"].startswith(expected_id) - assert container_1["id"] != container_2["id"] + # Create two units, specifying their titles but not their slugs/keys: + container1_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) + container2_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) + # Notice the container IDs below are slugified from the title: "alpha-bravo-NNNNN" + assert container1_data["id"].startswith("lct:CL-TEST:containers:unit:alpha-bravo-") + assert container2_data["id"].startswith("lct:CL-TEST:containers:unit:alpha-bravo-") + assert container1_data["id"] != container2_data["id"] - def test_unit_add_children(self) -> None: + def test_unit_add_children(self): """ Test that we can add and get unit children components """ - # Add some components - self._add_container_children( - self.unit["id"], - children_ids=[self.problem_block["id"], self.html_block["id"]] - ) - data = self._get_container_children(self.unit["id"]) + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) + + # Create container and add some components + container_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) + problem_block = self._add_block_to_library(lib["id"], "problem", "Problem1", can_stand_alone=False) + html_block = self._add_block_to_library(lib["id"], "html", "Html1", can_stand_alone=False) + self._add_container_components( + container_data["id"], + children_ids=[problem_block["id"], html_block["id"]] + ) + data = self._get_container_components(container_data["id"]) assert len(data) == 2 - assert data[0]['id'] == self.problem_block['id'] + assert data[0]['id'] == problem_block['id'] assert not data[0]['can_stand_alone'] - assert data[1]['id'] == self.html_block['id'] + assert data[1]['id'] == html_block['id'] assert not data[1]['can_stand_alone'] - problem_block_2 = self._add_block_to_library(self.lib["id"], "problem", "Problem_2", can_stand_alone=False) - html_block_2 = self._add_block_to_library(self.lib["id"], "html", "Html_2") + problem_block_2 = self._add_block_to_library(lib["id"], "problem", "Problem2", can_stand_alone=False) + html_block_2 = self._add_block_to_library(lib["id"], "html", "Html2") # Add two more components - self._add_container_children( - self.unit["id"], + self._add_container_components( + container_data["id"], children_ids=[problem_block_2["id"], html_block_2["id"]] ) - data = self._get_container_children(self.unit["id"]) + data = self._get_container_components(container_data["id"]) # Verify total number of components to be 2 + 2 = 4 assert len(data) == 4 assert data[2]['id'] == problem_block_2['id'] @@ -294,264 +157,107 @@ def test_unit_add_children(self) -> None: assert data[3]['id'] == html_block_2['id'] assert data[3]['can_stand_alone'] - def test_subsection_add_children(self) -> None: - # Create units - child_unit_1 = self._create_container(self.lib["id"], "unit", display_name="Child unit 1", slug=None) - child_unit_2 = self._create_container(self.lib["id"], "unit", display_name="Child unit 2", slug=None) - - # Add the units to subsection - self._add_container_children( - self.subsection["id"], - children_ids=[child_unit_1["id"], child_unit_2["id"]] - ) - data = self._get_container_children(self.subsection["id"]) - assert len(data) == 2 - assert data[0]['id'] == child_unit_1['id'] - assert data[1]['id'] == child_unit_2['id'] - - child_unit_3 = self._create_container(self.lib["id"], "unit", display_name="Child unit 3", slug=None) - child_unit_4 = self._create_container(self.lib["id"], "unit", display_name="Child unit 4", slug=None) - - # Add two more units to subsection - self._add_container_children( - self.subsection["id"], - children_ids=[child_unit_3["id"], child_unit_4["id"]] - ) - data = self._get_container_children(self.subsection["id"]) - # Verify total number of units to be 2 + 2 = 4 - assert len(data) == 4 - assert data[2]['id'] == child_unit_3['id'] - assert data[3]['id'] == child_unit_4['id'] - - def test_section_add_children(self) -> None: - # Create Subsections - child_subsection_1 = self._create_container( - self.lib["id"], - "subsection", - display_name="Child Subsection 1", - slug=None, - ) - child_subsection_2 = self._create_container( - self.lib["id"], - "subsection", - display_name="Child Subsection 2", - slug=None, - ) - - # Add the subsections to section - self._add_container_children( - self.section["id"], - children_ids=[child_subsection_1["id"], child_subsection_2["id"]] - ) - data = self._get_container_children(self.section["id"]) - assert len(data) == 2 - assert data[0]['id'] == child_subsection_1['id'] - assert data[1]['id'] == child_subsection_2['id'] - - child_subsection_3 = self._create_container( - self.lib["id"], - "subsection", - display_name="Child Subsection 3", - slug=None, - ) - child_subsection_4 = self._create_container( - self.lib["id"], - "subsection", - display_name="Child Subsection 4", - slug=None, - ) - - # Add two more subsections to section - self._add_container_children( - self.section["id"], - children_ids=[child_subsection_3["id"], child_subsection_4["id"]] - ) - data = self._get_container_children(self.section["id"]) - # Verify total number of subsections to be 2 + 2 = 4 - assert len(data) == 4 - assert data[2]['id'] == child_subsection_3['id'] - assert data[3]['id'] == child_subsection_4['id'] - - @ddt.data( - ("unit_with_components", ["problem_block_2", "problem_block"], ["html_block", "html_block_2"]), - ("subsection_with_units", ["unit", "unit_with_components"], ["unit_2", "unit_3"]), - ("section_with_subsections", ["subsection", "subsection_with_units"], ["subsection_2", "subsection_3"]), - ) - @ddt.unpack - def test_container_remove_children(self, container_name, items_to_remove, expected_items) -> None: + def test_unit_remove_children(self): """ - Test that we can remove container children + Test that we can remove unit children components """ - container = getattr(self, container_name) - item_to_remove_1 = getattr(self, items_to_remove[0]) - item_to_remove_2 = getattr(self, items_to_remove[1]) - expected_item_1 = getattr(self, expected_items[0]) - expected_item_2 = getattr(self, expected_items[1]) - data = self._get_container_children(container["id"]) + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) + + # Create container and add some components + container_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) + problem_block = self._add_block_to_library(lib["id"], "problem", "Problem1", can_stand_alone=False) + html_block = self._add_block_to_library(lib["id"], "html", "Html1", can_stand_alone=False) + problem_block_2 = self._add_block_to_library(lib["id"], "problem", "Problem2", can_stand_alone=False) + html_block_2 = self._add_block_to_library(lib["id"], "html", "Html2") + self._add_container_components( + container_data["id"], + children_ids=[problem_block["id"], html_block["id"], problem_block_2["id"], html_block_2["id"]] + ) + data = self._get_container_components(container_data["id"]) assert len(data) == 4 - # Remove items. - self._remove_container_children( - container["id"], - children_ids=[item_to_remove_1["id"], item_to_remove_2["id"]] + # Remove both problem blocks. + self._remove_container_components( + container_data["id"], + children_ids=[problem_block_2["id"], problem_block["id"]] ) - data = self._get_container_children(container["id"]) + data = self._get_container_components(container_data["id"]) assert len(data) == 2 - assert data[0]['id'] == expected_item_1['id'] - assert data[1]['id'] == expected_item_2['id'] + assert data[0]['id'] == html_block['id'] + assert data[1]['id'] == html_block_2['id'] - def test_unit_replace_children(self) -> None: + def test_unit_replace_children(self): """ Test that we can completely replace/reorder unit children components. """ - data = self._get_container_children(self.unit_with_components["id"]) + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) + + # Create container and add some components + container_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) + problem_block = self._add_block_to_library(lib["id"], "problem", "Problem1", can_stand_alone=False) + html_block = self._add_block_to_library(lib["id"], "html", "Html1", can_stand_alone=False) + problem_block_2 = self._add_block_to_library(lib["id"], "problem", "Problem2", can_stand_alone=False) + html_block_2 = self._add_block_to_library(lib["id"], "html", "Html2") + self._add_container_components( + container_data["id"], + children_ids=[problem_block["id"], html_block["id"], problem_block_2["id"], html_block_2["id"]] + ) + data = self._get_container_components(container_data["id"]) assert len(data) == 4 - assert data[0]['id'] == self.problem_block['id'] - assert data[1]['id'] == self.html_block['id'] - assert data[2]['id'] == self.problem_block_2['id'] - assert data[3]['id'] == self.html_block_2['id'] + assert data[0]['id'] == problem_block['id'] + assert data[1]['id'] == html_block['id'] + assert data[2]['id'] == problem_block_2['id'] + assert data[3]['id'] == html_block_2['id'] # Reorder the components - self._patch_container_children( - self.unit_with_components["id"], - children_ids=[ - self.problem_block["id"], - self.problem_block_2["id"], - self.html_block["id"], - self.html_block_2["id"], - ] + self._patch_container_components( + container_data["id"], + children_ids=[problem_block["id"], problem_block_2["id"], html_block["id"], html_block_2["id"]] ) - data = self._get_container_children(self.unit_with_components["id"]) + data = self._get_container_components(container_data["id"]) assert len(data) == 4 - assert data[0]['id'] == self.problem_block['id'] - assert data[1]['id'] == self.problem_block_2['id'] - assert data[2]['id'] == self.html_block['id'] - assert data[3]['id'] == self.html_block_2['id'] + assert data[0]['id'] == problem_block['id'] + assert data[1]['id'] == problem_block_2['id'] + assert data[2]['id'] == html_block['id'] + assert data[3]['id'] == html_block_2['id'] # Replace with new components - new_problem_block = self._add_block_to_library(self.lib["id"], "problem", "New_Problem", can_stand_alone=False) - new_html_block = self._add_block_to_library(self.lib["id"], "html", "New_Html", can_stand_alone=False) - self._patch_container_children( - self.unit_with_components["id"], + new_problem_block = self._add_block_to_library(lib["id"], "problem", "New_Problem", can_stand_alone=False) + new_html_block = self._add_block_to_library(lib["id"], "html", "New_Html", can_stand_alone=False) + self._patch_container_components( + container_data["id"], children_ids=[new_problem_block["id"], new_html_block["id"]], ) - data = self._get_container_children(self.unit_with_components["id"]) + data = self._get_container_components(container_data["id"]) assert len(data) == 2 assert data[0]['id'] == new_problem_block['id'] assert data[1]['id'] == new_html_block['id'] - def test_subsection_replace_children(self) -> None: + def test_restore_unit(self): """ - Test that we can completely replace/reorder subsection children. + Test restore a deleted unit. """ - data = self._get_container_children(self.subsection_with_units["id"]) - assert len(data) == 4 - assert data[0]['id'] == self.unit['id'] - assert data[1]['id'] == self.unit_with_components['id'] - assert data[2]['id'] == self.unit_2['id'] - assert data[3]['id'] == self.unit_3['id'] - - # Reorder the units - self._patch_container_children( - self.subsection_with_units["id"], - children_ids=[ - self.unit_2["id"], - self.unit["id"], - self.unit_3["id"], - self.unit_with_components["id"], - ] - ) - data = self._get_container_children(self.subsection_with_units["id"]) - assert len(data) == 4 - assert data[0]['id'] == self.unit_2['id'] - assert data[1]['id'] == self.unit['id'] - assert data[2]['id'] == self.unit_3['id'] - assert data[3]['id'] == self.unit_with_components['id'] - - # Replace with new units - new_unit_1 = self._create_container(self.lib["id"], "unit", display_name="New Unit 1", slug=None) - new_unit_2 = self._create_container(self.lib["id"], "unit", display_name="New Unit 2", slug=None) - self._patch_container_children( - self.subsection_with_units["id"], - children_ids=[new_unit_1["id"], new_unit_2["id"]], - ) - data = self._get_container_children(self.subsection_with_units["id"]) - assert len(data) == 2 - assert data[0]['id'] == new_unit_1['id'] - assert data[1]['id'] == new_unit_2['id'] + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) - def test_section_replace_children(self) -> None: - """ - Test that we can completely replace/reorder section children. - """ - data = self._get_container_children(self.section_with_subsections["id"]) - assert len(data) == 4 - assert data[0]['id'] == self.subsection['id'] - assert data[1]['id'] == self.subsection_with_units['id'] - assert data[2]['id'] == self.subsection_2['id'] - assert data[3]['id'] == self.subsection_3['id'] - - # Reorder the subsections - self._patch_container_children( - self.section_with_subsections["id"], - children_ids=[ - self.subsection_2["id"], - self.subsection["id"], - self.subsection_3["id"], - self.subsection_with_units["id"], - ] - ) - data = self._get_container_children(self.section_with_subsections["id"]) - assert len(data) == 4 - assert data[0]['id'] == self.subsection_2['id'] - assert data[1]['id'] == self.subsection['id'] - assert data[2]['id'] == self.subsection_3['id'] - assert data[3]['id'] == self.subsection_with_units['id'] - - # Replace with new subsections - new_subsection_1 = self._create_container( - self.lib["id"], - "subsection", - display_name="New Subsection 1", - slug=None, - ) - new_subsection_2 = self._create_container( - self.lib["id"], - "subsection", - display_name="New Subsection 2", - slug=None, - ) - self._patch_container_children( - self.section_with_subsections["id"], - children_ids=[new_subsection_1["id"], new_subsection_2["id"]], - ) - data = self._get_container_children(self.section_with_subsections["id"]) - assert len(data) == 2 - assert data[0]['id'] == new_subsection_1['id'] - assert data[1]['id'] == new_subsection_2['id'] - - @ddt.data( - "unit", - "subsection", - "section", - ) - def test_restore_containers(self, container_type) -> None: - """ - Test restore a deleted container. - """ - container = getattr(self, container_type) + # Create a unit: + create_date = datetime(2024, 9, 8, 7, 6, 5, tzinfo=timezone.utc) + with freeze_time(create_date): + container_data = self._create_container(lib["id"], "unit", slug="u1", display_name="Test Unit") - # Delete container - self._delete_container(container["id"]) + # Delete the unit + self._delete_container(container_data["id"]) # Restore container - self._restore_container(container["id"]) - new_container_data = self._get_container(container["id"]) + self._restore_container(container_data["id"]) + new_container_data = self._get_container(container_data["id"]) expected_data = { - "id": container["id"], - "container_type": container_type, - "display_name": container["display_name"], + "id": "lct:CL-TEST:containers:unit:u1", + "container_type": "unit", + "display_name": "Test Unit", "last_published": None, - "published_by": None, + "published_by": "", "last_draft_created": "2024-09-08T07:06:05Z", "last_draft_created_by": 'Bob', 'has_unpublished_changes': True, @@ -560,30 +266,17 @@ def test_restore_containers(self, container_type) -> None: 'collections': [], } - self.assertDictContainsEntries(new_container_data, expected_data) - - @ddt.data( - "unit", - "subsection", - "section", - ) - def test_tag_containers(self, container_type) -> None: - container = getattr(self, container_type) - - assert container["tags_count"] == 0 - tagging_api.tag_object( - container["id"], - self.taxonomy, - ['one', 'three', 'four'], - ) + def test_container_collections(self): + # Create a library + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + lib_key = LibraryLocatorV2.from_string(lib["id"]) - new_container_data = self._get_container(container["id"]) - assert new_container_data["tags_count"] == 3 + # Create a unit + container_data = self._create_container(lib["id"], "unit", display_name="Alpha Bravo", slug=None) - def test_unit_collections(self) -> None: # Create a collection col1 = api.create_library_collection( - self.lib_key, + lib_key, "COL1", title="Collection 1", created_by=self.user.id, @@ -591,424 +284,69 @@ def test_unit_collections(self) -> None: ) result = self._patch_container_collections( - self.unit["id"], + container_data["id"], collection_keys=[col1.key], ) assert result['count'] == 1 # Fetch the unit - unit_as_read = self._get_container(self.unit["id"]) + unit_as_read = self._get_container(container_data["id"]) # Verify the collections assert unit_as_read['collections'] == [{"title": col1.title, "key": col1.key}] - def test_section_hierarchy(self): - with self.assertNumQueries(133): - hierarchy = self._get_container_hierarchy(self.section_with_subsections["id"]) - assert hierarchy["object_key"] == self.section_with_subsections["id"] - assert hierarchy["components"] == [ - self._hierarchy_member(self.problem_block), - self._hierarchy_member(self.html_block), - self._hierarchy_member(self.problem_block_2), - self._hierarchy_member(self.html_block_2), - ] - assert hierarchy["units"] == [ - self._hierarchy_member(self.unit), - self._hierarchy_member(self.unit_with_components), - self._hierarchy_member(self.unit_2), - self._hierarchy_member(self.unit_3), - ] - assert hierarchy["subsections"] == [ - self._hierarchy_member(self.subsection), - self._hierarchy_member(self.subsection_with_units), - self._hierarchy_member(self.subsection_2), - self._hierarchy_member(self.subsection_3), - ] - assert hierarchy["sections"] == [ - self._hierarchy_member(self.section_with_subsections), - ] - - def test_subsection_hierarchy(self): - with self.assertNumQueries(95): - hierarchy = self._get_container_hierarchy(self.subsection_with_units["id"]) - assert hierarchy["object_key"] == self.subsection_with_units["id"] - assert hierarchy["components"] == [ - self._hierarchy_member(self.problem_block), - self._hierarchy_member(self.html_block), - self._hierarchy_member(self.problem_block_2), - self._hierarchy_member(self.html_block_2), - ] - assert hierarchy["units"] == [ - self._hierarchy_member(self.unit), - self._hierarchy_member(self.unit_with_components), - self._hierarchy_member(self.unit_2), - self._hierarchy_member(self.unit_3), - ] - assert hierarchy["subsections"] == [ - self._hierarchy_member(self.subsection_with_units), - ] - assert hierarchy["sections"] == [ - self._hierarchy_member(self.section_with_subsections), - ] - - def test_units_hierarchy(self): - with self.assertNumQueries(60): - hierarchy = self._get_container_hierarchy(self.unit_with_components["id"]) - assert hierarchy["object_key"] == self.unit_with_components["id"] - assert hierarchy["components"] == [ - self._hierarchy_member(self.problem_block), - self._hierarchy_member(self.html_block), - self._hierarchy_member(self.problem_block_2), - self._hierarchy_member(self.html_block_2), - ] - assert hierarchy["units"] == [ - self._hierarchy_member(self.unit_with_components), - ] - assert hierarchy["subsections"] == [ - self._hierarchy_member(self.subsection_with_units), - ] - assert hierarchy["sections"] == [ - self._hierarchy_member(self.section_with_subsections), - ] - - def test_container_hierarchy_not_found(self): - self._get_container_hierarchy( - "lct:CL-TEST:containers:section:does-not-exist", - expect_response=404, - ) - - def test_block_hierarchy(self): - with self.assertNumQueries(27): - hierarchy = self._get_block_hierarchy(self.problem_block["id"]) - assert hierarchy["object_key"] == self.problem_block["id"] - assert hierarchy["components"] == [ - self._hierarchy_member(self.problem_block), - ] - assert hierarchy["units"] == [ - self._hierarchy_member(self.unit_with_components), - ] - assert hierarchy["subsections"] == [ - self._hierarchy_member(self.subsection_with_units), - ] - assert hierarchy["sections"] == [ - self._hierarchy_member(self.section_with_subsections), - ] - - def test_block_hierarchy_not_found(self): - self._get_block_hierarchy( - "lb:CL-TEST:problem:does-not-exist", - expect_response=404, - ) - - def _verify_publish_state( - self, - container_id, - expected_childen_len, - expected_has_unpublished_changes, - expected_published_by, - expected_children_ids, - ) -> None: - """ - Verify the publish state of a container - """ - container = self._get_container(container_id) - assert container["has_unpublished_changes"] == expected_has_unpublished_changes - container_children = self._get_container_children(container_id) - assert len(container_children) == expected_childen_len - - for index in range(expected_childen_len): - assert container_children[index]["id"] == expected_children_ids[index] - assert container_children[index]["has_unpublished_changes"] == expected_has_unpublished_changes - assert container_children[index]["published_by"] == expected_published_by - - def test_publish_unit(self) -> None: + def test_publish_container(self): # pylint: disable=too-many-statements """ - Test that we can publish the changes to a specific unit + Test that we can publish the changes to a specific container """ - html_block_3 = self._add_block_to_library(self.lib["id"], "html", "Html3") - self._add_container_children( - self.unit["id"], - children_ids=[ - self.html_block["id"], - html_block_3["id"], - ] - ) - + lib = self._create_library(slug="containers", title="Container Test Library", description="Units and more") + + # Create two containers and add some components + container1 = self._create_container(lib["id"], "unit", display_name="Alpha Unit", slug=None) + container2 = self._create_container(lib["id"], "unit", display_name="Bravo Unit", slug=None) + problem_block = self._add_block_to_library(lib["id"], "problem", "Problem1", can_stand_alone=False) + html_block = self._add_block_to_library(lib["id"], "html", "Html1", can_stand_alone=False) + html_block2 = self._add_block_to_library(lib["id"], "html", "Html2", can_stand_alone=False) + self._add_container_components(container1["id"], children_ids=[problem_block["id"], html_block["id"]]) + self._add_container_components(container2["id"], children_ids=[html_block["id"], html_block2["id"]]) # At first everything is unpublished: - self._verify_publish_state( - container_id=self.unit_with_components["id"], - expected_childen_len=4, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.problem_block["id"], - self.html_block["id"], - self.problem_block_2["id"], - self.html_block_2["id"], - ], - ) - self._verify_publish_state( - container_id=self.unit["id"], - expected_childen_len=2, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.html_block["id"], - html_block_3["id"], - ], - ) - - # Now publish only unit 1 - self._publish_container(self.unit_with_components["id"]) + c1_before = self._get_container(container1["id"]) + assert c1_before["has_unpublished_changes"] + c1_components_before = self._get_container_components(container1["id"]) + assert len(c1_components_before) == 2 + assert c1_components_before[0]["id"] == problem_block["id"] + assert c1_components_before[0]["has_unpublished_changes"] + assert c1_components_before[0]["published_by"] is None + assert c1_components_before[1]["id"] == html_block["id"] + assert c1_components_before[1]["has_unpublished_changes"] + assert c1_components_before[1]["published_by"] is None + c2_before = self._get_container(container2["id"]) + assert c2_before["has_unpublished_changes"] + + # Now publish only Container 1 + self._publish_container(container1["id"]) # Now it is published: - self._verify_publish_state( - container_id=self.unit_with_components["id"], - expected_childen_len=4, - expected_has_unpublished_changes=False, - expected_published_by=self.user.username, - expected_children_ids=[ - self.problem_block["id"], - self.html_block["id"], - self.problem_block_2["id"], - self.html_block_2["id"], - ], - ) - - # and unit 2 is still unpublished, except for the shared HTML block that is also in unit 1: - c2_after = self._get_container(self.unit["id"]) + c1_after = self._get_container(container1["id"]) + assert c1_after["has_unpublished_changes"] is False + c1_components_after = self._get_container_components(container1["id"]) + assert len(c1_components_after) == 2 + assert c1_components_after[0]["id"] == problem_block["id"] + assert c1_components_after[0]["has_unpublished_changes"] is False + assert c1_components_after[0]["published_by"] == self.user.username + assert c1_components_after[1]["id"] == html_block["id"] + assert c1_components_after[1]["has_unpublished_changes"] is False + assert c1_components_after[1]["published_by"] == self.user.username + + # and container 2 is still unpublished, except for the shared HTML block that is also in container 1: + c2_after = self._get_container(container2["id"]) assert c2_after["has_unpublished_changes"] - c2_components_after = self._get_container_children(self.unit["id"]) + c2_components_after = self._get_container_components(container2["id"]) assert len(c2_components_after) == 2 - assert c2_components_after[0]["id"] == self.html_block["id"] - assert c2_components_after[0]["has_unpublished_changes"] is False # published since it's also in unit 1 + assert c2_components_after[0]["id"] == html_block["id"] + assert c2_components_after[0]["has_unpublished_changes"] is False # published since it's also in container 1 assert c2_components_after[0]["published_by"] == self.user.username - assert c2_components_after[1]["id"] == html_block_3["id"] + assert c2_components_after[1]["id"] == html_block2["id"] assert c2_components_after[1]["has_unpublished_changes"] # unaffected assert c2_components_after[1]["published_by"] is None - - def test_copy_container(self) -> None: - """ - Test that we can copy a container and its children. - """ - tagging_api.tag_object( - self.section_with_subsections["id"], - self.taxonomy, - ['one', 'three', 'four'], - ) - tagging_api.tag_object( - self.subsection_with_units["id"], - self.taxonomy, - ['one', 'two'], - ) - tagging_api.tag_object( - self.unit_with_components["id"], - self.taxonomy, - ['one'], - ) - self._copy_container(self.section_with_subsections["id"]) - - from openedx.core.djangoapps.content_staging import api as staging_api - - clipboard_data = staging_api.get_user_clipboard(self.user.id) - - assert clipboard_data is not None - assert clipboard_data.content.display_name == "Section with subsections" - assert clipboard_data.content.status == "ready" - assert clipboard_data.content.purpose == "clipboard" - assert clipboard_data.content.block_type == "chapter" - assert str(clipboard_data.source_usage_key) == self.section_with_subsections["id"] - - # Check the tags on the clipboard content: - assert clipboard_data.content.tags == { - 'lb:CL-TEST:containers:html:Html1': {}, - 'lb:CL-TEST:containers:html:Html2': {}, - 'lb:CL-TEST:containers:problem:Problem1': {}, - 'lb:CL-TEST:containers:problem:Problem2': {}, - self.section_with_subsections["id"]: { - str(self.taxonomy.id): ['one', 'three', 'four'], - }, - self.subsection_with_units["id"]: { - str(self.taxonomy.id): ['one', 'two'], - }, - self.unit_with_components["id"]: { - str(self.taxonomy.id): ['one'], - }, - } - - # Test the actual OLX in the clipboard: - olx_data = staging_api.get_staged_content_olx(clipboard_data.content.id) - assert olx_data is not None - assert olx_data == textwrap.dedent(f"""\ - - - - - - - - - - - - - - - - - """) - - def test_publish_subsection(self) -> None: - """ - Test that we can publish the changes to a specific subsection - """ - unit_4 = self._create_container(self.lib["id"], "unit", display_name="Test Unit 4", slug=None) - - # Add units on subsection - self._add_container_children( - self.subsection["id"], - children_ids=[ - self.unit["id"], - unit_4["id"], - ] - ) - - # TODO -- remove this when containers publish their children: - # https://github.com/openedx/openedx-learning/pull/307 - # Removing the unit with components because the components (children of children) are not published. - # If the unit is kept, the subsection continues to have changes even after it is published. - self._remove_container_children( - self.subsection_with_units["id"], - children_ids=[ - self.unit_with_components["id"], - ] - ) - # /TODO - - # At first everything is unpublished: - self._verify_publish_state( - container_id=self.subsection_with_units["id"], - expected_childen_len=3, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.unit["id"], - self.unit_2["id"], - self.unit_3["id"], - ], - ) - self._verify_publish_state( - container_id=self.subsection["id"], - expected_childen_len=2, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.unit["id"], - unit_4["id"], - ], - ) - - # Now publish only subsection 1 - self._publish_container(self.subsection_with_units["id"]) - - # Now it is published: - self._verify_publish_state( - container_id=self.subsection_with_units["id"], - expected_childen_len=3, - expected_has_unpublished_changes=False, - expected_published_by=self.user.username, - expected_children_ids=[ - self.unit["id"], - self.unit_2["id"], - self.unit_3["id"], - ], - ) - - # and subsection 2 is still unpublished, except for the shared unit that is also in subsection 1: - c2_after = self._get_container(self.subsection["id"]) - assert c2_after["has_unpublished_changes"] - c2_units_after = self._get_container_children(self.subsection["id"]) - assert len(c2_units_after) == 2 - assert c2_units_after[0]["id"] == self.unit["id"] - assert c2_units_after[0]["has_unpublished_changes"] is False # published since it's also in subsection 1 - assert c2_units_after[0]["published_by"] == self.user.username - assert c2_units_after[1]["id"] == unit_4["id"] - assert c2_units_after[1]["has_unpublished_changes"] # unaffected - assert c2_units_after[1]["published_by"] is None - - def test_publish_section(self) -> None: - """ - Test that we can publish the changes to a specific section - """ - subsection_4 = self._create_container(self.lib["id"], "subsection", display_name="Test Subsection 4", slug=None) - self._add_container_children( - self.section["id"], - children_ids=[ - self.subsection["id"], - subsection_4["id"], - ] - ) - - # TODO -- remove this when containers publish their children: - # https://github.com/openedx/openedx-learning/pull/307 - # Removing the subsection with units because the units (children of children) are not published. - # If the subsection is kept, the section continues to have changes even after it is published. - self._remove_container_children( - self.section_with_subsections["id"], - children_ids=[ - self.subsection_with_units["id"], - ] - ) - # /TODO - - # At first everything is unpublished: - self._verify_publish_state( - container_id=self.section_with_subsections["id"], - expected_childen_len=3, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.subsection["id"], - self.subsection_2["id"], - self.subsection_3["id"], - ], - ) - self._verify_publish_state( - container_id=self.section["id"], - expected_childen_len=2, - expected_has_unpublished_changes=True, - expected_published_by=None, - expected_children_ids=[ - self.subsection["id"], - subsection_4["id"], - ], - ) - - # Now publish only section 1 - self._publish_container(self.section_with_subsections["id"]) - - # Now it is published: - self._verify_publish_state( - container_id=self.section_with_subsections["id"], - expected_childen_len=3, - expected_has_unpublished_changes=False, - expected_published_by=self.user.username, - expected_children_ids=[ - self.subsection["id"], - self.subsection_2["id"], - self.subsection_3["id"], - ], - ) - - # and section 2 is still unpublished, except for the shared subsection that is also in section 1: - c2_after = self._get_container(self.section["id"]) - assert c2_after["has_unpublished_changes"] - c2_units_after = self._get_container_children(self.section["id"]) - assert len(c2_units_after) == 2 - assert c2_units_after[0]["id"] == self.subsection["id"] - assert c2_units_after[0]["has_unpublished_changes"] is False # published since it's also in section 1 - assert c2_units_after[0]["published_by"] == self.user.username - assert c2_units_after[1]["id"] == subsection_4["id"] - assert c2_units_after[1]["has_unpublished_changes"] # unaffected - assert c2_units_after[1]["published_by"] is None diff --git a/openedx/core/djangoapps/content_libraries/tests/test_content_libraries.py b/openedx/core/djangoapps/content_libraries/tests/test_content_libraries.py index 91a9c29a3754..e2fec3aee1ff 100644 --- a/openedx/core/djangoapps/content_libraries/tests/test_content_libraries.py +++ b/openedx/core/djangoapps/content_libraries/tests/test_content_libraries.py @@ -2,33 +2,20 @@ Tests for Learning-Core-based Content Libraries """ from datetime import datetime, timezone -import os -import zipfile -import uuid -import tempfile -from io import StringIO from unittest import skip -from unittest.mock import ANY, patch +from unittest.mock import patch import ddt -import tomlkit -from bridgekeeper import perms -from django.core.files.uploadedfile import SimpleUploadedFile from django.contrib.auth.models import Group -from django.db.models import Q from django.test import override_settings from django.test.client import Client from freezegun import freeze_time -from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2, LibraryCollectionLocator +from opaque_keys.edx.locator import LibraryLocatorV2, LibraryUsageLocatorV2 from organizations.models import Organization from rest_framework.test import APITestCase -from rest_framework import status -from openedx_learning.api.authoring_models import LearningPackage -from user_tasks.models import UserTaskStatus, UserTaskArtifact from common.djangoapps.student.tests.factories import UserFactory from openedx.core.djangoapps.content_libraries.constants import CC_4_BY -from openedx.core.djangoapps.content_libraries.tasks import LibraryRestoreTask from openedx.core.djangoapps.content_libraries.tests.base import ( URL_BLOCK_GET_HANDLER_URL, URL_BLOCK_METADATA_URL, @@ -36,15 +23,8 @@ URL_BLOCK_XBLOCK_HANDLER, ContentLibrariesRestApiTest, ) -from openedx_authz import api as authz_api -from openedx_authz.constants import roles -from openedx_authz.engine.enforcer import AuthzEnforcer from openedx.core.djangoapps.xblock import api as xblock_api from openedx.core.djangolib.testing.utils import skip_unless_cms -from openedx_authz.constants.permissions import VIEW_LIBRARY - -from ..models import ContentLibrary, ContentLibraryPermission -from ..permissions import CAN_VIEW_THIS_CONTENT_LIBRARY, HasPermissionInContentLibraryScope @skip_unless_cms @@ -89,6 +69,7 @@ def test_library_crud(self): "slug": "téstlꜟط", "title": "A Tést Lꜟطrary", "description": "Just Téstꜟng", + "version": 0, "license": CC_4_BY, "has_unpublished_changes": False, "has_unpublished_deletes": False, @@ -434,11 +415,11 @@ def test_library_blocks_studio_view(self): # Add a 'html' XBlock to the library: create_date = datetime(2024, 6, 6, 6, 6, 6, tzinfo=timezone.utc) with freeze_time(create_date): - block_data = self._add_block_to_library(lib_id, "problem", "problem1") + block_data = self._add_block_to_library(lib_id, "html", "html1") self.assertDictContainsEntries(block_data, { - "id": "lb:CL-TEST:testlib2:problem:problem1", - "display_name": "Blank Problem", - "block_type": "problem", + "id": "lb:CL-TEST:testlib2:html:html1", + "display_name": "Text", + "block_type": "html", "has_unpublished_changes": True, "last_published": None, "published_by": None, @@ -460,14 +441,14 @@ def test_library_blocks_studio_view(self): block_data["has_unpublished_changes"] = False block_data["last_published"] = publish_date.isoformat().replace('+00:00', 'Z') block_data["published_by"] = "Bob" - block_data["published_display_name"] = "Blank Problem" + block_data["published_display_name"] = "Text" self.assertDictContainsEntries(self._get_library_block(block_id), block_data) assert self._get_library_blocks(lib_id)['results'] == [block_data] # Now update the block's OLX: orig_olx = self._get_library_block_olx(block_id) - assert '", - expect_response=status.HTTP_200_OK) - self._set_library_block_fields( - block_data["id"], - {"data": "", "metadata": {}}, - expect_response=status.HTTP_200_OK) - self._set_library_block_asset( - block_data["id"], - "static/test.txt", - b"data", - expect_response=status.HTTP_200_OK) - # They can remove blocks - self._delete_library_block(block_data["id"], expect_response=status.HTTP_200_OK) - # Verify deletion - self._get_library_block(block_data["id"], expect_response=404) - - # Recreate blocks for further tests - block_data = self._add_block_to_library(self.lib_id, "problem", "new_problem") - - for user in self._all_users_excluding(self.library_editors): - with self.as_user(user): - self._add_block_to_library( - self.lib_id, - "problem", - "problem1", - expect_response=status.HTTP_403_FORBIDDEN) - # They can't modify blocks - self._set_library_block_olx( - block_data["id"], - "", - expect_response=status.HTTP_403_FORBIDDEN) - self._set_library_block_fields( - block_data["id"], - {"data": "", "metadata": {}}, - expect_response=status.HTTP_403_FORBIDDEN) - self._set_library_block_asset( - block_data["id"], - "static/test.txt", - b"data", - expect_response=status.HTTP_403_FORBIDDEN) - # They can't remove blocks - self._delete_library_block(block_data["id"], expect_response=status.HTTP_403_FORBIDDEN) - - def test_publish_permissions(self): - """ - Verify that only users with publish permissions can publish. - """ - # Test publish access - for user in self.library_publishers: - with self.as_user(user): - block_data = self._add_block_to_library(self.lib_id, "problem", f"problem_{user.username}_1") - self._publish_library_block(block_data["id"], expect_response=status.HTTP_200_OK) - block_data = self._add_block_to_library(self.lib_id, "problem", f"problem_{user.username}_2") - assert self._get_library(self.lib_id)['has_unpublished_changes'] is True - self._commit_library_changes(self.lib_id, expect_response=status.HTTP_200_OK) - assert self._get_library(self.lib_id)['has_unpublished_changes'] is False - - block_data = self._add_block_to_library(self.lib_id, "problem", "draft_problem") - assert self._get_library(self.lib_id)['has_unpublished_changes'] is True - - for user in self._all_users_excluding(self.library_publishers): - with self.as_user(user): - self._publish_library_block(block_data["id"], expect_response=status.HTTP_403_FORBIDDEN) - self._commit_library_changes(self.lib_id, expect_response=status.HTTP_403_FORBIDDEN) - # Verify that no changes were published - assert self._get_library(self.lib_id)['has_unpublished_changes'] is True - - def test_collection_permissions(self): - """ - Verify that only users with collection permissions can perform collection actions. - """ - library_key = LibraryLocatorV2.from_string(self.lib_id) - block_data = self._add_block_to_library(self.lib_id, "problem", "collection_problem") - # Test library collection access - for user in self.library_collection_editors: - with self.as_user(user): - # Create collection - collection_data = self._create_collection( - self.lib_id, - title=f"Temp Collection {user.username}", - expect_response=status.HTTP_200_OK) - collection_id = collection_data["key"] - collection_key = LibraryCollectionLocator(lib_key=library_key, collection_id=collection_id) - # Update collection - self._update_collection(collection_key, title="Updated Collection", expect_response=status.HTTP_200_OK) - self._add_items_to_collection( - collection_key, - item_keys=[block_data["id"]], - expect_response=status.HTTP_200_OK) - # Delete collection - self._soft_delete_collection(collection_key, expect_response=status.HTTP_204_NO_CONTENT) - - collection_data = self._create_collection( - self.lib_id, - title="New Temp Collection", - expect_response=status.HTTP_200_OK) - collection_id = collection_data["key"] - collection_key = LibraryCollectionLocator(lib_key=library_key, collection_id=collection_id) - - for user in self._all_users_excluding(self.library_collection_editors): - with self.as_user(user): - # Attempt to create collection - self._create_collection( - self.lib_id, - title="Unauthorized Collection", - expect_response=status.HTTP_403_FORBIDDEN) - # Attempt to update collection - self._update_collection( - collection_key, - title="Unauthorized Change", - expect_response=status.HTTP_403_FORBIDDEN) - self._add_items_to_collection( - collection_key, - item_keys=[block_data["id"]], - expect_response=status.HTTP_403_FORBIDDEN) - # Attempt to delete collection - self._soft_delete_collection(collection_key, expect_response=status.HTTP_403_FORBIDDEN) - - def test_delete_library_permissions(self): - """ - Verify that only users with delete permissions can delete a library. - """ - # Test library delete access - for user in self._all_users_excluding(self.library_deleters): - with self.as_user(user): - result = self._delete_library(self.lib_id, expect_response=status.HTTP_403_FORBIDDEN) - assert 'detail' in result # Error message - assert 'permission' in result['detail'].lower() - - for user in self.library_deleters: - with self.as_user(user): - result = self._delete_library(self.lib_id, expect_response=status.HTTP_200_OK) - assert result == {} diff --git a/openedx/core/djangoapps/content_libraries/tests/test_events.py b/openedx/core/djangoapps/content_libraries/tests/test_events.py index 975cfbafb4d9..e0e3e7392765 100644 --- a/openedx/core/djangoapps/content_libraries/tests/test_events.py +++ b/openedx/core/djangoapps/content_libraries/tests/test_events.py @@ -308,8 +308,8 @@ def test_publish_all_lib_changes(self) -> None: problem_block = self._add_block_to_library(self.lib1_key, "problem", "Problem1", can_stand_alone=False) html_block = self._add_block_to_library(self.lib1_key, "html", "Html1", can_stand_alone=False) html_block2 = self._add_block_to_library(self.lib1_key, "html", "Html2", can_stand_alone=False) - self._add_container_children(container1["id"], children_ids=[problem_block["id"], html_block["id"]]) - self._add_container_children(container2["id"], children_ids=[html_block["id"], html_block2["id"]]) + self._add_container_components(container1["id"], children_ids=[problem_block["id"], html_block["id"]]) + self._add_container_components(container2["id"], children_ids=[html_block["id"], html_block2["id"]]) # Now publish only Container 2 (which will auto-publish both HTML blocks since they're children) self._publish_container(container2["id"]) @@ -351,7 +351,7 @@ def test_publish_child_block(self) -> None: # Create a container and a block container1 = self._create_container(self.lib1_key, "unit", display_name="Alpha Unit", slug=None) problem_block = self._add_block_to_library(self.lib1_key, "problem", "Problem1", can_stand_alone=False) - self._add_container_children(container1["id"], children_ids=[problem_block["id"]]) + self._add_container_components(container1["id"], children_ids=[problem_block["id"]]) # Publish all changes self._commit_library_changes(self.lib1_key) assert self._get_container(container1["id"])["has_unpublished_changes"] is False @@ -396,8 +396,8 @@ def test_publish_container(self) -> None: problem_block = self._add_block_to_library(self.lib1_key, "problem", "Problem1", can_stand_alone=False) html_block = self._add_block_to_library(self.lib1_key, "html", "Html1", can_stand_alone=False) html_block2 = self._add_block_to_library(self.lib1_key, "html", "Html2", can_stand_alone=False) - self._add_container_children(container1["id"], children_ids=[problem_block["id"], html_block["id"]]) - self._add_container_children(container2["id"], children_ids=[html_block["id"], html_block2["id"]]) + self._add_container_components(container1["id"], children_ids=[problem_block["id"], html_block["id"]]) + self._add_container_components(container2["id"], children_ids=[html_block["id"], html_block2["id"]]) # At first everything is unpublished: c1_before = self._get_container(container1["id"]) assert c1_before["has_unpublished_changes"] @@ -449,53 +449,6 @@ def test_publish_container(self) -> None: c2_after = self._get_container(container2["id"]) assert c2_after["has_unpublished_changes"] - def test_publish_child_container(self): - """ - Test the events that get emitted when we publish the changes to a container that is child of another container - """ - # Create some containers - unit = self._create_container(self.lib1_key, "unit", display_name="Alpha Unit", slug=None) - subsection = self._create_container(self.lib1_key, "subsection", display_name="Bravo Subsection", slug=None) - - # Add one container as child - self._add_container_children(subsection["id"], children_ids=[unit["id"]]) - - # At first everything is unpublished: - c1_before = self._get_container(unit["id"]) - assert c1_before["has_unpublished_changes"] - c2_before = self._get_container(subsection["id"]) - assert c2_before["has_unpublished_changes"] - - # clear event log after the initial mock data setup is complete: - self.clear_events() - - # Now publish only the unit - self._publish_container(unit["id"]) - - # Now it is published: - c1_after = self._get_container(unit["id"]) - assert c1_after["has_unpublished_changes"] is False - - # And publish events were emitted: - self.expect_new_events( - { # An event for the unit being published: - "signal": LIBRARY_CONTAINER_PUBLISHED, - "library_container": LibraryContainerData( - container_key=LibraryContainerLocator.from_string(unit["id"]), - ), - }, - { # An event for parent (subsection): - "signal": LIBRARY_CONTAINER_PUBLISHED, - "library_container": LibraryContainerData( - container_key=LibraryContainerLocator.from_string(subsection["id"]), - ), - }, - ) - - # note that subsection is still unpublished - c2_after = self._get_container(subsection["id"]) - assert c2_after["has_unpublished_changes"] - def test_restore_unit(self) -> None: """ Test restoring a deleted unit via the "restore" API. diff --git a/openedx/core/djangoapps/notifications/email/utils.py b/openedx/core/djangoapps/notifications/email/utils.py index 1f761c086095..cfc9791ae98a 100644 --- a/openedx/core/djangoapps/notifications/email/utils.py +++ b/openedx/core/djangoapps/notifications/email/utils.py @@ -2,31 +2,32 @@ Email Notifications Utils """ import datetime +import json from bs4 import BeautifulSoup from django.conf import settings from django.contrib.auth import get_user_model -from django.core.exceptions import BadRequest from django.shortcuts import get_object_or_404 -from django.utils.translation import gettext as _ from pytz import utc from waffle import get_waffle_flag_model # pylint: disable=invalid-django-waffle-import +from common.djangoapps.student.models import CourseEnrollment from lms.djangoapps.branding.api import get_logo_url_for_email -from lms.djangoapps.discussion.notification_prefs.views import UsernameCipher, UsernameDecryptionException -from openedx.core.djangoapps.lang_pref import LANGUAGE_KEY +from lms.djangoapps.discussion.notification_prefs.views import UsernameCipher from openedx.core.djangoapps.notifications.base_notification import COURSE_NOTIFICATION_APPS, COURSE_NOTIFICATION_TYPES from openedx.core.djangoapps.notifications.config.waffle import ENABLE_EMAIL_NOTIFICATIONS from openedx.core.djangoapps.notifications.email import ONE_CLICK_EMAIL_UNSUB_KEY from openedx.core.djangoapps.notifications.email_notifications import EmailCadence from openedx.core.djangoapps.notifications.events import notification_preference_unsubscribe_event -from openedx.core.djangoapps.notifications.models import NotificationPreference +from openedx.core.djangoapps.notifications.models import ( + CourseNotificationPreference, + get_course_notification_preference_config_version +) from openedx.core.djangoapps.user_api.models import UserPreference from xmodule.modulestore.django import modulestore from .notification_icons import NotificationTypeIcons - User = get_user_model() @@ -66,12 +67,13 @@ def get_icon_url_for_notification_type(notification_type): return NotificationTypeIcons.get_icon_url_for_notification_type(notification_type) -def get_unsubscribe_link(username): +def get_unsubscribe_link(username, patch): """ Returns unsubscribe url for username with patch preferences """ encrypted_username = encrypt_string(username) - return f"{settings.LEARNING_MICROFRONTEND_URL}/preferences-unsubscribe/{encrypted_username}/" + encrypted_patch = encrypt_object(patch) + return f"{settings.LEARNING_MICROFRONTEND_URL}/preferences-unsubscribe/{encrypted_username}/{encrypted_patch}" def create_email_template_context(username): @@ -97,10 +99,9 @@ def create_email_template_context(username): "platform_name": settings.PLATFORM_NAME, "mailing_address": settings.CONTACT_MAILING_ADDRESS, "logo_url": get_logo_url_for_email(), - "logo_notification_cadence_url": settings.NOTIFICATION_DIGEST_LOGO, "social_media": social_media_info, "notification_settings_url": f"{account_base_url}/#notifications", - "unsubscribe_url": get_unsubscribe_link(username) + "unsubscribe_url": get_unsubscribe_link(username, patch) } @@ -117,25 +118,17 @@ def create_email_digest_context(app_notifications_dict, username, start_date, en context = create_email_template_context(username) start_date_str = create_datetime_string(start_date) end_date_str = create_datetime_string(end_date if end_date else start_date) - email_digest_updates = [ + email_digest_updates = [{ + 'title': 'Total Notifications', + 'count': sum(value['count'] for value in app_notifications_dict.values()) + }] + email_digest_updates.extend([ { 'title': value['title'], 'count': value['count'], - 'translated_title': value.get('translated_title', value['title']), } for key, value in app_notifications_dict.items() - ] - lookup = { - 'Updates': 1, - 'Grading': 2, - 'Discussion': 3, - } - email_digest_updates.sort(key=lambda x: lookup.get(x['title'], 4), reverse=False) - email_digest_updates.append({ - 'title': 'Total Notifications', - 'translated_title': _('Total Notifications'), - 'count': sum(value['count'] for value in app_notifications_dict.values()) - }) + ]) email_content = [] notifications_in_app = 5 @@ -143,7 +136,6 @@ def create_email_digest_context(app_notifications_dict, username, start_date, en total = value['count'] app_content = { 'title': value['title'], - 'translated_title': value.get('translated_title', value['title']), 'help_text': value.get('help_text', ''), 'help_text_url': value.get('help_text_url', ''), 'notifications': add_additional_attributes_to_notifications( @@ -209,7 +201,7 @@ def get_time_ago(datetime_obj): current_date = utc.localize(datetime.datetime.today()) days_diff = (current_date - datetime_obj).days if days_diff == 0: - return _("Today") + return "Today" if days_diff >= 7: return f"{int(days_diff / 7)}w" return f"{days_diff}d" @@ -248,7 +240,6 @@ def add_additional_attributes_to_notifications(notifications, courses_data=None) notification.time_ago = get_time_ago(notification.created) notification.email_content = add_zero_margin_to_root(notification.content) notification.details = add_zero_margin_to_root(notification.content_context.get('email_content', '')) - notification.view_text = get_text_for_notification_type(notification_type) return notifications @@ -262,7 +253,6 @@ def create_app_notifications_dict(notifications): name: { 'count': 0, 'title': name.title(), - 'translated_title': get_translated_app_title(name), 'notifications': [] } for name in app_names @@ -320,57 +310,6 @@ def filter_notification_with_email_enabled_preferences(notifications, preference return filtered_notifications -def create_missing_account_level_preferences(notifications, preferences, user): - """ - Creates missing account level preferences for notifications - """ - preferences = list(preferences) - notification_types = list(set(notification.notification_type for notification in notifications)) - missing_prefs = [] - for notification_type in notification_types: - if not any(preference.type == notification_type for preference in preferences): - type_pref = COURSE_NOTIFICATION_TYPES.get(notification_type, {}) - app_name = type_pref["notification_app"] - if type_pref.get('is_core', False): - app_pref = COURSE_NOTIFICATION_APPS.get(app_name, {}) - default_pref = { - "web": app_pref["core_web"], - "push": app_pref["core_push"], - "email": app_pref["core_email"], - "email_cadence": app_pref["core_email_cadence"] - } - else: - default_pref = COURSE_NOTIFICATION_TYPES.get(notification_type, {}) - missing_prefs.append( - NotificationPreference( - user=user, type=notification_type, app=app_name, web=default_pref['web'], - push=default_pref['push'], email=default_pref['email'], email_cadence=default_pref['email_cadence'], - ) - ) - if missing_prefs: - created_prefs = NotificationPreference.objects.bulk_create(missing_prefs, ignore_conflicts=True) - preferences = preferences + list(created_prefs) - return preferences - - -def filter_email_enabled_notifications(notifications, preferences, user, cadence_type=EmailCadence.DAILY): - """ - Filter notifications with email enabled in account level preferences - """ - preferences = create_missing_account_level_preferences(notifications, preferences, user) - enabled_course_prefs = [ - preference.type - for preference in preferences - if preference.email and preference.email_cadence == cadence_type - ] - filtered_notifications = [] - for notification in notifications: - if notification.notification_type in enabled_course_prefs: - filtered_notifications.append(notification) - filtered_notifications.sort(key=lambda elem: elem.created, reverse=True) - return filtered_notifications - - def encrypt_string(string): """ Encrypts input string @@ -385,20 +324,23 @@ def decrypt_string(string): return UsernameCipher.decrypt(string).decode() -def username_from_hash(group, request): +def encrypt_object(obj): """ - Django ratelimit key to return username from hash + Returns hashed string of object """ - username = request.resolver_match.kwargs.get("username") - if username: - try: - return decrypt_string(username) - except UsernameDecryptionException as exc: - raise BadRequest("Bad request") from exc - return None + string = json.dumps(obj) + return encrypt_string(string) + + +def decrypt_object(string): + """ + Decrypts input string and returns an object + """ + decoded = decrypt_string(string) + return json.loads(decoded) -def update_user_preferences_from_patch(encrypted_username): +def update_user_preferences_from_patch(encrypted_username, encrypted_patch): """ Decrypt username and patch and updates user preferences Allowed parameters for decrypted patch @@ -409,59 +351,84 @@ def update_user_preferences_from_patch(encrypted_username): course_id: course key string """ username = decrypt_string(encrypted_username) - user = get_object_or_404(User, username=username) - - NotificationPreference.create_default_preferences_for_user(user.id) + patch = decrypt_object(encrypted_patch) - updated_count = NotificationPreference.objects.filter(user=user, email=True).update(email=False) - is_preference_updated = updated_count > 0 + app_value = patch.get("app_name") + type_value = patch.get("notification_type") + channel_value = patch.get("channel") + pref_value = bool(patch.get("value", False)) + user = get_object_or_404(User, username=username) - notification_preference_unsubscribe_event(user, is_preference_updated) - UserPreference.objects.get_or_create(user_id=user.id, key=ONE_CLICK_EMAIL_UNSUB_KEY) + kwargs = {'user': user} + if 'course_id' in patch.keys(): + kwargs['course_id'] = patch['course_id'] + + def is_name_match(name, param_name): + """ + Name is match if strings are equal or param_name is None + """ + return True if param_name is None else name == param_name + + def get_default_cadence_value(app_name, notification_type): + """ + Returns default email cadence value + """ + if notification_type == 'core': + return COURSE_NOTIFICATION_APPS[app_name]['core_email_cadence'] + return COURSE_NOTIFICATION_TYPES[notification_type]['email_cadence'] + + def get_updated_preference(pref): + """ + Update preference if config version doesn't match + """ + if pref.config_version != get_course_notification_preference_config_version(): + pref = pref.get_user_course_preference(pref.user_id, pref.course_id) + return pref + + course_ids = CourseEnrollment.objects.filter(user=user, is_active=True).values_list('course_id', flat=True) + CourseNotificationPreference.objects.bulk_create( + [ + CourseNotificationPreference(user=user, course_id=course_id) + for course_id in course_ids + ], + ignore_conflicts=True + ) + preferences = CourseNotificationPreference.objects.filter(**kwargs) + is_preference_updated = False + + # pylint: disable=too-many-nested-blocks + for preference in preferences: + preference = get_updated_preference(preference) + preference_json = preference.notification_preference_config + for app_name, app_prefs in preference_json.items(): + if not is_name_match(app_name, app_value): + continue + for noti_type, type_prefs in app_prefs['notification_types'].items(): + if not is_name_match(noti_type, type_value): + continue + for channel in ['web', 'email', 'push']: + if not is_name_match(channel, channel_value): + continue + if is_notification_type_channel_editable(app_name, noti_type, channel): + if type_prefs[channel] != pref_value: + type_prefs[channel] = pref_value + is_preference_updated = True + + if channel == 'email' and pref_value and type_prefs.get('email_cadence') == EmailCadence.NEVER: + default_cadence = get_default_cadence_value(app_name, noti_type) + if type_prefs['email_cadence'] != default_cadence: + type_prefs['email_cadence'] = default_cadence + is_preference_updated = True + preference.save() + notification_preference_unsubscribe_event(user, is_preference_updated) + if app_value is None and type_value is None and channel_value == 'email' and not pref_value: + UserPreference.objects.get_or_create(user_id=user.id, key=ONE_CLICK_EMAIL_UNSUB_KEY) def is_notification_type_channel_editable(app_name, notification_type, channel): """ Returns if notification type channel is editable """ - notification_type = 'core'\ - if COURSE_NOTIFICATION_TYPES.get(notification_type, {}).get("is_core", False)\ - else notification_type if notification_type == 'core': return channel not in COURSE_NOTIFICATION_APPS[app_name]['non_editable'] return channel not in COURSE_NOTIFICATION_TYPES[notification_type]['non_editable'] - - -def get_translated_app_title(name): - """ - Returns translated string from notification app_name key - """ - mapping = { - 'discussion': _('Discussion'), - 'updates': _('Updates'), - 'grading': _('Grades'), - } - return mapping.get(name, '') - - -def get_language_preference_for_users(user_ids): - """ - Returns mapping of user_id and language preference for users - """ - prefs = UserPreference.get_preference_for_users(user_ids, LANGUAGE_KEY) - return {pref.user_id: pref.value for pref in prefs} - - -def get_text_for_notification_type(notification_type): - """ - Returns text for notification type - """ - app_name = COURSE_NOTIFICATION_TYPES.get(notification_type, {}).get('notification_app') - if not app_name: - return "" - mapping = { - 'discussion': _('discussion'), - 'updates': _('update'), - 'grading': _('assessment'), - } - return mapping.get(app_name, "") diff --git a/openedx/core/djangoapps/user_api/accounts/views.py b/openedx/core/djangoapps/user_api/accounts/views.py index c3ff6ce7a2f2..ee00c9bc1ddd 100644 --- a/openedx/core/djangoapps/user_api/accounts/views.py +++ b/openedx/core/djangoapps/user_api/accounts/views.py @@ -67,6 +67,7 @@ from openedx.core.djangoapps.user_api import accounts from openedx.core.djangoapps.user_api.accounts.image_helpers import get_profile_image_names, set_has_profile_image from openedx.core.djangoapps.user_api.accounts.utils import handle_retirement_cancellation +from openedx.core.djangoapps.user_authn.cookies import delete_logged_in_cookies from openedx.core.djangoapps.user_authn.exceptions import AuthFailedError from openedx.core.lib.api.authentication import BearerAuthentication, BearerAuthenticationAllowInactiveUser from openedx.core.lib.api.parsers import MergePatchParser @@ -593,7 +594,11 @@ def post(self, request): # Log the user out. logout(request) - return Response(status=status.HTTP_204_NO_CONTENT) + + # EDLYCUSTOM: delete cookies after account deletion, to restrict wordpress access + response = Response(status=status.HTTP_204_NO_CONTENT) + delete_logged_in_cookies(response) + return response except KeyError: log.exception(f"Username not specified {request.user}") return Response("Username not specified.", status=status.HTTP_404_NOT_FOUND) diff --git a/openedx/core/djangoapps/user_authn/views/password_reset.py b/openedx/core/djangoapps/user_authn/views/password_reset.py index 321101fb9a18..9a0aa18aa141 100644 --- a/openedx/core/djangoapps/user_authn/views/password_reset.py +++ b/openedx/core/djangoapps/user_authn/views/password_reset.py @@ -21,6 +21,7 @@ from django.utils.translation import gettext_lazy as _ from django.views.decorators.csrf import csrf_exempt, ensure_csrf_cookie from django.views.decorators.http import require_POST +from edly_features_app.filters import ResetPasswordRequested from edx_ace import ace from edx_ace.recipient import Recipient from eventtracking import tracker @@ -199,6 +200,14 @@ def clean_email(self): # The line below contains the only change, removing is_active=True self.users_cache = User.objects.filter(email__iexact=email) + # EDLYCUSTOM: This filter allows filtering users before resetting password + try: + self.users_cache = ResetPasswordRequested.run_filter( + users=self.users_cache, + ) + except ResetPasswordRequested.PreventResetPassword as exc: + raise forms.ValidationError(str(exc)) from exc + if not self.users_cache and is_secondary_email_feature_enabled(): # Check if user has entered the secondary email. self.users_cache = User.objects.filter( diff --git a/openedx/core/djangoapps/user_authn/views/register.py b/openedx/core/djangoapps/user_authn/views/register.py index 2fc0818a3ab9..c19de97924e0 100644 --- a/openedx/core/djangoapps/user_authn/views/register.py +++ b/openedx/core/djangoapps/user_authn/views/register.py @@ -22,11 +22,12 @@ from django.views.decorators.csrf import csrf_exempt, ensure_csrf_cookie from django.views.decorators.debug import sensitive_post_parameters from django_countries import countries +from edly_features_app.filters import RegistrationValidationRequested from edx_django_utils.monitoring import set_custom_attribute from openedx_events.learning.data import UserData, UserPersonalData from openedx_events.learning.signals import STUDENT_REGISTRATION_COMPLETED from openedx_filters.learning.filters import StudentRegistrationRequested -from zoneinfo import ZoneInfo +from pytz import UTC from django_ratelimit.decorators import ratelimit from requests import HTTPError from rest_framework.response import Response @@ -371,7 +372,7 @@ def _track_user_registration(user, profile, params, third_party_provider, regist 'name': profile.name, # Mailchimp requires the age & yearOfBirth to be integers, we send a sane integer default if falsey. 'age': profile.age or -1, - 'yearOfBirth': profile.year_of_birth or datetime.datetime.now(ZoneInfo("UTC")).year, + 'yearOfBirth': profile.year_of_birth or datetime.datetime.now(UTC).year, 'education': profile.level_of_education_display, 'address': profile.mailing_address, 'gender': profile.gender_display, @@ -530,9 +531,7 @@ def _record_utm_registration_attribution(request, user): # We divide by 1000 here because the javascript timestamp generated is in milliseconds not seconds. # PYTHON: time.time() => 1475590280.823698 # JS: new Date().getTime() => 1475590280823 - created_at_datetime = datetime.datetime.fromtimestamp( - int(created_at_unixtime) / float(1000), tz=ZoneInfo("UTC") - ) + created_at_datetime = datetime.datetime.fromtimestamp(int(created_at_unixtime) / float(1000), tz=UTC) UserAttribute.set_user_attribute( user, REGISTRATION_UTM_CREATED_AT, @@ -595,15 +594,12 @@ def post(self, request): data['username'] = get_auto_generated_username(data) try: - # .. filter_implemented_name: StudentRegistrationRequested - # .. filter_type: org.openedx.learning.student.registration.requested.v1 data = StudentRegistrationRequested.run_filter(form_data=data) except StudentRegistrationRequested.PreventRegistration as exc: errors = { "error_message": [{"user_message": str(exc)}], } - error_code = getattr(exc, "error_code", None) - return self._create_response(request, errors, status_code=exc.status_code, error_code=error_code) + return self._create_response(request, errors, status_code=exc.status_code, error_code=exc.error_code) response = self._handle_duplicate_email_username(request, data) if response: @@ -922,4 +918,10 @@ def update_validations(field_name): if self.username_suggestions: response_dict['username_suggestions'] = self.username_suggestions + # EDLYCUSTOM: we need to return extra info when adding user from panel + response_dict, _ = RegistrationValidationRequested.run_filter( + response_dict=response_dict, + request=request, + ) + return Response(response_dict) diff --git a/openedx/core/release.py b/openedx/core/release.py index dda3add782a0..9df0e2f15c5a 100644 --- a/openedx/core/release.py +++ b/openedx/core/release.py @@ -8,7 +8,7 @@ # The release line: an Open edX release name ("ficus"), or "master". # This should always be "master" on the master branch, and will be changed # manually when we start release-line branches, like open-release/ficus.master. -RELEASE_LINE = "ulmo" +RELEASE_LINE = "teak" def doc_version(): diff --git a/requirements/common_constraints.txt b/requirements/common_constraints.txt index 1f3e81f50334..fc92ee019407 100644 --- a/requirements/common_constraints.txt +++ b/requirements/common_constraints.txt @@ -16,16 +16,13 @@ # this file from Github directly. It does not require packaging in edx-lint. # using LTS django version -Django<6.0 +Django<5.0 # elasticsearch>=7.14.0 includes breaking changes in it which caused issues in discovery upgrade process. # elastic search changelog: https://www.elastic.co/guide/en/enterprise-search/master/release-notes-7.14.0.html # See https://github.com/openedx/edx-platform/issues/35126 for more info elasticsearch<7.14.0 -# pip 25.3 is incompatible with pip-tools hence causing failures during the build process -# Make upgrade command and all requirements upgrade jobs are broken due to this. -# See issue https://github.com/openedx/public-engineering/issues/440 for details regarding the ongoing fix. -# The constraint can be removed once a release (pip-tools > 7.5.1) is available with support for pip 25.3 -# Issue to track this dependency and unpin later on: https://github.com/openedx/edx-lint/issues/503 -pip<25.3 +# Cause: https://github.com/openedx/edx-lint/issues/458 +# This can be unpinned once https://github.com/openedx/edx-lint/issues/459 has been resolved. +pip<24.3 diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 6b5f3e38c2ed..ba888ac023ba 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -13,22 +13,41 @@ # This file contains all common constraints for edx-repos -c common_constraints.txt -# Date: 2025-10-07 -# Stay on LTS version, remove once this is added to common constraint -Django<6.0 - # Date: 2020-02-26 # As it is not clarified what exact breaking changes will be introduced as per # the next major release, ensure the installed version is within boundaries. # Issue for unpinning: https://github.com/openedx/edx-platform/issues/35280 celery>=5.2.2,<6.0.0 +# Date: 2022-07-20 +# edx-enterprise, snowflake-connector-python require charset-normalizer==2.0.0 +# Can be removed once snowflake-connector-python>2.7.9 is released with the fix. +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35278 +charset-normalizer<2.1.0 + +# Date: 2024-02-02 +# Stay on LTS version, remove once this is added to common constraint +Django<5.0 + # Date: 2020-02-10 # django-oauth-toolkit version >=2.0.0 has breaking changes. More details # mentioned on this issue https://github.com/openedx/edx-platform/issues/32884 # Issue for unpinning: https://github.com/openedx/edx-platform/issues/35277 django-oauth-toolkit==1.7.1 +# Date: 2024-02-02 +# incremental upgrade +django-simple-history==3.4.0 + +# Date: 2021-05-17 +# greater version has breaking changes and requires some migration steps. +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35276 +django-webpack-loader==0.7.0 + +# Date: 2023-06-20 +# Adding pin to avoid any major upgrade +djangorestframework<3.15.0 + # Date: 2024-07-19 # Generally speaking, the major version of django-stubs must either match the major version # of django, or exceed it by 1. So, we will need to perpetually constrain django-stubs and @@ -38,11 +57,32 @@ django-oauth-toolkit==1.7.1 # Issue: https://github.com/openedx/edx-platform/issues/35275 django-stubs<6 +# Date: 2024-07-23 +# django-storages==1.14.4 breaks course imports +# Two lines were added in 1.14.4 that make file_exists_in_storage function always return False, +# as the default value of AWS_S3_FILE_OVERWRITE is True +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35170 +django-storages<1.14.4 + # Date: 2019-08-16 # The team that owns this package will manually bump this package rather than having it pulled in automatically. # This is to allow them to better control its deployment and to do it in a process that works better # for them. -edx-enterprise==6.5.1 +edx-enterprise==5.12.7 + +# Date: 2024-05-09 +# This has to be constrained as well because newer versions of edx-i18n-tools need the +# newer version of lxml but that requirement was not made expilict in the 1.6.0 version +# of the package. This can be un-pinned when we're upgrading lxml. +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35274 +edx-i18n-tools<1.6.0 + +# Date: 2024-07-26 +# To override the constraint of edx-lint +# This can be removed once https://github.com/openedx/edx-platform/issues/34586 is resolved +# and the upstream constraint in edx-lint has been removed. +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35273 +event-tracking==3.0.0 # Date: 2023-07-26 # Our legacy Sass code is incompatible with anything except this ancient libsass version. @@ -51,6 +91,17 @@ edx-enterprise==6.5.1 # https://github.com/openedx/edx-platform/issues/31616 libsass==0.10.0 +# Date: 2018-12-14 +# markdown>=3.4.0 has failures due to internal refactorings which causes the tests to fail +# pinning the version untill the issue gets resolved in the package itself +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35271 +markdown<3.4.0 + +# Date: 2024-04-24 +# moto==5.0 contains breaking changes. Needs to be updated separately. +# Issue for unpinning: https://github.com/openedx/edx-platform/issues/35270 +moto<5.0 + # Date: 2024-07-16 # We need to upgrade the version of elasticsearch to atleast 7.15 before we can upgrade to Numpy 2.0.0 # Otherwise we see a failure while running the following command: @@ -61,7 +112,7 @@ numpy<2.0.0 # Date: 2023-09-18 # pinning this version to avoid updates while the library is being developed # Issue for unpinning: https://github.com/openedx/edx-platform/issues/35269 -openedx-learning==0.30.1 +openedx-learning==0.26.0 # Date: 2023-11-29 # Open AI version 1.0.0 dropped support for openai.ChatCompletion which is currently in use in enterprise. @@ -110,29 +161,10 @@ social-auth-app-django<=5.4.1 # Issue for unpinning: https://github.com/openedx/edx-platform/issues/35126 elasticsearch==7.9.1 -# Date 2025-05-09 -# lxml and xmlsec need to be constrained because the latest version builds against a newer -# version of libxml2 than what we're running with. This leads to a version mismatch error -# at runtime. You can re-produce it by running any test. -# If lxml is pinned in the future and you see this error, it may be that the system libxml2 -# is now shipping the correct version and we can un-pin this. -# Issue: https://github.com/openedx/edx-platform/issues/36695 -lxml==5.3.2 +# Date 2025-03-21 +# xmlsec==1.3.15 breaks the test due to incompatible lxml binary version +# social-auth-core>4.5.4 breaks tests with authorization on LinkedIn API +# Both of these constraints will be updated in a follow up PR under the following issue: +# https://github.com/openedx/edx-platform/issues/36425 xmlsec==1.3.14 - -# Date 2025-08-12 -# The newest version of the debug toolbar has a bug in it -# https://github.com/django-commons/django-debug-toolbar/issues/2172 -# Pin this back to the previous version until that bug is fixed. -django-debug-toolbar<6.0.0 - -# Date 2025-10-07 -# Cryptography 46.0.0 conflicts with system dependencies needed for snowflake-connector-python -# snowflake-connector-python comes as a dependency of edx-enterprise so it can not be directly pinned here. -# See issue https://github.com/openedx/edx-platform/issues/37417 for details on this. -# This can be unpinned once snowflake-connector-python==4.0.0 is available (contains the fix). -# pact-python==3.0.0 also removes cffi dependency and is causing the upgrade build to fail -# This should also be removed together with cryptography constraint. -# Issue: https://github.com/openedx/edx-platform/issues/37435 -cryptography<46.0.0 -pact-python<3.0.0 +social-auth-core==4.5.4 diff --git a/requirements/edx/base.txt b/requirements/edx/base.txt index 60e19cd22853..492d3c193c69 100644 --- a/requirements/edx/base.txt +++ b/requirements/edx/base.txt @@ -4,15 +4,17 @@ # # make upgrade # +-e git+https://github.com/anupdhabarde/edx-proctoring-proctortrack.git@31c6c9923a51c903ae83760ecbbac191363aa2a2#egg=edx_proctoring_proctortrack + # via -r requirements/edx/github.in acid-xblock==0.4.1 # via -r requirements/edx/kernel.in aiohappyeyeballs==2.6.1 # via aiohttp -aiohttp==3.13.2 +aiohttp==3.11.18 # via # geoip2 # openai -aiosignal==1.4.0 +aiosignal==1.3.2 # via aiohttp amqp==5.3.1 # via kombu @@ -22,25 +24,22 @@ aniso8601==10.0.1 # via edx-tincan-py35 annotated-types==0.7.0 # via pydantic -anyio==4.11.0 - # via httpx appdirs==1.4.4 # via fs -asgiref==3.10.0 +asgiref==3.8.1 # via # django # django-cors-headers # django-countries asn1crypto==1.5.1 # via snowflake-connector-python -attrs==25.4.0 +attrs==25.3.0 # via # -r requirements/edx/kernel.in # aiohttp # edx-ace # jsonschema # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-learning # referencing @@ -51,15 +50,15 @@ babel==2.17.0 # enmerkar-underscore backoff==1.10.0 # via analytics-python -bcrypt==5.0.0 +bcrypt==4.3.0 # via paramiko -beautifulsoup4==4.14.2 +beautifulsoup4==4.13.4 # via # openedx-forum # pynliner -billiard==4.2.2 +billiard==4.2.1 # via celery -bleach[css]==6.3.0 +bleach[css]==6.2.0 # via # edx-enterprise # lti-consumer-xblock @@ -69,64 +68,59 @@ bleach[css]==6.3.0 # xblock-poll boto==2.49.0 # via -r requirements/edx/kernel.in -boto3==1.40.62 +boto3==1.37.38 # via # -r requirements/edx/kernel.in # django-ses # fs-s3fs # ora2 # snowflake-connector-python -botocore==1.40.62 +botocore==1.37.38 # via # -r requirements/edx/kernel.in # boto3 # s3transfer # snowflake-connector-python -bracex==2.6 - # via wcmatch bridgekeeper==0.9 # via -r requirements/edx/kernel.in -cachecontrol==0.14.3 +cachecontrol==0.14.2 # via firebase-admin -cachetools==6.2.1 +cachetools==5.5.2 # via # edxval # google-auth -camel-converter[pydantic]==5.0.0 +camel-converter[pydantic]==4.0.1 # via meilisearch -casbin-django-orm-adapter==1.7.0 - # via openedx-authz -celery==5.5.3 +celery==5.5.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # django-celery-results # django-user-tasks # edx-celeryutils # edx-enterprise - # enterprise-integrated-channels # event-tracking # openedx-learning -certifi==2025.10.5 +certifi==2025.1.31 # via # elasticsearch - # httpcore - # httpx # requests # snowflake-connector-python -cffi==2.0.0 +cffi==1.17.1 # via # cryptography # pynacl + # snowflake-connector-python chardet==5.2.0 # via pysrt -charset-normalizer==3.4.4 +charset-normalizer==2.0.12 # via + # -c requirements/edx/../constraints.txt # requests # snowflake-connector-python -chem==2.0.0 +chem==1.3.0 # via -r requirements/edx/kernel.in -click==8.3.0 +click==8.1.8 # via # celery # click-didyoumean @@ -135,9 +129,10 @@ click==8.3.0 # code-annotations # edx-django-utils # nltk + # user-util click-didyoumean==0.3.1 # via celery -click-plugins==1.1.1.2 +click-plugins==1.1.1 # via celery click-repl==0.3.0 # via celery @@ -145,13 +140,12 @@ code-annotations==2.3.0 # via # edx-enterprise # edx-toggles -codejail-includes==2.0.0 +codejail-includes==1.0.0 # via -r requirements/edx/kernel.in crowdsourcehinter-xblock==0.8 # via -r requirements/edx/bundled.in -cryptography==45.0.7 +cryptography==44.0.2 # via - # -c requirements/constraints.txt # -r requirements/edx/kernel.in # django-fernet-fields-v2 # edx-enterprise @@ -161,6 +155,7 @@ cryptography==45.0.7 # pyjwt # pyopenssl # snowflake-connector-python + # social-auth-core cssutils==2.11.1 # via pynliner defusedxml==0.7.1 @@ -170,14 +165,12 @@ defusedxml==0.7.1 # ora2 # python3-openid # social-auth-core -django==5.2.7 +django==4.2.20 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in - # casbin-django-orm-adapter # django-appconf - # django-autocomplete-light # django-celery-results # django-classy-tags # django-config-models @@ -194,7 +187,6 @@ django==5.2.7 # django-push-notifications # django-sekizai # django-ses - # django-simple-history # django-statici18n # django-storages # django-user-tasks @@ -230,12 +222,10 @@ django==5.2.7 # edxval # enmerkar # enmerkar-underscore - # enterprise-integrated-channels # event-tracking # help-tokens # jsonfield # lti-consumer-xblock - # openedx-authz # openedx-django-pyfs # openedx-django-wiki # openedx-events @@ -249,8 +239,6 @@ django==5.2.7 # xss-utils django-appconf==1.1.0 # via django-statici18n -django-autocomplete-light==3.12.1 - # via -r requirements/edx/kernel.in django-cache-memoize==0.2.1 # via edx-enterprise django-celery-results==2.6.0 @@ -262,9 +250,8 @@ django-config-models==2.9.0 # -r requirements/edx/kernel.in # edx-enterprise # edx-name-affirmation - # enterprise-integrated-channels # lti-consumer-xblock -django-cors-headers==4.9.0 +django-cors-headers==4.7.0 # via -r requirements/edx/kernel.in django-countries==7.6.1 # via @@ -280,10 +267,8 @@ django-crum==0.7.9 # edx-toggles # super-csv django-fernet-fields-v2==0.9 - # via - # edx-enterprise - # enterprise-integrated-channels -django-filter==25.2 + # via edx-enterprise +django-filter==25.1 # via # -r requirements/edx/kernel.in # edx-enterprise @@ -313,28 +298,24 @@ django-model-utils==5.0.0 # edx-submissions # edx-when # edxval - # enterprise-integrated-channels # ora2 # super-csv -django-mptt==0.18.0 +django-mptt==0.17.0 # via # -r requirements/edx/kernel.in # openedx-django-wiki -django-multi-email-field==0.8.0 +django-multi-email-field==0.7.0 # via edx-enterprise -django-mysql==4.19.0 +django-mysql==4.16.0 # via -r requirements/edx/kernel.in django-oauth-toolkit==1.7.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-enterprise - # enterprise-integrated-channels django-object-actions==5.0.0 - # via - # edx-enterprise - # enterprise-integrated-channels -django-pipeline==4.1.0 + # via edx-enterprise +django-pipeline==4.0.0 # via -r requirements/edx/kernel.in django-push-notifications==3.2.1 # via edx-ace @@ -346,14 +327,14 @@ django-sekizai==4.1.0 # openedx-django-wiki django-ses==4.4.0 # via -r requirements/edx/bundled.in -django-simple-history==3.10.1 +django-simple-history==3.4.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-enterprise # edx-name-affirmation # edx-organizations # edx-proctoring - # enterprise-integrated-channels # ora2 django-statici18n==2.6.0 # via @@ -362,13 +343,14 @@ django-statici18n==2.6.0 # xblock-drag-and-drop-v2 # xblock-poll # xblocks-contrib -django-storages==1.14.6 +django-storages==1.14.3 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edxval -django-user-tasks==3.4.4 +django-user-tasks==3.3.0 # via -r requirements/edx/kernel.in -django-waffle==5.0.0 +django-waffle==4.2.0 # via # -r requirements/edx/kernel.in # edx-django-utils @@ -376,12 +358,14 @@ django-waffle==5.0.0 # edx-enterprise # edx-proctoring # edx-toggles -django-webpack-loader==3.2.1 +django-webpack-loader==0.7.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-proctoring -djangorestframework==3.16.1 +djangorestframework==3.14.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # django-config-models # django-user-tasks @@ -396,14 +380,13 @@ djangorestframework==3.16.1 # edx-organizations # edx-proctoring # edx-submissions - # openedx-authz # openedx-forum # openedx-learning # ora2 # super-csv djangorestframework-xml==2.0.0 # via edx-enterprise -dnspython==2.8.0 +dnspython==2.7.0 # via pymongo done-xblock==2.5.0 # via -r requirements/edx/bundled.in @@ -411,20 +394,19 @@ drf-jwt==1.19.2 # via edx-drf-extensions drf-spectacular==0.28.0 # via -r requirements/edx/kernel.in -drf-yasg==1.21.11 +drf-yasg==1.21.10 # via # django-user-tasks # edx-api-doc-tools -edx-ace==1.15.0 +edx-ace==1.11.4 # via -r requirements/edx/kernel.in -edx-api-doc-tools==2.1.0 +edx-api-doc-tools==2.0.0 # via # -r requirements/edx/kernel.in # edx-name-affirmation - # openedx-authz -edx-auth-backends==4.6.2 +edx-auth-backends==4.5.0 # via -r requirements/edx/kernel.in -edx-bulk-grades==1.2.0 +edx-bulk-grades==1.1.0 # via # -r requirements/edx/kernel.in # staff-graded-xblock @@ -433,28 +415,27 @@ edx-ccx-keys==2.0.2 # -r requirements/edx/kernel.in # lti-consumer-xblock # openedx-events -edx-celeryutils==1.4.0 +edx-celeryutils==1.3.0 # via # -r requirements/edx/kernel.in # edx-name-affirmation # super-csv -edx-codejail==4.0.0 +edx-codejail==3.5.2 # via -r requirements/edx/kernel.in -edx-completion==4.9 +edx-completion==4.7.11 # via -r requirements/edx/kernel.in -edx-django-release-util==1.5.0 +edx-django-release-util==1.4.0 # via # -r requirements/edx/kernel.in # edx-submissions # edxval -edx-django-sites-extensions==5.1.0 +edx-django-sites-extensions==4.2.0 # via -r requirements/edx/kernel.in -edx-django-utils==8.0.1 +edx-django-utils==7.4.0 # via # -r requirements/edx/kernel.in # django-config-models # edx-ace - # edx-auth-backends # edx-drf-extensions # edx-enterprise # edx-event-bus-kafka @@ -463,9 +444,7 @@ edx-django-utils==8.0.1 # edx-rest-api-client # edx-toggles # edx-when - # enterprise-integrated-channels # event-tracking - # openedx-authz # openedx-events # ora2 # super-csv @@ -480,25 +459,24 @@ edx-drf-extensions==10.6.0 # edx-rbac # edx-when # edxval - # enterprise-integrated-channels - # openedx-authz # openedx-learning -edx-enterprise==6.5.1 +edx-enterprise==5.12.7 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in edx-event-bus-kafka==6.1.0 # via -r requirements/edx/kernel.in edx-event-bus-redis==0.6.1 # via -r requirements/edx/kernel.in -edx-i18n-tools==1.9.0 +edx-i18n-tools==1.5.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/bundled.in # ora2 # xblocks-contrib -edx-milestones==1.1.0 +edx-milestones==0.6.0 # via -r requirements/edx/kernel.in -edx-name-affirmation==3.0.2 +edx-name-affirmation==3.0.1 # via -r requirements/edx/kernel.in edx-opaque-keys[django]==3.0.0 # via @@ -512,45 +490,38 @@ edx-opaque-keys[django]==3.0.0 # edx-organizations # edx-proctoring # edx-when - # enterprise-integrated-channels # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-filters # ora2 - # xblocks-contrib -edx-organizations==7.3.0 +edx-organizations==6.13.0 # via -r requirements/edx/kernel.in -edx-proctoring==5.2.0 - # via -r requirements/edx/kernel.in -edx-rbac==2.1.0 +edx-proctoring==5.1.2 # via - # edx-enterprise - # enterprise-integrated-channels + # -r requirements/edx/kernel.in + # edx-proctoring-proctortrack +edx-rbac==1.10.0 + # via edx-enterprise edx-rest-api-client==6.2.0 # via # -r requirements/edx/kernel.in # edx-enterprise # edx-proctoring - # enterprise-integrated-channels -edx-search==4.3.0 +edx-search==4.1.3 # via # -r requirements/edx/kernel.in # openedx-forum -edx-sga==0.26.0 +edx-sga==0.25.3 # via -r requirements/edx/bundled.in -edx-submissions==3.12.1 +edx-submissions==3.10.0 # via # -r requirements/edx/kernel.in # ora2 edx-tincan-py35==2.0.0 - # via - # edx-enterprise - # enterprise-integrated-channels -edx-toggles==5.4.1 + # via edx-enterprise +edx-toggles==5.3.0 # via # -r requirements/edx/kernel.in - # edx-auth-backends # edx-completion # edx-enterprise # edx-event-bus-kafka @@ -560,73 +531,81 @@ edx-toggles==5.4.1 # edxval # event-tracking # ora2 -edx-when==3.0.0 +edx-when==2.5.1 # via # -r requirements/edx/kernel.in # edx-proctoring -edxval==3.1.0 +edxval==2.10.0 # via -r requirements/edx/kernel.in elasticsearch==7.9.1 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # edx-search # openedx-forum enmerkar==0.7.1 # via enmerkar-underscore -enmerkar-underscore==2.4.0 +enmerkar-underscore==2.3.1 # via -r requirements/edx/kernel.in -enterprise-integrated-channels==0.1.22 - # via -r requirements/edx/bundled.in -event-tracking==3.3.0 +event-tracking==3.0.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-completion # edx-proctoring # edx-search -fastavro==1.12.1 +fastavro==1.10.0 # via openedx-events -filelock==3.20.0 +filelock==3.18.0 # via snowflake-connector-python -firebase-admin==7.1.0 +firebase-admin==6.7.0 # via edx-ace -frozenlist==1.8.0 +frozenlist==1.6.0 # via # aiohttp # aiosignal -fs==2.4.16 +fs==2.0.27 # via # -r requirements/edx/kernel.in # fs-s3fs # openedx-django-pyfs # xblock -fs-s3fs==1.1.1 +fs-s3fs==0.1.8 # via # -r requirements/edx/kernel.in # openedx-django-pyfs -geoip2==5.1.0 +future==1.0.0 + # via pyjwkest +geoip2==5.0.1 # via -r requirements/edx/kernel.in glob2==0.7 # via -r requirements/edx/kernel.in -google-api-core[grpc]==2.28.1 +google-api-core[grpc]==2.24.2 # via # firebase-admin + # google-api-python-client # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-auth==2.42.0 +google-api-python-client==2.167.0 + # via firebase-admin +google-auth==2.39.0 # via # google-api-core + # google-api-python-client + # google-auth-httplib2 # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-cloud-core==2.5.0 +google-auth-httplib2==0.2.0 + # via google-api-python-client +google-cloud-core==2.4.3 # via # google-cloud-firestore # google-cloud-storage -google-cloud-firestore==2.21.0 +google-cloud-firestore==2.20.2 # via firebase-admin -google-cloud-storage==3.4.1 +google-cloud-storage==3.1.0 # via firebase-admin google-crc32c==1.7.1 # via @@ -634,54 +613,42 @@ google-crc32c==1.7.1 # google-resumable-media google-resumable-media==2.7.2 # via google-cloud-storage -googleapis-common-protos==1.71.0 +googleapis-common-protos==1.70.0 # via # google-api-core # grpcio-status -grpcio==1.76.0 +grpcio==1.71.0 # via # google-api-core # grpcio-status -grpcio-status==1.76.0 +grpcio-status==1.71.0 # via google-api-core gunicorn==23.0.0 # via -r requirements/edx/kernel.in -h11==0.16.0 - # via httpcore -h2==4.3.0 - # via httpx -help-tokens==3.2.0 +help-tokens==3.1.0 # via -r requirements/edx/kernel.in -hpack==4.1.0 - # via h2 html5lib==1.1 # via # -r requirements/edx/kernel.in # ora2 -httpcore==1.0.9 - # via httpx -httpx[http2]==0.28.1 - # via firebase-admin -hyperframe==6.1.0 - # via h2 -icalendar==6.3.1 +httplib2==0.22.0 + # via + # google-api-python-client + # google-auth-httplib2 +icalendar==6.1.3 # via -r requirements/edx/kernel.in -idna==3.11 +idna==3.10 # via - # anyio - # httpx # optimizely-sdk # requests # snowflake-connector-python # yarl -importlib-metadata==8.7.0 +importlib-metadata==8.6.1 # via -r requirements/edx/kernel.in inflection==0.5.1 # via # drf-spectacular # drf-yasg -invoke==2.2.1 - # via paramiko ipaddress==1.0.23 # via -r requirements/edx/kernel.in isodate==0.7.2 @@ -692,31 +659,30 @@ jmespath==1.0.1 # via # boto3 # botocore -joblib==1.5.2 +joblib==1.4.2 # via nltk jsondiff==2.2.1 # via edx-enterprise -jsonfield==3.2.0 +jsonfield==3.1.0 # via # -r requirements/edx/kernel.in # edx-celeryutils # edx-enterprise # edx-proctoring # edx-submissions - # enterprise-integrated-channels # lti-consumer-xblock # ora2 -jsonschema==4.25.1 +jsonschema==4.23.0 # via # drf-spectacular # optimizely-sdk -jsonschema-specifications==2025.9.1 +jsonschema-specifications==2024.10.1 # via jsonschema jwcrypto==1.5.6 # via # django-oauth-toolkit # pylti1p3 -kombu==5.5.4 +kombu==5.5.3 # via celery laboratory==1.0.2 # via -r requirements/edx/kernel.in @@ -728,11 +694,10 @@ lazy==1.6 # xblock loremipsum==1.0.5 # via ora2 -lti-consumer-xblock==9.14.3 +lti-consumer-xblock==9.13.4 # via -r requirements/edx/kernel.in -lxml[html-clean]==5.3.2 +lxml[html-clean,html_clean]==5.3.2 # via - # -c requirements/constraints.txt # -r requirements/edx/kernel.in # edx-i18n-tools # edxval @@ -744,7 +709,7 @@ lxml[html-clean]==5.3.2 # python3-saml # xblock # xmlsec -lxml-html-clean==0.4.3 +lxml-html-clean==0.4.2 # via lxml mailsnake==1.6.4 # via -r requirements/edx/bundled.in @@ -755,22 +720,23 @@ mako==1.3.10 # lti-consumer-xblock # xblock # xblock-utils -markdown==3.9 +markdown==3.3.7 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # openedx-django-wiki # staff-graded-xblock # xblock-poll -markupsafe==3.0.3 +markupsafe==3.0.2 # via # chem # jinja2 # mako # openedx-calc # xblock -maxminddb==2.8.2 +maxminddb==2.6.3 # via geoip2 -meilisearch==0.37.1 +meilisearch==0.34.1 # via # -r requirements/edx/kernel.in # edx-search @@ -778,13 +744,13 @@ mongoengine==0.29.1 # via -r requirements/edx/kernel.in monotonic==1.6 # via analytics-python -more-itertools==10.8.0 +more-itertools==10.6.0 # via cssutils mpmath==1.3.0 # via sympy -msgpack==1.1.2 +msgpack==1.1.0 # via cachecontrol -multidict==6.7.0 +multidict==6.4.3 # via # aiohttp # yarl @@ -792,55 +758,50 @@ mysqlclient==2.2.7 # via # -r requirements/edx/kernel.in # openedx-forum -nh3==0.3.1 - # via - # -r requirements/edx/kernel.in - # xblocks-contrib -nltk==3.9.2 +newrelic==10.9.0 + # via edx-django-utils +nh3==0.2.21 + # via -r requirements/edx/kernel.in +nltk==3.9.1 # via chem nodeenv==1.9.1 # via -r requirements/edx/kernel.in numpy==1.26.4 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # chem # openedx-calc # scipy # shapely -oauthlib==3.3.1 +oauthlib==3.2.2 # via # -r requirements/edx/kernel.in # django-oauth-toolkit # lti-consumer-xblock # requests-oauthlib # social-auth-core - # xblocks-contrib olxcleaner==0.3.0 # via -r requirements/edx/kernel.in openai==0.28.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # edx-enterprise openedx-atlas==0.7.0 # via # -r requirements/edx/kernel.in - # enterprise-integrated-channels - # openedx-authz # openedx-forum -openedx-authz==0.20.0 - # via -r requirements/edx/kernel.in openedx-calc==4.0.2 # via -r requirements/edx/kernel.in -openedx-django-pyfs==3.8.0 +openedx-django-pyfs==3.7.0 # via # lti-consumer-xblock # xblock # xblocks-contrib -openedx-django-require==3.0.0 +openedx-django-require==2.1.0 # via -r requirements/edx/kernel.in -openedx-django-wiki==3.1.1 +openedx-django-wiki==2.1.0 # via -r requirements/edx/kernel.in -openedx-events==10.5.0 +openedx-events==10.2.0 # via # -r requirements/edx/kernel.in # edx-enterprise @@ -849,32 +810,33 @@ openedx-events==10.5.0 # edx-name-affirmation # event-tracking # ora2 -openedx-filters==2.1.0 +openedx-filters==2.0.1 # via # -r requirements/edx/kernel.in # lti-consumer-xblock # ora2 -openedx-forum==0.3.8 +openedx-forum==0.3.6 # via -r requirements/edx/kernel.in -openedx-learning==0.30.1 +openedx-learning==0.26.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in +openedx-mongodbproxy==0.2.2 + # via -r requirements/edx/kernel.in optimizely-sdk==5.2.0 # via -r requirements/edx/bundled.in -ora2==6.17.1 +ora2==6.16.1 # via -r requirements/edx/bundled.in packaging==25.0 # via # drf-yasg # gunicorn - # kombu # snowflake-connector-python -paramiko==4.0.0 +paramiko==3.5.1 # via edx-enterprise path==16.11.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-i18n-tools # path-py @@ -883,23 +845,25 @@ path-py==12.5.0 # edx-enterprise # ora2 # staff-graded-xblock +pbr==6.1.1 + # via stevedore pgpy==0.6.0 # via edx-enterprise piexif==1.1.3 # via -r requirements/edx/kernel.in -pillow==12.0.0 +pillow==11.2.1 # via # -r requirements/edx/kernel.in # edx-enterprise # edx-organizations # edxval -platformdirs==4.5.0 +platformdirs==4.3.7 # via snowflake-connector-python polib==1.2.0 # via edx-i18n-tools -prompt-toolkit==3.0.52 +prompt-toolkit==3.0.51 # via click-repl -propcache==0.4.1 +propcache==0.3.1 # via # aiohttp # yarl @@ -907,14 +871,14 @@ proto-plus==1.26.1 # via # google-api-core # google-cloud-firestore -protobuf==6.33.0 +protobuf==5.29.4 # via # google-api-core # google-cloud-firestore # googleapis-common-protos # grpcio-status # proto-plus -psutil==7.1.2 +psutil==7.0.0 # via # -r requirements/edx/kernel.in # edx-django-utils @@ -925,23 +889,24 @@ pyasn1==0.6.1 # rsa pyasn1-modules==0.4.2 # via google-auth -pycasbin==2.4.0 - # via - # casbin-django-orm-adapter - # openedx-authz pycountry==24.6.1 # via -r requirements/edx/kernel.in -pycparser==2.23 +pycparser==2.22 # via cffi -pycryptodomex==3.23.0 +pycryptodomex==3.22.0 # via # -r requirements/edx/kernel.in # edx-proctoring # lti-consumer-xblock -pydantic==2.12.3 + # pyjwkest +pydantic==2.11.3 # via camel-converter -pydantic-core==2.41.4 +pydantic-core==2.33.1 # via pydantic +pyjwkest==1.4.2 + # via + # -r requirements/edx/kernel.in + # lti-consumer-xblock pyjwt[crypto]==2.10.1 # via # -r requirements/edx/kernel.in @@ -963,23 +928,25 @@ pymemcache==4.0.0 # via -r requirements/edx/kernel.in pymongo==4.4.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-opaque-keys # event-tracking # mongoengine # openedx-forum -pynacl==1.6.0 + # openedx-mongodbproxy +pynacl==1.5.0 # via # edx-django-utils # paramiko pynliner==0.8.0 # via -r requirements/edx/kernel.in -pyopenssl==25.3.0 +pyopenssl==25.0.0 # via snowflake-connector-python -pyparsing==3.2.5 +pyparsing==3.2.3 # via # chem + # httplib2 # openedx-calc pyrsistent==0.20.0 # via optimizely-sdk @@ -1004,7 +971,7 @@ python-ipware==3.0.0 # via django-ipware python-slugify==8.0.4 # via code-annotations -python-swiftclient==4.8.0 +python-swiftclient==4.7.0 # via ora2 python3-openid==3.2.0 ; python_version >= "3" # via @@ -1015,21 +982,22 @@ python3-saml==1.16.0 pytz==2025.2 # via # -r requirements/edx/kernel.in + # djangorestframework # drf-yasg # edx-completion # edx-enterprise # edx-proctoring # edx-submissions # edx-tincan-py35 - # enterprise-integrated-channels # event-tracking + # fs # olxcleaner # ora2 # snowflake-connector-python # xblock pyuca==1.2 # via -r requirements/edx/kernel.in -pyyaml==6.0.3 +pyyaml==6.0.2 # via # -r requirements/edx/kernel.in # code-annotations @@ -1041,19 +1009,19 @@ pyyaml==6.0.3 # xblock random2==1.0.2 # via -r requirements/edx/kernel.in -recommender-xblock==3.1.0 +recommender-xblock==3.0.0 # via -r requirements/edx/bundled.in -redis==7.0.1 +redis==5.2.1 # via # -r requirements/edx/kernel.in # walrus -referencing==0.37.0 +referencing==0.36.2 # via # jsonschema # jsonschema-specifications -regex==2025.10.23 +regex==2024.11.6 # via nltk -requests==2.32.5 +requests==2.32.3 # via # analytics-python # cachecontrol @@ -1062,7 +1030,6 @@ requests==2.32.5 # edx-drf-extensions # edx-enterprise # edx-rest-api-client - # enterprise-integrated-channels # geoip2 # google-api-core # google-cloud-storage @@ -1071,6 +1038,7 @@ requests==2.32.5 # openai # openedx-forum # optimizely-sdk + # pyjwkest # pylti1p3 # python-swiftclient # requests-oauthlib @@ -1083,7 +1051,7 @@ requests-oauthlib==2.0.0 # via # -r requirements/edx/kernel.in # social-auth-core -rpds-py==0.28.0 +rpds-py==0.24.0 # via # jsonschema # referencing @@ -1095,19 +1063,17 @@ rules==3.5 # edx-enterprise # edx-proctoring # openedx-learning -s3transfer==0.14.0 +s3transfer==0.11.5 # via boto3 sailthru-client==2.2.3 # via edx-ace -scipy==1.16.3 +scipy==1.15.2 # via chem semantic-version==2.10.0 # via edx-drf-extensions -shapely==2.1.2 +shapely==2.1.0 # via -r requirements/edx/kernel.in -simpleeval==1.0.3 - # via pycasbin -simplejson==3.20.2 +simplejson==3.20.1 # via # -r requirements/edx/kernel.in # sailthru-client @@ -1118,6 +1084,7 @@ six==1.17.0 # via # -r requirements/edx/kernel.in # analytics-python + # codejail-includes # crowdsourcehinter-xblock # edx-ace # edx-auth-backends @@ -1130,24 +1097,23 @@ six==1.17.0 # fs # fs-s3fs # html5lib + # pyjwkest # python-dateutil slumber==0.7.1 # via # -r requirements/edx/kernel.in # edx-bulk-grades # edx-enterprise - # enterprise-integrated-channels -sniffio==1.3.1 - # via anyio -snowflake-connector-python==4.0.0 +snowflake-connector-python==3.14.1 # via edx-enterprise social-auth-app-django==5.4.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-auth-backends -social-auth-core==4.8.1 +social-auth-core==4.5.4 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/kernel.in # edx-auth-backends # social-auth-app-django @@ -1159,13 +1125,13 @@ sortedcontainers==2.4.0 # via # -r requirements/edx/kernel.in # snowflake-connector-python -soupsieve==2.8 +soupsieve==2.7 # via beautifulsoup4 sqlparse==0.5.3 # via django -staff-graded-xblock==3.1.0 +staff-graded-xblock==3.0.1 # via -r requirements/edx/bundled.in -stevedore==5.5.0 +stevedore==5.4.1 # via # -r requirements/edx/kernel.in # code-annotations @@ -1173,32 +1139,27 @@ stevedore==5.5.0 # edx-django-utils # edx-enterprise # edx-opaque-keys -super-csv==4.1.0 +super-csv==4.0.1 # via edx-bulk-grades -sympy==1.14.0 +sympy==1.13.3 # via openedx-calc -testfixtures==10.0.0 +testfixtures==8.3.0 # via edx-enterprise text-unidecode==1.3 # via python-slugify tinycss2==1.4.0 # via bleach -tomlkit==0.13.3 - # via - # openedx-learning - # snowflake-connector-python +tomlkit==0.13.2 + # via snowflake-connector-python tqdm==4.67.1 # via # nltk # openai -typing-extensions==4.15.0 +typing-extensions==4.13.2 # via - # aiosignal - # anyio # beautifulsoup4 # django-countries # edx-opaque-keys - # grpcio # jwcrypto # pydantic # pydantic-core @@ -1207,7 +1168,7 @@ typing-extensions==4.15.0 # referencing # snowflake-connector-python # typing-inspection -typing-inspection==0.4.2 +typing-inspection==0.4.0 # via pydantic tzdata==2025.2 # via @@ -1217,18 +1178,20 @@ unicodecsv==0.14.1 # via # -r requirements/edx/kernel.in # edx-enterprise - # enterprise-integrated-channels unicodeit==0.7.5 # via -r requirements/edx/kernel.in -uritemplate==4.2.0 +uritemplate==4.1.1 # via # drf-spectacular # drf-yasg -urllib3==2.5.0 + # google-api-python-client +urllib3==2.2.3 # via # botocore # elasticsearch # requests +user-util==1.1.0 + # via -r requirements/edx/kernel.in vine==5.1.0 # via # amqp @@ -1236,13 +1199,11 @@ vine==5.1.0 # kombu voluptuous==0.15.2 # via ora2 -walrus==0.9.5 +walrus==0.9.4 # via edx-event-bus-redis -wcmatch==10.1 - # via pycasbin -wcwidth==0.2.14 +wcwidth==0.2.13 # via prompt-toolkit -web-fragments==3.1.0 +web-fragments==3.0.0 # via # -r requirements/edx/kernel.in # crowdsourcehinter-xblock @@ -1261,7 +1222,7 @@ webob==1.8.9 # xblock wheel==0.45.1 # via django-pipeline -wrapt==2.0.0 +wrapt==1.17.2 # via -r requirements/edx/kernel.in xblock[django]==5.2.0 # via @@ -1279,27 +1240,27 @@ xblock[django]==5.2.0 # xblock-google-drive # xblock-utils # xblocks-contrib -xblock-drag-and-drop-v2==5.0.3 +xblock-drag-and-drop-v2==5.0.2 # via -r requirements/edx/bundled.in xblock-google-drive==0.8.1 # via -r requirements/edx/bundled.in -xblock-poll==1.15.1 +xblock-poll==1.14.1 # via -r requirements/edx/bundled.in xblock-utils==4.0.0 # via # edx-sga # xblock-poll -xblocks-contrib==0.6.0 +xblocks-contrib==0.3.0 # via -r requirements/edx/bundled.in xmlsec==1.3.14 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # python3-saml -xss-utils==0.8.0 +xss-utils==0.7.1 # via -r requirements/edx/kernel.in -yarl==1.22.0 +yarl==1.20.0 # via aiohttp -zipp==3.23.0 +zipp==3.21.0 # via importlib-metadata # The following packages are considered to be unsafe in a requirements file: diff --git a/requirements/edx/development.txt b/requirements/edx/development.txt index 85ac66aed195..cea879b897d1 100644 --- a/requirements/edx/development.txt +++ b/requirements/edx/development.txt @@ -4,6 +4,10 @@ # # make upgrade # +-e git+https://github.com/anupdhabarde/edx-proctoring-proctortrack.git@31c6c9923a51c903ae83760ecbbac191363aa2a2#egg=edx_proctoring_proctortrack + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt accessible-pygments==0.0.5 # via # -r requirements/edx/doc.txt @@ -17,13 +21,13 @@ aiohappyeyeballs==2.6.1 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # aiohttp -aiohttp==3.13.2 +aiohttp==3.11.18 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # geoip2 # openai -aiosignal==1.4.0 +aiosignal==1.3.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -46,46 +50,42 @@ aniso8601==10.0.1 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-tincan-py35 -annotated-doc==0.0.3 - # via - # -r requirements/edx/testing.txt - # fastapi annotated-types==0.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # pydantic -anyio==4.11.0 +anyio==4.9.0 # via - # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # httpx + # httpcore # starlette appdirs==1.4.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # fs -asgiref==3.10.0 +asgiref==3.8.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django # django-cors-headers # django-countries + # django-stubs asn1crypto==1.5.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # snowflake-connector-python -astroid==3.3.11 +astroid==3.3.9 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # pylint # pylint-celery # sphinx-autoapi -attrs==25.4.0 +attrs==25.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -93,7 +93,6 @@ attrs==25.4.0 # edx-ace # jsonschema # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-learning # referencing @@ -110,24 +109,24 @@ backoff==1.10.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # analytics-python -bcrypt==5.0.0 +bcrypt==4.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # paramiko -beautifulsoup4==4.14.2 +beautifulsoup4==4.13.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-forum # pydata-sphinx-theme # pynliner -billiard==4.2.2 +billiard==4.2.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # celery -bleach[css]==6.3.0 +bleach[css]==6.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -141,7 +140,7 @@ boto==2.49.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -boto3==1.40.62 +boto3==1.37.38 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -149,61 +148,50 @@ boto3==1.40.62 # fs-s3fs # ora2 # snowflake-connector-python -botocore==1.40.62 +botocore==1.37.38 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # boto3 # s3transfer # snowflake-connector-python -bracex==2.6 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # wcmatch bridgekeeper==0.9 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -build==1.3.0 +build==1.2.2.post1 # via - # -r requirements/pip-tools.txt + # -r requirements/edx/../pip-tools.txt # pip-tools -cachecontrol==0.14.3 +cachecontrol==0.14.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # firebase-admin -cachetools==6.2.1 +cachetools==5.5.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edxval # google-auth # tox -camel-converter[pydantic]==5.0.0 +camel-converter[pydantic]==4.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # meilisearch -casbin-django-orm-adapter==1.7.0 +celery==5.5.1 # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # openedx-authz -celery==5.5.3 - # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-celery-results # django-user-tasks # edx-celeryutils # edx-enterprise - # enterprise-integrated-channels # event-tracking # openedx-learning -certifi==2025.10.5 +certifi==2025.1.31 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -212,12 +200,13 @@ certifi==2025.10.5 # httpx # requests # snowflake-connector-python -cffi==2.0.0 +cffi==1.17.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # cryptography # pynacl + # snowflake-connector-python chardet==5.2.0 # via # -r requirements/edx/doc.txt @@ -225,23 +214,24 @@ chardet==5.2.0 # diff-cover # pysrt # tox -charset-normalizer==3.4.4 +charset-normalizer==2.0.12 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # requests # snowflake-connector-python -chem==2.0.0 +chem==1.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -click==8.3.0 +click==8.1.8 # via + # -r requirements/edx/../pip-tools.txt # -r requirements/edx/assets.txt # -r requirements/edx/development.in # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # -r requirements/pip-tools.txt # celery # click-didyoumean # click-log @@ -254,6 +244,7 @@ click==8.3.0 # nltk # pact-python # pip-tools + # user-util # uvicorn click-didyoumean==0.3.1 # via @@ -264,7 +255,7 @@ click-log==0.4.0 # via # -r requirements/edx/testing.txt # edx-lint -click-plugins==1.1.1.2 +click-plugins==1.1.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -281,7 +272,7 @@ code-annotations==2.3.0 # edx-enterprise # edx-lint # edx-toggles -codejail-includes==2.0.0 +codejail-includes==1.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -289,7 +280,7 @@ colorama==0.4.6 # via # -r requirements/edx/testing.txt # tox -coverage[toml]==7.11.0 +coverage[toml]==7.8.0 # via # -r requirements/edx/testing.txt # pytest-cov @@ -297,9 +288,8 @@ crowdsourcehinter-xblock==0.8 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -cryptography==45.0.7 +cryptography==44.0.2 # via - # -c requirements/constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-fernet-fields-v2 @@ -310,6 +300,7 @@ cryptography==45.0.7 # pyjwt # pyopenssl # snowflake-connector-python + # social-auth-core cssselect==1.3.0 # via # -r requirements/edx/testing.txt @@ -333,25 +324,23 @@ defusedxml==0.7.1 # ora2 # python3-openid # social-auth-core -diff-cover==9.7.1 +diff-cover==9.2.4 # via -r requirements/edx/testing.txt dill==0.4.0 # via # -r requirements/edx/testing.txt # pylint -distlib==0.4.0 +distlib==0.3.9 # via # -r requirements/edx/testing.txt # virtualenv -django==5.2.7 +django==4.2.20 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # casbin-django-orm-adapter # django-appconf - # django-autocomplete-light # django-celery-results # django-classy-tags # django-config-models @@ -369,7 +358,6 @@ django==5.2.7 # django-push-notifications # django-sekizai # django-ses - # django-simple-history # django-statici18n # django-storages # django-stubs @@ -407,12 +395,10 @@ django==5.2.7 # edxval # enmerkar # enmerkar-underscore - # enterprise-integrated-channels # event-tracking # help-tokens # jsonfield # lti-consumer-xblock - # openedx-authz # openedx-django-pyfs # openedx-django-wiki # openedx-events @@ -429,10 +415,6 @@ django-appconf==1.1.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-statici18n -django-autocomplete-light==3.12.1 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt django-cache-memoize==0.2.1 # via # -r requirements/edx/doc.txt @@ -453,9 +435,8 @@ django-config-models==2.9.0 # -r requirements/edx/testing.txt # edx-enterprise # edx-name-affirmation - # enterprise-integrated-channels # lti-consumer-xblock -django-cors-headers==4.9.0 +django-cors-headers==4.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -474,17 +455,14 @@ django-crum==0.7.9 # edx-rbac # edx-toggles # super-csv -django-debug-toolbar==5.2.0 - # via - # -c requirements/constraints.txt - # -r requirements/edx/development.in +django-debug-toolbar==5.1.0 + # via -r requirements/edx/development.in django-fernet-fields-v2==0.9 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels -django-filter==25.2 +django-filter==25.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -522,37 +500,34 @@ django-model-utils==5.0.0 # edx-submissions # edx-when # edxval - # enterprise-integrated-channels # ora2 # super-csv -django-mptt==0.18.0 +django-mptt==0.17.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-django-wiki -django-multi-email-field==0.8.0 +django-multi-email-field==0.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise -django-mysql==4.19.0 +django-mysql==4.16.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt django-oauth-toolkit==1.7.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels django-object-actions==5.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels -django-pipeline==4.1.0 +django-pipeline==4.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -574,15 +549,15 @@ django-ses==4.4.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -django-simple-history==3.10.1 +django-simple-history==3.4.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise # edx-name-affirmation # edx-organizations # edx-proctoring - # enterprise-integrated-channels # ora2 django-statici18n==2.6.0 # via @@ -592,23 +567,24 @@ django-statici18n==2.6.0 # xblock-drag-and-drop-v2 # xblock-poll # xblocks-contrib -django-storages==1.14.6 +django-storages==1.14.3 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edxval -django-stubs[compatible-mypy]==5.2.7 +django-stubs==5.1.3 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/development.in # djangorestframework-stubs -django-stubs-ext==5.2.7 +django-stubs-ext==5.1.3 # via django-stubs -django-user-tasks==3.4.4 +django-user-tasks==3.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -django-waffle==5.0.0 +django-waffle==4.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -617,13 +593,15 @@ django-waffle==5.0.0 # edx-enterprise # edx-proctoring # edx-toggles -django-webpack-loader==3.2.1 +django-webpack-loader==0.7.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-proctoring -djangorestframework==3.16.1 +djangorestframework==3.14.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-config-models @@ -639,19 +617,18 @@ djangorestframework==3.16.1 # edx-organizations # edx-proctoring # edx-submissions - # openedx-authz # openedx-forum # openedx-learning # ora2 # super-csv -djangorestframework-stubs==3.16.5 +djangorestframework-stubs==3.15.3 # via -r requirements/edx/development.in djangorestframework-xml==2.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise -dnspython==2.8.0 +dnspython==2.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -675,27 +652,26 @@ drf-spectacular==0.28.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -drf-yasg==1.21.11 +drf-yasg==1.21.10 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-user-tasks # edx-api-doc-tools -edx-ace==1.15.0 +edx-ace==1.11.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-api-doc-tools==2.1.0 +edx-api-doc-tools==2.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-name-affirmation - # openedx-authz -edx-auth-backends==4.6.2 +edx-auth-backends==4.5.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-bulk-grades==1.2.0 +edx-bulk-grades==1.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -706,37 +682,36 @@ edx-ccx-keys==2.0.2 # -r requirements/edx/testing.txt # lti-consumer-xblock # openedx-events -edx-celeryutils==1.4.0 +edx-celeryutils==1.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-name-affirmation # super-csv -edx-codejail==4.0.0 +edx-codejail==3.5.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-completion==4.9 +edx-completion==4.7.11 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-django-release-util==1.5.0 +edx-django-release-util==1.4.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-submissions # edxval -edx-django-sites-extensions==5.1.0 +edx-django-sites-extensions==4.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-django-utils==8.0.1 +edx-django-utils==7.4.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # django-config-models # edx-ace - # edx-auth-backends # edx-drf-extensions # edx-enterprise # edx-event-bus-kafka @@ -745,9 +720,7 @@ edx-django-utils==8.0.1 # edx-rest-api-client # edx-toggles # edx-when - # enterprise-integrated-channels # event-tracking - # openedx-authz # openedx-events # ora2 # super-csv @@ -763,12 +736,10 @@ edx-drf-extensions==10.6.0 # edx-rbac # edx-when # edxval - # enterprise-integrated-channels - # openedx-authz # openedx-learning -edx-enterprise==6.5.1 +edx-enterprise==5.12.7 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt edx-event-bus-kafka==6.1.0 @@ -779,19 +750,20 @@ edx-event-bus-redis==0.6.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-i18n-tools==1.9.0 +edx-i18n-tools==1.5.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # ora2 # xblocks-contrib edx-lint==5.6.0 # via -r requirements/edx/testing.txt -edx-milestones==1.1.0 +edx-milestones==0.6.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-name-affirmation==3.0.2 +edx-name-affirmation==3.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -808,44 +780,40 @@ edx-opaque-keys[django]==3.0.0 # edx-organizations # edx-proctoring # edx-when - # enterprise-integrated-channels # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-filters # ora2 - # xblocks-contrib -edx-organizations==7.3.0 +edx-organizations==6.13.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-proctoring==5.2.0 +edx-proctoring==5.1.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-rbac==2.1.0 + # edx-proctoring-proctortrack +edx-rbac==1.10.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels edx-rest-api-client==6.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise # edx-proctoring - # enterprise-integrated-channels -edx-search==4.3.0 +edx-search==4.1.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-forum -edx-sga==0.26.0 +edx-sga==0.25.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -edx-submissions==3.12.1 +edx-submissions==3.10.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -855,12 +823,10 @@ edx-tincan-py35==2.0.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels -edx-toggles==5.4.1 +edx-toggles==5.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # edx-auth-backends # edx-completion # edx-enterprise # edx-event-bus-kafka @@ -870,19 +836,19 @@ edx-toggles==5.4.1 # edxval # event-tracking # ora2 -edx-when==3.0.0 +edx-when==2.5.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-proctoring -edxval==3.1.0 +edxval==2.10.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt elasticsearch==7.9.1 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-search @@ -892,16 +858,13 @@ enmerkar==0.7.1 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # enmerkar-underscore -enmerkar-underscore==2.4.0 +enmerkar-underscore==2.3.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -enterprise-integrated-channels==0.1.22 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt -event-tracking==3.3.0 +event-tracking==3.0.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-completion @@ -913,52 +876,57 @@ execnet==2.1.1 # pytest-xdist factory-boy==3.3.3 # via -r requirements/edx/testing.txt -faker==37.12.0 +faker==37.1.0 # via # -r requirements/edx/testing.txt # factory-boy -fastapi==0.120.2 +fastapi==0.115.12 # via # -r requirements/edx/testing.txt # pact-python -fastavro==1.12.1 +fastavro==1.10.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-events -filelock==3.20.0 +filelock==3.18.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # snowflake-connector-python # tox # virtualenv -firebase-admin==7.1.0 +firebase-admin==6.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-ace -freezegun==1.5.5 +freezegun==1.5.1 # via -r requirements/edx/testing.txt -frozenlist==1.8.0 +frozenlist==1.6.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # aiohttp # aiosignal -fs==2.4.16 +fs==2.0.27 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # fs-s3fs # openedx-django-pyfs # xblock -fs-s3fs==1.1.1 +fs-s3fs==0.1.8 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-django-pyfs -geoip2==5.1.0 +future==1.0.0 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt + # pyjwkest +geoip2==5.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -966,40 +934,53 @@ gitdb==4.0.12 # via # -r requirements/edx/doc.txt # gitpython -gitpython==3.1.45 +gitpython==3.1.44 # via -r requirements/edx/doc.txt glob2==0.7 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -google-api-core[grpc]==2.28.1 +google-api-core[grpc]==2.24.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # firebase-admin + # google-api-python-client # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-auth==2.42.0 +google-api-python-client==2.167.0 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt + # firebase-admin +google-auth==2.39.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-api-core + # google-api-python-client + # google-auth-httplib2 # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-cloud-core==2.5.0 +google-auth-httplib2==0.2.0 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt + # google-api-python-client +google-cloud-core==2.4.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-cloud-firestore # google-cloud-storage -google-cloud-firestore==2.21.0 +google-cloud-firestore==2.20.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # firebase-admin -google-cloud-storage==3.4.1 +google-cloud-storage==3.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1015,23 +996,23 @@ google-resumable-media==2.7.2 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-cloud-storage -googleapis-common-protos==1.71.0 +googleapis-common-protos==1.70.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-api-core # grpcio-status -grimp==3.13 +grimp==3.8 # via # -r requirements/edx/testing.txt # import-linter -grpcio==1.76.0 +grpcio==1.71.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-api-core # grpcio-status -grpcio-status==1.76.0 +grpcio-status==1.71.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1040,69 +1021,57 @@ gunicorn==23.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -h11==0.16.0 +h11==0.14.0 # via - # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # httpcore # uvicorn -h2==4.3.0 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # httpx -help-tokens==3.2.0 +help-tokens==3.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -hpack==4.1.0 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # h2 html5lib==1.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # ora2 -httpcore==1.0.9 +httpcore==0.16.3 # via - # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # httpx -httpretty==1.1.4 - # via -r requirements/edx/testing.txt -httpx[http2]==0.28.1 +httplib2==0.22.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # firebase-admin -hyperframe==6.1.0 + # google-api-python-client + # google-auth-httplib2 +httpretty==1.1.4 + # via -r requirements/edx/testing.txt +httpx==0.23.3 # via - # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # h2 -icalendar==6.3.1 + # pact-python +icalendar==6.1.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -idna==3.11 +idna==3.10 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # anyio - # httpx # optimizely-sdk # requests + # rfc3986 # snowflake-connector-python # yarl imagesize==1.4.1 # via # -r requirements/edx/doc.txt # sphinx -import-linter==2.5.2 +import-linter==2.3 # via -r requirements/edx/testing.txt -importlib-metadata==8.7.0 +importlib-metadata==8.6.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1112,15 +1081,10 @@ inflection==0.5.1 # -r requirements/edx/testing.txt # drf-spectacular # drf-yasg -iniconfig==2.3.0 +iniconfig==2.1.0 # via # -r requirements/edx/testing.txt # pytest -invoke==2.2.1 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # paramiko ipaddress==1.0.23 # via # -r requirements/edx/doc.txt @@ -1130,7 +1094,7 @@ isodate==0.7.2 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # python3-saml -isort==6.1.0 +isort==6.0.1 # via # -r requirements/edx/testing.txt # pylint @@ -1148,7 +1112,7 @@ jmespath==1.0.1 # -r requirements/edx/testing.txt # boto3 # botocore -joblib==1.5.2 +joblib==1.4.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1158,7 +1122,7 @@ jsondiff==2.2.1 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise -jsonfield==3.2.0 +jsonfield==3.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1166,17 +1130,16 @@ jsonfield==3.2.0 # edx-enterprise # edx-proctoring # edx-submissions - # enterprise-integrated-channels # lti-consumer-xblock # ora2 -jsonschema==4.25.1 +jsonschema==4.23.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # drf-spectacular # optimizely-sdk # sphinxcontrib-openapi -jsonschema-specifications==2025.9.1 +jsonschema-specifications==2024.10.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1187,7 +1150,7 @@ jwcrypto==1.5.6 # -r requirements/edx/testing.txt # django-oauth-toolkit # pylti1p3 -kombu==5.5.4 +kombu==5.5.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1206,20 +1169,19 @@ lazy==1.6 # xblock libsass==0.10.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/assets.txt loremipsum==1.0.5 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # ora2 -lti-consumer-xblock==9.14.3 +lti-consumer-xblock==9.13.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt lxml[html-clean]==5.3.2 # via - # -c requirements/constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-i18n-tools @@ -1233,7 +1195,7 @@ lxml[html-clean]==5.3.2 # python3-saml # xblock # xmlsec -lxml-html-clean==0.4.3 +lxml-html-clean==0.4.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1250,14 +1212,15 @@ mako==1.3.10 # lti-consumer-xblock # xblock # xblock-utils -markdown==3.9 +markdown==3.3.7 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-django-wiki # staff-graded-xblock # xblock-poll -markupsafe==3.0.3 +markupsafe==3.0.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1266,7 +1229,7 @@ markupsafe==3.0.3 # mako # openedx-calc # xblock -maxminddb==2.8.2 +maxminddb==2.6.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1275,12 +1238,12 @@ mccabe==0.7.0 # via # -r requirements/edx/testing.txt # pylint -meilisearch==0.37.1 +meilisearch==0.34.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-search -mistune==3.1.4 +mistune==3.1.3 # via # -r requirements/edx/doc.txt # sphinx-mdinclude @@ -1295,7 +1258,7 @@ monotonic==1.6 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # analytics-python -more-itertools==10.8.0 +more-itertools==10.6.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1305,34 +1268,36 @@ mpmath==1.3.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # sympy -msgpack==1.1.2 +msgpack==1.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # cachecontrol -multidict==6.7.0 +multidict==6.4.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # aiohttp # yarl -mypy==1.18.2 - # via - # -r requirements/edx/development.in - # django-stubs -mypy-extensions==1.1.0 +mypy==1.15.0 + # via -r requirements/edx/development.in +mypy-extensions==1.0.0 # via mypy mysqlclient==2.2.7 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-forum -nh3==0.3.1 +newrelic==10.9.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # xblocks-contrib -nltk==3.9.2 + # edx-django-utils +nh3==0.2.21 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt +nltk==3.9.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1344,14 +1309,14 @@ nodeenv==1.9.1 # -r requirements/edx/testing.txt numpy==1.26.4 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # chem # openedx-calc # scipy # shapely -oauthlib==3.3.1 +oauthlib==3.2.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1359,14 +1324,13 @@ oauthlib==3.3.1 # lti-consumer-xblock # requests-oauthlib # social-auth-core - # xblocks-contrib olxcleaner==0.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt openai==0.28.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise @@ -1374,33 +1338,27 @@ openedx-atlas==0.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # enterprise-integrated-channels - # openedx-authz # openedx-forum -openedx-authz==0.20.0 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt openedx-calc==4.0.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -openedx-django-pyfs==3.8.0 +openedx-django-pyfs==3.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # lti-consumer-xblock # xblock # xblocks-contrib -openedx-django-require==3.0.0 +openedx-django-require==2.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -openedx-django-wiki==3.1.1 +openedx-django-wiki==2.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -openedx-events==10.5.0 +openedx-events==10.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1410,56 +1368,57 @@ openedx-events==10.5.0 # edx-name-affirmation # event-tracking # ora2 -openedx-filters==2.1.0 +openedx-filters==2.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # lti-consumer-xblock # ora2 -openedx-forum==0.3.8 +openedx-forum==0.3.6 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -openedx-learning==0.30.1 +openedx-learning==0.26.0 + # via + # -c requirements/edx/../constraints.txt + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt +openedx-mongodbproxy==0.2.2 # via - # -c requirements/constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt optimizely-sdk==5.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -ora2==6.17.1 +ora2==6.16.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt packaging==25.0 # via + # -r requirements/edx/../pip-tools.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # -r requirements/pip-tools.txt # build # drf-yasg # gunicorn - # kombu # pydata-sphinx-theme # pyproject-api # pytest # snowflake-connector-python # sphinx # tox -pact-python==1.6.0 - # via - # -c requirements/constraints.txt - # -r requirements/edx/testing.txt -paramiko==4.0.0 +pact-python==2.0.1 + # via -r requirements/edx/testing.txt +paramiko==3.5.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise path==16.11.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-i18n-tools @@ -1471,8 +1430,11 @@ path-py==12.5.0 # edx-enterprise # ora2 # staff-graded-xblock -pathspec==0.12.1 - # via mypy +pbr==6.1.1 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt + # stevedore pgpy==0.6.0 # via # -r requirements/edx/doc.txt @@ -1486,16 +1448,16 @@ piexif==1.1.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -pillow==12.0.0 +pillow==11.2.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise # edx-organizations # edxval -pip-tools==7.5.1 - # via -r requirements/pip-tools.txt -platformdirs==4.5.0 +pip-tools==7.4.1 + # via -r requirements/edx/../pip-tools.txt +platformdirs==4.3.7 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1503,24 +1465,23 @@ platformdirs==4.5.0 # snowflake-connector-python # tox # virtualenv -pluggy==1.6.0 +pluggy==1.5.0 # via # -r requirements/edx/testing.txt # diff-cover # pytest - # pytest-cov # tox polib==1.2.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-i18n-tools -prompt-toolkit==3.0.52 +prompt-toolkit==3.0.51 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # click-repl -propcache==0.4.1 +propcache==0.3.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1532,7 +1493,7 @@ proto-plus==1.26.1 # -r requirements/edx/testing.txt # google-api-core # google-cloud-firestore -protobuf==6.33.0 +protobuf==5.29.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1541,7 +1502,7 @@ protobuf==6.33.0 # googleapis-common-protos # grpcio-status # proto-plus -psutil==7.1.2 +psutil==7.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1562,38 +1523,33 @@ pyasn1-modules==0.4.2 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # google-auth -pycasbin==2.4.0 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # casbin-django-orm-adapter - # openedx-authz pycodestyle==2.8.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/testing.txt pycountry==24.6.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -pycparser==2.23 +pycparser==2.22 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # cffi -pycryptodomex==3.23.0 +pycryptodomex==3.22.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-proctoring # lti-consumer-xblock -pydantic==2.12.3 + # pyjwkest +pydantic==2.11.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # camel-converter # fastapi -pydantic-core==2.41.4 +pydantic-core==2.33.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1602,7 +1558,7 @@ pydata-sphinx-theme==0.15.4 # via # -r requirements/edx/doc.txt # sphinx-book-theme -pygments==2.19.2 +pygments==2.19.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1611,6 +1567,11 @@ pygments==2.19.2 # pydata-sphinx-theme # sphinx # sphinx-mdinclude +pyjwkest==1.4.2 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt + # lti-consumer-xblock pyjwt[crypto]==2.10.1 # via # -r requirements/edx/doc.txt @@ -1630,7 +1591,7 @@ pylatexenc==2.10 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # olxcleaner -pylint==3.3.9 +pylint==3.3.6 # via # -r requirements/edx/testing.txt # edx-lint @@ -1646,7 +1607,7 @@ pylint-django==2.6.1 # via # -r requirements/edx/testing.txt # edx-lint -pylint-plugin-utils==0.9.0 +pylint-plugin-utils==0.8.2 # via # -r requirements/edx/testing.txt # pylint-celery @@ -1663,14 +1624,15 @@ pymemcache==4.0.0 # -r requirements/edx/testing.txt pymongo==4.4.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-opaque-keys # event-tracking # mongoengine # openedx-forum -pynacl==1.6.0 + # openedx-mongodbproxy +pynacl==1.5.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1680,24 +1642,25 @@ pynliner==0.8.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -pyopenssl==25.3.0 +pyopenssl==25.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # snowflake-connector-python -pyparsing==3.2.5 +pyparsing==3.2.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # chem + # httplib2 # openedx-calc -pyproject-api==1.10.0 +pyproject-api==1.9.0 # via # -r requirements/edx/testing.txt # tox pyproject-hooks==1.2.0 # via - # -r requirements/pip-tools.txt + # -r requirements/edx/../pip-tools.txt # build # pip-tools pyquery==2.0.1 @@ -1725,7 +1688,7 @@ pytest==8.2.0 # pytest-xdist pytest-attrib==0.1.3 # via -r requirements/edx/testing.txt -pytest-cov==7.0.0 +pytest-cov==6.1.1 # via -r requirements/edx/testing.txt pytest-django==4.11.1 # via -r requirements/edx/testing.txt @@ -1735,9 +1698,9 @@ pytest-metadata==3.1.1 # via # -r requirements/edx/testing.txt # pytest-json-report -pytest-randomly==4.0.1 +pytest-randomly==3.16.0 # via -r requirements/edx/testing.txt -pytest-xdist[psutil]==3.8.0 +pytest-xdist[psutil]==3.6.1 # via -r requirements/edx/testing.txt python-dateutil==2.9.0.post0 # via @@ -1764,7 +1727,7 @@ python-slugify==8.0.4 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # code-annotations -python-swiftclient==4.8.0 +python-swiftclient==4.7.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1782,14 +1745,15 @@ pytz==2025.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt + # djangorestframework # drf-yasg # edx-completion # edx-enterprise # edx-proctoring # edx-submissions # edx-tincan-py35 - # enterprise-integrated-channels # event-tracking + # fs # olxcleaner # ora2 # snowflake-connector-python @@ -1798,9 +1762,9 @@ pyuca==1.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -pywatchman==3.0.0 +pywatchman==2.0.0 # via -r requirements/edx/development.in -pyyaml==6.0.3 +pyyaml==6.0.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1817,27 +1781,27 @@ random2==1.0.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -recommender-xblock==3.1.0 +recommender-xblock==3.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -redis==7.0.1 +redis==5.2.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # walrus -referencing==0.37.0 +referencing==0.36.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # jsonschema # jsonschema-specifications -regex==2025.10.23 +regex==2024.11.6 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # nltk -requests==2.32.5 +requests==2.32.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1849,7 +1813,6 @@ requests==2.32.5 # edx-drf-extensions # edx-enterprise # edx-rest-api-client - # enterprise-integrated-channels # geoip2 # google-api-core # google-cloud-storage @@ -1859,6 +1822,7 @@ requests==2.32.5 # openedx-forum # optimizely-sdk # pact-python + # pyjwkest # pylti1p3 # python-swiftclient # requests-oauthlib @@ -1873,11 +1837,15 @@ requests-oauthlib==2.0.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # social-auth-core +rfc3986[idna2008]==1.5.0 + # via + # -r requirements/edx/testing.txt + # httpx roman-numerals-py==3.1.0 # via # -r requirements/edx/doc.txt # sphinx -rpds-py==0.28.0 +rpds-py==0.24.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1895,7 +1863,7 @@ rules==3.5 # edx-enterprise # edx-proctoring # openedx-learning -s3transfer==0.14.0 +s3transfer==0.11.5 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1905,7 +1873,7 @@ sailthru-client==2.2.3 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-ace -scipy==1.16.3 +scipy==1.15.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1915,16 +1883,11 @@ semantic-version==2.10.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-drf-extensions -shapely==2.1.2 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt -simpleeval==1.0.3 +shapely==2.1.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # pycasbin -simplejson==3.20.2 +simplejson==3.20.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -1932,7 +1895,7 @@ simplejson==3.20.2 # super-csv # xblock # xblock-utils -singledispatch==4.1.2 +singledispatch==4.1.1 # via -r requirements/edx/testing.txt six==1.17.0 # via @@ -1940,6 +1903,7 @@ six==1.17.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # analytics-python + # codejail-includes # crowdsourcehinter-xblock # edx-ace # edx-auth-backends @@ -1955,6 +1919,7 @@ six==1.17.0 # html5lib # libsass # pact-python + # pyjwkest # python-dateutil # sphinxcontrib-httpdomain slumber==0.7.1 @@ -1963,33 +1928,34 @@ slumber==0.7.1 # -r requirements/edx/testing.txt # edx-bulk-grades # edx-enterprise - # enterprise-integrated-channels smmap==5.0.2 # via # -r requirements/edx/doc.txt # gitdb sniffio==1.3.1 # via - # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # anyio -snowballstemmer==3.0.1 + # httpcore + # httpx +snowballstemmer==2.2.0 # via # -r requirements/edx/doc.txt # sphinx -snowflake-connector-python==4.0.0 +snowflake-connector-python==3.14.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise social-auth-app-django==5.4.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-auth-backends -social-auth-core==4.8.1 +social-auth-core==4.5.4 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-auth-backends @@ -2004,7 +1970,7 @@ sortedcontainers==2.4.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # snowflake-connector-python -soupsieve==2.8 +soupsieve==2.7 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2021,7 +1987,7 @@ sphinx==8.2.3 # sphinxcontrib-httpdomain # sphinxcontrib-openapi # sphinxext-rediraffe -sphinx-autoapi==3.6.1 +sphinx-autoapi==3.6.0 # via -r requirements/edx/doc.txt sphinx-book-theme==1.1.4 # via -r requirements/edx/doc.txt @@ -2031,7 +1997,7 @@ sphinx-mdinclude==0.6.2 # via # -r requirements/edx/doc.txt # sphinxcontrib-openapi -sphinx-reredirects==1.0.0 +sphinx-reredirects==0.1.6 # via -r requirements/edx/doc.txt sphinxcontrib-applehelp==2.0.0 # via @@ -2063,7 +2029,7 @@ sphinxcontrib-serializinghtml==2.0.0 # via # -r requirements/edx/doc.txt # sphinx -sphinxext-rediraffe==0.3.0 +sphinxext-rediraffe==0.2.7 # via -r requirements/edx/doc.txt sqlparse==0.5.3 # via @@ -2071,15 +2037,15 @@ sqlparse==0.5.3 # -r requirements/edx/testing.txt # django # django-debug-toolbar -staff-graded-xblock==3.1.0 +staff-graded-xblock==3.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -starlette==0.49.1 +starlette==0.46.2 # via # -r requirements/edx/testing.txt # fastapi -stevedore==5.5.0 +stevedore==5.4.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2088,17 +2054,17 @@ stevedore==5.5.0 # edx-django-utils # edx-enterprise # edx-opaque-keys -super-csv==4.1.0 +super-csv==4.0.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-bulk-grades -sympy==1.14.0 +sympy==1.13.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # openedx-calc -testfixtures==10.0.0 +testfixtures==8.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2113,14 +2079,13 @@ tinycss2==1.4.0 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # bleach -tomlkit==0.13.3 +tomlkit==0.13.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # openedx-learning # pylint # snowflake-connector-python -tox==4.32.0 +tox==4.25.0 # via -r requirements/edx/testing.txt tqdm==4.67.1 # via @@ -2128,17 +2093,16 @@ tqdm==4.67.1 # -r requirements/edx/testing.txt # nltk # openai -types-pyyaml==6.0.12.20250915 +types-pyyaml==6.0.12.20250402 # via # django-stubs # djangorestframework-stubs -types-requests==2.32.4.20250913 +types-requests==2.32.0.20250328 # via djangorestframework-stubs -typing-extensions==4.15.0 +typing-extensions==4.13.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # aiosignal # anyio # beautifulsoup4 # django-countries @@ -2148,7 +2112,6 @@ typing-extensions==4.15.0 # edx-opaque-keys # fastapi # grimp - # grpcio # import-linter # jwcrypto # mypy @@ -2159,9 +2122,8 @@ typing-extensions==4.15.0 # pyopenssl # referencing # snowflake-connector-python - # starlette # typing-inspection -typing-inspection==0.4.2 +typing-inspection==0.4.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2178,20 +2140,20 @@ unicodecsv==0.14.1 # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-enterprise - # enterprise-integrated-channels unicodeit==0.7.5 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt unidiff==0.7.5 # via -r requirements/edx/testing.txt -uritemplate==4.2.0 +uritemplate==4.1.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # drf-spectacular # drf-yasg -urllib3==2.5.0 + # google-api-python-client +urllib3==2.2.3 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2200,7 +2162,11 @@ urllib3==2.5.0 # pact-python # requests # types-requests -uvicorn==0.38.0 +user-util==1.1.0 + # via + # -r requirements/edx/doc.txt + # -r requirements/edx/testing.txt +uvicorn==0.34.2 # via # -r requirements/edx/testing.txt # pact-python @@ -2211,7 +2177,7 @@ vine==5.1.0 # amqp # celery # kombu -virtualenv==20.35.4 +virtualenv==20.30.0 # via # -r requirements/edx/testing.txt # tox @@ -2222,24 +2188,19 @@ voluptuous==0.15.2 # ora2 vulture==2.14 # via -r requirements/edx/development.in -walrus==0.9.5 +walrus==0.9.4 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # edx-event-bus-redis watchdog==6.0.0 # via -r requirements/edx/development.in -wcmatch==10.1 - # via - # -r requirements/edx/doc.txt - # -r requirements/edx/testing.txt - # pycasbin -wcwidth==0.2.14 +wcwidth==0.2.13 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # prompt-toolkit -web-fragments==3.1.0 +web-fragments==3.0.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2262,12 +2223,12 @@ webob==1.8.9 # xblock wheel==0.45.1 # via + # -r requirements/edx/../pip-tools.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt - # -r requirements/pip-tools.txt # django-pipeline # pip-tools -wrapt==2.0.0 +wrapt==1.17.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2288,7 +2249,7 @@ xblock[django]==5.2.0 # xblock-google-drive # xblock-utils # xblocks-contrib -xblock-drag-and-drop-v2==5.0.3 +xblock-drag-and-drop-v2==5.0.2 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2296,7 +2257,7 @@ xblock-google-drive==0.8.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -xblock-poll==1.15.1 +xblock-poll==1.14.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt @@ -2306,26 +2267,26 @@ xblock-utils==4.0.0 # -r requirements/edx/testing.txt # edx-sga # xblock-poll -xblocks-contrib==0.6.0 +xblocks-contrib==0.3.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt xmlsec==1.3.14 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # python3-saml -xss-utils==0.8.0 +xss-utils==0.7.1 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt -yarl==1.22.0 +yarl==1.20.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt # aiohttp -zipp==3.23.0 +zipp==3.21.0 # via # -r requirements/edx/doc.txt # -r requirements/edx/testing.txt diff --git a/requirements/edx/doc.txt b/requirements/edx/doc.txt index 68c115676fe9..c9929446bdd8 100644 --- a/requirements/edx/doc.txt +++ b/requirements/edx/doc.txt @@ -4,6 +4,8 @@ # # make upgrade # +-e git+https://github.com/anupdhabarde/edx-proctoring-proctortrack.git@31c6c9923a51c903ae83760ecbbac191363aa2a2#egg=edx_proctoring_proctortrack + # via -r requirements/edx/base.txt accessible-pygments==0.0.5 # via pydata-sphinx-theme acid-xblock==0.4.1 @@ -12,12 +14,12 @@ aiohappyeyeballs==2.6.1 # via # -r requirements/edx/base.txt # aiohttp -aiohttp==3.13.2 +aiohttp==3.11.18 # via # -r requirements/edx/base.txt # geoip2 # openai -aiosignal==1.4.0 +aiosignal==1.3.2 # via # -r requirements/edx/base.txt # aiohttp @@ -37,15 +39,11 @@ annotated-types==0.7.0 # via # -r requirements/edx/base.txt # pydantic -anyio==4.11.0 - # via - # -r requirements/edx/base.txt - # httpx appdirs==1.4.4 # via # -r requirements/edx/base.txt # fs -asgiref==3.10.0 +asgiref==3.8.1 # via # -r requirements/edx/base.txt # django @@ -55,16 +53,15 @@ asn1crypto==1.5.1 # via # -r requirements/edx/base.txt # snowflake-connector-python -astroid==3.3.11 +astroid==3.3.9 # via sphinx-autoapi -attrs==25.4.0 +attrs==25.3.0 # via # -r requirements/edx/base.txt # aiohttp # edx-ace # jsonschema # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-learning # referencing @@ -79,21 +76,21 @@ backoff==1.10.0 # via # -r requirements/edx/base.txt # analytics-python -bcrypt==5.0.0 +bcrypt==4.3.0 # via # -r requirements/edx/base.txt # paramiko -beautifulsoup4==4.14.2 +beautifulsoup4==4.13.4 # via # -r requirements/edx/base.txt # openedx-forum # pydata-sphinx-theme # pynliner -billiard==4.2.2 +billiard==4.2.1 # via # -r requirements/edx/base.txt # celery -bleach[css]==6.3.0 +bleach[css]==6.2.0 # via # -r requirements/edx/base.txt # edx-enterprise @@ -104,78 +101,69 @@ bleach[css]==6.3.0 # xblock-poll boto==2.49.0 # via -r requirements/edx/base.txt -boto3==1.40.62 +boto3==1.37.38 # via # -r requirements/edx/base.txt # django-ses # fs-s3fs # ora2 # snowflake-connector-python -botocore==1.40.62 +botocore==1.37.38 # via # -r requirements/edx/base.txt # boto3 # s3transfer # snowflake-connector-python -bracex==2.6 - # via - # -r requirements/edx/base.txt - # wcmatch bridgekeeper==0.9 # via -r requirements/edx/base.txt -cachecontrol==0.14.3 +cachecontrol==0.14.2 # via # -r requirements/edx/base.txt # firebase-admin -cachetools==6.2.1 +cachetools==5.5.2 # via # -r requirements/edx/base.txt # edxval # google-auth -camel-converter[pydantic]==5.0.0 +camel-converter[pydantic]==4.0.1 # via # -r requirements/edx/base.txt # meilisearch -casbin-django-orm-adapter==1.7.0 - # via - # -r requirements/edx/base.txt - # openedx-authz -celery==5.5.3 +celery==5.5.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # django-celery-results # django-user-tasks # edx-celeryutils # edx-enterprise - # enterprise-integrated-channels # event-tracking # openedx-learning -certifi==2025.10.5 +certifi==2025.1.31 # via # -r requirements/edx/base.txt # elasticsearch - # httpcore - # httpx # requests # snowflake-connector-python -cffi==2.0.0 +cffi==1.17.1 # via # -r requirements/edx/base.txt # cryptography # pynacl + # snowflake-connector-python chardet==5.2.0 # via # -r requirements/edx/base.txt # pysrt -charset-normalizer==3.4.4 +charset-normalizer==2.0.12 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # requests # snowflake-connector-python -chem==2.0.0 +chem==1.3.0 # via -r requirements/edx/base.txt -click==8.3.0 +click==8.1.8 # via # -r requirements/edx/base.txt # celery @@ -185,11 +173,12 @@ click==8.3.0 # code-annotations # edx-django-utils # nltk + # user-util click-didyoumean==0.3.1 # via # -r requirements/edx/base.txt # celery -click-plugins==1.1.1.2 +click-plugins==1.1.1 # via # -r requirements/edx/base.txt # celery @@ -203,13 +192,12 @@ code-annotations==2.3.0 # -r requirements/edx/doc.in # edx-enterprise # edx-toggles -codejail-includes==2.0.0 +codejail-includes==1.0.0 # via -r requirements/edx/base.txt crowdsourcehinter-xblock==0.8 # via -r requirements/edx/base.txt -cryptography==45.0.7 +cryptography==44.0.2 # via - # -c requirements/constraints.txt # -r requirements/edx/base.txt # django-fernet-fields-v2 # edx-enterprise @@ -219,6 +207,7 @@ cryptography==45.0.7 # pyjwt # pyopenssl # snowflake-connector-python + # social-auth-core cssutils==2.11.1 # via # -r requirements/edx/base.txt @@ -232,14 +221,12 @@ defusedxml==0.7.1 # ora2 # python3-openid # social-auth-core -django==5.2.7 +django==4.2.20 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt - # casbin-django-orm-adapter # django-appconf - # django-autocomplete-light # django-celery-results # django-classy-tags # django-config-models @@ -256,7 +243,6 @@ django==5.2.7 # django-push-notifications # django-sekizai # django-ses - # django-simple-history # django-statici18n # django-storages # django-user-tasks @@ -292,12 +278,10 @@ django==5.2.7 # edxval # enmerkar # enmerkar-underscore - # enterprise-integrated-channels # event-tracking # help-tokens # jsonfield # lti-consumer-xblock - # openedx-authz # openedx-django-pyfs # openedx-django-wiki # openedx-events @@ -313,8 +297,6 @@ django-appconf==1.1.0 # via # -r requirements/edx/base.txt # django-statici18n -django-autocomplete-light==3.12.1 - # via -r requirements/edx/base.txt django-cache-memoize==0.2.1 # via # -r requirements/edx/base.txt @@ -330,9 +312,8 @@ django-config-models==2.9.0 # -r requirements/edx/base.txt # edx-enterprise # edx-name-affirmation - # enterprise-integrated-channels # lti-consumer-xblock -django-cors-headers==4.9.0 +django-cors-headers==4.7.0 # via -r requirements/edx/base.txt django-countries==7.6.1 # via @@ -351,8 +332,7 @@ django-fernet-fields-v2==0.9 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -django-filter==25.2 +django-filter==25.1 # via # -r requirements/edx/base.txt # edx-enterprise @@ -384,31 +364,28 @@ django-model-utils==5.0.0 # edx-submissions # edx-when # edxval - # enterprise-integrated-channels # ora2 # super-csv -django-mptt==0.18.0 +django-mptt==0.17.0 # via # -r requirements/edx/base.txt # openedx-django-wiki -django-multi-email-field==0.8.0 +django-multi-email-field==0.7.0 # via # -r requirements/edx/base.txt # edx-enterprise -django-mysql==4.19.0 +django-mysql==4.16.0 # via -r requirements/edx/base.txt django-oauth-toolkit==1.7.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels django-object-actions==5.0.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -django-pipeline==4.1.0 +django-pipeline==4.0.0 # via -r requirements/edx/base.txt django-push-notifications==3.2.1 # via @@ -422,14 +399,14 @@ django-sekizai==4.1.0 # openedx-django-wiki django-ses==4.4.0 # via -r requirements/edx/base.txt -django-simple-history==3.10.1 +django-simple-history==3.4.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise # edx-name-affirmation # edx-organizations # edx-proctoring - # enterprise-integrated-channels # ora2 django-statici18n==2.6.0 # via @@ -438,13 +415,14 @@ django-statici18n==2.6.0 # xblock-drag-and-drop-v2 # xblock-poll # xblocks-contrib -django-storages==1.14.6 +django-storages==1.14.3 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edxval -django-user-tasks==3.4.4 +django-user-tasks==3.3.0 # via -r requirements/edx/base.txt -django-waffle==5.0.0 +django-waffle==4.2.0 # via # -r requirements/edx/base.txt # edx-django-utils @@ -452,12 +430,14 @@ django-waffle==5.0.0 # edx-enterprise # edx-proctoring # edx-toggles -django-webpack-loader==3.2.1 +django-webpack-loader==0.7.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-proctoring -djangorestframework==3.16.1 +djangorestframework==3.14.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # django-config-models # django-user-tasks @@ -472,7 +452,6 @@ djangorestframework==3.16.1 # edx-organizations # edx-proctoring # edx-submissions - # openedx-authz # openedx-forum # openedx-learning # ora2 @@ -481,7 +460,7 @@ djangorestframework-xml==2.0.0 # via # -r requirements/edx/base.txt # edx-enterprise -dnspython==2.8.0 +dnspython==2.7.0 # via # -r requirements/edx/base.txt # pymongo @@ -498,21 +477,20 @@ drf-jwt==1.19.2 # edx-drf-extensions drf-spectacular==0.28.0 # via -r requirements/edx/base.txt -drf-yasg==1.21.11 +drf-yasg==1.21.10 # via # -r requirements/edx/base.txt # django-user-tasks # edx-api-doc-tools -edx-ace==1.15.0 +edx-ace==1.11.4 # via -r requirements/edx/base.txt -edx-api-doc-tools==2.1.0 +edx-api-doc-tools==2.0.0 # via # -r requirements/edx/base.txt # edx-name-affirmation - # openedx-authz -edx-auth-backends==4.6.2 +edx-auth-backends==4.5.0 # via -r requirements/edx/base.txt -edx-bulk-grades==1.2.0 +edx-bulk-grades==1.1.0 # via # -r requirements/edx/base.txt # staff-graded-xblock @@ -521,28 +499,27 @@ edx-ccx-keys==2.0.2 # -r requirements/edx/base.txt # lti-consumer-xblock # openedx-events -edx-celeryutils==1.4.0 +edx-celeryutils==1.3.0 # via # -r requirements/edx/base.txt # edx-name-affirmation # super-csv -edx-codejail==4.0.0 +edx-codejail==3.5.2 # via -r requirements/edx/base.txt -edx-completion==4.9 +edx-completion==4.7.11 # via -r requirements/edx/base.txt -edx-django-release-util==1.5.0 +edx-django-release-util==1.4.0 # via # -r requirements/edx/base.txt # edx-submissions # edxval -edx-django-sites-extensions==5.1.0 +edx-django-sites-extensions==4.2.0 # via -r requirements/edx/base.txt -edx-django-utils==8.0.1 +edx-django-utils==7.4.0 # via # -r requirements/edx/base.txt # django-config-models # edx-ace - # edx-auth-backends # edx-drf-extensions # edx-enterprise # edx-event-bus-kafka @@ -551,9 +528,7 @@ edx-django-utils==8.0.1 # edx-rest-api-client # edx-toggles # edx-when - # enterprise-integrated-channels # event-tracking - # openedx-authz # openedx-events # ora2 # super-csv @@ -568,25 +543,24 @@ edx-drf-extensions==10.6.0 # edx-rbac # edx-when # edxval - # enterprise-integrated-channels - # openedx-authz # openedx-learning -edx-enterprise==6.5.1 +edx-enterprise==5.12.7 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt edx-event-bus-kafka==6.1.0 # via -r requirements/edx/base.txt edx-event-bus-redis==0.6.1 # via -r requirements/edx/base.txt -edx-i18n-tools==1.9.0 +edx-i18n-tools==1.5.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # ora2 # xblocks-contrib -edx-milestones==1.1.0 +edx-milestones==0.6.0 # via -r requirements/edx/base.txt -edx-name-affirmation==3.0.2 +edx-name-affirmation==3.0.1 # via -r requirements/edx/base.txt edx-opaque-keys[django]==3.0.0 # via @@ -600,35 +574,32 @@ edx-opaque-keys[django]==3.0.0 # edx-organizations # edx-proctoring # edx-when - # enterprise-integrated-channels # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-filters # ora2 - # xblocks-contrib -edx-organizations==7.3.0 +edx-organizations==6.13.0 # via -r requirements/edx/base.txt -edx-proctoring==5.2.0 - # via -r requirements/edx/base.txt -edx-rbac==2.1.0 +edx-proctoring==5.1.2 + # via + # -r requirements/edx/base.txt + # edx-proctoring-proctortrack +edx-rbac==1.10.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels edx-rest-api-client==6.2.0 # via # -r requirements/edx/base.txt # edx-enterprise # edx-proctoring - # enterprise-integrated-channels -edx-search==4.3.0 +edx-search==4.1.3 # via # -r requirements/edx/base.txt # openedx-forum -edx-sga==0.26.0 +edx-sga==0.25.3 # via -r requirements/edx/base.txt -edx-submissions==3.12.1 +edx-submissions==3.10.0 # via # -r requirements/edx/base.txt # ora2 @@ -636,11 +607,9 @@ edx-tincan-py35==2.0.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -edx-toggles==5.4.1 +edx-toggles==5.3.0 # via # -r requirements/edx/base.txt - # edx-auth-backends # edx-completion # edx-enterprise # edx-event-bus-kafka @@ -650,16 +619,16 @@ edx-toggles==5.4.1 # edxval # event-tracking # ora2 -edx-when==3.0.0 +edx-when==2.5.1 # via # -r requirements/edx/base.txt # edx-proctoring -edxval==3.1.0 +edxval==2.10.0 # via -r requirements/edx/base.txt elasticsearch==7.9.1 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-search # openedx-forum @@ -667,75 +636,89 @@ enmerkar==0.7.1 # via # -r requirements/edx/base.txt # enmerkar-underscore -enmerkar-underscore==2.4.0 +enmerkar-underscore==2.3.1 # via -r requirements/edx/base.txt -enterprise-integrated-channels==0.1.22 - # via -r requirements/edx/base.txt -event-tracking==3.3.0 +event-tracking==3.0.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-completion # edx-proctoring # edx-search -fastavro==1.12.1 +fastavro==1.10.0 # via # -r requirements/edx/base.txt # openedx-events -filelock==3.20.0 +filelock==3.18.0 # via # -r requirements/edx/base.txt # snowflake-connector-python -firebase-admin==7.1.0 +firebase-admin==6.7.0 # via # -r requirements/edx/base.txt # edx-ace -frozenlist==1.8.0 +frozenlist==1.6.0 # via # -r requirements/edx/base.txt # aiohttp # aiosignal -fs==2.4.16 +fs==2.0.27 # via # -r requirements/edx/base.txt # fs-s3fs # openedx-django-pyfs # xblock -fs-s3fs==1.1.1 +fs-s3fs==0.1.8 # via # -r requirements/edx/base.txt # openedx-django-pyfs -geoip2==5.1.0 +future==1.0.0 + # via + # -r requirements/edx/base.txt + # pyjwkest +geoip2==5.0.1 # via -r requirements/edx/base.txt gitdb==4.0.12 # via gitpython -gitpython==3.1.45 +gitpython==3.1.44 # via -r requirements/edx/doc.in glob2==0.7 # via -r requirements/edx/base.txt -google-api-core[grpc]==2.28.1 +google-api-core[grpc]==2.24.2 # via # -r requirements/edx/base.txt # firebase-admin + # google-api-python-client # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-auth==2.42.0 +google-api-python-client==2.167.0 + # via + # -r requirements/edx/base.txt + # firebase-admin +google-auth==2.39.0 # via # -r requirements/edx/base.txt # google-api-core + # google-api-python-client + # google-auth-httplib2 # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-cloud-core==2.5.0 +google-auth-httplib2==0.2.0 + # via + # -r requirements/edx/base.txt + # google-api-python-client +google-cloud-core==2.4.3 # via # -r requirements/edx/base.txt # google-cloud-firestore # google-cloud-storage -google-cloud-firestore==2.21.0 +google-cloud-firestore==2.20.2 # via # -r requirements/edx/base.txt # firebase-admin -google-cloud-storage==3.4.1 +google-cloud-storage==3.1.0 # via # -r requirements/edx/base.txt # firebase-admin @@ -748,76 +731,51 @@ google-resumable-media==2.7.2 # via # -r requirements/edx/base.txt # google-cloud-storage -googleapis-common-protos==1.71.0 +googleapis-common-protos==1.70.0 # via # -r requirements/edx/base.txt # google-api-core # grpcio-status -grpcio==1.76.0 +grpcio==1.71.0 # via # -r requirements/edx/base.txt # google-api-core # grpcio-status -grpcio-status==1.76.0 +grpcio-status==1.71.0 # via # -r requirements/edx/base.txt # google-api-core gunicorn==23.0.0 # via -r requirements/edx/base.txt -h11==0.16.0 - # via - # -r requirements/edx/base.txt - # httpcore -h2==4.3.0 - # via - # -r requirements/edx/base.txt - # httpx -help-tokens==3.2.0 +help-tokens==3.1.0 # via -r requirements/edx/base.txt -hpack==4.1.0 - # via - # -r requirements/edx/base.txt - # h2 html5lib==1.1 # via # -r requirements/edx/base.txt # ora2 -httpcore==1.0.9 - # via - # -r requirements/edx/base.txt - # httpx -httpx[http2]==0.28.1 +httplib2==0.22.0 # via # -r requirements/edx/base.txt - # firebase-admin -hyperframe==6.1.0 - # via - # -r requirements/edx/base.txt - # h2 -icalendar==6.3.1 + # google-api-python-client + # google-auth-httplib2 +icalendar==6.1.3 # via -r requirements/edx/base.txt -idna==3.11 +idna==3.10 # via # -r requirements/edx/base.txt - # anyio - # httpx # optimizely-sdk # requests # snowflake-connector-python # yarl imagesize==1.4.1 # via sphinx -importlib-metadata==8.7.0 +importlib-metadata==8.6.1 # via -r requirements/edx/base.txt inflection==0.5.1 # via # -r requirements/edx/base.txt # drf-spectacular # drf-yasg -invoke==2.2.1 - # via - # -r requirements/edx/base.txt - # paramiko ipaddress==1.0.23 # via -r requirements/edx/base.txt isodate==0.7.2 @@ -835,7 +793,7 @@ jmespath==1.0.1 # -r requirements/edx/base.txt # boto3 # botocore -joblib==1.5.2 +joblib==1.4.2 # via # -r requirements/edx/base.txt # nltk @@ -843,23 +801,22 @@ jsondiff==2.2.1 # via # -r requirements/edx/base.txt # edx-enterprise -jsonfield==3.2.0 +jsonfield==3.1.0 # via # -r requirements/edx/base.txt # edx-celeryutils # edx-enterprise # edx-proctoring # edx-submissions - # enterprise-integrated-channels # lti-consumer-xblock # ora2 -jsonschema==4.25.1 +jsonschema==4.23.0 # via # -r requirements/edx/base.txt # drf-spectacular # optimizely-sdk # sphinxcontrib-openapi -jsonschema-specifications==2025.9.1 +jsonschema-specifications==2024.10.1 # via # -r requirements/edx/base.txt # jsonschema @@ -868,7 +825,7 @@ jwcrypto==1.5.6 # -r requirements/edx/base.txt # django-oauth-toolkit # pylti1p3 -kombu==5.5.4 +kombu==5.5.3 # via # -r requirements/edx/base.txt # celery @@ -885,11 +842,10 @@ loremipsum==1.0.5 # via # -r requirements/edx/base.txt # ora2 -lti-consumer-xblock==9.14.3 +lti-consumer-xblock==9.13.4 # via -r requirements/edx/base.txt lxml[html-clean]==5.3.2 # via - # -c requirements/constraints.txt # -r requirements/edx/base.txt # edx-i18n-tools # edxval @@ -901,7 +857,7 @@ lxml[html-clean]==5.3.2 # python3-saml # xblock # xmlsec -lxml-html-clean==0.4.3 +lxml-html-clean==0.4.2 # via # -r requirements/edx/base.txt # lxml @@ -914,13 +870,14 @@ mako==1.3.10 # lti-consumer-xblock # xblock # xblock-utils -markdown==3.9 +markdown==3.3.7 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # openedx-django-wiki # staff-graded-xblock # xblock-poll -markupsafe==3.0.3 +markupsafe==3.0.2 # via # -r requirements/edx/base.txt # chem @@ -928,15 +885,15 @@ markupsafe==3.0.3 # mako # openedx-calc # xblock -maxminddb==2.8.2 +maxminddb==2.6.3 # via # -r requirements/edx/base.txt # geoip2 -meilisearch==0.37.1 +meilisearch==0.34.1 # via # -r requirements/edx/base.txt # edx-search -mistune==3.1.4 +mistune==3.1.3 # via sphinx-mdinclude mongoengine==0.29.1 # via -r requirements/edx/base.txt @@ -944,7 +901,7 @@ monotonic==1.6 # via # -r requirements/edx/base.txt # analytics-python -more-itertools==10.8.0 +more-itertools==10.6.0 # via # -r requirements/edx/base.txt # cssutils @@ -952,11 +909,11 @@ mpmath==1.3.0 # via # -r requirements/edx/base.txt # sympy -msgpack==1.1.2 +msgpack==1.1.0 # via # -r requirements/edx/base.txt # cachecontrol -multidict==6.7.0 +multidict==6.4.3 # via # -r requirements/edx/base.txt # aiohttp @@ -965,11 +922,13 @@ mysqlclient==2.2.7 # via # -r requirements/edx/base.txt # openedx-forum -nh3==0.3.1 +newrelic==10.9.0 # via # -r requirements/edx/base.txt - # xblocks-contrib -nltk==3.9.2 + # edx-django-utils +nh3==0.2.21 + # via -r requirements/edx/base.txt +nltk==3.9.1 # via # -r requirements/edx/base.txt # chem @@ -977,48 +936,43 @@ nodeenv==1.9.1 # via -r requirements/edx/base.txt numpy==1.26.4 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # chem # openedx-calc # scipy # shapely -oauthlib==3.3.1 +oauthlib==3.2.2 # via # -r requirements/edx/base.txt # django-oauth-toolkit # lti-consumer-xblock # requests-oauthlib # social-auth-core - # xblocks-contrib olxcleaner==0.3.0 # via -r requirements/edx/base.txt openai==0.28.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise openedx-atlas==0.7.0 # via # -r requirements/edx/base.txt - # enterprise-integrated-channels - # openedx-authz # openedx-forum -openedx-authz==0.20.0 - # via -r requirements/edx/base.txt openedx-calc==4.0.2 # via -r requirements/edx/base.txt -openedx-django-pyfs==3.8.0 +openedx-django-pyfs==3.7.0 # via # -r requirements/edx/base.txt # lti-consumer-xblock # xblock # xblocks-contrib -openedx-django-require==3.0.0 +openedx-django-require==2.1.0 # via -r requirements/edx/base.txt -openedx-django-wiki==3.1.1 +openedx-django-wiki==2.1.0 # via -r requirements/edx/base.txt -openedx-events==10.5.0 +openedx-events==10.2.0 # via # -r requirements/edx/base.txt # edx-enterprise @@ -1027,37 +981,38 @@ openedx-events==10.5.0 # edx-name-affirmation # event-tracking # ora2 -openedx-filters==2.1.0 +openedx-filters==2.0.1 # via # -r requirements/edx/base.txt # lti-consumer-xblock # ora2 -openedx-forum==0.3.8 +openedx-forum==0.3.6 # via -r requirements/edx/base.txt -openedx-learning==0.30.1 +openedx-learning==0.26.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt +openedx-mongodbproxy==0.2.2 + # via -r requirements/edx/base.txt optimizely-sdk==5.2.0 # via -r requirements/edx/base.txt -ora2==6.17.1 +ora2==6.16.1 # via -r requirements/edx/base.txt packaging==25.0 # via # -r requirements/edx/base.txt # drf-yasg # gunicorn - # kombu # pydata-sphinx-theme # snowflake-connector-python # sphinx -paramiko==4.0.0 +paramiko==3.5.1 # via # -r requirements/edx/base.txt # edx-enterprise path==16.11.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-i18n-tools # path-py @@ -1067,6 +1022,10 @@ path-py==12.5.0 # edx-enterprise # ora2 # staff-graded-xblock +pbr==6.1.1 + # via + # -r requirements/edx/base.txt + # stevedore pgpy==0.6.0 # via # -r requirements/edx/base.txt @@ -1075,13 +1034,13 @@ picobox==4.0.0 # via sphinxcontrib-openapi piexif==1.1.3 # via -r requirements/edx/base.txt -pillow==12.0.0 +pillow==11.2.1 # via # -r requirements/edx/base.txt # edx-enterprise # edx-organizations # edxval -platformdirs==4.5.0 +platformdirs==4.3.7 # via # -r requirements/edx/base.txt # snowflake-connector-python @@ -1089,11 +1048,11 @@ polib==1.2.0 # via # -r requirements/edx/base.txt # edx-i18n-tools -prompt-toolkit==3.0.52 +prompt-toolkit==3.0.51 # via # -r requirements/edx/base.txt # click-repl -propcache==0.4.1 +propcache==0.3.1 # via # -r requirements/edx/base.txt # aiohttp @@ -1103,7 +1062,7 @@ proto-plus==1.26.1 # -r requirements/edx/base.txt # google-api-core # google-cloud-firestore -protobuf==6.33.0 +protobuf==5.29.4 # via # -r requirements/edx/base.txt # google-api-core @@ -1111,7 +1070,7 @@ protobuf==6.33.0 # googleapis-common-protos # grpcio-status # proto-plus -psutil==7.1.2 +psutil==7.0.0 # via # -r requirements/edx/base.txt # edx-django-utils @@ -1125,38 +1084,38 @@ pyasn1-modules==0.4.2 # via # -r requirements/edx/base.txt # google-auth -pycasbin==2.4.0 - # via - # -r requirements/edx/base.txt - # casbin-django-orm-adapter - # openedx-authz pycountry==24.6.1 # via -r requirements/edx/base.txt -pycparser==2.23 +pycparser==2.22 # via # -r requirements/edx/base.txt # cffi -pycryptodomex==3.23.0 +pycryptodomex==3.22.0 # via # -r requirements/edx/base.txt # edx-proctoring # lti-consumer-xblock -pydantic==2.12.3 + # pyjwkest +pydantic==2.11.3 # via # -r requirements/edx/base.txt # camel-converter -pydantic-core==2.41.4 +pydantic-core==2.33.1 # via # -r requirements/edx/base.txt # pydantic pydata-sphinx-theme==0.15.4 # via sphinx-book-theme -pygments==2.19.2 +pygments==2.19.1 # via # accessible-pygments # pydata-sphinx-theme # sphinx # sphinx-mdinclude +pyjwkest==1.4.2 + # via + # -r requirements/edx/base.txt + # lti-consumer-xblock pyjwt[crypto]==2.10.1 # via # -r requirements/edx/base.txt @@ -1180,27 +1139,29 @@ pymemcache==4.0.0 # via -r requirements/edx/base.txt pymongo==4.4.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-opaque-keys # event-tracking # mongoengine # openedx-forum -pynacl==1.6.0 + # openedx-mongodbproxy +pynacl==1.5.0 # via # -r requirements/edx/base.txt # edx-django-utils # paramiko pynliner==0.8.0 # via -r requirements/edx/base.txt -pyopenssl==25.3.0 +pyopenssl==25.0.0 # via # -r requirements/edx/base.txt # snowflake-connector-python -pyparsing==3.2.5 +pyparsing==3.2.3 # via # -r requirements/edx/base.txt # chem + # httplib2 # openedx-calc pyrsistent==0.20.0 # via @@ -1231,7 +1192,7 @@ python-slugify==8.0.4 # via # -r requirements/edx/base.txt # code-annotations -python-swiftclient==4.8.0 +python-swiftclient==4.7.0 # via # -r requirements/edx/base.txt # ora2 @@ -1244,21 +1205,22 @@ python3-saml==1.16.0 pytz==2025.2 # via # -r requirements/edx/base.txt + # djangorestframework # drf-yasg # edx-completion # edx-enterprise # edx-proctoring # edx-submissions # edx-tincan-py35 - # enterprise-integrated-channels # event-tracking + # fs # olxcleaner # ora2 # snowflake-connector-python # xblock pyuca==1.2 # via -r requirements/edx/base.txt -pyyaml==6.0.3 +pyyaml==6.0.2 # via # -r requirements/edx/base.txt # code-annotations @@ -1272,22 +1234,22 @@ pyyaml==6.0.3 # xblock random2==1.0.2 # via -r requirements/edx/base.txt -recommender-xblock==3.1.0 +recommender-xblock==3.0.0 # via -r requirements/edx/base.txt -redis==7.0.1 +redis==5.2.1 # via # -r requirements/edx/base.txt # walrus -referencing==0.37.0 +referencing==0.36.2 # via # -r requirements/edx/base.txt # jsonschema # jsonschema-specifications -regex==2025.10.23 +regex==2024.11.6 # via # -r requirements/edx/base.txt # nltk -requests==2.32.5 +requests==2.32.3 # via # -r requirements/edx/base.txt # analytics-python @@ -1297,7 +1259,6 @@ requests==2.32.5 # edx-drf-extensions # edx-enterprise # edx-rest-api-client - # enterprise-integrated-channels # geoip2 # google-api-core # google-cloud-storage @@ -1306,6 +1267,7 @@ requests==2.32.5 # openai # openedx-forum # optimizely-sdk + # pyjwkest # pylti1p3 # python-swiftclient # requests-oauthlib @@ -1321,7 +1283,7 @@ requests-oauthlib==2.0.0 # social-auth-core roman-numerals-py==3.1.0 # via sphinx -rpds-py==0.28.0 +rpds-py==0.24.0 # via # -r requirements/edx/base.txt # jsonschema @@ -1336,7 +1298,7 @@ rules==3.5 # edx-enterprise # edx-proctoring # openedx-learning -s3transfer==0.14.0 +s3transfer==0.11.5 # via # -r requirements/edx/base.txt # boto3 @@ -1344,7 +1306,7 @@ sailthru-client==2.2.3 # via # -r requirements/edx/base.txt # edx-ace -scipy==1.16.3 +scipy==1.15.2 # via # -r requirements/edx/base.txt # chem @@ -1352,13 +1314,9 @@ semantic-version==2.10.0 # via # -r requirements/edx/base.txt # edx-drf-extensions -shapely==2.1.2 +shapely==2.1.0 # via -r requirements/edx/base.txt -simpleeval==1.0.3 - # via - # -r requirements/edx/base.txt - # pycasbin -simplejson==3.20.2 +simplejson==3.20.1 # via # -r requirements/edx/base.txt # sailthru-client @@ -1369,6 +1327,7 @@ six==1.17.0 # via # -r requirements/edx/base.txt # analytics-python + # codejail-includes # crowdsourcehinter-xblock # edx-ace # edx-auth-backends @@ -1381,6 +1340,7 @@ six==1.17.0 # fs # fs-s3fs # html5lib + # pyjwkest # python-dateutil # sphinxcontrib-httpdomain slumber==0.7.1 @@ -1388,26 +1348,22 @@ slumber==0.7.1 # -r requirements/edx/base.txt # edx-bulk-grades # edx-enterprise - # enterprise-integrated-channels smmap==5.0.2 # via gitdb -sniffio==1.3.1 - # via - # -r requirements/edx/base.txt - # anyio -snowballstemmer==3.0.1 +snowballstemmer==2.2.0 # via sphinx -snowflake-connector-python==4.0.0 +snowflake-connector-python==3.14.1 # via # -r requirements/edx/base.txt # edx-enterprise social-auth-app-django==5.4.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-auth-backends -social-auth-core==4.8.1 +social-auth-core==4.5.4 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-auth-backends # social-auth-app-django @@ -1419,7 +1375,7 @@ sortedcontainers==2.4.0 # via # -r requirements/edx/base.txt # snowflake-connector-python -soupsieve==2.8 +soupsieve==2.7 # via # -r requirements/edx/base.txt # beautifulsoup4 @@ -1435,7 +1391,7 @@ sphinx==8.2.3 # sphinxcontrib-httpdomain # sphinxcontrib-openapi # sphinxext-rediraffe -sphinx-autoapi==3.6.1 +sphinx-autoapi==3.6.0 # via -r requirements/edx/doc.in sphinx-book-theme==1.1.4 # via -r requirements/edx/doc.in @@ -1443,7 +1399,7 @@ sphinx-design==0.6.1 # via -r requirements/edx/doc.in sphinx-mdinclude==0.6.2 # via sphinxcontrib-openapi -sphinx-reredirects==1.0.0 +sphinx-reredirects==0.1.6 # via -r requirements/edx/doc.in sphinxcontrib-applehelp==2.0.0 # via sphinx @@ -1461,15 +1417,15 @@ sphinxcontrib-qthelp==2.0.0 # via sphinx sphinxcontrib-serializinghtml==2.0.0 # via sphinx -sphinxext-rediraffe==0.3.0 +sphinxext-rediraffe==0.2.7 # via -r requirements/edx/doc.in sqlparse==0.5.3 # via # -r requirements/edx/base.txt # django -staff-graded-xblock==3.1.0 +staff-graded-xblock==3.0.1 # via -r requirements/edx/base.txt -stevedore==5.5.0 +stevedore==5.4.1 # via # -r requirements/edx/base.txt # code-annotations @@ -1477,15 +1433,15 @@ stevedore==5.5.0 # edx-django-utils # edx-enterprise # edx-opaque-keys -super-csv==4.1.0 +super-csv==4.0.1 # via # -r requirements/edx/base.txt # edx-bulk-grades -sympy==1.14.0 +sympy==1.13.3 # via # -r requirements/edx/base.txt # openedx-calc -testfixtures==10.0.0 +testfixtures==8.3.0 # via # -r requirements/edx/base.txt # edx-enterprise @@ -1497,25 +1453,21 @@ tinycss2==1.4.0 # via # -r requirements/edx/base.txt # bleach -tomlkit==0.13.3 +tomlkit==0.13.2 # via # -r requirements/edx/base.txt - # openedx-learning # snowflake-connector-python tqdm==4.67.1 # via # -r requirements/edx/base.txt # nltk # openai -typing-extensions==4.15.0 +typing-extensions==4.13.2 # via # -r requirements/edx/base.txt - # aiosignal - # anyio # beautifulsoup4 # django-countries # edx-opaque-keys - # grpcio # jwcrypto # pydantic # pydantic-core @@ -1525,7 +1477,7 @@ typing-extensions==4.15.0 # referencing # snowflake-connector-python # typing-inspection -typing-inspection==0.4.2 +typing-inspection==0.4.0 # via # -r requirements/edx/base.txt # pydantic @@ -1538,20 +1490,22 @@ unicodecsv==0.14.1 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels unicodeit==0.7.5 # via -r requirements/edx/base.txt -uritemplate==4.2.0 +uritemplate==4.1.1 # via # -r requirements/edx/base.txt # drf-spectacular # drf-yasg -urllib3==2.5.0 + # google-api-python-client +urllib3==2.2.3 # via # -r requirements/edx/base.txt # botocore # elasticsearch # requests +user-util==1.1.0 + # via -r requirements/edx/base.txt vine==5.1.0 # via # -r requirements/edx/base.txt @@ -1562,19 +1516,15 @@ voluptuous==0.15.2 # via # -r requirements/edx/base.txt # ora2 -walrus==0.9.5 +walrus==0.9.4 # via # -r requirements/edx/base.txt # edx-event-bus-redis -wcmatch==10.1 - # via - # -r requirements/edx/base.txt - # pycasbin -wcwidth==0.2.14 +wcwidth==0.2.13 # via # -r requirements/edx/base.txt # prompt-toolkit -web-fragments==3.1.0 +web-fragments==3.0.0 # via # -r requirements/edx/base.txt # crowdsourcehinter-xblock @@ -1596,7 +1546,7 @@ wheel==0.45.1 # via # -r requirements/edx/base.txt # django-pipeline -wrapt==2.0.0 +wrapt==1.17.2 # via -r requirements/edx/base.txt xblock[django]==5.2.0 # via @@ -1614,31 +1564,31 @@ xblock[django]==5.2.0 # xblock-google-drive # xblock-utils # xblocks-contrib -xblock-drag-and-drop-v2==5.0.3 +xblock-drag-and-drop-v2==5.0.2 # via -r requirements/edx/base.txt xblock-google-drive==0.8.1 # via -r requirements/edx/base.txt -xblock-poll==1.15.1 +xblock-poll==1.14.1 # via -r requirements/edx/base.txt xblock-utils==4.0.0 # via # -r requirements/edx/base.txt # edx-sga # xblock-poll -xblocks-contrib==0.6.0 +xblocks-contrib==0.3.0 # via -r requirements/edx/base.txt xmlsec==1.3.14 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # python3-saml -xss-utils==0.8.0 +xss-utils==0.7.1 # via -r requirements/edx/base.txt -yarl==1.22.0 +yarl==1.20.0 # via # -r requirements/edx/base.txt # aiohttp -zipp==3.23.0 +zipp==3.21.0 # via # -r requirements/edx/base.txt # importlib-metadata diff --git a/requirements/edx/testing.txt b/requirements/edx/testing.txt index f58ef16f9e31..6eb1248b07ee 100644 --- a/requirements/edx/testing.txt +++ b/requirements/edx/testing.txt @@ -4,18 +4,20 @@ # # make upgrade # +-e git+https://github.com/anupdhabarde/edx-proctoring-proctortrack.git@31c6c9923a51c903ae83760ecbbac191363aa2a2#egg=edx_proctoring_proctortrack + # via -r requirements/edx/base.txt acid-xblock==0.4.1 # via -r requirements/edx/base.txt aiohappyeyeballs==2.6.1 # via # -r requirements/edx/base.txt # aiohttp -aiohttp==3.13.2 +aiohttp==3.11.18 # via # -r requirements/edx/base.txt # geoip2 # openai -aiosignal==1.4.0 +aiosignal==1.3.2 # via # -r requirements/edx/base.txt # aiohttp @@ -29,22 +31,19 @@ aniso8601==10.0.1 # via # -r requirements/edx/base.txt # edx-tincan-py35 -annotated-doc==0.0.3 - # via fastapi annotated-types==0.7.0 # via # -r requirements/edx/base.txt # pydantic -anyio==4.11.0 +anyio==4.9.0 # via - # -r requirements/edx/base.txt - # httpx + # httpcore # starlette appdirs==1.4.4 # via # -r requirements/edx/base.txt # fs -asgiref==3.10.0 +asgiref==3.8.1 # via # -r requirements/edx/base.txt # django @@ -54,18 +53,17 @@ asn1crypto==1.5.1 # via # -r requirements/edx/base.txt # snowflake-connector-python -astroid==3.3.11 +astroid==3.3.9 # via # pylint # pylint-celery -attrs==25.4.0 +attrs==25.3.0 # via # -r requirements/edx/base.txt # aiohttp # edx-ace # jsonschema # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-learning # referencing @@ -78,21 +76,21 @@ backoff==1.10.0 # via # -r requirements/edx/base.txt # analytics-python -bcrypt==5.0.0 +bcrypt==4.3.0 # via # -r requirements/edx/base.txt # paramiko -beautifulsoup4==4.14.2 +beautifulsoup4==4.13.4 # via # -r requirements/edx/base.txt # -r requirements/edx/testing.in # openedx-forum # pynliner -billiard==4.2.2 +billiard==4.2.1 # via # -r requirements/edx/base.txt # celery -bleach[css]==6.3.0 +bleach[css]==6.2.0 # via # -r requirements/edx/base.txt # edx-enterprise @@ -103,55 +101,46 @@ bleach[css]==6.3.0 # xblock-poll boto==2.49.0 # via -r requirements/edx/base.txt -boto3==1.40.62 +boto3==1.37.38 # via # -r requirements/edx/base.txt # django-ses # fs-s3fs # ora2 # snowflake-connector-python -botocore==1.40.62 +botocore==1.37.38 # via # -r requirements/edx/base.txt # boto3 # s3transfer # snowflake-connector-python -bracex==2.6 - # via - # -r requirements/edx/base.txt - # wcmatch bridgekeeper==0.9 # via -r requirements/edx/base.txt -cachecontrol==0.14.3 +cachecontrol==0.14.2 # via # -r requirements/edx/base.txt # firebase-admin -cachetools==6.2.1 +cachetools==5.5.2 # via # -r requirements/edx/base.txt # edxval # google-auth # tox -camel-converter[pydantic]==5.0.0 +camel-converter[pydantic]==4.0.1 # via # -r requirements/edx/base.txt # meilisearch -casbin-django-orm-adapter==1.7.0 +celery==5.5.1 # via - # -r requirements/edx/base.txt - # openedx-authz -celery==5.5.3 - # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # django-celery-results # django-user-tasks # edx-celeryutils # edx-enterprise - # enterprise-integrated-channels # event-tracking # openedx-learning -certifi==2025.10.5 +certifi==2025.1.31 # via # -r requirements/edx/base.txt # elasticsearch @@ -159,11 +148,12 @@ certifi==2025.10.5 # httpx # requests # snowflake-connector-python -cffi==2.0.0 +cffi==1.17.1 # via # -r requirements/edx/base.txt # cryptography # pynacl + # snowflake-connector-python chardet==5.2.0 # via # -r requirements/edx/base.txt @@ -171,14 +161,15 @@ chardet==5.2.0 # diff-cover # pysrt # tox -charset-normalizer==3.4.4 +charset-normalizer==2.0.12 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # requests # snowflake-connector-python -chem==2.0.0 +chem==1.3.0 # via -r requirements/edx/base.txt -click==8.3.0 +click==8.1.8 # via # -r requirements/edx/base.txt # celery @@ -192,6 +183,7 @@ click==8.3.0 # import-linter # nltk # pact-python + # user-util # uvicorn click-didyoumean==0.3.1 # via @@ -199,7 +191,7 @@ click-didyoumean==0.3.1 # celery click-log==0.4.0 # via edx-lint -click-plugins==1.1.1.2 +click-plugins==1.1.1 # via # -r requirements/edx/base.txt # celery @@ -214,19 +206,18 @@ code-annotations==2.3.0 # edx-enterprise # edx-lint # edx-toggles -codejail-includes==2.0.0 +codejail-includes==1.0.0 # via -r requirements/edx/base.txt colorama==0.4.6 # via tox -coverage[toml]==7.11.0 +coverage[toml]==7.8.0 # via # -r requirements/edx/coverage.txt # pytest-cov crowdsourcehinter-xblock==0.8 # via -r requirements/edx/base.txt -cryptography==45.0.7 +cryptography==44.0.2 # via - # -c requirements/constraints.txt # -r requirements/edx/base.txt # django-fernet-fields-v2 # edx-enterprise @@ -236,6 +227,7 @@ cryptography==45.0.7 # pyjwt # pyopenssl # snowflake-connector-python + # social-auth-core cssselect==1.3.0 # via # -r requirements/edx/testing.in @@ -253,20 +245,18 @@ defusedxml==0.7.1 # ora2 # python3-openid # social-auth-core -diff-cover==9.7.1 +diff-cover==9.2.4 # via -r requirements/edx/coverage.txt dill==0.4.0 # via pylint -distlib==0.4.0 +distlib==0.3.9 # via virtualenv -django==5.2.7 +django==4.2.20 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt - # casbin-django-orm-adapter # django-appconf - # django-autocomplete-light # django-celery-results # django-classy-tags # django-config-models @@ -283,7 +273,6 @@ django==5.2.7 # django-push-notifications # django-sekizai # django-ses - # django-simple-history # django-statici18n # django-storages # django-user-tasks @@ -319,12 +308,10 @@ django==5.2.7 # edxval # enmerkar # enmerkar-underscore - # enterprise-integrated-channels # event-tracking # help-tokens # jsonfield # lti-consumer-xblock - # openedx-authz # openedx-django-pyfs # openedx-django-wiki # openedx-events @@ -340,8 +327,6 @@ django-appconf==1.1.0 # via # -r requirements/edx/base.txt # django-statici18n -django-autocomplete-light==3.12.1 - # via -r requirements/edx/base.txt django-cache-memoize==0.2.1 # via # -r requirements/edx/base.txt @@ -357,9 +342,8 @@ django-config-models==2.9.0 # -r requirements/edx/base.txt # edx-enterprise # edx-name-affirmation - # enterprise-integrated-channels # lti-consumer-xblock -django-cors-headers==4.9.0 +django-cors-headers==4.7.0 # via -r requirements/edx/base.txt django-countries==7.6.1 # via @@ -378,8 +362,7 @@ django-fernet-fields-v2==0.9 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -django-filter==25.2 +django-filter==25.1 # via # -r requirements/edx/base.txt # edx-enterprise @@ -411,31 +394,28 @@ django-model-utils==5.0.0 # edx-submissions # edx-when # edxval - # enterprise-integrated-channels # ora2 # super-csv -django-mptt==0.18.0 +django-mptt==0.17.0 # via # -r requirements/edx/base.txt # openedx-django-wiki -django-multi-email-field==0.8.0 +django-multi-email-field==0.7.0 # via # -r requirements/edx/base.txt # edx-enterprise -django-mysql==4.19.0 +django-mysql==4.16.0 # via -r requirements/edx/base.txt django-oauth-toolkit==1.7.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels django-object-actions==5.0.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -django-pipeline==4.1.0 +django-pipeline==4.0.0 # via -r requirements/edx/base.txt django-push-notifications==3.2.1 # via @@ -449,14 +429,14 @@ django-sekizai==4.1.0 # openedx-django-wiki django-ses==4.4.0 # via -r requirements/edx/base.txt -django-simple-history==3.10.1 +django-simple-history==3.4.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise # edx-name-affirmation # edx-organizations # edx-proctoring - # enterprise-integrated-channels # ora2 django-statici18n==2.6.0 # via @@ -465,13 +445,14 @@ django-statici18n==2.6.0 # xblock-drag-and-drop-v2 # xblock-poll # xblocks-contrib -django-storages==1.14.6 +django-storages==1.14.3 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edxval -django-user-tasks==3.4.4 +django-user-tasks==3.3.0 # via -r requirements/edx/base.txt -django-waffle==5.0.0 +django-waffle==4.2.0 # via # -r requirements/edx/base.txt # edx-django-utils @@ -479,12 +460,14 @@ django-waffle==5.0.0 # edx-enterprise # edx-proctoring # edx-toggles -django-webpack-loader==3.2.1 +django-webpack-loader==0.7.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-proctoring -djangorestframework==3.16.1 +djangorestframework==3.14.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # django-config-models # django-user-tasks @@ -499,7 +482,6 @@ djangorestframework==3.16.1 # edx-organizations # edx-proctoring # edx-submissions - # openedx-authz # openedx-forum # openedx-learning # ora2 @@ -508,7 +490,7 @@ djangorestframework-xml==2.0.0 # via # -r requirements/edx/base.txt # edx-enterprise -dnspython==2.8.0 +dnspython==2.7.0 # via # -r requirements/edx/base.txt # pymongo @@ -520,21 +502,20 @@ drf-jwt==1.19.2 # edx-drf-extensions drf-spectacular==0.28.0 # via -r requirements/edx/base.txt -drf-yasg==1.21.11 +drf-yasg==1.21.10 # via # -r requirements/edx/base.txt # django-user-tasks # edx-api-doc-tools -edx-ace==1.15.0 +edx-ace==1.11.4 # via -r requirements/edx/base.txt -edx-api-doc-tools==2.1.0 +edx-api-doc-tools==2.0.0 # via # -r requirements/edx/base.txt # edx-name-affirmation - # openedx-authz -edx-auth-backends==4.6.2 +edx-auth-backends==4.5.0 # via -r requirements/edx/base.txt -edx-bulk-grades==1.2.0 +edx-bulk-grades==1.1.0 # via # -r requirements/edx/base.txt # staff-graded-xblock @@ -543,28 +524,27 @@ edx-ccx-keys==2.0.2 # -r requirements/edx/base.txt # lti-consumer-xblock # openedx-events -edx-celeryutils==1.4.0 +edx-celeryutils==1.3.0 # via # -r requirements/edx/base.txt # edx-name-affirmation # super-csv -edx-codejail==4.0.0 +edx-codejail==3.5.2 # via -r requirements/edx/base.txt -edx-completion==4.9 +edx-completion==4.7.11 # via -r requirements/edx/base.txt -edx-django-release-util==1.5.0 +edx-django-release-util==1.4.0 # via # -r requirements/edx/base.txt # edx-submissions # edxval -edx-django-sites-extensions==5.1.0 +edx-django-sites-extensions==4.2.0 # via -r requirements/edx/base.txt -edx-django-utils==8.0.1 +edx-django-utils==7.4.0 # via # -r requirements/edx/base.txt # django-config-models # edx-ace - # edx-auth-backends # edx-drf-extensions # edx-enterprise # edx-event-bus-kafka @@ -573,9 +553,7 @@ edx-django-utils==8.0.1 # edx-rest-api-client # edx-toggles # edx-when - # enterprise-integrated-channels # event-tracking - # openedx-authz # openedx-events # ora2 # super-csv @@ -590,27 +568,26 @@ edx-drf-extensions==10.6.0 # edx-rbac # edx-when # edxval - # enterprise-integrated-channels - # openedx-authz # openedx-learning -edx-enterprise==6.5.1 +edx-enterprise==5.12.7 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt edx-event-bus-kafka==6.1.0 # via -r requirements/edx/base.txt edx-event-bus-redis==0.6.1 # via -r requirements/edx/base.txt -edx-i18n-tools==1.9.0 +edx-i18n-tools==1.5.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # ora2 # xblocks-contrib edx-lint==5.6.0 # via -r requirements/edx/testing.in -edx-milestones==1.1.0 +edx-milestones==0.6.0 # via -r requirements/edx/base.txt -edx-name-affirmation==3.0.2 +edx-name-affirmation==3.0.1 # via -r requirements/edx/base.txt edx-opaque-keys[django]==3.0.0 # via @@ -624,35 +601,32 @@ edx-opaque-keys[django]==3.0.0 # edx-organizations # edx-proctoring # edx-when - # enterprise-integrated-channels # lti-consumer-xblock - # openedx-authz # openedx-events # openedx-filters # ora2 - # xblocks-contrib -edx-organizations==7.3.0 - # via -r requirements/edx/base.txt -edx-proctoring==5.2.0 +edx-organizations==6.13.0 # via -r requirements/edx/base.txt -edx-rbac==2.1.0 +edx-proctoring==5.1.2 + # via + # -r requirements/edx/base.txt + # edx-proctoring-proctortrack +edx-rbac==1.10.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels edx-rest-api-client==6.2.0 # via # -r requirements/edx/base.txt # edx-enterprise # edx-proctoring - # enterprise-integrated-channels -edx-search==4.3.0 +edx-search==4.1.3 # via # -r requirements/edx/base.txt # openedx-forum -edx-sga==0.26.0 +edx-sga==0.25.3 # via -r requirements/edx/base.txt -edx-submissions==3.12.1 +edx-submissions==3.10.0 # via # -r requirements/edx/base.txt # ora2 @@ -660,11 +634,9 @@ edx-tincan-py35==2.0.0 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels -edx-toggles==5.4.1 +edx-toggles==5.3.0 # via # -r requirements/edx/base.txt - # edx-auth-backends # edx-completion # edx-enterprise # edx-event-bus-kafka @@ -674,16 +646,16 @@ edx-toggles==5.4.1 # edxval # event-tracking # ora2 -edx-when==3.0.0 +edx-when==2.5.1 # via # -r requirements/edx/base.txt # edx-proctoring -edxval==3.1.0 +edxval==2.10.0 # via -r requirements/edx/base.txt elasticsearch==7.9.1 # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c requirements/edx/../common_constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-search # openedx-forum @@ -691,12 +663,11 @@ enmerkar==0.7.1 # via # -r requirements/edx/base.txt # enmerkar-underscore -enmerkar-underscore==2.4.0 - # via -r requirements/edx/base.txt -enterprise-integrated-channels==0.1.22 +enmerkar-underscore==2.3.1 # via -r requirements/edx/base.txt -event-tracking==3.3.0 +event-tracking==3.0.0 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-completion # edx-proctoring @@ -705,69 +676,84 @@ execnet==2.1.1 # via pytest-xdist factory-boy==3.3.3 # via -r requirements/edx/testing.in -faker==37.12.0 +faker==37.1.0 # via factory-boy -fastapi==0.120.2 +fastapi==0.115.12 # via pact-python -fastavro==1.12.1 +fastavro==1.10.0 # via # -r requirements/edx/base.txt # openedx-events -filelock==3.20.0 +filelock==3.18.0 # via # -r requirements/edx/base.txt # snowflake-connector-python # tox # virtualenv -firebase-admin==7.1.0 +firebase-admin==6.7.0 # via # -r requirements/edx/base.txt # edx-ace -freezegun==1.5.5 +freezegun==1.5.1 # via -r requirements/edx/testing.in -frozenlist==1.8.0 +frozenlist==1.6.0 # via # -r requirements/edx/base.txt # aiohttp # aiosignal -fs==2.4.16 +fs==2.0.27 # via # -r requirements/edx/base.txt # fs-s3fs # openedx-django-pyfs # xblock -fs-s3fs==1.1.1 +fs-s3fs==0.1.8 # via # -r requirements/edx/base.txt # openedx-django-pyfs -geoip2==5.1.0 +future==1.0.0 + # via + # -r requirements/edx/base.txt + # pyjwkest +geoip2==5.0.1 # via -r requirements/edx/base.txt glob2==0.7 # via -r requirements/edx/base.txt -google-api-core[grpc]==2.28.1 +google-api-core[grpc]==2.24.2 # via # -r requirements/edx/base.txt # firebase-admin + # google-api-python-client # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-auth==2.42.0 +google-api-python-client==2.167.0 + # via + # -r requirements/edx/base.txt + # firebase-admin +google-auth==2.39.0 # via # -r requirements/edx/base.txt # google-api-core + # google-api-python-client + # google-auth-httplib2 # google-cloud-core # google-cloud-firestore # google-cloud-storage -google-cloud-core==2.5.0 +google-auth-httplib2==0.2.0 + # via + # -r requirements/edx/base.txt + # google-api-python-client +google-cloud-core==2.4.3 # via # -r requirements/edx/base.txt # google-cloud-firestore # google-cloud-storage -google-cloud-firestore==2.21.0 +google-cloud-firestore==2.20.2 # via # -r requirements/edx/base.txt # firebase-admin -google-cloud-storage==3.4.1 +google-cloud-storage==3.1.0 # via # -r requirements/edx/base.txt # firebase-admin @@ -780,90 +766,74 @@ google-resumable-media==2.7.2 # via # -r requirements/edx/base.txt # google-cloud-storage -googleapis-common-protos==1.71.0 +googleapis-common-protos==1.70.0 # via # -r requirements/edx/base.txt # google-api-core # grpcio-status -grimp==3.13 +grimp==3.8 # via import-linter -grpcio==1.76.0 +grpcio==1.71.0 # via # -r requirements/edx/base.txt # google-api-core # grpcio-status -grpcio-status==1.76.0 +grpcio-status==1.71.0 # via # -r requirements/edx/base.txt # google-api-core gunicorn==23.0.0 # via -r requirements/edx/base.txt -h11==0.16.0 +h11==0.14.0 # via - # -r requirements/edx/base.txt # httpcore # uvicorn -h2==4.3.0 - # via - # -r requirements/edx/base.txt - # httpx -help-tokens==3.2.0 +help-tokens==3.1.0 # via -r requirements/edx/base.txt -hpack==4.1.0 - # via - # -r requirements/edx/base.txt - # h2 html5lib==1.1 # via # -r requirements/edx/base.txt # ora2 -httpcore==1.0.9 +httpcore==0.16.3 + # via httpx +httplib2==0.22.0 # via # -r requirements/edx/base.txt - # httpx + # google-api-python-client + # google-auth-httplib2 httpretty==1.1.4 # via -r requirements/edx/testing.in -httpx[http2]==0.28.1 - # via - # -r requirements/edx/base.txt - # firebase-admin -hyperframe==6.1.0 - # via - # -r requirements/edx/base.txt - # h2 -icalendar==6.3.1 +httpx==0.23.3 + # via pact-python +icalendar==6.1.3 # via -r requirements/edx/base.txt -idna==3.11 +idna==3.10 # via # -r requirements/edx/base.txt # anyio - # httpx # optimizely-sdk # requests + # rfc3986 # snowflake-connector-python # yarl -import-linter==2.5.2 +import-linter==2.3 # via -r requirements/edx/testing.in -importlib-metadata==8.7.0 +importlib-metadata==8.6.1 # via -r requirements/edx/base.txt inflection==0.5.1 # via # -r requirements/edx/base.txt # drf-spectacular # drf-yasg -iniconfig==2.3.0 +iniconfig==2.1.0 # via pytest -invoke==2.2.1 - # via - # -r requirements/edx/base.txt - # paramiko ipaddress==1.0.23 # via -r requirements/edx/base.txt isodate==0.7.2 # via # -r requirements/edx/base.txt # python3-saml -isort==6.1.0 +isort==6.0.1 # via # -r requirements/edx/testing.in # pylint @@ -878,7 +848,7 @@ jmespath==1.0.1 # -r requirements/edx/base.txt # boto3 # botocore -joblib==1.5.2 +joblib==1.4.2 # via # -r requirements/edx/base.txt # nltk @@ -886,22 +856,21 @@ jsondiff==2.2.1 # via # -r requirements/edx/base.txt # edx-enterprise -jsonfield==3.2.0 +jsonfield==3.1.0 # via # -r requirements/edx/base.txt # edx-celeryutils # edx-enterprise # edx-proctoring # edx-submissions - # enterprise-integrated-channels # lti-consumer-xblock # ora2 -jsonschema==4.25.1 +jsonschema==4.23.0 # via # -r requirements/edx/base.txt # drf-spectacular # optimizely-sdk -jsonschema-specifications==2025.9.1 +jsonschema-specifications==2024.10.1 # via # -r requirements/edx/base.txt # jsonschema @@ -910,7 +879,7 @@ jwcrypto==1.5.6 # -r requirements/edx/base.txt # django-oauth-toolkit # pylti1p3 -kombu==5.5.4 +kombu==5.5.3 # via # -r requirements/edx/base.txt # celery @@ -927,11 +896,10 @@ loremipsum==1.0.5 # via # -r requirements/edx/base.txt # ora2 -lti-consumer-xblock==9.14.3 +lti-consumer-xblock==9.13.4 # via -r requirements/edx/base.txt lxml[html-clean]==5.3.2 # via - # -c requirements/constraints.txt # -r requirements/edx/base.txt # edx-i18n-tools # edxval @@ -944,7 +912,7 @@ lxml[html-clean]==5.3.2 # python3-saml # xblock # xmlsec -lxml-html-clean==0.4.3 +lxml-html-clean==0.4.2 # via # -r requirements/edx/base.txt # lxml @@ -957,13 +925,14 @@ mako==1.3.10 # lti-consumer-xblock # xblock # xblock-utils -markdown==3.9 +markdown==3.3.7 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # openedx-django-wiki # staff-graded-xblock # xblock-poll -markupsafe==3.0.3 +markupsafe==3.0.2 # via # -r requirements/edx/base.txt # -r requirements/edx/coverage.txt @@ -972,13 +941,13 @@ markupsafe==3.0.3 # mako # openedx-calc # xblock -maxminddb==2.8.2 +maxminddb==2.6.3 # via # -r requirements/edx/base.txt # geoip2 mccabe==0.7.0 # via pylint -meilisearch==0.37.1 +meilisearch==0.34.1 # via # -r requirements/edx/base.txt # edx-search @@ -990,7 +959,7 @@ monotonic==1.6 # via # -r requirements/edx/base.txt # analytics-python -more-itertools==10.8.0 +more-itertools==10.6.0 # via # -r requirements/edx/base.txt # cssutils @@ -998,11 +967,11 @@ mpmath==1.3.0 # via # -r requirements/edx/base.txt # sympy -msgpack==1.1.2 +msgpack==1.1.0 # via # -r requirements/edx/base.txt # cachecontrol -multidict==6.7.0 +multidict==6.4.3 # via # -r requirements/edx/base.txt # aiohttp @@ -1011,11 +980,13 @@ mysqlclient==2.2.7 # via # -r requirements/edx/base.txt # openedx-forum -nh3==0.3.1 +newrelic==10.9.0 # via # -r requirements/edx/base.txt - # xblocks-contrib -nltk==3.9.2 + # edx-django-utils +nh3==0.2.21 + # via -r requirements/edx/base.txt +nltk==3.9.1 # via # -r requirements/edx/base.txt # chem @@ -1023,48 +994,43 @@ nodeenv==1.9.1 # via -r requirements/edx/base.txt numpy==1.26.4 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # chem # openedx-calc # scipy # shapely -oauthlib==3.3.1 +oauthlib==3.2.2 # via # -r requirements/edx/base.txt # django-oauth-toolkit # lti-consumer-xblock # requests-oauthlib # social-auth-core - # xblocks-contrib olxcleaner==0.3.0 # via -r requirements/edx/base.txt openai==0.28.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-enterprise openedx-atlas==0.7.0 # via # -r requirements/edx/base.txt - # enterprise-integrated-channels - # openedx-authz # openedx-forum -openedx-authz==0.20.0 - # via -r requirements/edx/base.txt openedx-calc==4.0.2 # via -r requirements/edx/base.txt -openedx-django-pyfs==3.8.0 +openedx-django-pyfs==3.7.0 # via # -r requirements/edx/base.txt # lti-consumer-xblock # xblock # xblocks-contrib -openedx-django-require==3.0.0 +openedx-django-require==2.1.0 # via -r requirements/edx/base.txt -openedx-django-wiki==3.1.1 +openedx-django-wiki==2.1.0 # via -r requirements/edx/base.txt -openedx-events==10.5.0 +openedx-events==10.2.0 # via # -r requirements/edx/base.txt # edx-enterprise @@ -1073,42 +1039,41 @@ openedx-events==10.5.0 # edx-name-affirmation # event-tracking # ora2 -openedx-filters==2.1.0 +openedx-filters==2.0.1 # via # -r requirements/edx/base.txt # lti-consumer-xblock # ora2 -openedx-forum==0.3.8 +openedx-forum==0.3.6 # via -r requirements/edx/base.txt -openedx-learning==0.30.1 +openedx-learning==0.26.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt +openedx-mongodbproxy==0.2.2 + # via -r requirements/edx/base.txt optimizely-sdk==5.2.0 # via -r requirements/edx/base.txt -ora2==6.17.1 +ora2==6.16.1 # via -r requirements/edx/base.txt packaging==25.0 # via # -r requirements/edx/base.txt # drf-yasg # gunicorn - # kombu # pyproject-api # pytest # snowflake-connector-python # tox -pact-python==1.6.0 - # via - # -c requirements/constraints.txt - # -r requirements/edx/testing.in -paramiko==4.0.0 +pact-python==2.0.1 + # via -r requirements/edx/testing.in +paramiko==3.5.1 # via # -r requirements/edx/base.txt # edx-enterprise path==16.11.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-i18n-tools # path-py @@ -1118,42 +1083,45 @@ path-py==12.5.0 # edx-enterprise # ora2 # staff-graded-xblock +pbr==6.1.1 + # via + # -r requirements/edx/base.txt + # stevedore pgpy==0.6.0 # via # -r requirements/edx/base.txt # edx-enterprise piexif==1.1.3 # via -r requirements/edx/base.txt -pillow==12.0.0 +pillow==11.2.1 # via # -r requirements/edx/base.txt # edx-enterprise # edx-organizations # edxval -platformdirs==4.5.0 +platformdirs==4.3.7 # via # -r requirements/edx/base.txt # pylint # snowflake-connector-python # tox # virtualenv -pluggy==1.6.0 +pluggy==1.5.0 # via # -r requirements/edx/coverage.txt # diff-cover # pytest - # pytest-cov # tox polib==1.2.0 # via # -r requirements/edx/base.txt # -r requirements/edx/testing.in # edx-i18n-tools -prompt-toolkit==3.0.52 +prompt-toolkit==3.0.51 # via # -r requirements/edx/base.txt # click-repl -propcache==0.4.1 +propcache==0.3.1 # via # -r requirements/edx/base.txt # aiohttp @@ -1163,7 +1131,7 @@ proto-plus==1.26.1 # -r requirements/edx/base.txt # google-api-core # google-cloud-firestore -protobuf==6.33.0 +protobuf==5.29.4 # via # -r requirements/edx/base.txt # google-api-core @@ -1171,7 +1139,7 @@ protobuf==6.33.0 # googleapis-common-protos # grpcio-status # proto-plus -psutil==7.1.2 +psutil==7.0.0 # via # -r requirements/edx/base.txt # edx-django-utils @@ -1189,39 +1157,39 @@ pyasn1-modules==0.4.2 # via # -r requirements/edx/base.txt # google-auth -pycasbin==2.4.0 - # via - # -r requirements/edx/base.txt - # casbin-django-orm-adapter - # openedx-authz pycodestyle==2.8.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/testing.in pycountry==24.6.1 # via -r requirements/edx/base.txt -pycparser==2.23 +pycparser==2.22 # via # -r requirements/edx/base.txt # cffi -pycryptodomex==3.23.0 +pycryptodomex==3.22.0 # via # -r requirements/edx/base.txt # edx-proctoring # lti-consumer-xblock -pydantic==2.12.3 + # pyjwkest +pydantic==2.11.3 # via # -r requirements/edx/base.txt # camel-converter # fastapi -pydantic-core==2.41.4 +pydantic-core==2.33.1 # via # -r requirements/edx/base.txt # pydantic -pygments==2.19.2 +pygments==2.19.1 # via # -r requirements/edx/coverage.txt # diff-cover +pyjwkest==1.4.2 + # via + # -r requirements/edx/base.txt + # lti-consumer-xblock pyjwt[crypto]==2.10.1 # via # -r requirements/edx/base.txt @@ -1239,7 +1207,7 @@ pylatexenc==2.10 # via # -r requirements/edx/base.txt # olxcleaner -pylint==3.3.9 +pylint==3.3.6 # via # edx-lint # pylint-celery @@ -1250,7 +1218,7 @@ pylint-celery==0.3 # via edx-lint pylint-django==2.6.1 # via edx-lint -pylint-plugin-utils==0.9.0 +pylint-plugin-utils==0.8.2 # via # pylint-celery # pylint-django @@ -1262,29 +1230,31 @@ pymemcache==4.0.0 # via -r requirements/edx/base.txt pymongo==4.4.0 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-opaque-keys # event-tracking # mongoengine # openedx-forum -pynacl==1.6.0 + # openedx-mongodbproxy +pynacl==1.5.0 # via # -r requirements/edx/base.txt # edx-django-utils # paramiko pynliner==0.8.0 # via -r requirements/edx/base.txt -pyopenssl==25.3.0 +pyopenssl==25.0.0 # via # -r requirements/edx/base.txt # snowflake-connector-python -pyparsing==3.2.5 +pyparsing==3.2.3 # via # -r requirements/edx/base.txt # chem + # httplib2 # openedx-calc -pyproject-api==1.10.0 +pyproject-api==1.9.0 # via tox pyquery==2.0.1 # via -r requirements/edx/testing.in @@ -1309,7 +1279,7 @@ pytest==8.2.0 # pytest-xdist pytest-attrib==0.1.3 # via -r requirements/edx/testing.in -pytest-cov==7.0.0 +pytest-cov==6.1.1 # via -r requirements/edx/testing.in pytest-django==4.11.1 # via -r requirements/edx/testing.in @@ -1319,9 +1289,9 @@ pytest-metadata==3.1.1 # via # -r requirements/edx/testing.in # pytest-json-report -pytest-randomly==4.0.1 +pytest-randomly==3.16.0 # via -r requirements/edx/testing.in -pytest-xdist[psutil]==3.8.0 +pytest-xdist[psutil]==3.6.1 # via -r requirements/edx/testing.in python-dateutil==2.9.0.post0 # via @@ -1345,7 +1315,7 @@ python-slugify==8.0.4 # via # -r requirements/edx/base.txt # code-annotations -python-swiftclient==4.8.0 +python-swiftclient==4.7.0 # via # -r requirements/edx/base.txt # ora2 @@ -1358,21 +1328,22 @@ python3-saml==1.16.0 pytz==2025.2 # via # -r requirements/edx/base.txt + # djangorestframework # drf-yasg # edx-completion # edx-enterprise # edx-proctoring # edx-submissions # edx-tincan-py35 - # enterprise-integrated-channels # event-tracking + # fs # olxcleaner # ora2 # snowflake-connector-python # xblock pyuca==1.2 # via -r requirements/edx/base.txt -pyyaml==6.0.3 +pyyaml==6.0.2 # via # -r requirements/edx/base.txt # code-annotations @@ -1384,22 +1355,22 @@ pyyaml==6.0.3 # xblock random2==1.0.2 # via -r requirements/edx/base.txt -recommender-xblock==3.1.0 +recommender-xblock==3.0.0 # via -r requirements/edx/base.txt -redis==7.0.1 +redis==5.2.1 # via # -r requirements/edx/base.txt # walrus -referencing==0.37.0 +referencing==0.36.2 # via # -r requirements/edx/base.txt # jsonschema # jsonschema-specifications -regex==2025.10.23 +regex==2024.11.6 # via # -r requirements/edx/base.txt # nltk -requests==2.32.5 +requests==2.32.3 # via # -r requirements/edx/base.txt # analytics-python @@ -1409,7 +1380,6 @@ requests==2.32.5 # edx-drf-extensions # edx-enterprise # edx-rest-api-client - # enterprise-integrated-channels # geoip2 # google-api-core # google-cloud-storage @@ -1419,6 +1389,7 @@ requests==2.32.5 # openedx-forum # optimizely-sdk # pact-python + # pyjwkest # pylti1p3 # python-swiftclient # requests-oauthlib @@ -1431,7 +1402,9 @@ requests-oauthlib==2.0.0 # via # -r requirements/edx/base.txt # social-auth-core -rpds-py==0.28.0 +rfc3986[idna2008]==1.5.0 + # via httpx +rpds-py==0.24.0 # via # -r requirements/edx/base.txt # jsonschema @@ -1446,7 +1419,7 @@ rules==3.5 # edx-enterprise # edx-proctoring # openedx-learning -s3transfer==0.14.0 +s3transfer==0.11.5 # via # -r requirements/edx/base.txt # boto3 @@ -1454,7 +1427,7 @@ sailthru-client==2.2.3 # via # -r requirements/edx/base.txt # edx-ace -scipy==1.16.3 +scipy==1.15.2 # via # -r requirements/edx/base.txt # chem @@ -1462,25 +1435,22 @@ semantic-version==2.10.0 # via # -r requirements/edx/base.txt # edx-drf-extensions -shapely==2.1.2 +shapely==2.1.0 # via -r requirements/edx/base.txt -simpleeval==1.0.3 - # via - # -r requirements/edx/base.txt - # pycasbin -simplejson==3.20.2 +simplejson==3.20.1 # via # -r requirements/edx/base.txt # sailthru-client # super-csv # xblock # xblock-utils -singledispatch==4.1.2 +singledispatch==4.1.1 # via -r requirements/edx/testing.in six==1.17.0 # via # -r requirements/edx/base.txt # analytics-python + # codejail-includes # crowdsourcehinter-xblock # edx-ace # edx-auth-backends @@ -1495,28 +1465,30 @@ six==1.17.0 # fs-s3fs # html5lib # pact-python + # pyjwkest # python-dateutil slumber==0.7.1 # via # -r requirements/edx/base.txt # edx-bulk-grades # edx-enterprise - # enterprise-integrated-channels sniffio==1.3.1 # via - # -r requirements/edx/base.txt # anyio -snowflake-connector-python==4.0.0 + # httpcore + # httpx +snowflake-connector-python==3.14.1 # via # -r requirements/edx/base.txt # edx-enterprise social-auth-app-django==5.4.1 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-auth-backends -social-auth-core==4.8.1 +social-auth-core==4.5.4 # via + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # edx-auth-backends # social-auth-app-django @@ -1528,7 +1500,7 @@ sortedcontainers==2.4.0 # via # -r requirements/edx/base.txt # snowflake-connector-python -soupsieve==2.8 +soupsieve==2.7 # via # -r requirements/edx/base.txt # beautifulsoup4 @@ -1536,11 +1508,11 @@ sqlparse==0.5.3 # via # -r requirements/edx/base.txt # django -staff-graded-xblock==3.1.0 +staff-graded-xblock==3.0.1 # via -r requirements/edx/base.txt -starlette==0.49.1 +starlette==0.46.2 # via fastapi -stevedore==5.5.0 +stevedore==5.4.1 # via # -r requirements/edx/base.txt # code-annotations @@ -1548,15 +1520,15 @@ stevedore==5.5.0 # edx-django-utils # edx-enterprise # edx-opaque-keys -super-csv==4.1.0 +super-csv==4.0.1 # via # -r requirements/edx/base.txt # edx-bulk-grades -sympy==1.14.0 +sympy==1.13.3 # via # -r requirements/edx/base.txt # openedx-calc -testfixtures==10.0.0 +testfixtures==8.3.0 # via # -r requirements/edx/base.txt # -r requirements/edx/testing.in @@ -1569,30 +1541,27 @@ tinycss2==1.4.0 # via # -r requirements/edx/base.txt # bleach -tomlkit==0.13.3 +tomlkit==0.13.2 # via # -r requirements/edx/base.txt - # openedx-learning # pylint # snowflake-connector-python -tox==4.32.0 +tox==4.25.0 # via -r requirements/edx/testing.in tqdm==4.67.1 # via # -r requirements/edx/base.txt # nltk # openai -typing-extensions==4.15.0 +typing-extensions==4.13.2 # via # -r requirements/edx/base.txt - # aiosignal # anyio # beautifulsoup4 # django-countries # edx-opaque-keys # fastapi # grimp - # grpcio # import-linter # jwcrypto # pydantic @@ -1601,9 +1570,8 @@ typing-extensions==4.15.0 # pyopenssl # referencing # snowflake-connector-python - # starlette # typing-inspection -typing-inspection==0.4.2 +typing-inspection==0.4.0 # via # -r requirements/edx/base.txt # pydantic @@ -1617,24 +1585,26 @@ unicodecsv==0.14.1 # via # -r requirements/edx/base.txt # edx-enterprise - # enterprise-integrated-channels unicodeit==0.7.5 # via -r requirements/edx/base.txt unidiff==0.7.5 # via -r requirements/edx/testing.in -uritemplate==4.2.0 +uritemplate==4.1.1 # via # -r requirements/edx/base.txt # drf-spectacular # drf-yasg -urllib3==2.5.0 + # google-api-python-client +urllib3==2.2.3 # via # -r requirements/edx/base.txt # botocore # elasticsearch # pact-python # requests -uvicorn==0.38.0 +user-util==1.1.0 + # via -r requirements/edx/base.txt +uvicorn==0.34.2 # via pact-python vine==5.1.0 # via @@ -1642,25 +1612,21 @@ vine==5.1.0 # amqp # celery # kombu -virtualenv==20.35.4 +virtualenv==20.30.0 # via tox voluptuous==0.15.2 # via # -r requirements/edx/base.txt # ora2 -walrus==0.9.5 +walrus==0.9.4 # via # -r requirements/edx/base.txt # edx-event-bus-redis -wcmatch==10.1 - # via - # -r requirements/edx/base.txt - # pycasbin -wcwidth==0.2.14 +wcwidth==0.2.13 # via # -r requirements/edx/base.txt # prompt-toolkit -web-fragments==3.1.0 +web-fragments==3.0.0 # via # -r requirements/edx/base.txt # crowdsourcehinter-xblock @@ -1682,7 +1648,7 @@ wheel==0.45.1 # via # -r requirements/edx/base.txt # django-pipeline -wrapt==2.0.0 +wrapt==1.17.2 # via -r requirements/edx/base.txt xblock[django]==5.2.0 # via @@ -1700,31 +1666,31 @@ xblock[django]==5.2.0 # xblock-google-drive # xblock-utils # xblocks-contrib -xblock-drag-and-drop-v2==5.0.3 +xblock-drag-and-drop-v2==5.0.2 # via -r requirements/edx/base.txt xblock-google-drive==0.8.1 # via -r requirements/edx/base.txt -xblock-poll==1.15.1 +xblock-poll==1.14.1 # via -r requirements/edx/base.txt xblock-utils==4.0.0 # via # -r requirements/edx/base.txt # edx-sga # xblock-poll -xblocks-contrib==0.6.0 +xblocks-contrib==0.3.0 # via -r requirements/edx/base.txt xmlsec==1.3.14 # via - # -c requirements/constraints.txt + # -c requirements/edx/../constraints.txt # -r requirements/edx/base.txt # python3-saml -xss-utils==0.8.0 +xss-utils==0.7.1 # via -r requirements/edx/base.txt -yarl==1.22.0 +yarl==1.20.0 # via # -r requirements/edx/base.txt # aiohttp -zipp==3.23.0 +zipp==3.21.0 # via # -r requirements/edx/base.txt # importlib-metadata diff --git a/scripts/user_retirement/requirements/base.txt b/scripts/user_retirement/requirements/base.txt index b726b8a49f40..e4fb71e54dc5 100644 --- a/scripts/user_retirement/requirements/base.txt +++ b/scripts/user_retirement/requirements/base.txt @@ -4,94 +4,96 @@ # # make upgrade # -asgiref==3.10.0 +asgiref==3.8.1 # via django -attrs==25.4.0 +attrs==25.3.0 # via zeep backoff==2.2.1 # via -r scripts/user_retirement/requirements/base.in -boto3==1.40.62 +boto3==1.37.38 # via -r scripts/user_retirement/requirements/base.in -botocore==1.40.62 +botocore==1.37.38 # via # boto3 # s3transfer -cachetools==6.2.1 +cachetools==5.5.2 # via google-auth -certifi==2025.10.5 +certifi==2025.1.31 # via requests -cffi==2.0.0 +cffi==1.17.1 # via # cryptography # pynacl -charset-normalizer==3.4.4 - # via requests -click==8.3.0 +charset-normalizer==2.0.12 + # via + # -c scripts/user_retirement/requirements/../../../requirements/constraints.txt + # requests +click==8.1.8 # via # -r scripts/user_retirement/requirements/base.in # edx-django-utils -cryptography==45.0.7 +cryptography==44.0.2 + # via pyjwt +django==4.2.20 # via - # -c requirements/constraints.txt - # pyjwt -django==5.2.7 - # via - # -c requirements/common_constraints.txt - # -c requirements/constraints.txt + # -c scripts/user_retirement/requirements/../../../requirements/common_constraints.txt + # -c scripts/user_retirement/requirements/../../../requirements/constraints.txt # django-crum # django-waffle # edx-django-utils django-crum==0.7.9 # via edx-django-utils -django-waffle==5.0.0 +django-waffle==4.2.0 # via edx-django-utils -edx-django-utils==8.0.1 +edx-django-utils==7.4.0 # via edx-rest-api-client edx-rest-api-client==6.2.0 # via -r scripts/user_retirement/requirements/base.in -google-api-core==2.28.1 +google-api-core==2.24.2 # via google-api-python-client -google-api-python-client==2.185.0 +google-api-python-client==2.167.0 # via -r scripts/user_retirement/requirements/base.in -google-auth==2.42.0 +google-auth==2.39.0 # via # google-api-core # google-api-python-client # google-auth-httplib2 google-auth-httplib2==0.2.0 # via google-api-python-client -googleapis-common-protos==1.71.0 +googleapis-common-protos==1.70.0 # via google-api-core -httplib2==0.31.0 +httplib2==0.22.0 # via # google-api-python-client # google-auth-httplib2 -idna==3.11 +idna==3.10 # via requests isodate==0.7.2 # via zeep -jenkinsapi==0.3.16 +jenkinsapi==0.3.14 # via -r scripts/user_retirement/requirements/base.in jmespath==1.0.1 # via # boto3 # botocore lxml==5.3.2 - # via - # -c requirements/constraints.txt - # zeep -more-itertools==10.8.0 + # via zeep +more-itertools==10.6.0 # via simple-salesforce -platformdirs==4.5.0 +newrelic==10.9.0 + # via edx-django-utils +pbr==6.1.1 + # via stevedore +platformdirs==4.3.7 # via zeep proto-plus==1.26.1 # via google-api-core -protobuf==6.33.0 +protobuf==6.30.2 # via # google-api-core # googleapis-common-protos # proto-plus -psutil==7.1.2 +psutil==7.0.0 # via edx-django-utils pyasn1==0.6.1 # via @@ -99,15 +101,15 @@ pyasn1==0.6.1 # rsa pyasn1-modules==0.4.2 # via google-auth -pycparser==2.23 +pycparser==2.22 # via cffi pyjwt[crypto]==2.10.1 # via # edx-rest-api-client # simple-salesforce -pynacl==1.6.0 +pynacl==1.5.0 # via edx-django-utils -pyparsing==3.2.5 +pyparsing==3.2.3 # via httplib2 python-dateutil==2.9.0.post0 # via botocore @@ -115,9 +117,9 @@ pytz==2025.2 # via # jenkinsapi # zeep -pyyaml==6.0.3 +pyyaml==6.0.2 # via -r scripts/user_retirement/requirements/base.in -requests==2.32.5 +requests==2.32.3 # via # -r scripts/user_retirement/requirements/base.in # edx-rest-api-client @@ -127,33 +129,39 @@ requests==2.32.5 # requests-toolbelt # simple-salesforce # zeep -requests-file==3.0.1 +requests-file==2.1.0 # via zeep requests-toolbelt==1.0.0 # via zeep rsa==4.9.1 # via google-auth -s3transfer==0.14.0 +s3transfer==0.11.5 # via boto3 -simple-salesforce==1.12.9 +simple-salesforce==1.12.6 # via -r scripts/user_retirement/requirements/base.in -simplejson==3.20.2 +simplejson==3.20.1 # via -r scripts/user_retirement/requirements/base.in six==1.17.0 - # via python-dateutil + # via + # jenkinsapi + # python-dateutil sqlparse==0.5.3 # via django -stevedore==5.5.0 +stevedore==5.4.1 # via edx-django-utils -typing-extensions==4.15.0 +typing-extensions==4.13.2 # via simple-salesforce unicodecsv==0.14.1 # via -r scripts/user_retirement/requirements/base.in -uritemplate==4.2.0 +uritemplate==4.1.1 # via google-api-python-client -urllib3==2.5.0 +urllib3==1.26.20 # via + # -r scripts/user_retirement/requirements/base.in # botocore # requests -zeep==4.3.2 +zeep==4.3.1 # via simple-salesforce + +# The following packages are considered to be unsafe in a requirements file: +# setuptools diff --git a/scripts/xblock/requirements.txt b/scripts/xblock/requirements.txt index 533cd92824e4..631204c34cf5 100644 --- a/scripts/xblock/requirements.txt +++ b/scripts/xblock/requirements.txt @@ -4,13 +4,15 @@ # # make upgrade # -certifi==2025.10.5 +certifi==2025.1.31 # via requests -charset-normalizer==3.4.4 +charset-normalizer==2.0.12 + # via + # -c scripts/xblock/../../requirements/constraints.txt + # requests +idna==3.10 # via requests -idna==3.11 - # via requests -requests==2.32.5 +requests==2.32.3 # via -r scripts/xblock/requirements.in -urllib3==2.5.0 +urllib3==2.2.3 # via requests diff --git a/xmodule/capa/safe_exec/tests/test_safe_exec.py b/xmodule/capa/safe_exec/tests/test_safe_exec.py index d7679b66aa5e..d09f8c9d9ba7 100644 --- a/xmodule/capa/safe_exec/tests/test_safe_exec.py +++ b/xmodule/capa/safe_exec/tests/test_safe_exec.py @@ -24,10 +24,8 @@ from xmodule.capa.safe_exec import safe_exec, update_hash from xmodule.capa.safe_exec.remote_exec import is_codejail_in_darklaunch, is_codejail_rest_service_enabled from xmodule.capa.safe_exec.safe_exec import emsg_normalizers, normalize_error_message -from xmodule.capa.tests.test_util import use_unsafe_codejail -@use_unsafe_codejail() class TestSafeExec(unittest.TestCase): # lint-amnesty, pylint: disable=missing-class-docstring def test_set_values(self): g = {} @@ -532,7 +530,6 @@ def set(self, key, value): self.cache[key] = value -@use_unsafe_codejail() class TestSafeExecCaching(unittest.TestCase): """Test that caching works on safe_exec.""" @@ -657,7 +654,6 @@ def test_deep_ordering(self): assert h1 == h2 -@use_unsafe_codejail() class TestRealProblems(unittest.TestCase): # lint-amnesty, pylint: disable=missing-class-docstring def test_802x(self): code = textwrap.dedent("""\ diff --git a/xmodule/html_block.py b/xmodule/html_block.py index 41ad3f39ab8b..ad7ac78e370a 100644 --- a/xmodule/html_block.py +++ b/xmodule/html_block.py @@ -127,14 +127,20 @@ def get_html(self): """ Returns html required for rendering the block. """ if self.data: data = self.data - user_id = ( + user = ( self.runtime.service(self, 'user') .get_current_user() - .opt_attrs.get(ATTR_KEY_DEPRECATED_ANONYMOUS_USER_ID) + ) + user_id = user.opt_attrs.get(ATTR_KEY_DEPRECATED_ANONYMOUS_USER_ID) if user_id: data = data.replace("%%USER_ID%%", user_id) data = data.replace("%%COURSE_ID%%", str(self.scope_ids.usage_id.context_key)) + + # EDLYCUSTOM: Replace %%USER_EMAIL%% with user's email in HTML xblock + email = user.emails[0] + data = data.replace("%%USER_EMAIL%%", email) + return data return self.data diff --git a/xmodule/item_bank_block.py b/xmodule/item_bank_block.py index d09aabcf4b05..b53617e2c8e4 100644 --- a/xmodule/item_bank_block.py +++ b/xmodule/item_bank_block.py @@ -7,7 +7,6 @@ import logging import random from copy import copy - from django.conf import settings from django.utils.functional import classproperty from lxml import etree @@ -25,13 +24,13 @@ from xmodule.studio_editable import StudioEditableBlock from xmodule.util.builtin_assets import add_webpack_js_to_fragment from xmodule.validation import StudioValidation, StudioValidationMessage +from xmodule.xml_block import XmlMixin from xmodule.x_module import ( - STUDENT_VIEW, ResourceTemplates, XModuleMixin, shim_xmodule_js, + STUDENT_VIEW, ) -from xmodule.xml_block import XmlMixin _ = lambda text: text @@ -269,6 +268,20 @@ def selected_children(self): return self.selected + def format_block_keys_for_analytics(self, block_keys: list[tuple[str, str]]) -> list[dict]: + """ + Given a list of (block_type, block_id) pairs, prepare the JSON-ready metadata needed for analytics logging. + + This is [ + {"usage_key": x, "original_usage_key": y, "original_usage_version": z, "descendants": [...]} + ] + where the main list contains all top-level blocks, and descendants contains a *flat* list of all + descendants of the top level blocks, if any. + + Must be implemented in child class. + """ + raise NotImplementedError + @XBlock.handler def reset_selected_children(self, _, __): """ @@ -419,40 +432,6 @@ def definition_to_xml(self, resource_fs): xml_object.set(field_name, str(field.read_from(self))) return xml_object - def author_view(self, context): - """ - Renders the Studio views. - Normal studio view: If block is properly configured, displays library status summary - Studio container view: displays a preview of all possible children. - """ - fragment = Fragment() - root_xblock = context.get('root_xblock') - is_root = root_xblock and root_xblock.usage_key == self.usage_key - if is_root and self.children: - # User has clicked the "View" link. Show a preview of all possible children: - context['can_edit_visibility'] = False - context['can_move'] = False - context['can_collapse'] = True - self.render_children(context, fragment, can_reorder=False, can_add=False) - else: - # We're just on the regular unit page, or we're on the "view" page but no children exist yet. - # Show a summary message and instructions. - summary_html = loader.render_django_template('templates/item_bank/author_view.html', { - # Due to template interpolation limitations, we have to pass some HTML for the link here: - "view_link": f'', - "blocks": [ - {"display_name": display_name_with_default(child)} - for child in self.get_children() - ], - "block_count": len(self.children), - "max_count": self.max_count, - }) - fragment.add_content(summary_html) - # Whether on the main author view or the detailed children view, show a button to add more from the library: - add_html = loader.render_django_template('templates/item_bank/author_view_add.html', {}) - fragment.add_content(add_html) - return fragment - @classmethod def get_selected_event_prefix(cls) -> str: """ @@ -463,6 +442,21 @@ def get_selected_event_prefix(cls) -> str: """ raise NotImplementedError + +class ItemBankBlock(ItemBankMixin, XBlock): + """ + An XBlock which shows a random subset of its children to each learner. + + Unlike LegacyLibraryContentBlock, this block does not need to worry about synchronization, capa_type filtering, etc. + That is all implemented using `upstream` links on each individual child. + """ + display_name = String( + display_name=_("Display Name"), + help=_("The display name for this component."), + default="Problem Bank", + scope=Scope.settings, + ) + def validate(self): """ Validates the state of this ItemBankBlock Instance. @@ -498,6 +492,40 @@ def validate(self): ) return validation + def author_view(self, context): + """ + Renders the Studio views. + Normal studio view: If block is properly configured, displays library status summary + Studio container view: displays a preview of all possible children. + """ + fragment = Fragment() + root_xblock = context.get('root_xblock') + is_root = root_xblock and root_xblock.usage_key == self.usage_key + if is_root and self.children: + # User has clicked the "View" link. Show a preview of all possible children: + context['can_edit_visibility'] = False + context['can_move'] = False + context['can_collapse'] = True + self.render_children(context, fragment, can_reorder=False, can_add=False) + else: + # We're just on the regular unit page, or we're on the "view" page but no children exist yet. + # Show a summary message and instructions. + summary_html = loader.render_django_template('templates/item_bank/author_view.html', { + # Due to template interpolation limitations, we have to pass some HTML for the link here: + "view_link": f'', + "blocks": [ + {"display_name": display_name_with_default(child)} + for child in self.get_children() + ], + "block_count": len(self.children), + "max_count": self.max_count, + }) + fragment.add_content(summary_html) + # Whether on the main author view or the detailed children view, show a button to add more from the library: + add_html = loader.render_django_template('templates/item_bank/author_view_add.html', {}) + fragment.add_content(add_html) + return fragment + def format_block_keys_for_analytics(self, block_keys: list[tuple[str, str]]) -> list[dict]: """ Implement format_block_keys_for_analytics using the `upstream` link system. @@ -513,21 +541,6 @@ def format_block_keys_for_analytics(self, block_keys: list[tuple[str, str]]) -> } for block_key in block_keys ] - -class ItemBankBlock(ItemBankMixin, XBlock): - """ - An XBlock which shows a random subset of its children to each learner. - - Unlike LegacyLibraryContentBlock, this block does not need to worry about synchronization, capa_type filtering, etc. - That is all implemented using `upstream` links on each individual child. - """ - display_name = String( - display_name=_("Display Name"), - help=_("The display name for this component."), - default="Problem Bank", - scope=Scope.settings, - ) - @classmethod def get_selected_event_prefix(cls) -> str: """ diff --git a/xmodule/util/sandboxing.py b/xmodule/util/sandboxing.py index 01e897adeca2..a8883ba3e93d 100644 --- a/xmodule/util/sandboxing.py +++ b/xmodule/util/sandboxing.py @@ -3,7 +3,6 @@ import re from django.conf import settings -from opaque_keys.edx.keys import CourseKey, LearningContextKey DEFAULT_PYTHON_LIB_FILENAME = 'python_lib.zip' @@ -12,6 +11,11 @@ def course_code_library_asset_name(): """ Return the asset name to use for course code libraries, defaulting to python_lib.zip. """ + # .. setting_name: PYTHON_LIB_FILENAME + # .. setting_default: python_lib.zip + # .. setting_description: Name of the course file to make available to code in + # custom Python-graded problems. By default, this file will not be downloadable + # by learners. return getattr(settings, 'PYTHON_LIB_FILENAME', DEFAULT_PYTHON_LIB_FILENAME) @@ -40,14 +44,10 @@ def can_execute_unsafe_code(course_id): return False -def get_python_lib_zip(contentstore, context_key: LearningContextKey): +def get_python_lib_zip(contentstore, course_id): """Return the bytes of the course code library file, if it exists.""" - if not isinstance(context_key, CourseKey): - # Although Content Libraries V2 does support python-evaluated capa problems, - # it doesn't yet support supplementary python zip files. - return None python_lib_filename = course_code_library_asset_name() - asset_key = context_key.make_asset_key("asset", python_lib_filename) + asset_key = course_id.make_asset_key("asset", python_lib_filename) zip_lib = contentstore().find(asset_key, throw_on_not_found=False) if zip_lib is not None: return zip_lib.data