Merge branch 'master' into sameeramin/upgrade-edx-enterprise-1103544
This commit is contained in:
@@ -583,7 +583,7 @@ class SetIDVerificationStatusTestCase(TestCase):
|
||||
"""
|
||||
Verification signal is sent upon approval.
|
||||
"""
|
||||
with mock.patch('openedx.core.djangoapps.signals.signals.LEARNER_NOW_VERIFIED.send_robust') as mock_signal:
|
||||
with mock.patch('openedx_events.learning.signals.IDV_ATTEMPT_APPROVED.send_event') as mock_signal:
|
||||
# Begin the pipeline.
|
||||
pipeline.set_id_verification_status(
|
||||
auth_entry=pipeline.AUTH_ENTRY_LOGIN,
|
||||
|
||||
402
docs/decisions/0020-upstream-downstream.rst
Normal file
402
docs/decisions/0020-upstream-downstream.rst
Normal file
@@ -0,0 +1,402 @@
|
||||
4. Upstream and downstream content
|
||||
##################################
|
||||
|
||||
Status
|
||||
******
|
||||
|
||||
Accepted.
|
||||
|
||||
Implementation in progress as of 2024-09-03.
|
||||
|
||||
Context
|
||||
*******
|
||||
|
||||
We are replacing the existing Legacy ("V1") Content Libraries system, based on
|
||||
ModuleStore, with a Relaunched ("V2") Content Libraries system, based on
|
||||
Learning Core. V1 and V2 libraries will coexist for at least one release to
|
||||
allow for migration; eventually, V1 libraries will be removed entirely.
|
||||
|
||||
Content from V1 libraries can only be included into courses using the
|
||||
LibraryContentBlock (called "Randomized Content Module" in Studio), which works
|
||||
like this:
|
||||
|
||||
* Course authors add a LibraryContentBlock to a Unit and configure it with a
|
||||
library key and a count of N library blocks to select (or `-1` for "all
|
||||
blocks").
|
||||
|
||||
* For each block in the chosen library, its *content definition* is copied into
|
||||
the course as a child of the LibraryContentBlock, whereas its *settings* are
|
||||
copied into a special "default" settings dictionary in the course's structure
|
||||
document--this distinction will matter later. The usage key of each copied
|
||||
block is derived from a hash of the original library block's usage key plus
|
||||
the LibraryContentBlock's own usage key--this will also matter
|
||||
later.
|
||||
|
||||
* The course author is free to override the content and settings of the
|
||||
course-local copies of each library block.
|
||||
|
||||
* When any update is made to the library, the course author is prompted to
|
||||
update the LibraryContentBlock. This involves re-copying the library blocks'
|
||||
content definitions and default settings, which clobbers any overrides they
|
||||
have made to content, but preserves any overrides they have made to settings.
|
||||
Furthermore, any blocks that were added to the library are newly copied into
|
||||
the course, and any blocks that were removed from the library are deleted
|
||||
from the course. For all blocks, usage keys are recalculated using the same
|
||||
hash derivation described above; for existing blocks, it is important that
|
||||
this recalculation yields the same usage key so that student state is not
|
||||
lost.
|
||||
|
||||
* Over in the LMS, when a learner loads LibraryContentBlock, they are shown a
|
||||
list of N randomly-picked blocks from the library. Subsequent visits show
|
||||
them the same list, *unless* children were added, children were removed, or N
|
||||
changed. In those cases, the LibraryContentBlock tries to make the smallest
|
||||
possible adjustment to their personal list of blocks while respecting N and
|
||||
the updated list of children.
|
||||
|
||||
This system has several issues:
|
||||
|
||||
#. **Missing defaults after import:** When a course with a LibraryContentBlock
|
||||
is imported into an Open edX instance *without* the referenced library, the
|
||||
blocks' *content* will remain intact as will course-local *settings
|
||||
overrides*. However, any *default settings* defined in the library will be
|
||||
missing. This can result in content that is completely broken, especially
|
||||
since critical fields like video URLs and LTI URLs are considered
|
||||
"settings". For a detailed scenario, see `LibraryContentBlock Curveball 1`_.
|
||||
|
||||
#. **Strange behavior when duplicating content:** Typically, when a
|
||||
block is duplicated or copy-pasted, the new block's usage key and its
|
||||
children's usage keys are randomly generated. However, recall that when a
|
||||
LibraryContentBlock is updated, its children's usage keys are rederived
|
||||
using a hash function. That would cause the children's usage keys to change,
|
||||
thus destroying any student state. So, we must work around this with a hack:
|
||||
upon duplicating or pasting a LibraryContentBlock, we immediately update the
|
||||
LibraryContentBlock, thus discarding the problematic randomly-generated keys
|
||||
in favor of hash-derived keys. This works, but:
|
||||
|
||||
* it involves weird code hacks,
|
||||
* it unexpectedly discards any content overrides the course author made to
|
||||
the copied LibraryContentBlock's children,
|
||||
* it unexpectedly uses the latest version of library content, regardless of
|
||||
which version the copied LibraryContentBlock was using, and
|
||||
* it fails if the library does not exist on the Open edX instance, which
|
||||
can happen if the course was imported from another instance.
|
||||
|
||||
#. **Conflation of reference and randomization:** The LibraryContentBlock does
|
||||
two things: it connects courses to library content, and it shows users a
|
||||
random subset of content. There is no reason that those two features need to
|
||||
be coupled together. A course author may want to randomize course-defined
|
||||
content, or they may want to randomize content from multiple different
|
||||
libraries. Or, they may want to use content from libraries without
|
||||
randomizing it at all. While it is feasible to support all these things in a
|
||||
single XBlock, trying to do so led to a `very complicated XBlock concept`_
|
||||
which difficult to explain to product managers and other engineers.
|
||||
|
||||
#. **Unpredictable preservation of overrides:** Recall that *content
|
||||
definitions* and *settings* are handled differently. This distinction is
|
||||
defined in the code: every authorable XBlock field is either defined with
|
||||
`Scope.content` or `Scope.settings`. In theory, XBlock developers would use
|
||||
the content scope for fields that are core to the meaning of piece of
|
||||
content, and they would only use the settings scope for fields that would be
|
||||
reasonable to configure in a local copy of the piece of content. In
|
||||
practice, though, XBlock developers almost always use `Scope.settings`. The
|
||||
result of this is that customizations to blocks *almost always* survive
|
||||
through library updates, except when they don't. Course authors have no way
|
||||
to know (or even guess) when their customizations they will and won't
|
||||
survive updates.
|
||||
|
||||
#. **General pain and suffering:** The relationship between courses and V1
|
||||
libraries is confusing to content authors, site admins, and developers
|
||||
alike. The behaviors above toe the line between "quirks" and "known bugs",
|
||||
and they are not all documented. Past attempts to improve the system have
|
||||
`triggered series of bugs`_, some of which led to permanent loss of learner
|
||||
state. In other cases, past Content Libraries improvement efforts have
|
||||
slowed or completely stalled out in code review due to the overwhelming
|
||||
amount of context and edge cases that must be understood to safely make any
|
||||
changes.
|
||||
|
||||
.. _LibraryContentBlock Curveball 1: https://openedx.atlassian.net/wiki/spaces/COMM/pages/3966795804/Fun+with+LibraryContentBlock+export+import+and+duplication#Curveball-1%3A-Import%2FExport
|
||||
.. _LibraryContentBlock Curveball 2: https://openedx.atlassian.net/wiki/spaces/COMM/pages/3966795804/Fun+with+LibraryContentBlock+export+import+and+duplication#Curveball-2:-Duplication
|
||||
.. _very complicated XBlock concept: https://github.com/openedx/edx-platform/blob/master/xmodule/docs/decisions/0003-library-content-block-schema.rst
|
||||
.. _triggered series of bugs: https://openedx.atlassian.net/wiki/spaces/COMM/pages/3858661405/Bugs+from+Content+Libraries+V1
|
||||
|
||||
We are keen to use the Library Relaunch project to address all of these
|
||||
problems. So, V2 libraries will interop with courses using a completely
|
||||
different data model.
|
||||
|
||||
|
||||
Decision
|
||||
********
|
||||
|
||||
We will create a framework where a *downstream* piece of content (e.g. a course
|
||||
block) can be *linked* to an *upstream* piece of content (e.g., a library
|
||||
block) with the following properties:
|
||||
|
||||
* **Portable:** Links can refer to certain content on the current Open edX
|
||||
instance, and in the future they may be able to refer to content on other
|
||||
Open edX instances or sites. Links will never include information that is
|
||||
internal to a particular Open edX instance, such as foreign keys.
|
||||
|
||||
* **Flat:** The *link* is a not a wrapper (like the LibraryContentBlock),
|
||||
but simply a piece of metadata directly on the downstream content which
|
||||
points to the upstream content. We will no longer rely on precarious
|
||||
hash-derived usage keys to establish connection to upstream blocks;
|
||||
like any other block, an upstream-linked blocks can be granted whatever block
|
||||
ID that the authoring environment assigns it, whether random or
|
||||
human-readable.
|
||||
|
||||
* **Forwards-compatible:** If downstream content is created in a course on
|
||||
an Open edX site that supports upstream and downstreams (e.g., a Teak
|
||||
instance), and then it is exported and imported into a site that doesn't
|
||||
(e.g., a Quince instance), the downstream content will simply act like
|
||||
regular course content.
|
||||
|
||||
* **Independent:** Upstream content and downstream content exist separately
|
||||
from one another:
|
||||
|
||||
* Modifying upstream content does not affect any downstream content (unless a
|
||||
sync happens, more on that later).
|
||||
* Deleting upstream content does not impact its downstream content. By
|
||||
corollary, pieces of downstream content can completely and correctly render
|
||||
on Open edX instances that are missing their linked upstream content.
|
||||
* (Preserving a positive feature of the V1 LibraryContentBlock) The link
|
||||
persists through export-import and copy-paste, regardless of whether the
|
||||
upstream content actually exists. A "broken" link to upstream content is
|
||||
seamlessly "repaired" if the upstream content becomes available again.
|
||||
|
||||
* **Customizable:** On an OLX level, authors can still override the value
|
||||
of any field for a piece of downstream content. However, we will empower
|
||||
Studio to be more prescriptive about what authors *can* override versus what
|
||||
they *should* override:
|
||||
|
||||
* We define a set of *customizable* fields, with platform-level defaults
|
||||
like display_name and a max_attempts, plus the ability for external
|
||||
XBlocks to opt their own fields into customizability.
|
||||
* Studio may use this list to provide an interface for customizing
|
||||
downstream blocks, separate from the usual "Edit" interface that would
|
||||
permit them to make unsafe overrides.
|
||||
* Furthermore, downstream content will record which fields the user has
|
||||
customized...
|
||||
|
||||
* even if the customization is to simply clear the value of the fields...
|
||||
* and even if the customization is made redundant in a future version of
|
||||
the upstream content. For example, if max_attempts is customized from 3
|
||||
to 5 in the downstream content, but the next version of the upstream
|
||||
content also changes max_attempts to 5, the downstream would still
|
||||
consider max_attempts to be customized. If the following version of the
|
||||
upstream content again changed max_attempts to 6, the downstream would
|
||||
retain max_attempts to be 5.
|
||||
|
||||
* Finally, the downstream content will locally save the upstream value of
|
||||
customizable fields, allowing the author to *revert* back to them
|
||||
regardless of whether the upstream content is actually available.
|
||||
|
||||
* **Synchronizable, without surprises:** Downstream content can be *synced*
|
||||
with updates that have been made to its linked upstream. This means that the
|
||||
latest available upstream content field values will entirely replace all of
|
||||
the downstream field values, *except* those which were customized, as
|
||||
described in the previous item.
|
||||
|
||||
* **Concrete, but flexible:** The internal implementation of upstream-downstream
|
||||
syncing will assume that:
|
||||
|
||||
* upstream content belongs to a V2 content library,
|
||||
* downstream content belongs to a course on the same instance, and
|
||||
* the link is the stringified usage key of the upstream library content.
|
||||
|
||||
This will allow us to keep the implementation straightforward. However, we
|
||||
will *not* expose these assumptions in the Python APIs, the HTTP APIs, or in
|
||||
the persisted fields, allowing us in the future to generalize to other
|
||||
upstreams (such as externally-hosted libraries) and other downstreams (such
|
||||
as a standalone enrollable sequence without a course).
|
||||
|
||||
If any of these assumptions are violated, we will raise an exception or log a
|
||||
warning, as appropriate. Particularly, if these assumptions are violated at
|
||||
the OLX level via a course import, then we will probably show a warning at
|
||||
import time and refuse to sync from the unsupported upstream; however, we
|
||||
will *not* fail the entire import or mangle the value of upstream link, since
|
||||
we want to remain forwards-compatible with potential future forms of syncing.
|
||||
As a concrete example: if a course block has *another course block's usage
|
||||
key* as an upstream, then we will faithfully keep that value through the
|
||||
import and export process, but we will not prompt the user to sync updates
|
||||
for that block.
|
||||
|
||||
* **Decoupled:** Upstream-downstream linking is not tied up with any other
|
||||
courseware feature; in particular, it is unrelated to content randomization.
|
||||
Randomized library content will be supported, but it will be a *synthesis* of
|
||||
two features: (1) a RandomizationBlock that randomly selects a subset of its
|
||||
children, where (2) some or all of those children are linked to upstream
|
||||
blocks.
|
||||
|
||||
Consequences
|
||||
************
|
||||
|
||||
To support the Libraries Relaunch in Sumac:
|
||||
|
||||
* For every XBlock in CMS, we will use XBlock fields to persist the upstream
|
||||
link, its versions, its customizable fields, and its set of downstream
|
||||
overrides.
|
||||
|
||||
* We will avoid exposing these fields to LMS code.
|
||||
|
||||
* We will define an initial set of customizable fields for Problem, Text, and
|
||||
Video blocks.
|
||||
|
||||
* We will define method(s) for syncing update on the XBlock runtime so that
|
||||
they are available in the SplitModuleStore's XBlock Runtime
|
||||
(CachingDescriptorSystem).
|
||||
|
||||
* Either in the initial implementation or in a later implementation, it may
|
||||
make sense to declare abstract versions of the syncing method(s) higher up
|
||||
in XBlock Runtime inheritance hierarchy.
|
||||
|
||||
* We will expose a CMS HTTP API for syncing updates to blocks from their
|
||||
upstreams.
|
||||
|
||||
* We will avoid exposing this API from the LMS.
|
||||
|
||||
For reference, here are some excerpts of a potential implementation. This may
|
||||
change through development and code review.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
###########################################################################
|
||||
# cms/lib/xblock/upstream_sync.py
|
||||
###########################################################################
|
||||
|
||||
class UpstreamSyncMixin(XBlockMixin):
|
||||
"""
|
||||
Allows an XBlock in the CMS to be associated & synced with an upstream.
|
||||
Mixed into CMS's XBLOCK_MIXINS, but not LMS's.
|
||||
"""
|
||||
|
||||
# Metadata related to upstream synchronization
|
||||
upstream = String(
|
||||
help=("""
|
||||
The usage key of a block (generally within a content library)
|
||||
which serves as a source of upstream updates for this block,
|
||||
or None if there is no such upstream. Please note: It is valid
|
||||
for this field to hold a usage key for an upstream block
|
||||
that does not exist (or does not *yet* exist) on this instance,
|
||||
particularly if this downstream block was imported from a
|
||||
different instance.
|
||||
"""),
|
||||
default=None, scope=Scope.settings, hidden=True, enforce_type=True
|
||||
)
|
||||
upstream_version = Integer(
|
||||
help=("""
|
||||
Record of the upstream block's version number at the time this
|
||||
block was created from it. If upstream_version is smaller
|
||||
than the upstream block's latest version, then the user will be
|
||||
able to sync updates into this downstream block.
|
||||
"""),
|
||||
default=None, scope=Scope.settings, hidden=True, enforce_type=True,
|
||||
)
|
||||
downstream_customized = Set(
|
||||
help=("""
|
||||
Names of the fields which have values set on the upstream
|
||||
block yet have been explicitly overridden on this downstream
|
||||
block. Unless explicitly cleared by the user, these
|
||||
customizations will persist even when updates are synced from
|
||||
the upstream.
|
||||
"""),
|
||||
default=[], scope=Scope.settings, hidden=True, enforce_type=True,
|
||||
)
|
||||
|
||||
# Store upstream defaults for customizable fields.
|
||||
upstream_display_name = String(...)
|
||||
upstream_max_attempts = List(...)
|
||||
... # We will probably want to pre-define several more of these.
|
||||
|
||||
def get_upstream_field_names(cls) -> dict[str, str]:
|
||||
"""
|
||||
Mapping from each customizable field to field which stores its upstream default.
|
||||
XBlocks outside of edx-platform can override this in order to set
|
||||
up their own customizable fields.
|
||||
"""
|
||||
return {
|
||||
"display_name": "upstream_display_name",
|
||||
"max_attempts": "upstream_max_attempts",
|
||||
}
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
"""
|
||||
Update `downstream_customized` when a customizable field is modified.
|
||||
Uses `get_upstream_field_names` keys as the list of fields that are
|
||||
customizable.
|
||||
"""
|
||||
...
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class UpstreamInfo:
|
||||
"""
|
||||
Metadata about a block's relationship with an upstream.
|
||||
"""
|
||||
usage_key: UsageKey
|
||||
current_version: int
|
||||
latest_version: int | None
|
||||
sync_url: str
|
||||
error: str | None
|
||||
|
||||
@property
|
||||
def sync_available(self) -> bool:
|
||||
"""
|
||||
Should the user be prompted to sync this block with upstream?
|
||||
"""
|
||||
return (
|
||||
self.latest_version
|
||||
and self.current_version < self.latest_version
|
||||
and not self.error
|
||||
)
|
||||
|
||||
|
||||
###########################################################################
|
||||
# xmodule/modulestore/split_mongo/caching_descriptor_system.py
|
||||
###########################################################################
|
||||
|
||||
class CachingDescriptorSystem(...):
|
||||
|
||||
def validate_upstream_key(self, usage_key: UsageKey | str) -> UsageKey:
|
||||
"""
|
||||
Raise an error if the provided key is not a valid upstream reference.
|
||||
Instead of explicitly checking whether a key is a LibraryLocatorV2,
|
||||
callers should validate using this function, and use an `except` clause
|
||||
to handle the case where the key is not a valid upstream.
|
||||
Raises: InvalidKeyError, UnsupportedUpstreamKeyType
|
||||
"""
|
||||
...
|
||||
|
||||
def sync_from_upstream(self, *, downstream_key: UsageKey, apply_updates: bool) -> None:
|
||||
"""
|
||||
Python API for loading updates from upstream block.
|
||||
Can choose whether or not to actually apply those updates...
|
||||
apply_updates=False: Think "get fetch".
|
||||
Use case: course import.
|
||||
apply_updates=True: Think "git pull".
|
||||
Use case: sync_updates handler.
|
||||
Raises: InvalidKeyError, UnsupportedUpstreamKeyType, XBlockNotFoundError
|
||||
"""
|
||||
...
|
||||
|
||||
def get_upstream_info(self, downstream_key: UsageKey) -> UpstreamInfo | None:
|
||||
"""
|
||||
Python API for upstream metadata, or None.
|
||||
Raises: InvalidKeyError, XBlockNotFoundError
|
||||
"""
|
||||
...
|
||||
|
||||
Finally, here is what the OLX for a library-sourced Problem XBlock in a course
|
||||
might look like:
|
||||
|
||||
.. code-block:: xml
|
||||
|
||||
<problem
|
||||
display_name="A title that has been customized in the course"
|
||||
max_attempts="2"
|
||||
upstream="lb:myorg:mylib:problem:p1"
|
||||
upstream_version="12"
|
||||
downstream_customized="["display_name","max_attempts"]"
|
||||
upstream_display_name="The title that was defined in the library block"
|
||||
upstream_max_attempts="3"
|
||||
>
|
||||
<!-- problem content would go here -->
|
||||
</problem>
|
||||
@@ -31,7 +31,7 @@ workspace {
|
||||
}
|
||||
|
||||
grades_app -> signal_handlers "Emits COURSE_GRADE_NOW_PASSED signal"
|
||||
verify_student_app -> signal_handlers "Emits LEARNER_NOW_VERIFIED signal"
|
||||
verify_student_app -> signal_handlers "Emits IDV_ATTEMPT_APPROVED signal"
|
||||
student_app -> signal_handlers "Emits ENROLLMENT_TRACK_UPDATED signal"
|
||||
allowlist -> signal_handlers "Emits APPEND_CERTIFICATE_ALLOWLIST signal"
|
||||
signal_handlers -> generation_handler "Invokes generate_allowlist_certificate()"
|
||||
|
||||
@@ -32,9 +32,8 @@ from openedx.core.djangoapps.content.course_overviews.signals import COURSE_PACI
|
||||
from openedx.core.djangoapps.signals.signals import (
|
||||
COURSE_GRADE_NOW_FAILED,
|
||||
COURSE_GRADE_NOW_PASSED,
|
||||
LEARNER_NOW_VERIFIED
|
||||
)
|
||||
from openedx_events.learning.signals import EXAM_ATTEMPT_REJECTED
|
||||
from openedx_events.learning.signals import EXAM_ATTEMPT_REJECTED, IDV_ATTEMPT_APPROVED
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
@@ -118,14 +117,17 @@ def _listen_for_failing_grade(sender, user, course_id, grade, **kwargs): # pyli
|
||||
log.info(f'Certificate marked not passing for {user.id} : {course_id} via failing grade')
|
||||
|
||||
|
||||
@receiver(LEARNER_NOW_VERIFIED, dispatch_uid="learner_track_changed")
|
||||
def _listen_for_id_verification_status_changed(sender, user, **kwargs): # pylint: disable=unused-argument
|
||||
@receiver(IDV_ATTEMPT_APPROVED, dispatch_uid="learner_track_changed")
|
||||
def _listen_for_id_verification_status_changed(sender, signal, **kwargs): # pylint: disable=unused-argument
|
||||
"""
|
||||
Listen for a signal indicating that the user's id verification status has changed.
|
||||
"""
|
||||
if not auto_certificate_generation_enabled():
|
||||
return
|
||||
|
||||
event_data = kwargs.get('idv_attempt')
|
||||
user = User.objects.get(id=event_data.user.id)
|
||||
|
||||
user_enrollments = CourseEnrollment.enrollments_for_user(user=user)
|
||||
expected_verification_status = IDVerificationService.user_status(user)
|
||||
expected_verification_status = expected_verification_status['status']
|
||||
|
||||
@@ -13,22 +13,20 @@ from opaque_keys.edx.keys import CourseKey, UsageKey
|
||||
from openedx_events.data import EventsMetadata
|
||||
from openedx_events.learning.data import ExamAttemptData, UserData, UserPersonalData
|
||||
from openedx_events.learning.signals import EXAM_ATTEMPT_REJECTED
|
||||
from xmodule.modulestore.tests.django_utils import ModuleStoreTestCase
|
||||
from xmodule.modulestore.tests.factories import CourseFactory
|
||||
from openedx_events.tests.utils import OpenEdxEventsTestMixin
|
||||
|
||||
from common.djangoapps.student.tests.factories import CourseEnrollmentFactory, UserFactory
|
||||
from lms.djangoapps.certificates.api import has_self_generated_certificates_enabled
|
||||
from lms.djangoapps.certificates.config import AUTO_CERTIFICATE_GENERATION
|
||||
from lms.djangoapps.certificates.data import CertificateStatuses
|
||||
from lms.djangoapps.certificates.models import (
|
||||
CertificateGenerationConfiguration,
|
||||
GeneratedCertificate
|
||||
)
|
||||
from lms.djangoapps.certificates.models import CertificateGenerationConfiguration, GeneratedCertificate
|
||||
from lms.djangoapps.certificates.signals import handle_exam_attempt_rejected_event
|
||||
from lms.djangoapps.certificates.tests.factories import CertificateAllowlistFactory, GeneratedCertificateFactory
|
||||
from lms.djangoapps.grades.course_grade_factory import CourseGradeFactory
|
||||
from lms.djangoapps.grades.tests.utils import mock_passing_grade
|
||||
from lms.djangoapps.verify_student.models import SoftwareSecurePhotoVerification
|
||||
from xmodule.modulestore.tests.django_utils import ModuleStoreTestCase
|
||||
from xmodule.modulestore.tests.factories import CourseFactory
|
||||
|
||||
|
||||
class SelfGeneratedCertsSignalTest(ModuleStoreTestCase):
|
||||
@@ -302,10 +300,17 @@ class FailingGradeCertsTest(ModuleStoreTestCase):
|
||||
assert cert.status == CertificateStatuses.downloadable
|
||||
|
||||
|
||||
class LearnerIdVerificationTest(ModuleStoreTestCase):
|
||||
class LearnerIdVerificationTest(ModuleStoreTestCase, OpenEdxEventsTestMixin):
|
||||
"""
|
||||
Tests for certificate generation task firing on learner id verification
|
||||
"""
|
||||
ENABLED_OPENEDX_EVENTS = ['org.openedx.learning.idv_attempt.approved.v1']
|
||||
|
||||
@classmethod
|
||||
def setUpClass(cls):
|
||||
super().setUpClass()
|
||||
cls.start_events_isolation()
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.course_one = CourseFactory.create(self_paced=True)
|
||||
|
||||
@@ -4704,15 +4704,19 @@ class TestOauthInstructorAPILevelsAccess(SharedModuleStoreTestCase, LoginEnrollm
|
||||
Test endpoints using Oauth2 authentication.
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def setUpClass(cls):
|
||||
super().setUpClass()
|
||||
cls.course = CourseFactory.create(
|
||||
entrance_exam_id='i4x://{}/{}/chapter/Entrance_exam'.format('test_org', 'test_course')
|
||||
)
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.course = CourseFactory.create(
|
||||
org='test_org',
|
||||
course='test_course',
|
||||
run='test_run',
|
||||
entrance_exam_id='i4x://{}/{}/chapter/Entrance_exam'.format('test_org', 'test_course')
|
||||
)
|
||||
self.problem_location = msk_from_problem_urlname(
|
||||
self.course.id,
|
||||
'robot-some-problem-urlname'
|
||||
)
|
||||
self.problem_urlname = str(self.problem_location)
|
||||
|
||||
self.other_user = UserFactory()
|
||||
dot_application = ApplicationFactory(user=self.other_user, authorization_grant_type='password')
|
||||
@@ -4744,7 +4748,14 @@ class TestOauthInstructorAPILevelsAccess(SharedModuleStoreTestCase, LoginEnrollm
|
||||
"send-to": ["myself"],
|
||||
"subject": "This is subject",
|
||||
"message": "message"
|
||||
}, 'data_researcher')
|
||||
}, 'data_researcher'),
|
||||
('list_instructor_tasks',
|
||||
{
|
||||
'problem_location_str': self.problem_urlname,
|
||||
'unique_student_identifier': self.other_user.email
|
||||
},
|
||||
'data_researcher'),
|
||||
('list_instructor_tasks', {}, 'data_researcher')
|
||||
]
|
||||
|
||||
self.fake_jwt = ('wyJUxMiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJjaGFuZ2UtbWUiLCJleHAiOjE3MjU4OTA2NzIsImdyY'
|
||||
|
||||
@@ -108,7 +108,7 @@ from lms.djangoapps.instructor_task.data import InstructorTaskTypes
|
||||
from lms.djangoapps.instructor_task.models import ReportStore
|
||||
from lms.djangoapps.instructor.views.serializer import (
|
||||
AccessSerializer, BlockDueDateSerializer, RoleNameSerializer, ShowStudentExtensionSerializer, UserSerializer,
|
||||
SendEmailSerializer, StudentAttemptsSerializer
|
||||
SendEmailSerializer, StudentAttemptsSerializer, ListInstructorTaskInputSerializer
|
||||
)
|
||||
from openedx.core.djangoapps.content.course_overviews.models import CourseOverview
|
||||
from openedx.core.djangoapps.course_groups.cohorts import add_user_to_cohort, is_course_cohorted
|
||||
@@ -2373,9 +2373,8 @@ class InstructorTasks(DeveloperErrorViewMixin, APIView):
|
||||
return _list_instructor_tasks(request=request, course_id=course_id)
|
||||
|
||||
|
||||
@require_POST
|
||||
@ensure_csrf_cookie
|
||||
def list_instructor_tasks(request, course_id):
|
||||
@method_decorator(cache_control(no_cache=True, no_store=True, must_revalidate=True), name='dispatch')
|
||||
class ListInstructorTasks(APIView):
|
||||
"""
|
||||
List instructor tasks.
|
||||
|
||||
@@ -2385,21 +2384,44 @@ def list_instructor_tasks(request, course_id):
|
||||
- `problem_location_str` and `unique_student_identifier` lists task
|
||||
history for problem AND student (intersection)
|
||||
"""
|
||||
return _list_instructor_tasks(request=request, course_id=course_id)
|
||||
permission_classes = (IsAuthenticated, permissions.InstructorPermission)
|
||||
permission_name = permissions.SHOW_TASKS
|
||||
serializer_class = ListInstructorTaskInputSerializer
|
||||
|
||||
@method_decorator(ensure_csrf_cookie)
|
||||
def post(self, request, course_id):
|
||||
"""
|
||||
List instructor tasks.
|
||||
"""
|
||||
serializer = self.serializer_class(data=request.data)
|
||||
serializer.is_valid(raise_exception=True)
|
||||
|
||||
return _list_instructor_tasks(
|
||||
request=request, course_id=course_id, serialize_data=serializer.validated_data
|
||||
)
|
||||
|
||||
|
||||
@cache_control(no_cache=True, no_store=True, must_revalidate=True)
|
||||
@require_course_permission(permissions.SHOW_TASKS)
|
||||
def _list_instructor_tasks(request, course_id):
|
||||
def _list_instructor_tasks(request, course_id, serialize_data=None):
|
||||
"""
|
||||
List instructor tasks.
|
||||
|
||||
Internal function with common code for both DRF and and tradition views.
|
||||
"""
|
||||
# This method is also used by other APIs with the GET method.
|
||||
# The query_params attribute is utilized for GET requests,
|
||||
# where parameters are passed as query strings.
|
||||
|
||||
course_id = CourseKey.from_string(course_id)
|
||||
params = getattr(request, 'query_params', request.POST)
|
||||
problem_location_str = strip_if_string(params.get('problem_location_str', False))
|
||||
student = params.get('unique_student_identifier', None)
|
||||
if serialize_data is not None:
|
||||
problem_location_str = strip_if_string(serialize_data.get('problem_location_str', False))
|
||||
student = serialize_data.get('unique_student_identifier', None)
|
||||
else:
|
||||
params = getattr(request, 'query_params', request.POST)
|
||||
problem_location_str = strip_if_string(params.get('problem_location_str', False))
|
||||
student = params.get('unique_student_identifier', None)
|
||||
|
||||
if student is not None:
|
||||
student = get_student_from_identifier(student)
|
||||
|
||||
|
||||
@@ -44,7 +44,7 @@ urlpatterns = [
|
||||
name='list_entrance_exam_instructor_tasks'),
|
||||
path('mark_student_can_skip_entrance_exam', api.mark_student_can_skip_entrance_exam,
|
||||
name='mark_student_can_skip_entrance_exam'),
|
||||
path('list_instructor_tasks', api.list_instructor_tasks, name='list_instructor_tasks'),
|
||||
path('list_instructor_tasks', api.ListInstructorTasks.as_view(), name='list_instructor_tasks'),
|
||||
path('list_background_email_tasks', api.list_background_email_tasks, name='list_background_email_tasks'),
|
||||
path('list_email_content', api.ListEmailContent.as_view(), name='list_email_content'),
|
||||
path('list_forum_members', api.list_forum_members, name='list_forum_members'),
|
||||
|
||||
@@ -61,6 +61,43 @@ class AccessSerializer(serializers.Serializer):
|
||||
return user
|
||||
|
||||
|
||||
class ListInstructorTaskInputSerializer(serializers.Serializer): # pylint: disable=abstract-method
|
||||
"""
|
||||
Serializer for handling the input data for the problem response report generation API.
|
||||
|
||||
Attributes:
|
||||
unique_student_identifier (str): The email or username of the student.
|
||||
This field is optional, but if provided, the `problem_location_str`
|
||||
must also be provided.
|
||||
problem_location_str (str): The string representing the location of the problem within the course.
|
||||
This field is optional, unless `unique_student_identifier` is provided.
|
||||
"""
|
||||
unique_student_identifier = serializers.CharField(
|
||||
max_length=255,
|
||||
help_text="Email or username of student",
|
||||
required=False
|
||||
)
|
||||
problem_location_str = serializers.CharField(
|
||||
help_text="Problem location",
|
||||
required=False
|
||||
)
|
||||
|
||||
def validate(self, data):
|
||||
"""
|
||||
Validate the data to ensure that if unique_student_identifier is provided,
|
||||
problem_location_str must also be provided.
|
||||
"""
|
||||
unique_student_identifier = data.get('unique_student_identifier')
|
||||
problem_location_str = data.get('problem_location_str')
|
||||
|
||||
if unique_student_identifier and not problem_location_str:
|
||||
raise serializers.ValidationError(
|
||||
"unique_student_identifier must accompany problem_location_str"
|
||||
)
|
||||
|
||||
return data
|
||||
|
||||
|
||||
class ShowStudentExtensionSerializer(serializers.Serializer):
|
||||
"""
|
||||
Serializer for validating and processing the student identifier.
|
||||
|
||||
@@ -13,6 +13,12 @@ from typing import Optional
|
||||
from lms.djangoapps.verify_student.emails import send_verification_approved_email
|
||||
from lms.djangoapps.verify_student.exceptions import VerificationAttemptInvalidStatus
|
||||
from lms.djangoapps.verify_student.models import VerificationAttempt
|
||||
from lms.djangoapps.verify_student.signals.signals import (
|
||||
emit_idv_attempt_approved_event,
|
||||
emit_idv_attempt_created_event,
|
||||
emit_idv_attempt_denied_event,
|
||||
emit_idv_attempt_pending_event,
|
||||
)
|
||||
from lms.djangoapps.verify_student.statuses import VerificationAttemptStatus
|
||||
from lms.djangoapps.verify_student.tasks import send_verification_status_email
|
||||
|
||||
@@ -70,6 +76,14 @@ def create_verification_attempt(user: User, name: str, status: str, expiration_d
|
||||
expiration_datetime=expiration_datetime,
|
||||
)
|
||||
|
||||
emit_idv_attempt_created_event(
|
||||
attempt_id=verification_attempt.id,
|
||||
user=user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
|
||||
return verification_attempt.id
|
||||
|
||||
|
||||
@@ -77,7 +91,7 @@ def update_verification_attempt(
|
||||
attempt_id: int,
|
||||
name: Optional[str] = None,
|
||||
status: Optional[str] = None,
|
||||
expiration_datetime: Optional[datetime] = None
|
||||
expiration_datetime: Optional[datetime] = None,
|
||||
):
|
||||
"""
|
||||
Update a verification attempt.
|
||||
@@ -125,3 +139,29 @@ def update_verification_attempt(
|
||||
attempt.expiration_datetime = expiration_datetime
|
||||
|
||||
attempt.save()
|
||||
|
||||
user = attempt.user
|
||||
if status == VerificationAttemptStatus.PENDING:
|
||||
emit_idv_attempt_pending_event(
|
||||
attempt_id=attempt_id,
|
||||
user=user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
elif status == VerificationAttemptStatus.APPROVED:
|
||||
emit_idv_attempt_approved_event(
|
||||
attempt_id=attempt_id,
|
||||
user=user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
elif status == VerificationAttemptStatus.DENIED:
|
||||
emit_idv_attempt_denied_event(
|
||||
attempt_id=attempt_id,
|
||||
user=user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
|
||||
@@ -17,5 +17,5 @@ class VerifyStudentConfig(AppConfig):
|
||||
"""
|
||||
Connect signal handlers.
|
||||
"""
|
||||
from lms.djangoapps.verify_student import signals # pylint: disable=unused-import
|
||||
from lms.djangoapps.verify_student.signals import signals # pylint: disable=unused-import
|
||||
from lms.djangoapps.verify_student import tasks # pylint: disable=unused-import
|
||||
|
||||
@@ -54,7 +54,7 @@ class TestBackfillSSOVerificationsCommand(TestCase):
|
||||
#self.assertNumQueries(100)
|
||||
|
||||
def test_signal_called(self):
|
||||
with patch('openedx.core.djangoapps.signals.signals.LEARNER_NOW_VERIFIED.send_robust') as mock_signal:
|
||||
with patch('openedx_events.learning.signals.IDV_ATTEMPT_APPROVED.send_event') as mock_signal:
|
||||
call_command('backfill_sso_verifications_for_old_account_links', '--provider-slug', self.provider.provider_id) # lint-amnesty, pylint: disable=line-too-long
|
||||
assert mock_signal.call_count == 1
|
||||
|
||||
|
||||
@@ -121,7 +121,7 @@ class TestRetryFailedPhotoVerificationsBetweenDates(MockS3Boto3Mixin, TestVerifi
|
||||
for _ in range(num_attempts):
|
||||
self.create_upload_and_submit_attempt_for_user()
|
||||
|
||||
@patch('lms.djangoapps.verify_student.signals.idv_update_signal.send')
|
||||
@patch('lms.djangoapps.verify_student.signals.signals.idv_update_signal.send')
|
||||
def test_resubmit_in_date_range(self, send_idv_update_mock):
|
||||
call_command('retry_failed_photo_verifications',
|
||||
status="submitted",
|
||||
|
||||
@@ -38,7 +38,7 @@ class TestTriggerSoftwareSecurePhotoVerificationsPostSaveSignal(MockS3Boto3Mixin
|
||||
for _ in range(num_attempts):
|
||||
self.create_and_submit_attempt_for_user()
|
||||
|
||||
@patch('lms.djangoapps.verify_student.signals.idv_update_signal.send')
|
||||
@patch('lms.djangoapps.verify_student.signals.signals.idv_update_signal.send')
|
||||
def test_command(self, send_idv_update_mock):
|
||||
call_command('trigger_softwaresecurephotoverifications_post_save_signal', start_date_time='2021-10-31 06:00:00')
|
||||
|
||||
|
||||
@@ -42,8 +42,9 @@ from lms.djangoapps.verify_student.ssencrypt import (
|
||||
rsa_decrypt,
|
||||
rsa_encrypt
|
||||
)
|
||||
from openedx.core.djangoapps.signals.signals import LEARNER_NOW_VERIFIED
|
||||
from openedx.core.storage import get_storage
|
||||
from openedx_events.learning.signals import IDV_ATTEMPT_APPROVED
|
||||
from openedx_events.learning.data import UserData, VerificationAttemptData
|
||||
|
||||
from .utils import auto_verify_for_testing_enabled, earliest_allowed_verification_date, submit_request_to_ss
|
||||
|
||||
@@ -248,13 +249,23 @@ class SSOVerification(IDVerificationAttempt):
|
||||
user_id=self.user, reviewer=approved_by
|
||||
))
|
||||
|
||||
# Emit signal to find and generate eligible certificates
|
||||
LEARNER_NOW_VERIFIED.send_robust(
|
||||
sender=SSOVerification,
|
||||
user=self.user
|
||||
# Emit event to find and generate eligible certificates
|
||||
verification_data = VerificationAttemptData(
|
||||
attempt_id=self.id,
|
||||
user=UserData(
|
||||
pii=None,
|
||||
id=self.user.id,
|
||||
is_active=self.user.is_active,
|
||||
),
|
||||
status=self.status,
|
||||
name=self.name,
|
||||
expiration_date=self.expiration_datetime,
|
||||
)
|
||||
IDV_ATTEMPT_APPROVED.send_event(
|
||||
idv_attempt=verification_data,
|
||||
)
|
||||
|
||||
message = 'LEARNER_NOW_VERIFIED signal fired for {user} from SSOVerification'
|
||||
message = 'IDV_ATTEMPT_APPROVED signal fired for {user} from SSOVerification'
|
||||
log.info(message.format(user=self.user.username))
|
||||
|
||||
|
||||
@@ -451,13 +462,24 @@ class PhotoVerification(IDVerificationAttempt):
|
||||
days=settings.VERIFY_STUDENT["DAYS_GOOD_FOR"]
|
||||
)
|
||||
self.save()
|
||||
# Emit signal to find and generate eligible certificates
|
||||
LEARNER_NOW_VERIFIED.send_robust(
|
||||
sender=PhotoVerification,
|
||||
user=self.user
|
||||
|
||||
# Emit event to find and generate eligible certificates
|
||||
verification_data = VerificationAttemptData(
|
||||
attempt_id=self.id,
|
||||
user=UserData(
|
||||
pii=None,
|
||||
id=self.user.id,
|
||||
is_active=self.user.is_active,
|
||||
),
|
||||
status=self.status,
|
||||
name=self.name,
|
||||
expiration_date=self.expiration_datetime,
|
||||
)
|
||||
IDV_ATTEMPT_APPROVED.send_event(
|
||||
idv_attempt=verification_data,
|
||||
)
|
||||
|
||||
message = 'LEARNER_NOW_VERIFIED signal fired for {user} from PhotoVerification'
|
||||
message = 'IDV_ATTEMPT_APPROVED signal fired for {user} from PhotoVerification'
|
||||
log.info(message.format(user=self.user.username))
|
||||
|
||||
@status_before_must_be("ready", "must_retry")
|
||||
|
||||
0
lms/djangoapps/verify_student/signals/__init__.py
Normal file
0
lms/djangoapps/verify_student/signals/__init__.py
Normal file
@@ -5,23 +5,23 @@ import logging
|
||||
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.db.models.signals import post_save
|
||||
from django.dispatch import Signal
|
||||
from django.dispatch.dispatcher import receiver
|
||||
from xmodule.modulestore.django import SignalHandler, modulestore
|
||||
|
||||
from common.djangoapps.student.models_api import get_name, get_pending_name_change
|
||||
from lms.djangoapps.verify_student.apps import VerifyStudentConfig # pylint: disable=unused-import
|
||||
from lms.djangoapps.verify_student.signals.signals import idv_update_signal
|
||||
from openedx.core.djangoapps.user_api.accounts.signals import USER_RETIRE_LMS_CRITICAL, USER_RETIRE_LMS_MISC
|
||||
|
||||
from .models import SoftwareSecurePhotoVerification, VerificationDeadline, VerificationAttempt
|
||||
from lms.djangoapps.verify_student.models import (
|
||||
SoftwareSecurePhotoVerification,
|
||||
VerificationDeadline,
|
||||
VerificationAttempt
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# Signal for emitting IDV submission and review updates
|
||||
# providing_args = ["attempt_id", "user_id", "status", "full_name", "profile_name"]
|
||||
idv_update_signal = Signal()
|
||||
|
||||
|
||||
@receiver(SignalHandler.course_published)
|
||||
def _listen_for_course_publish(sender, course_key, **kwargs): # pylint: disable=unused-argument
|
||||
"""
|
||||
109
lms/djangoapps/verify_student/signals/signals.py
Normal file
109
lms/djangoapps/verify_student/signals/signals.py
Normal file
@@ -0,0 +1,109 @@
|
||||
"""
|
||||
Signal definitions and functions to send those signals for the verify_student application.
|
||||
"""
|
||||
|
||||
from django.dispatch import Signal
|
||||
|
||||
from openedx_events.learning.data import UserData, UserPersonalData, VerificationAttemptData
|
||||
from openedx_events.learning.signals import (
|
||||
IDV_ATTEMPT_CREATED,
|
||||
IDV_ATTEMPT_PENDING,
|
||||
IDV_ATTEMPT_APPROVED,
|
||||
IDV_ATTEMPT_DENIED,
|
||||
)
|
||||
|
||||
# Signal for emitting IDV submission and review updates
|
||||
# providing_args = ["attempt_id", "user_id", "status", "full_name", "profile_name"]
|
||||
idv_update_signal = Signal()
|
||||
|
||||
|
||||
def _create_user_data(user):
|
||||
"""
|
||||
Helper function to create a UserData object.
|
||||
"""
|
||||
user_data = UserData(
|
||||
id=user.id,
|
||||
is_active=user.is_active,
|
||||
pii=UserPersonalData(
|
||||
username=user.username,
|
||||
email=user.email,
|
||||
name=user.get_full_name()
|
||||
)
|
||||
)
|
||||
|
||||
return user_data
|
||||
|
||||
|
||||
def emit_idv_attempt_created_event(attempt_id, user, status, name, expiration_date):
|
||||
"""
|
||||
Emit the IDV_ATTEMPT_CREATED Open edX event.
|
||||
"""
|
||||
user_data = _create_user_data(user)
|
||||
|
||||
# .. event_implemented_name: IDV_ATTEMPT_CREATED
|
||||
IDV_ATTEMPT_CREATED.send_event(
|
||||
idv_attempt=VerificationAttemptData(
|
||||
attempt_id=attempt_id,
|
||||
user=user_data,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_date,
|
||||
)
|
||||
)
|
||||
return user_data
|
||||
|
||||
|
||||
def emit_idv_attempt_pending_event(attempt_id, user, status, name, expiration_date):
|
||||
"""
|
||||
Emit the IDV_ATTEMPT_PENDING Open edX event.
|
||||
"""
|
||||
user_data = _create_user_data(user)
|
||||
|
||||
# .. event_implemented_name: IDV_ATTEMPT_PENDING
|
||||
IDV_ATTEMPT_PENDING.send_event(
|
||||
idv_attempt=VerificationAttemptData(
|
||||
attempt_id=attempt_id,
|
||||
user=user_data,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_date,
|
||||
)
|
||||
)
|
||||
return user_data
|
||||
|
||||
|
||||
def emit_idv_attempt_approved_event(attempt_id, user, status, name, expiration_date):
|
||||
"""
|
||||
Emit the IDV_ATTEMPT_APPROVED Open edX event.
|
||||
"""
|
||||
user_data = _create_user_data(user)
|
||||
|
||||
# .. event_implemented_name: IDV_ATTEMPT_APPROVED
|
||||
IDV_ATTEMPT_APPROVED.send_event(
|
||||
idv_attempt=VerificationAttemptData(
|
||||
attempt_id=attempt_id,
|
||||
user=user_data,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_date,
|
||||
)
|
||||
)
|
||||
return user_data
|
||||
|
||||
|
||||
def emit_idv_attempt_denied_event(attempt_id, user, status, name, expiration_date):
|
||||
"""
|
||||
Emit the IDV_ATTEMPT_DENIED Open edX event.
|
||||
"""
|
||||
user_data = _create_user_data(user)
|
||||
|
||||
# .. event_implemented_name: IDV_ATTEMPT_DENIED
|
||||
IDV_ATTEMPT_DENIED.send_event(
|
||||
idv_attempt=VerificationAttemptData(
|
||||
attempt_id=attempt_id,
|
||||
user=user_data,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_date,
|
||||
)
|
||||
)
|
||||
@@ -69,7 +69,8 @@ class CreateVerificationAttempt(TestCase):
|
||||
)
|
||||
self.attempt.save()
|
||||
|
||||
def test_create_verification_attempt(self):
|
||||
@patch('lms.djangoapps.verify_student.api.emit_idv_attempt_created_event')
|
||||
def test_create_verification_attempt(self, mock_created_event):
|
||||
expected_id = 2
|
||||
self.assertEqual(
|
||||
create_verification_attempt(
|
||||
@@ -86,6 +87,13 @@ class CreateVerificationAttempt(TestCase):
|
||||
self.assertEqual(verification_attempt.name, 'Tester McTest')
|
||||
self.assertEqual(verification_attempt.status, VerificationAttemptStatus.CREATED)
|
||||
self.assertEqual(verification_attempt.expiration_datetime, datetime(2024, 12, 31, tzinfo=timezone.utc))
|
||||
mock_created_event.assert_called_with(
|
||||
attempt_id=verification_attempt.id,
|
||||
user=self.user,
|
||||
status=VerificationAttemptStatus.CREATED,
|
||||
name='Tester McTest',
|
||||
expiration_date=datetime(2024, 12, 31, tzinfo=timezone.utc),
|
||||
)
|
||||
|
||||
def test_create_verification_attempt_no_expiration_datetime(self):
|
||||
expected_id = 2
|
||||
@@ -129,7 +137,18 @@ class UpdateVerificationAttempt(TestCase):
|
||||
('Tester McTest3', VerificationAttemptStatus.DENIED, datetime(2026, 12, 31, tzinfo=timezone.utc)),
|
||||
)
|
||||
@ddt.unpack
|
||||
def test_update_verification_attempt(self, name, status, expiration_datetime):
|
||||
@patch('lms.djangoapps.verify_student.api.emit_idv_attempt_pending_event')
|
||||
@patch('lms.djangoapps.verify_student.api.emit_idv_attempt_approved_event')
|
||||
@patch('lms.djangoapps.verify_student.api.emit_idv_attempt_denied_event')
|
||||
def test_update_verification_attempt(
|
||||
self,
|
||||
name,
|
||||
status,
|
||||
expiration_datetime,
|
||||
mock_denied_event,
|
||||
mock_approved_event,
|
||||
mock_pending_event,
|
||||
):
|
||||
update_verification_attempt(
|
||||
attempt_id=self.attempt.id,
|
||||
name=name,
|
||||
@@ -145,6 +164,31 @@ class UpdateVerificationAttempt(TestCase):
|
||||
self.assertEqual(verification_attempt.status, status)
|
||||
self.assertEqual(verification_attempt.expiration_datetime, expiration_datetime)
|
||||
|
||||
if status == VerificationAttemptStatus.PENDING:
|
||||
mock_pending_event.assert_called_with(
|
||||
attempt_id=verification_attempt.id,
|
||||
user=self.user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
elif status == VerificationAttemptStatus.APPROVED:
|
||||
mock_approved_event.assert_called_with(
|
||||
attempt_id=verification_attempt.id,
|
||||
user=self.user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
elif status == VerificationAttemptStatus.DENIED:
|
||||
mock_denied_event.assert_called_with(
|
||||
attempt_id=verification_attempt.id,
|
||||
user=self.user,
|
||||
status=status,
|
||||
name=name,
|
||||
expiration_date=expiration_datetime,
|
||||
)
|
||||
|
||||
def test_update_verification_attempt_none_values(self):
|
||||
update_verification_attempt(
|
||||
attempt_id=self.attempt.id,
|
||||
@@ -166,6 +210,7 @@ class UpdateVerificationAttempt(TestCase):
|
||||
VerificationAttempt.DoesNotExist,
|
||||
update_verification_attempt,
|
||||
attempt_id=999999,
|
||||
name=None,
|
||||
status=VerificationAttemptStatus.APPROVED,
|
||||
)
|
||||
|
||||
|
||||
@@ -15,7 +15,7 @@ from lms.djangoapps.verify_student.models import (
|
||||
VerificationDeadline,
|
||||
VerificationAttempt
|
||||
)
|
||||
from lms.djangoapps.verify_student.signals import (
|
||||
from lms.djangoapps.verify_student.signals.handlers import (
|
||||
_listen_for_course_publish,
|
||||
_listen_for_lms_retire,
|
||||
_listen_for_lms_retire_verification_attempts
|
||||
@@ -29,9 +29,9 @@ from xmodule.modulestore.tests.django_utils import ModuleStoreTestCase # lint-a
|
||||
from xmodule.modulestore.tests.factories import CourseFactory # lint-amnesty, pylint: disable=wrong-import-order
|
||||
|
||||
|
||||
class VerificationDeadlineSignalTest(ModuleStoreTestCase):
|
||||
class VerificationDeadlineHandlerTest(ModuleStoreTestCase):
|
||||
"""
|
||||
Tests for the VerificationDeadline signal
|
||||
Tests for the VerificationDeadline handler
|
||||
"""
|
||||
|
||||
def setUp(self):
|
||||
@@ -41,13 +41,13 @@ class VerificationDeadlineSignalTest(ModuleStoreTestCase):
|
||||
VerificationDeadline.objects.all().delete()
|
||||
|
||||
def test_no_deadline(self):
|
||||
""" Verify the signal sets deadline to course end when no deadline exists."""
|
||||
""" Verify the handler sets deadline to course end when no deadline exists."""
|
||||
_listen_for_course_publish('store', self.course.id)
|
||||
|
||||
assert VerificationDeadline.deadline_for_course(self.course.id) == self.course.end
|
||||
|
||||
def test_deadline(self):
|
||||
""" Verify deadline is set to course end date by signal when changed. """
|
||||
""" Verify deadline is set to course end date by handler when changed. """
|
||||
deadline = now() - timedelta(days=7)
|
||||
VerificationDeadline.set_deadline(self.course.id, deadline)
|
||||
|
||||
@@ -55,7 +55,7 @@ class VerificationDeadlineSignalTest(ModuleStoreTestCase):
|
||||
assert VerificationDeadline.deadline_for_course(self.course.id) == self.course.end
|
||||
|
||||
def test_deadline_explicit(self):
|
||||
""" Verify deadline is unchanged by signal when explicitly set. """
|
||||
""" Verify deadline is unchanged by handler when explicitly set. """
|
||||
deadline = now() - timedelta(days=7)
|
||||
VerificationDeadline.set_deadline(self.course.id, deadline, is_explicit=True)
|
||||
|
||||
@@ -66,9 +66,9 @@ class VerificationDeadlineSignalTest(ModuleStoreTestCase):
|
||||
assert actual_deadline == deadline
|
||||
|
||||
|
||||
class RetirementSignalTest(ModuleStoreTestCase):
|
||||
class RetirementHandlerTest(ModuleStoreTestCase):
|
||||
"""
|
||||
Tests for the VerificationDeadline signal
|
||||
Tests for the VerificationDeadline handler
|
||||
"""
|
||||
|
||||
def _create_entry(self):
|
||||
@@ -119,8 +119,8 @@ class RetirementSignalTest(ModuleStoreTestCase):
|
||||
|
||||
class PostSavePhotoVerificationTest(ModuleStoreTestCase):
|
||||
"""
|
||||
Tests for the post_save signal on the SoftwareSecurePhotoVerification model.
|
||||
This receiver should emit another signal that contains limited data about
|
||||
Tests for the post_save handler on the SoftwareSecurePhotoVerification model.
|
||||
This receiver should emit another handler that contains limited data about
|
||||
the verification attempt that was updated.
|
||||
"""
|
||||
|
||||
@@ -132,7 +132,7 @@ class PostSavePhotoVerificationTest(ModuleStoreTestCase):
|
||||
self.photo_id_image_url = 'https://test.photo'
|
||||
self.photo_id_key = 'test+key'
|
||||
|
||||
@patch('lms.djangoapps.verify_student.signals.idv_update_signal.send')
|
||||
@patch('lms.djangoapps.verify_student.signals.signals.idv_update_signal.send')
|
||||
def test_post_save_signal(self, mock_signal):
|
||||
# create new softwaresecureverification
|
||||
attempt = SoftwareSecurePhotoVerification.objects.create(
|
||||
@@ -165,7 +165,7 @@ class PostSavePhotoVerificationTest(ModuleStoreTestCase):
|
||||
full_name=attempt.user.profile.name
|
||||
)
|
||||
|
||||
@patch('lms.djangoapps.verify_student.signals.idv_update_signal.send')
|
||||
@patch('lms.djangoapps.verify_student.signals.signals.idv_update_signal.send')
|
||||
def test_post_save_signal_pending_name(self, mock_signal):
|
||||
pending_name_change = do_name_change_request(self.user, 'Pending Name', 'test')[0]
|
||||
|
||||
@@ -187,7 +187,7 @@ class PostSavePhotoVerificationTest(ModuleStoreTestCase):
|
||||
)
|
||||
|
||||
|
||||
class RetirementSignalVerificationAttemptsTest(ModuleStoreTestCase):
|
||||
class RetirementHandlerVerificationAttemptsTest(ModuleStoreTestCase):
|
||||
"""
|
||||
Tests for the LMS User Retirement signal for Verification Attempts
|
||||
"""
|
||||
@@ -50,6 +50,12 @@ such that the value can be defined later than this assignment (file load order).
|
||||
|
||||
$activeSection = null;
|
||||
|
||||
var usesProctoringLegacyView = function () {
|
||||
// If the element #proctoring-mfe-view is present, then uses the new MFE
|
||||
// and the legacy views should not be initialized.
|
||||
return !document.getElementById('proctoring-mfe-view');
|
||||
}
|
||||
|
||||
SafeWaiter = (function() {
|
||||
function safeWaiter() {
|
||||
this.after_handlers = [];
|
||||
@@ -200,7 +206,7 @@ such that the value can be defined later than this assignment (file load order).
|
||||
}
|
||||
];
|
||||
// eslint-disable-next-line no-void
|
||||
if (edx.instructor_dashboard.proctoring !== void 0) {
|
||||
if (usesProctoringLegacyView() && edx.instructor_dashboard.proctoring !== void 0) {
|
||||
sectionsToInitialize = sectionsToInitialize.concat([
|
||||
{
|
||||
constructor: edx.instructor_dashboard.proctoring.ProctoredExamAllowanceView,
|
||||
|
||||
@@ -18,7 +18,7 @@ from meilisearch import Client as MeilisearchClient
|
||||
from meilisearch.errors import MeilisearchError
|
||||
from meilisearch.models.task import TaskInfo
|
||||
from opaque_keys.edx.keys import UsageKey
|
||||
from opaque_keys.edx.locator import LibraryLocatorV2
|
||||
from opaque_keys.edx.locator import LibraryLocatorV2, LibraryCollectionLocator
|
||||
from openedx_learning.api import authoring as authoring_api
|
||||
from common.djangoapps.student.roles import GlobalStaff
|
||||
from rest_framework.request import Request
|
||||
@@ -36,6 +36,7 @@ from .documents import (
|
||||
searchable_doc_for_library_block,
|
||||
searchable_doc_collections,
|
||||
searchable_doc_tags,
|
||||
searchable_doc_tags_for_collection,
|
||||
)
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
@@ -395,13 +396,12 @@ def rebuild_index(status_cb: Callable[[str], None] | None = None) -> None:
|
||||
return docs
|
||||
|
||||
############## Collections ##############
|
||||
def index_collection_batch(batch, num_done) -> int:
|
||||
def index_collection_batch(batch, num_done, library_key) -> int:
|
||||
docs = []
|
||||
for collection in batch:
|
||||
try:
|
||||
doc = searchable_doc_for_collection(collection)
|
||||
# Uncomment below line once collections are tagged.
|
||||
# doc.update(searchable_doc_tags(collection.id))
|
||||
doc.update(searchable_doc_tags_for_collection(library_key, collection))
|
||||
docs.append(doc)
|
||||
except Exception as err: # pylint: disable=broad-except
|
||||
status_cb(f"Error indexing collection {collection}: {err}")
|
||||
@@ -428,7 +428,11 @@ def rebuild_index(status_cb: Callable[[str], None] | None = None) -> None:
|
||||
status_cb(f"{num_collections_done + 1}/{num_collections}. Now indexing collections in library {lib_key}")
|
||||
paginator = Paginator(collections, 100)
|
||||
for p in paginator.page_range:
|
||||
num_collections_done = index_collection_batch(paginator.page(p).object_list, num_collections_done)
|
||||
num_collections_done = index_collection_batch(
|
||||
paginator.page(p).object_list,
|
||||
num_collections_done,
|
||||
lib_key,
|
||||
)
|
||||
status_cb(f"{num_collections_done}/{num_collections} collections indexed for library {lib_key}")
|
||||
|
||||
num_contexts_done += 1
|
||||
@@ -604,6 +608,17 @@ def upsert_block_collections_index_docs(usage_key: UsageKey):
|
||||
_update_index_docs([doc])
|
||||
|
||||
|
||||
def upsert_collection_tags_index_docs(collection_usage_key: LibraryCollectionLocator):
|
||||
"""
|
||||
Updates the tags data in documents for the given library collection
|
||||
"""
|
||||
collection = lib_api.get_library_collection_from_usage_key(collection_usage_key)
|
||||
|
||||
doc = {Fields.id: collection.id}
|
||||
doc.update(searchable_doc_tags_for_collection(collection_usage_key.library_key, collection))
|
||||
_update_index_docs([doc])
|
||||
|
||||
|
||||
def _get_user_orgs(request: Request) -> list[str]:
|
||||
"""
|
||||
Get the org.short_names for the organizations that the requesting user has OrgStaffRole or OrgInstructorRole.
|
||||
|
||||
@@ -10,6 +10,7 @@ from django.utils.text import slugify
|
||||
from django.core.exceptions import ObjectDoesNotExist
|
||||
from opaque_keys.edx.keys import LearningContextKey, UsageKey
|
||||
from openedx_learning.api import authoring as authoring_api
|
||||
from opaque_keys.edx.locator import LibraryLocatorV2
|
||||
|
||||
from openedx.core.djangoapps.content.search.models import SearchAccess
|
||||
from openedx.core.djangoapps.content_libraries import api as lib_api
|
||||
@@ -339,6 +340,28 @@ def searchable_doc_collections(usage_key: UsageKey) -> dict:
|
||||
return doc
|
||||
|
||||
|
||||
def searchable_doc_tags_for_collection(
|
||||
library_key: LibraryLocatorV2,
|
||||
collection,
|
||||
) -> dict:
|
||||
"""
|
||||
Generate a dictionary document suitable for ingestion into a search engine
|
||||
like Meilisearch or Elasticsearch, with the tags data for the given library collection.
|
||||
"""
|
||||
doc = {
|
||||
Fields.id: collection.id,
|
||||
}
|
||||
|
||||
collection_usage_key = lib_api.get_library_collection_usage_key(
|
||||
library_key,
|
||||
collection.key,
|
||||
)
|
||||
|
||||
doc.update(_tags_for_content_object(collection_usage_key))
|
||||
|
||||
return doc
|
||||
|
||||
|
||||
def searchable_doc_for_course_block(block) -> dict:
|
||||
"""
|
||||
Generate a dictionary document suitable for ingestion into a search engine
|
||||
@@ -382,6 +405,7 @@ def searchable_doc_for_collection(collection) -> dict:
|
||||
doc.update({
|
||||
Fields.context_key: str(context_key),
|
||||
Fields.org: org,
|
||||
Fields.usage_key: str(lib_api.get_library_collection_usage_key(context_key, collection.key)),
|
||||
})
|
||||
except LearningPackage.contentlibrary.RelatedObjectDoesNotExist:
|
||||
log.warning(f"Related library not found for {collection}")
|
||||
|
||||
@@ -8,6 +8,7 @@ from django.db.models.signals import post_delete
|
||||
from django.dispatch import receiver
|
||||
from opaque_keys import InvalidKeyError
|
||||
from opaque_keys.edx.keys import UsageKey
|
||||
from opaque_keys.edx.locator import LibraryCollectionLocator
|
||||
from openedx_events.content_authoring.data import (
|
||||
ContentLibraryData,
|
||||
ContentObjectChangedData,
|
||||
@@ -32,7 +33,12 @@ from openedx_events.content_authoring.signals import (
|
||||
from openedx.core.djangoapps.content.course_overviews.models import CourseOverview
|
||||
from openedx.core.djangoapps.content.search.models import SearchAccess
|
||||
|
||||
from .api import only_if_meilisearch_enabled, upsert_block_collections_index_docs, upsert_block_tags_index_docs
|
||||
from .api import (
|
||||
only_if_meilisearch_enabled,
|
||||
upsert_block_collections_index_docs,
|
||||
upsert_block_tags_index_docs,
|
||||
upsert_collection_tags_index_docs,
|
||||
)
|
||||
from .tasks import (
|
||||
delete_library_block_index_doc,
|
||||
delete_xblock_index_doc,
|
||||
@@ -118,7 +124,9 @@ def library_block_updated_handler(**kwargs) -> None:
|
||||
log.error("Received null or incorrect data for event")
|
||||
return
|
||||
|
||||
upsert_library_block_index_doc.delay(str(library_block_data.usage_key))
|
||||
# Update content library index synchronously to make sure that search index is updated before
|
||||
# the frontend invalidates/refetches results. This is only a single document update so is very fast.
|
||||
upsert_library_block_index_doc.apply(args=[str(library_block_data.usage_key)])
|
||||
|
||||
|
||||
@receiver(LIBRARY_BLOCK_DELETED)
|
||||
@@ -132,7 +140,9 @@ def library_block_deleted(**kwargs) -> None:
|
||||
log.error("Received null or incorrect data for event")
|
||||
return
|
||||
|
||||
delete_library_block_index_doc.delay(str(library_block_data.usage_key))
|
||||
# Update content library index synchronously to make sure that search index is updated before
|
||||
# the frontend invalidates/refetches results. This is only a single document update so is very fast.
|
||||
delete_library_block_index_doc.apply(args=[str(library_block_data.usage_key)])
|
||||
|
||||
|
||||
@receiver(CONTENT_LIBRARY_UPDATED)
|
||||
@@ -191,12 +201,19 @@ def content_object_associations_changed_handler(**kwargs) -> None:
|
||||
# Check if valid if course or library block
|
||||
usage_key = UsageKey.from_string(str(content_object.object_id))
|
||||
except InvalidKeyError:
|
||||
log.error("Received invalid content object id")
|
||||
return
|
||||
try:
|
||||
# Check if valid if library collection
|
||||
usage_key = LibraryCollectionLocator.from_string(str(content_object.object_id))
|
||||
except InvalidKeyError:
|
||||
log.error("Received invalid content object id")
|
||||
return
|
||||
|
||||
# This event's changes may contain both "tags" and "collections", but this will happen rarely, if ever.
|
||||
# So we allow a potential double "upsert" here.
|
||||
if not content_object.changes or "tags" in content_object.changes:
|
||||
upsert_block_tags_index_docs(usage_key)
|
||||
if isinstance(usage_key, LibraryCollectionLocator):
|
||||
upsert_collection_tags_index_docs(usage_key)
|
||||
else:
|
||||
upsert_block_tags_index_docs(usage_key)
|
||||
if not content_object.changes or "collections" in content_object.changes:
|
||||
upsert_block_collections_index_docs(usage_key)
|
||||
|
||||
@@ -185,9 +185,11 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
created_by=None,
|
||||
description="my collection description"
|
||||
)
|
||||
self.collection_usage_key = "lib-collection:org1:lib:MYCOL"
|
||||
self.collection_dict = {
|
||||
"id": self.collection.id,
|
||||
"block_id": self.collection.key,
|
||||
"usage_key": self.collection_usage_key,
|
||||
"type": "collection",
|
||||
"display_name": "my_collection",
|
||||
"description": "my collection description",
|
||||
@@ -221,6 +223,8 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
doc_problem2 = copy.deepcopy(self.doc_problem2)
|
||||
doc_problem2["tags"] = {}
|
||||
doc_problem2["collections"] = {}
|
||||
doc_collection = copy.deepcopy(self.collection_dict)
|
||||
doc_collection["tags"] = {}
|
||||
|
||||
api.rebuild_index()
|
||||
assert mock_meilisearch.return_value.index.return_value.add_documents.call_count == 3
|
||||
@@ -228,7 +232,7 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
[
|
||||
call([doc_sequential, doc_vertical]),
|
||||
call([doc_problem1, doc_problem2]),
|
||||
call([self.collection_dict]),
|
||||
call([doc_collection]),
|
||||
],
|
||||
any_order=True,
|
||||
)
|
||||
@@ -459,6 +463,7 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
doc_collection1_created = {
|
||||
"id": collection1.id,
|
||||
"block_id": collection1.key,
|
||||
"usage_key": f"lib-collection:org1:lib:{collection1.key}",
|
||||
"type": "collection",
|
||||
"display_name": "Collection 1",
|
||||
"description": "First Collection",
|
||||
@@ -473,6 +478,7 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
doc_collection2_created = {
|
||||
"id": collection2.id,
|
||||
"block_id": collection2.key,
|
||||
"usage_key": f"lib-collection:org1:lib:{collection2.key}",
|
||||
"type": "collection",
|
||||
"display_name": "Collection 2",
|
||||
"description": "Second Collection",
|
||||
@@ -487,6 +493,7 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
doc_collection2_updated = {
|
||||
"id": collection2.id,
|
||||
"block_id": collection2.key,
|
||||
"usage_key": f"lib-collection:org1:lib:{collection2.key}",
|
||||
"type": "collection",
|
||||
"display_name": "Collection 2",
|
||||
"description": "Second Collection",
|
||||
@@ -501,6 +508,7 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
doc_collection1_updated = {
|
||||
"id": collection1.id,
|
||||
"block_id": collection1.key,
|
||||
"usage_key": f"lib-collection:org1:lib:{collection1.key}",
|
||||
"type": "collection",
|
||||
"display_name": "Collection 1",
|
||||
"description": "First Collection",
|
||||
@@ -576,3 +584,34 @@ class TestSearchApi(ModuleStoreTestCase):
|
||||
mock_meilisearch.return_value.index.return_value.delete_documents.assert_called_once_with(
|
||||
filter=delete_filter
|
||||
)
|
||||
|
||||
@override_settings(MEILISEARCH_ENABLED=True)
|
||||
def test_index_tags_in_collections(self, mock_meilisearch):
|
||||
# Tag collection
|
||||
tagging_api.tag_object(self.collection_usage_key, self.taxonomyA, ["one", "two"])
|
||||
tagging_api.tag_object(self.collection_usage_key, self.taxonomyB, ["three", "four"])
|
||||
|
||||
# Build expected docs with tags at each stage
|
||||
doc_collection_with_tags1 = {
|
||||
"id": self.collection.id,
|
||||
"tags": {
|
||||
'taxonomy': ['A'],
|
||||
'level0': ['A > one', 'A > two']
|
||||
}
|
||||
}
|
||||
doc_collection_with_tags2 = {
|
||||
"id": self.collection.id,
|
||||
"tags": {
|
||||
'taxonomy': ['A', 'B'],
|
||||
'level0': ['A > one', 'A > two', 'B > four', 'B > three']
|
||||
}
|
||||
}
|
||||
|
||||
assert mock_meilisearch.return_value.index.return_value.update_documents.call_count == 2
|
||||
mock_meilisearch.return_value.index.return_value.update_documents.assert_has_calls(
|
||||
[
|
||||
call([doc_collection_with_tags1]),
|
||||
call([doc_collection_with_tags2]),
|
||||
],
|
||||
any_order=True,
|
||||
)
|
||||
|
||||
@@ -19,6 +19,7 @@ try:
|
||||
from ..documents import (
|
||||
searchable_doc_for_course_block,
|
||||
searchable_doc_tags,
|
||||
searchable_doc_tags_for_collection,
|
||||
searchable_doc_collections,
|
||||
searchable_doc_for_collection,
|
||||
searchable_doc_for_library_block,
|
||||
@@ -27,6 +28,7 @@ try:
|
||||
except RuntimeError:
|
||||
searchable_doc_for_course_block = lambda x: x
|
||||
searchable_doc_tags = lambda x: x
|
||||
searchable_doc_tags_for_collection = lambda x: x
|
||||
searchable_doc_for_collection = lambda x: x
|
||||
searchable_doc_for_library_block = lambda x: x
|
||||
SearchAccess = {}
|
||||
@@ -76,6 +78,7 @@ class StudioDocumentsTest(SharedModuleStoreTestCase):
|
||||
created_by=None,
|
||||
description="my toy collection description"
|
||||
)
|
||||
cls.collection_usage_key = "lib-collection:edX:2012_Fall:TOY_COLLECTION"
|
||||
cls.library_block = library_api.create_library_block(
|
||||
cls.library.key,
|
||||
"html",
|
||||
@@ -109,6 +112,7 @@ class StudioDocumentsTest(SharedModuleStoreTestCase):
|
||||
tagging_api.tag_object(str(cls.html_block_key), cls.subject_tags, tags=["Chinese", "Jump Links"])
|
||||
tagging_api.tag_object(str(cls.html_block_key), cls.difficulty_tags, tags=["Normal"])
|
||||
tagging_api.tag_object(str(cls.library_block.usage_key), cls.difficulty_tags, tags=["Normal"])
|
||||
tagging_api.tag_object(cls.collection_usage_key, cls.difficulty_tags, tags=["Normal"])
|
||||
|
||||
@property
|
||||
def toy_course_access_id(self):
|
||||
@@ -296,9 +300,12 @@ class StudioDocumentsTest(SharedModuleStoreTestCase):
|
||||
|
||||
def test_collection_with_library(self):
|
||||
doc = searchable_doc_for_collection(self.collection)
|
||||
doc.update(searchable_doc_tags_for_collection(self.library.key, self.collection))
|
||||
|
||||
assert doc == {
|
||||
"id": self.collection.id,
|
||||
"block_id": self.collection.key,
|
||||
"usage_key": self.collection_usage_key,
|
||||
"type": "collection",
|
||||
"org": "edX",
|
||||
"display_name": "Toy Collection",
|
||||
@@ -309,6 +316,10 @@ class StudioDocumentsTest(SharedModuleStoreTestCase):
|
||||
"breadcrumbs": [{"display_name": "some content_library"}],
|
||||
"created": 1680674828.0,
|
||||
"modified": 1680674828.0,
|
||||
'tags': {
|
||||
'taxonomy': ['Difficulty'],
|
||||
'level0': ['Difficulty > Normal']
|
||||
}
|
||||
}
|
||||
|
||||
def test_collection_with_no_library(self):
|
||||
|
||||
@@ -73,7 +73,8 @@ from opaque_keys.edx.keys import BlockTypeKey, UsageKey, UsageKeyV2
|
||||
from opaque_keys.edx.locator import (
|
||||
LibraryLocatorV2,
|
||||
LibraryUsageLocatorV2,
|
||||
LibraryLocator as LibraryLocatorV1
|
||||
LibraryLocator as LibraryLocatorV1,
|
||||
LibraryCollectionLocator,
|
||||
)
|
||||
from opaque_keys import InvalidKeyError
|
||||
from openedx_events.content_authoring.data import (
|
||||
@@ -218,8 +219,12 @@ class LibraryXBlockMetadata:
|
||||
modified = attr.ib(type=datetime)
|
||||
display_name = attr.ib("")
|
||||
last_published = attr.ib(default=None, type=datetime)
|
||||
last_draft_created = attr.ib(default=None, type=datetime)
|
||||
last_draft_created_by = attr.ib("")
|
||||
published_by = attr.ib("")
|
||||
has_unpublished_changes = attr.ib(False)
|
||||
tags_count = attr.ib(0)
|
||||
created = attr.ib(default=None, type=datetime)
|
||||
|
||||
@classmethod
|
||||
def from_component(cls, library_key, component):
|
||||
@@ -228,6 +233,14 @@ class LibraryXBlockMetadata:
|
||||
"""
|
||||
last_publish_log = component.versioning.last_publish_log
|
||||
|
||||
published_by = None
|
||||
if last_publish_log and last_publish_log.published_by:
|
||||
published_by = last_publish_log.published_by.username
|
||||
|
||||
draft = component.versioning.draft
|
||||
last_draft_created = draft.created if draft else None
|
||||
last_draft_created_by = draft.publishable_entity_version.created_by if draft else None
|
||||
|
||||
return cls(
|
||||
usage_key=LibraryUsageLocatorV2(
|
||||
library_key,
|
||||
@@ -238,7 +251,10 @@ class LibraryXBlockMetadata:
|
||||
created=component.created,
|
||||
modified=component.versioning.draft.created,
|
||||
last_published=None if last_publish_log is None else last_publish_log.published_at,
|
||||
has_unpublished_changes=component.versioning.has_unpublished_changes
|
||||
published_by=published_by,
|
||||
last_draft_created=last_draft_created,
|
||||
last_draft_created_by=last_draft_created_by,
|
||||
has_unpublished_changes=component.versioning.has_unpublished_changes,
|
||||
)
|
||||
|
||||
|
||||
@@ -1247,6 +1263,43 @@ def update_library_collection_components(
|
||||
return collection
|
||||
|
||||
|
||||
def get_library_collection_usage_key(
|
||||
library_key: LibraryLocatorV2,
|
||||
collection_key: str,
|
||||
# As an optimization, callers may pass in a pre-fetched ContentLibrary instance
|
||||
content_library: ContentLibrary | None = None,
|
||||
) -> LibraryCollectionLocator:
|
||||
"""
|
||||
Returns the LibraryCollectionLocator associated to a collection
|
||||
"""
|
||||
if not content_library:
|
||||
content_library = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined]
|
||||
assert content_library
|
||||
assert content_library.learning_package_id
|
||||
assert content_library.library_key == library_key
|
||||
|
||||
return LibraryCollectionLocator(library_key, collection_key)
|
||||
|
||||
|
||||
def get_library_collection_from_usage_key(
|
||||
collection_usage_key: LibraryCollectionLocator,
|
||||
) -> Collection:
|
||||
"""
|
||||
Return a Collection using the LibraryCollectionLocator
|
||||
"""
|
||||
|
||||
library_key = collection_usage_key.library_key
|
||||
collection_key = collection_usage_key.collection_id
|
||||
content_library = ContentLibrary.objects.get_by_key(library_key) # type: ignore[attr-defined]
|
||||
try:
|
||||
return authoring_api.get_collection(
|
||||
content_library.learning_package_id,
|
||||
collection_key,
|
||||
)
|
||||
except Collection.DoesNotExist as exc:
|
||||
raise ContentLibraryCollectionNotFound from exc
|
||||
|
||||
|
||||
# V1/V2 Compatibility Helpers
|
||||
# (Should be removed as part of
|
||||
# https://github.com/openedx/edx-platform/issues/32457)
|
||||
|
||||
@@ -148,7 +148,12 @@ class LibraryXBlockMetadataSerializer(serializers.Serializer):
|
||||
|
||||
block_type = serializers.CharField(source="usage_key.block_type")
|
||||
display_name = serializers.CharField(read_only=True)
|
||||
last_published = serializers.DateTimeField(format=DATETIME_FORMAT, read_only=True)
|
||||
published_by = serializers.CharField(read_only=True)
|
||||
last_draft_created = serializers.DateTimeField(format=DATETIME_FORMAT, read_only=True)
|
||||
last_draft_created_by = serializers.CharField(read_only=True)
|
||||
has_unpublished_changes = serializers.BooleanField(read_only=True)
|
||||
created = serializers.DateTimeField(format=DATETIME_FORMAT, read_only=True)
|
||||
|
||||
# When creating a new XBlock in a library, the slug becomes the ID part of
|
||||
# the definition key and usage key:
|
||||
|
||||
@@ -5,9 +5,11 @@ from unittest.mock import Mock, patch
|
||||
from unittest import skip
|
||||
|
||||
import ddt
|
||||
from datetime import datetime, timezone
|
||||
from uuid import uuid4
|
||||
from django.contrib.auth.models import Group
|
||||
from django.test.client import Client
|
||||
from freezegun import freeze_time
|
||||
from organizations.models import Organization
|
||||
from rest_framework.test import APITestCase
|
||||
|
||||
@@ -270,12 +272,18 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
assert self._get_library_blocks(lib_id)['results'] == []
|
||||
|
||||
# Add a 'problem' XBlock to the library:
|
||||
block_data = self._add_block_to_library(lib_id, "problem", "ࠒröblæm1")
|
||||
create_date = datetime(2024, 6, 6, 6, 6, 6, tzinfo=timezone.utc)
|
||||
with freeze_time(create_date):
|
||||
block_data = self._add_block_to_library(lib_id, "problem", "ࠒröblæm1")
|
||||
self.assertDictContainsEntries(block_data, {
|
||||
"id": "lb:CL-TEST:téstlꜟط:problem:ࠒröblæm1",
|
||||
"display_name": "Blank Problem",
|
||||
"block_type": "problem",
|
||||
"has_unpublished_changes": True,
|
||||
"last_published": None,
|
||||
"published_by": None,
|
||||
"last_draft_created": create_date.isoformat().replace('+00:00', 'Z'),
|
||||
"last_draft_created_by": "Bob",
|
||||
})
|
||||
block_id = block_data["id"]
|
||||
# Confirm that the result contains a definition key, but don't check its value,
|
||||
@@ -287,10 +295,14 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
assert self._get_library(lib_id)['has_unpublished_changes'] is True
|
||||
|
||||
# Publish the changes:
|
||||
self._commit_library_changes(lib_id)
|
||||
publish_date = datetime(2024, 7, 7, 7, 7, 7, tzinfo=timezone.utc)
|
||||
with freeze_time(publish_date):
|
||||
self._commit_library_changes(lib_id)
|
||||
assert self._get_library(lib_id)['has_unpublished_changes'] is False
|
||||
# And now the block information should also show that block has no unpublished changes:
|
||||
block_data["has_unpublished_changes"] = False
|
||||
block_data["last_published"] = publish_date.isoformat().replace('+00:00', 'Z')
|
||||
block_data["published_by"] = "Bob"
|
||||
self.assertDictContainsEntries(self._get_library_block(block_id), block_data)
|
||||
assert self._get_library_blocks(lib_id)['results'] == [block_data]
|
||||
|
||||
@@ -311,13 +323,16 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
</multiplechoiceresponse>
|
||||
</problem>
|
||||
""".strip()
|
||||
self._set_library_block_olx(block_id, new_olx)
|
||||
update_date = datetime(2024, 8, 8, 8, 8, 8, tzinfo=timezone.utc)
|
||||
with freeze_time(update_date):
|
||||
self._set_library_block_olx(block_id, new_olx)
|
||||
# now reading it back, we should get that exact OLX (no change to whitespace etc.):
|
||||
assert self._get_library_block_olx(block_id) == new_olx
|
||||
# And the display name and "unpublished changes" status of the block should be updated:
|
||||
self.assertDictContainsEntries(self._get_library_block(block_id), {
|
||||
"display_name": "New Multi Choice Question",
|
||||
"has_unpublished_changes": True,
|
||||
"last_draft_created": update_date.isoformat().replace('+00:00', 'Z')
|
||||
})
|
||||
|
||||
# Now view the XBlock's student_view (including draft changes):
|
||||
@@ -358,12 +373,18 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
assert self._get_library_blocks(lib_id)['results'] == []
|
||||
|
||||
# Add a 'html' XBlock to the library:
|
||||
block_data = self._add_block_to_library(lib_id, "html", "html1")
|
||||
create_date = datetime(2024, 6, 6, 6, 6, 6, tzinfo=timezone.utc)
|
||||
with freeze_time(create_date):
|
||||
block_data = self._add_block_to_library(lib_id, "html", "html1")
|
||||
self.assertDictContainsEntries(block_data, {
|
||||
"id": "lb:CL-TEST:testlib2:html:html1",
|
||||
"display_name": "Text",
|
||||
"block_type": "html",
|
||||
"has_unpublished_changes": True,
|
||||
"last_published": None,
|
||||
"published_by": None,
|
||||
"last_draft_created": create_date.isoformat().replace('+00:00', 'Z'),
|
||||
"last_draft_created_by": "Bob",
|
||||
})
|
||||
block_id = block_data["id"]
|
||||
|
||||
@@ -372,10 +393,14 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
assert self._get_library(lib_id)['has_unpublished_changes'] is True
|
||||
|
||||
# Publish the changes:
|
||||
self._commit_library_changes(lib_id)
|
||||
publish_date = datetime(2024, 7, 7, 7, 7, 7, tzinfo=timezone.utc)
|
||||
with freeze_time(publish_date):
|
||||
self._commit_library_changes(lib_id)
|
||||
assert self._get_library(lib_id)['has_unpublished_changes'] is False
|
||||
# And now the block information should also show that block has no unpublished changes:
|
||||
block_data["has_unpublished_changes"] = False
|
||||
block_data["last_published"] = publish_date.isoformat().replace('+00:00', 'Z')
|
||||
block_data["published_by"] = "Bob"
|
||||
self.assertDictContainsEntries(self._get_library_block(block_id), block_data)
|
||||
assert self._get_library_blocks(lib_id)['results'] == [block_data]
|
||||
|
||||
@@ -383,13 +408,17 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
orig_olx = self._get_library_block_olx(block_id)
|
||||
assert '<html' in orig_olx
|
||||
new_olx = "<html><b>Hello world!</b></html>"
|
||||
self._set_library_block_olx(block_id, new_olx)
|
||||
|
||||
update_date = datetime(2024, 8, 8, 8, 8, 8, tzinfo=timezone.utc)
|
||||
with freeze_time(update_date):
|
||||
self._set_library_block_olx(block_id, new_olx)
|
||||
# now reading it back, we should get that exact OLX (no change to whitespace etc.):
|
||||
assert self._get_library_block_olx(block_id) == new_olx
|
||||
# And the display name and "unpublished changes" status of the block should be updated:
|
||||
self.assertDictContainsEntries(self._get_library_block(block_id), {
|
||||
"display_name": "Text",
|
||||
"has_unpublished_changes": True,
|
||||
"last_draft_created": update_date.isoformat().replace('+00:00', 'Z')
|
||||
})
|
||||
|
||||
# Now view the XBlock's studio view (including draft changes):
|
||||
@@ -1019,6 +1048,7 @@ class ContentLibrariesTestCase(ContentLibrariesRestApiTest, OpenEdxEventsTestMix
|
||||
# the the block in the clipboard
|
||||
self.assertDictContainsEntries(self._get_library_block(paste_data["id"]), {
|
||||
**block_data,
|
||||
"last_draft_created_by": None,
|
||||
"id": f"lb:CL-TEST:test_lib_paste_clipboard:problem:{pasted_block_id}",
|
||||
})
|
||||
|
||||
|
||||
@@ -4,14 +4,15 @@ Test the objecttag_export_helpers module
|
||||
import time
|
||||
from unittest.mock import patch
|
||||
|
||||
from openedx_tagging.core.tagging.models import ObjectTag
|
||||
from organizations.models import Organization
|
||||
|
||||
from openedx.core.djangoapps.content_libraries import api as library_api
|
||||
from xmodule.modulestore.tests.django_utils import TEST_DATA_SPLIT_MODULESTORE, ModuleStoreTestCase
|
||||
from xmodule.modulestore.tests.factories import BlockFactory, CourseFactory
|
||||
|
||||
from .. import api
|
||||
from ..helpers.objecttag_export_helpers import TaggedContent, build_object_tree_with_objecttags, iterate_with_level
|
||||
from openedx_tagging.core.tagging.models import ObjectTag
|
||||
from organizations.models import Organization
|
||||
|
||||
|
||||
class TestGetAllObjectTagsMixin:
|
||||
|
||||
@@ -207,6 +207,26 @@ COURSE_NOTIFICATION_TYPES = {
|
||||
'filters': [FILTER_AUDIT_EXPIRED_USERS_WITH_NO_ROLE],
|
||||
'visible_to': [CourseStaffRole.ROLE, CourseInstructorRole.ROLE]
|
||||
},
|
||||
'ora_grade_assigned': {
|
||||
'notification_app': 'grading',
|
||||
'name': 'ora_grade_assigned',
|
||||
'is_core': False,
|
||||
'info': '',
|
||||
'web': False,
|
||||
'email': False,
|
||||
'push': False,
|
||||
'email_cadence': EmailCadence.DAILY,
|
||||
'non_editable': [],
|
||||
'content_template': _('<{p}>You have received {points_earned} out of {points_possible} on your assessment: '
|
||||
'<{strong}>{ora_name}</{strong}></{p}>'),
|
||||
'content_context': {
|
||||
'ora_name': 'Name of ORA in course',
|
||||
'points_earned': 'Points earned',
|
||||
'points_possible': 'Points possible',
|
||||
},
|
||||
'email_template': '',
|
||||
'filters': [FILTER_AUDIT_EXPIRED_USERS_WITH_NO_ROLE],
|
||||
},
|
||||
}
|
||||
|
||||
COURSE_NOTIFICATION_APPS = {
|
||||
|
||||
@@ -28,3 +28,13 @@ ENABLE_NOTIFICATIONS = CourseWaffleFlag(f'{WAFFLE_NAMESPACE}.enable_notification
|
||||
# .. toggle_warning: When the flag is ON, Email Notifications feature is enabled.
|
||||
# .. toggle_tickets: INF-1259
|
||||
ENABLE_EMAIL_NOTIFICATIONS = WaffleFlag(f'{WAFFLE_NAMESPACE}.enable_email_notifications', __name__)
|
||||
|
||||
# .. toggle_name: notifications.enable_ora_grade_notifications
|
||||
# .. toggle_implementation: CourseWaffleFlag
|
||||
# .. toggle_default: False
|
||||
# .. toggle_description: Waffle flag to enable ORA grade notifications
|
||||
# .. toggle_use_cases: temporary, open_edx
|
||||
# .. toggle_creation_date: 2024-09-10
|
||||
# .. toggle_target_removal_date: 2024-10-10
|
||||
# .. toggle_tickets: INF-1304
|
||||
ENABLE_ORA_GRADE_NOTIFICATION = CourseWaffleFlag(f"{WAFFLE_NAMESPACE}.enable_ora_grade_notifications", __name__)
|
||||
|
||||
@@ -21,7 +21,7 @@ from openedx.core.djangoapps.notifications.audience_filters import (
|
||||
ForumRoleAudienceFilter,
|
||||
TeamAudienceFilter
|
||||
)
|
||||
from openedx.core.djangoapps.notifications.config.waffle import ENABLE_NOTIFICATIONS
|
||||
from openedx.core.djangoapps.notifications.config.waffle import ENABLE_NOTIFICATIONS, ENABLE_ORA_GRADE_NOTIFICATION
|
||||
from openedx.core.djangoapps.notifications.models import CourseNotificationPreference
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
@@ -72,6 +72,12 @@ def generate_user_notifications(signal, sender, notification_data, metadata, **k
|
||||
"""
|
||||
Watches for USER_NOTIFICATION_REQUESTED signal and calls send_web_notifications task
|
||||
"""
|
||||
if (
|
||||
notification_data.notification_type == 'ora_grade_assigned'
|
||||
and not ENABLE_ORA_GRADE_NOTIFICATION.is_enabled(notification_data.course_key)
|
||||
):
|
||||
return
|
||||
|
||||
from openedx.core.djangoapps.notifications.tasks import send_notifications
|
||||
notification_data = notification_data.__dict__
|
||||
notification_data['course_key'] = str(notification_data['course_key'])
|
||||
|
||||
@@ -23,7 +23,7 @@ NOTIFICATION_CHANNELS = ['web', 'push', 'email']
|
||||
ADDITIONAL_NOTIFICATION_CHANNEL_SETTINGS = ['email_cadence']
|
||||
|
||||
# Update this version when there is a change to any course specific notification type or app.
|
||||
COURSE_NOTIFICATION_CONFIG_VERSION = 11
|
||||
COURSE_NOTIFICATION_CONFIG_VERSION = 12
|
||||
|
||||
|
||||
def get_course_notification_preference_config():
|
||||
|
||||
@@ -313,7 +313,14 @@ class UserNotificationPreferenceAPITest(ModuleStoreTestCase):
|
||||
'push': True,
|
||||
'email_cadence': 'Daily',
|
||||
'info': 'Notifications for submission grading.'
|
||||
}
|
||||
},
|
||||
'ora_grade_assigned': {
|
||||
'web': False,
|
||||
'email': False,
|
||||
'push': False,
|
||||
'email_cadence': 'Daily',
|
||||
'info': ''
|
||||
},
|
||||
},
|
||||
'non_editable': {}
|
||||
}
|
||||
|
||||
@@ -36,9 +36,5 @@ COURSE_GRADE_NOW_PASSED = Signal()
|
||||
# ]
|
||||
COURSE_GRADE_NOW_FAILED = Signal()
|
||||
|
||||
# Signal that indicates that a user has become verified for certificate purposes
|
||||
# providing_args=['user']
|
||||
LEARNER_NOW_VERIFIED = Signal()
|
||||
|
||||
# providing_args=['user']
|
||||
USER_ACCOUNT_ACTIVATED = Signal() # Signal indicating email verification
|
||||
|
||||
@@ -93,7 +93,7 @@ libsass==0.10.0
|
||||
click==8.1.6
|
||||
|
||||
# pinning this version to avoid updates while the library is being developed
|
||||
openedx-learning==0.11.4
|
||||
openedx-learning==0.11.5
|
||||
|
||||
# Open AI version 1.0.0 dropped support for openai.ChatCompletion which is currently in use in enterprise.
|
||||
openai<=0.28.1
|
||||
@@ -142,7 +142,3 @@ django-storages<1.14.4
|
||||
# We are pinning this until after all the smaller migrations get handled and then we can migrate this all at once.
|
||||
# Ticket to unpin: https://github.com/edx/edx-arch-experiments/issues/760
|
||||
social-auth-app-django<=5.4.1
|
||||
|
||||
# Temporary pin as to prevent a new version of edx-name-affirmation from being merged before we modify it to work
|
||||
# properly along with work in this PR: https://github.com/openedx/edx-platform/pull/35468
|
||||
edx-name-affirmation==2.4.0
|
||||
|
||||
@@ -482,10 +482,8 @@ edx-i18n-tools==1.5.0
|
||||
# ora2
|
||||
edx-milestones==0.6.0
|
||||
# via -r requirements/edx/kernel.in
|
||||
edx-name-affirmation==2.4.0
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/kernel.in
|
||||
edx-name-affirmation==2.4.1
|
||||
# via -r requirements/edx/kernel.in
|
||||
edx-opaque-keys[django]==2.11.0
|
||||
# via
|
||||
# -r requirements/edx/kernel.in
|
||||
@@ -826,7 +824,7 @@ openedx-filters==1.9.0
|
||||
# -r requirements/edx/kernel.in
|
||||
# lti-consumer-xblock
|
||||
# ora2
|
||||
openedx-learning==0.11.4
|
||||
openedx-learning==0.11.5
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/kernel.in
|
||||
@@ -836,7 +834,7 @@ optimizely-sdk==4.1.1
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/bundled.in
|
||||
ora2==6.11.2
|
||||
ora2==6.12.0
|
||||
# via -r requirements/edx/bundled.in
|
||||
packaging==24.1
|
||||
# via
|
||||
|
||||
@@ -766,9 +766,8 @@ edx-milestones==0.6.0
|
||||
# via
|
||||
# -r requirements/edx/doc.txt
|
||||
# -r requirements/edx/testing.txt
|
||||
edx-name-affirmation==2.4.0
|
||||
edx-name-affirmation==2.4.1
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/doc.txt
|
||||
# -r requirements/edx/testing.txt
|
||||
edx-opaque-keys[django]==2.11.0
|
||||
@@ -1374,7 +1373,7 @@ openedx-filters==1.9.0
|
||||
# -r requirements/edx/testing.txt
|
||||
# lti-consumer-xblock
|
||||
# ora2
|
||||
openedx-learning==0.11.4
|
||||
openedx-learning==0.11.5
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/doc.txt
|
||||
@@ -1388,7 +1387,7 @@ optimizely-sdk==4.1.1
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/doc.txt
|
||||
# -r requirements/edx/testing.txt
|
||||
ora2==6.11.2
|
||||
ora2==6.12.0
|
||||
# via
|
||||
# -r requirements/edx/doc.txt
|
||||
# -r requirements/edx/testing.txt
|
||||
|
||||
@@ -562,10 +562,8 @@ edx-i18n-tools==1.5.0
|
||||
# ora2
|
||||
edx-milestones==0.6.0
|
||||
# via -r requirements/edx/base.txt
|
||||
edx-name-affirmation==2.4.0
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
edx-name-affirmation==2.4.1
|
||||
# via -r requirements/edx/base.txt
|
||||
edx-opaque-keys[django]==2.11.0
|
||||
# via
|
||||
# -r requirements/edx/base.txt
|
||||
@@ -985,7 +983,7 @@ openedx-filters==1.9.0
|
||||
# -r requirements/edx/base.txt
|
||||
# lti-consumer-xblock
|
||||
# ora2
|
||||
openedx-learning==0.11.4
|
||||
openedx-learning==0.11.5
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
@@ -995,7 +993,7 @@ optimizely-sdk==4.1.1
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
ora2==6.11.2
|
||||
ora2==6.12.0
|
||||
# via -r requirements/edx/base.txt
|
||||
packaging==24.1
|
||||
# via
|
||||
|
||||
@@ -588,10 +588,8 @@ edx-lint==5.3.7
|
||||
# via -r requirements/edx/testing.in
|
||||
edx-milestones==0.6.0
|
||||
# via -r requirements/edx/base.txt
|
||||
edx-name-affirmation==2.4.0
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
edx-name-affirmation==2.4.1
|
||||
# via -r requirements/edx/base.txt
|
||||
edx-opaque-keys[django]==2.11.0
|
||||
# via
|
||||
# -r requirements/edx/base.txt
|
||||
@@ -1036,7 +1034,7 @@ openedx-filters==1.9.0
|
||||
# -r requirements/edx/base.txt
|
||||
# lti-consumer-xblock
|
||||
# ora2
|
||||
openedx-learning==0.11.4
|
||||
openedx-learning==0.11.5
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
@@ -1046,7 +1044,7 @@ optimizely-sdk==4.1.1
|
||||
# via
|
||||
# -c requirements/edx/../constraints.txt
|
||||
# -r requirements/edx/base.txt
|
||||
ora2==6.11.2
|
||||
ora2==6.12.0
|
||||
# via -r requirements/edx/base.txt
|
||||
packaging==24.1
|
||||
# via
|
||||
|
||||
@@ -5,9 +5,9 @@ Evolving the library_content block schema
|
||||
Status
|
||||
******
|
||||
|
||||
**Provisional**
|
||||
**Replaced** by the `Upstream-Downstream ADR`_.
|
||||
|
||||
Subject to change due to implementation learnings and stakeholder feedback.
|
||||
.. _Upstream-Downstream ADR: https://docs/decisions/0020-upstream-block.rst
|
||||
|
||||
Context
|
||||
*******
|
||||
|
||||
Reference in New Issue
Block a user