Files
edx-platform/doc/testing.md

7.6 KiB

Testing

Overview

We maintain three kinds of tests: unit tests, integration tests, and acceptance tests.

Unit Tests

  • Each test case should be concise: setup, execute, check, and teardown. If you find yourself writing tests with many steps, consider refactoring the unit under tests into smaller units, and then testing those individually.

  • As a rule of thumb, your unit tests should cover every code branch.

  • Mock or patch external dependencies. We use voidspace mock.

  • We unit test Python code (using unittest) and Javascript (using Jasmine)

Integration Tests

  • Test several units at the same time. Note that you can still mock or patch dependencies that are not under test! For example, you might test that LoncapaProblem, NumericalResponse, and CorrectMap in the capa package work together, while still mocking out template rendering.

  • Use integration tests to ensure that units are hooked up correctly. You do not need to test every possible input--that's what unit tests are for. Instead, focus on testing the "happy path" to verify that the components work together correctly.

  • Many of our tests use the Django test client to simulate HTTP requests to the server.

UI Acceptance Tests

  • Use these to test that major program features are working correctly.

  • We use lettuce to write BDD-style tests. Most of these tests simulate user interactions through the browser using splinter.

Overall, you want to write the tests that maximize coverage while minimizing maintenance. In practice, this usually means investing heavily in unit tests, which tend to be the most robust to changes in the code base.

Test Pyramid

The pyramid above shows the relative number of unit tests, integration tests, and acceptance tests. Most of our tests are unit tests or integration tests.

Test Locations

  • Python unit and integration tests: Located in subpackages called tests. For example, the tests for the capa package are located in common/lib/capa/capa/tests.

  • Javascript unit tests: Located in spec folders. For example, common/lib/xmodule/xmodule/js/spec and {cms,lms}/static/coffee/spec
    For consistency, you should use the same directory structure for implementation and test. For example, the test for src/views/module.coffee should be written in spec/views/module_spec.coffee.

  • UI acceptance tests:

    • Set up and helper methods: common/djangoapps/terrain
    • Tests: located in features subpackage within a Django app. For example: lms/djangoapps/courseware/features

Factories

Many tests delegate set-up to a "factory" class. For example, there are factories for creating courses, problems, and users. This encapsulates set-up logic from tests.

Factories are often implemented using FactoryBoy

In general, factories should be located close to the code they use. For example, the factory for creating problem XML definitions is located in common/lib/capa/capa/tests/response_xml_factory.py because the capa package handles problem XML.

Running Tests

Before running tests, ensure that you have all the dependencies. You can install dependencies using:

pip install -r requirements.txt

Running Python Unit tests

We use nose through the django-nose plugin to run the test suite.

You can run tests using rake commands. For example,

rake test

runs all the tests. It also runs collectstatic, which prepares the static files used by the site (for example, compiling Coffeescript to Javascript).

You can also run the tests without collectstatic, which tends to be faster:

rake fasttest_lms

or

rake fasttest_cms

xmodule can be tested independently, with this:

rake test_common/lib/xmodule

To run a single django test class:

django-admin.py test --settings=lms.envs.test --pythonpath=. lms/djangoapps/courseware/tests/tests.py:TestViewAuth

To run a single django test:

django-admin.py test --settings=lms.envs.test --pythonpath=. lms/djangoapps/courseware/tests/tests.py:TestViewAuth.test_dark_launch

To run a single nose test file:

nosetests common/lib/xmodule/xmodule/tests/test_stringify.py

To run a single nose test:

nosetests common/lib/xmodule/xmodule/tests/test_stringify.py:test_stringify

Very handy: if you uncomment the pdb=1 line in setup.cfg, it will drop you into pdb on error. This lets you go up and down the stack and see what the values of the variables are. Check out the pdb documentation

Running Javascript Unit Tests

These commands start a development server with jasmine testing enabled, and launch your default browser pointing to those tests

rake browse_jasmine_{lms,cms}

To run the tests headless, you must install phantomjs, then run:

rake phantomjs_jasmine_{lms,cms}

If the phantomjs binary is not on the path, set the PHANTOMJS_PATH environment variable to point to it

PHANTOMJS_PATH=/path/to/phantomjs rake phantomjs_jasmine_{lms,cms}

Once you have run the rake command, your browser should open to to http://localhost/_jasmine/, which displays the test results.

Troubleshooting: If you get an error message while running the rake task, try running bundle install to install the required ruby gems.

Running Acceptance Tests

We use Lettuce for acceptance testing. Most of our tests use Splinter to simulate UI browser interactions. Splinter, in turn, uses Selenium to control the Chrome browser.

Prerequisite: You must have ChromeDriver installed to run the tests in Chrome. The tests are confirmed to run with Chrome (not Chromium) version 26.0.0.1410.63 with ChromeDriver version r195636.

To run all the acceptance tests:

rake test_acceptance_lms
rake test_acceptance_cms

To test only a specific feature:

rake test_acceptance_lms[lms/djangoapps/courseware/features/problems.feature]

To start the debugger on failure, add the --pdb option:

rake test_acceptance_lms["lms/djangoapps/courseware/features/problems.feature --pdb"]

To run tests faster by not collecting static files, you can use rake fasttest_acceptance_lms and rake fasttest_acceptance_cms.

Troubleshooting: If you get an error message that says something about harvest not being a command, you probably are missing a requirement. Try running:

pip install -r requirements.txt

Note: The acceptance tests can not currently run in parallel.

Viewing Test Coverage

We currently collect test coverage information for Python unit/integration tests.

To view test coverage:

  1. Run the test suite:

     rake test
    
  2. Generate reports:

     rake coverage:html
    
  3. HTML reports are located in the reports folder.

Testing using queue servers

When testing problems that use a queue server on AWS (e.g. sandbox-xqueue.edx.org), you'll need to run your server on your public IP, like so.

django-admin.py runserver --settings=lms.envs.dev --pythonpath=. 0.0.0.0:8000

When you connect to the LMS, you need to use the public ip. Use ifconfig to figure out the number, and connect e.g. to http://18.3.4.5:8000/