+
+
@@ -218,7 +218,7 @@ describe 'MarkdownEditingDescriptor', ->
One of the main elements that goes into a good multiple choice question is the existence of good distractors. That is, each of the alternate responses presented to the student should be the result of a plausible mistake that a student might make.
- What Apple device competed with the portable CD player?
+ >>What Apple device competed with the portable CD player?<<
( ) The iPad
( ) Napster
(x) The iPod
@@ -230,31 +230,26 @@ describe 'MarkdownEditingDescriptor', ->
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
[Explanation]
""")
- expect(data).toEqual("""
-
A multiple choice problem presents radio buttons for student input. Students can only select a single option presented. Multiple Choice questions have been the subject of many areas of research due to the early invention and adoption of bubble sheets.
-
-
One of the main elements that goes into a good multiple choice question is the existence of good distractors. That is, each of the alternate responses presented to the student should be the result of a plausible mistake that a student might make.
-
-
What Apple device competed with the portable CD player?
-
-
- The iPad
- Napster
- The iPod
- The vegetable peeler
- Android
- The Beatles
-
-
-
-
-
-
Explanation
-
-
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
-
-
-
+ expect(data).toXMLEqual("""
+
+
A multiple choice problem presents radio buttons for student input. Students can only select a single option presented. Multiple Choice questions have been the subject of many areas of research due to the early invention and adoption of bubble sheets.
+
One of the main elements that goes into a good multiple choice question is the existence of good distractors. That is, each of the alternate responses presented to the student should be the result of a plausible mistake that a student might make.
+
+
+ The iPad
+ Napster
+ The iPod
+ The vegetable peeler
+ Android
+ The Beatles
+
+
+
+
Explanation
+
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
+
+
+ """)
it 'converts multiple choice shuffle to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""A multiple choice problem presents radio buttons for student input. Students can only select a single option presented. Multiple Choice questions have been the subject of many areas of research due to the early invention and adoption of bubble sheets.
@@ -273,31 +268,27 @@ describe 'MarkdownEditingDescriptor', ->
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
[Explanation]
""")
- expect(data).toEqual("""
-
A multiple choice problem presents radio buttons for student input. Students can only select a single option presented. Multiple Choice questions have been the subject of many areas of research due to the early invention and adoption of bubble sheets.
-
-
One of the main elements that goes into a good multiple choice question is the existence of good distractors. That is, each of the alternate responses presented to the student should be the result of a plausible mistake that a student might make.
-
-
What Apple device competed with the portable CD player?
-
-
- The iPad
- Napster
- The iPod
- The vegetable peeler
- Android
- The Beatles
-
-
-
-
-
-
Explanation
-
-
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
A multiple choice problem presents radio buttons for student input. Students can only select a single option presented. Multiple Choice questions have been the subject of many areas of research due to the early invention and adoption of bubble sheets.
+
One of the main elements that goes into a good multiple choice question is the existence of good distractors. That is, each of the alternate responses presented to the student should be the result of a plausible mistake that a student might make.
+
What Apple device competed with the portable CD player?
+
+ The iPad
+ Napster
+ The iPod
+ The vegetable peeler
+ Android
+ The Beatles
+
+
+
+
Explanation
+
The release of the iPod allowed consumers to carry their entire music library with them in a format that did not rely on fragile and energy-intensive spinning disks.
+
+
+ """)
it 'converts a series of multiplechoice to xml', ->
@@ -367,25 +358,20 @@ describe 'MarkdownEditingDescriptor', ->
Multiple Choice also allows students to select from a variety of pre-written responses, although the format makes it easier for students to read very long response options. Optionresponse also differs slightly because students are more likely to think of an answer and then search for it rather than relying purely on recognition to answer the question.
[Explanation]
""")
- expect(data).toEqual("""
-
OptionResponse gives a limited set of options for students to respond with, and presents those options in a format that encourages them to search for a specific answer rather than being immediately presented with options from which to recognize the correct answer.
-
-
The answer options and the identification of the correct answer is defined in the optioninput tag.
-
-
Translation between Option Response and __________ is extremely straightforward:
-
-
-
-
-
-
-
-
Explanation
-
-
Multiple Choice also allows students to select from a variety of pre-written responses, although the format makes it easier for students to read very long response options. Optionresponse also differs slightly because students are more likely to think of an answer and then search for it rather than relying purely on recognition to answer the question.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
OptionResponse gives a limited set of options for students to respond with, and presents those options in a format that encourages them to search for a specific answer rather than being immediately presented with options from which to recognize the correct answer.
+
The answer options and the identification of the correct answer is defined in the optioninput tag.
+
Translation between Option Response and __________ is extremely straightforward:
+
+
+
+
Explanation
+
Multiple Choice also allows students to select from a variety of pre-written responses, although the format makes it easier for students to read very long response options. Optionresponse also differs slightly because students are more likely to think of an answer and then search for it rather than relying purely on recognition to answer the question.
+
+
+ """)
it 'converts StringResponse to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""A string response problem accepts a line of text input from the student, and evaluates the input for correctness based on an expected answer within each input box.
@@ -399,24 +385,20 @@ describe 'MarkdownEditingDescriptor', ->
Lansing is the capital of Michigan, although it is not Michgan's largest city, or even the seat of the county in which it resides.
[Explanation]
""")
- expect(data).toEqual("""
-
A string response problem accepts a line of text input from the student, and evaluates the input for correctness based on an expected answer within each input box.
-
-
The answer is correct if it matches every character of the expected answer. This can be a problem with international spelling, dates, or anything where the format of the answer is not clear.
-
-
Which US state has Lansing as its capital?
-
-
-
-
-
-
-
Explanation
-
-
Lansing is the capital of Michigan, although it is not Michgan's largest city, or even the seat of the county in which it resides.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
A string response problem accepts a line of text input from the student, and evaluates the input for correctness based on an expected answer within each input box.
+
The answer is correct if it matches every character of the expected answer. This can be a problem with international spelling, dates, or anything where the format of the answer is not clear.
+
Which US state has Lansing as its capital?
+
+
+
+
Explanation
+
Lansing is the capital of Michigan, although it is not Michgan's largest city, or even the seat of the county in which it resides.
+
+
+ """)
it 'converts StringResponse with regular expression to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""Who lead the civil right movement in the United States of America?
@@ -426,20 +408,18 @@ describe 'MarkdownEditingDescriptor', ->
Test Explanation.
[Explanation]
""")
- expect(data).toEqual("""
-
Who lead the civil right movement in the United States of America?
-
-
-
-
-
-
-
Explanation
-
-
Test Explanation.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
Who lead the civil right movement in the United States of America?
+
+
+
+
Explanation
+
Test Explanation.
+
+
+ """)
it 'converts StringResponse with multiple answers to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""Who lead the civil right movement in the United States of America?
@@ -452,23 +432,21 @@ describe 'MarkdownEditingDescriptor', ->
Test Explanation.
[Explanation]
""")
- expect(data).toEqual("""
-
Who lead the civil right movement in the United States of America?
-
-
-
-
-
-
-
-
-
-
Explanation
-
-
Test Explanation.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
Who lead the civil right movement in the United States of America?
+
+
+
+
+
+
+
Explanation
+
Test Explanation.
+
+
+ """)
it 'converts StringResponse with multiple answers and regular expressions to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""Write a number from 1 to 4.
@@ -481,23 +459,21 @@ describe 'MarkdownEditingDescriptor', ->
Test Explanation.
[Explanation]
""")
- expect(data).toEqual("""
-
Write a number from 1 to 4.
-
-
-
-
-
-
-
-
-
-
Explanation
-
-
Test Explanation.
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
Write a number from 1 to 4.
+
+
+
+
+
+
+
Explanation
+
Test Explanation.
+
+
+ """)
# test labels
it 'converts markdown labels to label attributes', ->
@@ -508,21 +484,19 @@ describe 'MarkdownEditingDescriptor', ->
Test Explanation.
[Explanation]
""")
- expect(data).toEqual("""
-
Who lead the civil right movement in the United States of America?
+
+
+ """)
it 'handles multiple questions with labels', ->
data = MarkdownEditingDescriptor.markdownToXml("""
France is a country in Europe.
@@ -538,7 +512,7 @@ describe 'MarkdownEditingDescriptor', ->
(x) Berlin
( ) Donut
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
@@ -597,66 +571,22 @@ describe 'MarkdownEditingDescriptor', ->
""")
- it 'tests malformed labels', ->
- data = MarkdownEditingDescriptor.markdownToXml("""
- France is a country in Europe.
- >>What is the capital of France?<
- = Paris
-
- blah>>What is the capital of <
-
France is a country in Europe.
-
-
>>What is the capital of France?<
-
-
-
-
-
blahWhat is the capital of Germany?
-
-
- Bonn
- Hamburg
- Berlin
- Donut
-
-
-
-
- """)
it 'adds labels to formulae', ->
data = MarkdownEditingDescriptor.markdownToXml("""
>>Enter the numerical value of Pi:<<
= 3.14159 +- .02
""")
- expect(data).toEqual("""
-
Enter the numerical value of Pi:
-
-
-
-
+ expect(data).toXMLEqual("""
+
+
+
+
+
- """)
- it 'escapes entities in labels', ->
- data = MarkdownEditingDescriptor.markdownToXml("""
- >>What is the "capital" of France & the 'best' > place < to live"?<<
- = Paris
- """)
- expect(data).toEqual("""
-
What is the "capital" of France & the 'best' > place < to live"?
-
-
-
+ """)
-
- """)
# test oddities
it 'converts headers and oddities to xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""Not a header
@@ -780,4 +710,178 @@ describe 'MarkdownEditingDescriptor', ->
Code should be nicely monospaced.
""")
- # failure tests
+
+ it 'can separate responsetypes based on ---', ->
+ data = MarkdownEditingDescriptor.markdownToXml("""
+ Multiple choice problems allow learners to select only one option. Learners can see all the options along with the problem text.
+
+ >>Which of the following countries has the largest population?<<
+ ( ) Brazil {{ timely feedback -- explain why an almost correct answer is wrong }}
+ ( ) Germany
+ (x) Indonesia
+ ( ) Russia
+
+ [explanation]
+ According to September 2014 estimates:
+ The population of Indonesia is approximately 250 million.
+ The population of Brazil is approximately 200 million.
+ The population of Russia is approximately 146 million.
+ The population of Germany is approximately 81 million.
+ [explanation]
+
+ ---
+
+ Checkbox problems allow learners to select multiple options. Learners can see all the options along with the problem text.
+
+ >>The following languages are in the Indo-European family:<<
+ [x] Urdu
+ [ ] Finnish
+ [x] Marathi
+ [x] French
+ [ ] Hungarian
+
+ Note: Make sure you select all of the correct options—there may be more than one!
+
+ [explanation]
+ Urdu, Marathi, and French are all Indo-European languages, while Finnish and Hungarian are in the Uralic family.
+ [explanation]
+
+ """)
+ expect(data).toXMLEqual("""
+
+
+
Multiple choice problems allow learners to select only one option. Learners can see all the options along with the problem text.
+
+
+ Brazil timely feedback -- explain why an almost correct answer is wrong
+
+ Germany
+ Indonesia
+ Russia
+
+
+
+
Explanation
+
According to September 2014 estimates:
+
The population of Indonesia is approximately 250 million.
+
The population of Brazil is approximately 200 million.
+
The population of Russia is approximately 146 million.
+
The population of Germany is approximately 81 million.
+
+
+
+
+
+
Checkbox problems allow learners to select multiple options. Learners can see all the options along with the problem text.
+
+
+ Urdu
+ Finnish
+ Marathi
+ French
+ Hungarian
+
+
Note: Make sure you select all of the correct options—there may be more than one!
+
+
+
Explanation
+
Urdu, Marathi, and French are all Indo-European languages, while Finnish and Hungarian are in the Uralic family.
+
+
+
+
+ """)
+
+ it 'can separate other things based on ---', ->
+ data = MarkdownEditingDescriptor.markdownToXml("""
+ Multiple choice problems allow learners to select only one option. Learners can see all the options along with the problem text.
+
+ ---
+
+ >>Which of the following countries has the largest population?<<
+ ( ) Brazil {{ timely feedback -- explain why an almost correct answer is wrong }}
+ ( ) Germany
+ (x) Indonesia
+ ( ) Russia
+
+ [explanation]
+ According to September 2014 estimates:
+ The population of Indonesia is approximately 250 million.
+ The population of Brazil is approximately 200 million.
+ The population of Russia is approximately 146 million.
+ The population of Germany is approximately 81 million.
+ [explanation]
+ """)
+ expect(data).toXMLEqual("""
+
+
Multiple choice problems allow learners to select only one option. Learners can see all the options along with the problem text.
+
+
+
+
+ Brazil timely feedback -- explain why an almost correct answer is wrong
+
+ Germany
+ Indonesia
+ Russia
+
+
+
+
Explanation
+
According to September 2014 estimates:
+
The population of Indonesia is approximately 250 million.
+
The population of Brazil is approximately 200 million.
+
The population of Russia is approximately 146 million.
+
The population of Germany is approximately 81 million.
+
+
+
+
+ """)
+
+ it 'can do separation if spaces are prsent around ---', ->
+ data = MarkdownEditingDescriptor.markdownToXml("""
+ >>The following languages are in the Indo-European family:<<
+ [x] Urdu
+ [ ] Finnish
+ [x] Marathi
+ [x] French
+ [ ] Hungarian
+
+ ---
+
+ >>Which of the following countries has the largest population?<<
+ ( ) Brazil {{ timely feedback -- explain why an almost correct answer is wrong }}
+ ( ) Germany
+ (x) Indonesia
+ ( ) Russia
+ """)
+ expect(data).toXMLEqual("""
+
+
+
+
+ Urdu
+ Finnish
+ Marathi
+ French
+ Hungarian
+
+
+
+
+
+
+
+
+ Brazil timely feedback -- explain why an almost correct answer is wrong
+
+ Germany
+ Indonesia
+ Russia
+
+
+
+
+
+ """)
diff --git a/common/lib/xmodule/xmodule/js/spec/problem/edit_spec_hint.coffee b/common/lib/xmodule/xmodule/js/spec/problem/edit_spec_hint.coffee
index 94e5fb671c..296591e445 100644
--- a/common/lib/xmodule/xmodule/js/spec/problem/edit_spec_hint.coffee
+++ b/common/lib/xmodule/xmodule/js/spec/problem/edit_spec_hint.coffee
@@ -14,11 +14,11 @@ describe 'Markdown to xml extended hint dropdown', ->
Clowns have funny _________ to make people laugh.
-
+
[[
dogs {{ NOPE::Not dogs, not cats, not toads }}
(FACES) {{ With lots of makeup, doncha know?}}
-
+
money {{ Clowns don't have any money, of course }}
donkeys {{don't be an ass.}}
-no hint-
@@ -46,7 +46,7 @@ describe 'Markdown to xml extended hint dropdown', ->
-
+
""")
@@ -64,14 +64,17 @@ describe 'Markdown to xml extended hint dropdown', ->
|| 1) one ||
|| 2) two ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
Translation between Dropdown and ________ is straightforward.
-
-
-
-
+
Translation between Dropdown and ________ is straightforward.
+
+
+
+
@@ -91,11 +94,11 @@ describe 'Markdown to xml extended hint dropdown', ->
|| 0) zero ||
|| 1) one ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
A Question ________ is answered.
-
+
A Question ________ is answered.
+
@@ -112,19 +115,20 @@ describe 'Markdown to xml extended hint dropdown', ->
bb
cc {{ hint2 }} ]]
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
-
-
-
-
+
+
+
+
-
-
+
+
""")
@@ -161,20 +166,20 @@ describe 'Markdown to xml extended hint checkbox', ->
it 'produces xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""
>>Select all the fruits from the list<<
-
+
[x] Apple {{ selected: You're right that apple is a fruit. }, {unselected: Remember that apple is also a fruit.}}
[ ] Mushroom {{U: You're right that mushrooms aren't fruit}, { selected: Mushroom is a fungus, not a fruit.}}
[x] Grape {{ selected: You're right that grape is a fruit }, {unselected: Remember that grape is also a fruit.}}
[ ] Mustang
[ ] Camero {{S:I don't know what a Camero is but it isn't a fruit.},{U:What is a camero anyway?}}
-
+
{{ ((A*B)) You're right that apple is a fruit, but there's one you're missing. Also, mushroom is not a fruit.}}
{{ ((B*C)) You're right that grape is a fruit, but there's one you're missing. Also, mushroom is not a fruit. }}
>>Select all the vegetables from the list<<
-
+
[ ] Banana {{ selected: No, sorry, a banana is a fruit. }, {unselected: poor banana.}}
[ ] Ice Cream
[ ] Mushroom {{U: You're right that mushrooms aren't vegetables.}, { selected: Mushroom is a fungus, not a vegetable.}}
@@ -184,7 +189,7 @@ describe 'Markdown to xml extended hint checkbox', ->
{{ ((A*B)) Making a banana split? }}
{{ ((B*D)) That will make a horrible dessert: a brussel sprout split? }}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Select all the fruits from the list
@@ -262,7 +267,7 @@ describe 'Markdown to xml extended hint checkbox', ->
|| Hint two. ||
|| Hint three. ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Select all the fruits from the list
@@ -317,7 +322,7 @@ describe 'Markdown to xml extended hint multiple choice', ->
it 'produces xml', ->
data = MarkdownEditingDescriptor.markdownToXml("""
>>Select the fruit from the list<<
-
+
() Mushroom {{ Mushroom is a fungus, not a fruit.}}
() Potato
(x) Apple {{ OUTSTANDING::Apple is indeed a fruit.}}
@@ -328,7 +333,7 @@ describe 'Markdown to xml extended hint multiple choice', ->
(x) Potato {{ Potato is a root vegetable. }}
() Apple {{ OOPS::Apple is a fruit.}}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Select the fruit from the list
@@ -338,7 +343,7 @@ describe 'Markdown to xml extended hint multiple choice', ->
Apple Apple is indeed a fruit.
-
+
Select the vegetables from the list
@@ -347,8 +352,8 @@ describe 'Markdown to xml extended hint multiple choice', ->
Apple Apple is a fruit.
-
-
+
+
""")
@@ -359,24 +364,24 @@ describe 'Markdown to xml extended hint multiple choice', ->
() Mushroom {{ Mushroom is a fungus, not a fruit.}}
() Potato
(x) Apple {{ OUTSTANDING::Apple is indeed a fruit.}}
-
-
+
+
|| 0) spaces on previous line. ||
|| 1) roses are red. ||
>>Select the vegetables from the list<<
() Mushroom {{ Mushroom is a fungus, not a vegetable.}}
-
+
(x) Potato {{ Potato is a root vegetable. }}
() Apple {{ OOPS::Apple is a fruit.}}
-
-
+
+
|| 2) where are the lions? ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Select the fruit from the list
@@ -412,14 +417,15 @@ describe 'Markdown to xml extended hint text input', ->
= France {{ BRAVO::Viva la France! }}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
In which country would you find the city of Paris?
-
- Viva la France!
+
+
+ Viva la France!
+
""")
@@ -429,15 +435,17 @@ describe 'Markdown to xml extended hint text input', ->
or= USA {{ meh::hint2 }}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
Where Paris?
-
- hint1
- hint2
+
+
+ hint1
+ hint2
+
-
+
+
""")
@@ -447,15 +455,16 @@ describe 'Markdown to xml extended hint text input', ->
not= warm {{feedback2}}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
Revenge is a dish best served
-
- khaaaaaan!
+
+
+ khaaaaaan!feedback2
+
""")
@@ -464,14 +473,15 @@ describe 'Markdown to xml extended hint text input', ->
s= 2 {{feedback1}}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q
-
- feedback1
+
+
+ feedback1
-
+
+
""")
@@ -483,16 +493,18 @@ describe 'Markdown to xml extended hint text input', ->
or= ccc
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q
-
- feedback1
+
+
+ feedback1
+ feedback2
-
+
+
""")
@@ -503,16 +515,18 @@ describe 'Markdown to xml extended hint text input', ->
or= ccc
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q
-
- feedback1
- feedback2
-
+
+
+ feedback1
+ feedback2
+
+
+
""")
@@ -523,7 +537,7 @@ describe 'Markdown to xml extended hint text input', ->
s= ccc
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
q
@@ -549,7 +563,7 @@ describe 'Markdown to xml extended hint text input', ->
paragraph 2
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
paragraph
q
@@ -574,7 +588,7 @@ describe 'Markdown to xml extended hint text input', ->
paragraph 2
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
paragraph
q
@@ -591,7 +605,7 @@ describe 'Markdown to xml extended hint text input', ->
= ccc {{feedback2}}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
q
@@ -614,11 +628,11 @@ describe 'Markdown to xml extended hint text input', ->
|| Paris is the capital of one of those countries. ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
Where Paris?
-
- hint1
+
+
+ hint1
@@ -640,10 +654,10 @@ describe 'Markdown to xml extended hint numeric input', ->
>>Enter the number of fingers on a human hand<<
= 5
-
+
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Enter the numerical value of Pi:
@@ -681,7 +695,7 @@ describe 'Markdown to xml extended hint numeric input', ->
|| hintB ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
text1
@@ -707,10 +721,10 @@ describe 'Markdown to xml extended hint with multiline hints', ->
data = MarkdownEditingDescriptor.markdownToXml("""
>>Checkboxes<<
- [x] A {{
+ [x] A {{
selected: aaa },
{unselected:bbb}}
- [ ] B {{U: c}, {
+ [ ] B {{U: c}, {
selected: d.}}
{{ ((A*B)) A*B hint}}
@@ -725,7 +739,7 @@ describe 'Markdown to xml extended hint with multiline hints', ->
hello
hint
}}
-
+
>>multiple choice<<
(x) AA{{hint1}}
() BB {{
@@ -733,11 +747,11 @@ describe 'Markdown to xml extended hint with multiline hints', ->
}}
( ) CC {{ hint3
}}
-
+
>>dropdown<<
[[
- W1 {{
- no }}
+ W1 {{
+ no }}
W2 {{
nope}}
(C1) {{ yes
@@ -749,7 +763,7 @@ describe 'Markdown to xml extended hint with multiline hints', ->
|| ccc ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
Checkboxes
@@ -816,23 +830,23 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
|| Ø ||
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
á and Ø
-
- Ø Ø
+
+
+ Ø Ø
+ BB
-
-
+
Ø
""")
-
+
it 'produces xml with quote-type characters', ->
data = MarkdownEditingDescriptor.markdownToXml("""
>>"quotes" aren't `fun`<<
@@ -840,13 +854,15 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
(x) "isn't" {{ "hello" }}
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
"quotes" aren't `fun`
-
- "hello" isn't
- "isn't" "hello"
+
+
+ "hello" isn't
+
+ "isn't" "hello"
+
@@ -862,18 +878,20 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
(x) b
that (y)
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q1
-
this (x)
-
- a (hint)
+
+
this (x)
+
+ a (hint)
+ b
+
that (y)
-
that (y)
+
""")
@@ -886,18 +904,19 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
[x] b {{ this hint passes through }}
that []
""")
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q1
-
this [x]
-
+
+
this [x]
+ a [square]b {{ this hint passes through }}
+
that []
-
that []
+
""")
@@ -907,7 +926,7 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
markdown = """
>>q22<<
- [[
+ [[
(x) {{ hintx
these
span
@@ -919,18 +938,20 @@ describe 'Markdown to xml extended hint with tricky syntax cases', ->
"""
markdown = markdown.replace(/\n/g, '\r\n') # make DOS line endings
data = MarkdownEditingDescriptor.markdownToXml(markdown)
- expect(data).toEqual("""
+ expect(data).toXMLEqual("""
-
q22
-
-
-
-
+
+
+
+
+
-
-
+
+
""")
-
diff --git a/common/lib/xmodule/xmodule/js/src/problem/edit.coffee b/common/lib/xmodule/xmodule/js/src/problem/edit.coffee
index deec7d7282..a7aeeb1706 100644
--- a/common/lib/xmodule/xmodule/js/src/problem/edit.coffee
+++ b/common/lib/xmodule/xmodule/js/src/problem/edit.coffee
@@ -192,11 +192,15 @@ class @MarkdownEditingDescriptor extends XModule.Descriptor
else
return template
-
@markdownToXml: (markdown)->
+ # it will contain ... tags
+ demandHintTags = [];
toXml = `function (markdown) {
var xml = markdown,
i, splits, scriptFlag;
+ var responseTypes = [
+ 'optionresponse', 'multiplechoiceresponse', 'stringresponse', 'numericalresponse', 'choiceresponse'
+ ];
// fix DOS \r\n line endings to look like \n
xml = xml.replace(/\r\n/g, '\n');
@@ -212,6 +216,7 @@ class @MarkdownEditingDescriptor extends XModule.Descriptor
for (i = 0; i < options.length; i += 1) {
var inner = /\s*\|\|(.*?)\|\|/.exec(options[i]);
if (inner) {
+ //safe-lint: disable=javascript-concat-html
demandhints += ' ' + inner[1].trim() + '\n';
}
}
@@ -524,7 +529,8 @@ class @MarkdownEditingDescriptor extends XModule.Descriptor
.replace(/>/g, '>')
.replace(/"/g, '"')
.replace(/'/g, ''');
- line = line.replace(/>>|< tag
+ line = line.replace(/>>(.*?)<, "
$1
");
} else if (line.match(/<\w+response/) && didinput && curlabel == prevlabel) {
// reset label to prevent gobbling up previous one (if multiple questions)
curlabel = '';
@@ -570,12 +576,71 @@ class @MarkdownEditingDescriptor extends XModule.Descriptor
// if we've come across demand hints, wrap in at the end
if (demandhints) {
- demandhints = '\n\n' + demandhints + '';
+ demandHintTags.push(demandhints);
}
- // make all elements descendants of a single problem element
- xml = '\n' + xml + demandhints + '\n';
+ // make selector to search responsetypes in xml
+ var responseTypesSelector = responseTypes.join(', ');
+ // make temporary xml
+ // safe-lint: disable=javascript-concat-html
+ var $xml = $($.parseXML('' + xml + ''));
+ responseType = $xml.find(responseTypesSelector);
+
+ // convert if there is only one responsetype
+ if (responseType.length === 1) {
+ var inputtype = responseType[0].firstElementChild
+ // used to decide whether an element should be placed before or after an inputtype
+ var beforeInputtype = true;
+
+ _.each($xml.find('prob').children(), function(child, index){
+ // we don't want to add the responsetype again into new xml
+ if (responseType[0].nodeName === child.nodeName) {
+ beforeInputtype = false;
+ return;
+ }
+
+ // replace
tag for question title with
+
+
+
+
Example Hidden Explanation
+
You can provide additional information that only appears at certain times by including a "showhide" flag.
This is a hidden explanation. It can contain equations, such as [mathjaxinline]\alpha = \frac{2}{\sqrt {1+\gamma }}[/mathjaxinline].
+
This is additional text after the hidden explanation.
+
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/multiplechoice.yaml b/common/lib/xmodule/xmodule/templates/problem/multiplechoice.yaml
index bc9aa26e35..8b3cf4269d 100644
--- a/common/lib/xmodule/xmodule/templates/problem/multiplechoice.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/multiplechoice.yaml
@@ -22,32 +22,30 @@ metadata:
The population of Germany is approximately 81 million.
[explanation]
-data: |
-
-
Multiple choice problems allow learners to select only one option.
- Learners can see all the options along with the problem text.
-
When you add the problem, be sure to select Settings
- to specify a Display Name and other values that apply.
-
You can use the following example problem as a model.
-
Which of the following countries has the largest population?
-
-
- Brazil
- timely feedback -- explain why an almost correct answer is wrong
-
- Germany
- Indonesia
- Russia
-
-
-
-
-
Explanation
-
According to September 2014 estimates:
-
The population of Indonesia is approximately 250 million.
-
The population of Brazil is approximately 200 million.
-
The population of Russia is approximately 146 million.
-
The population of Germany is approximately 81 million.
-
-
-
\ No newline at end of file
+data: |
+
+
+
Multiple choice problems allow learners to select only one option. Learners can see all the options along with the problem text.
+
When you add the problem, be sure to select Settings to specify a Display Name and other values that apply.
+
You can use the following example problem as a model.
+ Which of the following countries has the largest population?
+
+ Brazil
+ timely feedback -- explain why an almost correct answer is wrong
+
+ Germany
+ Indonesia
+ Russia
+
+
+
+
Explanation
+
According to September 2014 estimates:
+
The population of Indonesia is approximately 250 million.
+
The population of Brazil is approximately 200 million.
+
The population of Russia is approximately 146 million.
+
The population of Germany is approximately 81 million.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/multiplechoice_hint.yaml b/common/lib/xmodule/xmodule/templates/problem/multiplechoice_hint.yaml
index 32c0fbb1c7..a05c0552d2 100644
--- a/common/lib/xmodule/xmodule/templates/problem/multiplechoice_hint.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/multiplechoice_hint.yaml
@@ -21,26 +21,32 @@ metadata:
||A fruit contains seeds of the plant.||
hinted: true
-data: |
-
-
-
You can provide feedback for each option in a multiple choice problem.
+data: |
+
+
+
You can provide feedback for each option in a multiple choice problem.
+
You can also add hints for learners.
+
Be sure to select Settings to specify a Display Name and other values that apply.
+
Use the following example problem as a model.
+ Which of the following is a vegetable?
+
+ apple
+ An apple is the fertilized ovary that comes from an apple tree and contains seeds, meaning it is a fruit.
+
+ pumpkin
+ A pumpkin is the fertilized ovary of a squash plant and contains seeds, meaning it is a fruit.
+
+ potato
+ A potato is an edible part of a plant in tuber form and is a vegetable.
+
+ tomato
+ Many people mistakenly think a tomato is a vegetable. However, because a tomato is the fertilized ovary of a tomato plant and contains seeds, it is a fruit.
+
+
+
-
You can also add hints for learners.
-
-
Use the following example problem as a model.
-
-
Which of the following is a vegetable?
-
-
- apple An apple is the fertilized ovary that comes from an apple tree and contains seeds, meaning it is a fruit.
- pumpkin A pumpkin is the fertilized ovary of a squash plant and contains seeds, meaning it is a fruit.
- potato A potato is an edible part of a plant in tuber form and is a vegetable.
- tomato Many people mistakenly think a tomato is a vegetable. However, because a tomato is the fertilized ovary of a tomato plant and contains seeds, it is a fruit.
-
-
-
- A fruit is the fertilized ovary from a flower.
- A fruit contains seeds of the plant.
-
-
+
+ A fruit is the fertilized ovary from a flower.
+ A fruit contains seeds of the plant.
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/numericalresponse.yaml b/common/lib/xmodule/xmodule/templates/problem/numericalresponse.yaml
index 324de1fe2e..72bd66e7e2 100644
--- a/common/lib/xmodule/xmodule/templates/problem/numericalresponse.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/numericalresponse.yaml
@@ -15,44 +15,46 @@ metadata:
= 9.3*10^7
or= 9.296*10^7
+ [explanation]
+ The sun is 93,000,000, or 9.3*10^7, miles away from Earth.
+ [explanation]
+
+ ---
+
>>The square of what number is -100?<<
= 10*i
[explanation]
- The sun is 93,000,000, or 9.3*10^7, miles away from Earth.
-100 is the square of 10 times the imaginary number, i.
[explanation]
-data: |
-
+data: |
+
+
+
In a numerical input problem, learners enter numbers or a specific and relatively simple mathematical expression. Learners enter the response in plain text, and the system then converts the text to a symbolic expression that learners can see below the response field.
+
The system can handle several types of characters, including basic operators, fractions, exponents, and common constants such as i. You can refer learners to
+ Entering Mathematical and Scientific Expressionsin the EdX Learner's Guide
+ for information about how to enter text into the field.
+
When you add the problem, be sure to select Settings to specify a Display Name and other values that apply.
+
You can use the following example problems as models.
+ How many miles away from Earth is the sun? Use scientific notation to answer.
+
+
+
+
Explanation
+
The sun is 93,000,000, or 9.3*10^7, miles away from Earth.
+
+
+
-
In a numerical input problem, learners enter numbers or a specific and
- relatively simple mathematical expression. Learners enter the response in
- plain text, and the system then converts the text to a symbolic expression
- that learners can see below the response field.
-
-
The system can handle several types of characters, including basic
- operators, fractions, exponents, and common constants such as i. You can
- refer learners to
- Entering Mathematical and Scientific Expressions in the EdX Learner's Guide for information about how to enter text into the field.
-
When you add the problem, be sure to select Settings
- to specify a Display Name and other values that apply.
-
-
You can use the following example problems as models.
-
How many miles away from Earth is the sun? Use scientific notation to answer.
-
-
-
-
-
The square of what number is -100?
-
-
-
-
-
-
Explanation
-
The sun is 93,000,000, or 9.3*10^7, miles away from Earth.
-
-100 is the square of 10 times the imaginary number, i.
-
-
-
+
+ The square of what number is -100?
+
+
+
+
Explanation
+
-100 is the square of 10 times the imaginary number, i.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/numericalresponse_hint.yaml b/common/lib/xmodule/xmodule/templates/problem/numericalresponse_hint.yaml
index b87732922c..fff5cd281f 100644
--- a/common/lib/xmodule/xmodule/templates/problem/numericalresponse_hint.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/numericalresponse_hint.yaml
@@ -2,7 +2,7 @@
metadata:
display_name: Numerical Input with Hints and Feedback
markdown: |
-
+
You can provide feedback for correct answers in numerical input problems. You cannot provide feedback for incorrect answers.
Use feedback for the correct answer to reinforce the process for arriving at the numerical value.
@@ -25,30 +25,27 @@ metadata:
[explanation]
hinted: true
-data: |
-
-
-
You can provide feedback for correct answers in numerical input problems. You cannot provide feedback for incorrect answers.
+data: |
+
+
+
You can provide feedback for correct answers in numerical input problems. You cannot provide feedback for incorrect answers.
+
Use feedback for the correct answer to reinforce the process for arriving at the numerical value.
+
You can also add hints for learners.
+
Be sure to select Settings to specify a Display Name and other values that apply.
+
Use the following example problem as a model.
+ What is the arithmetic mean for the following set of numbers? (1, 5, 6, 3, 5)
+
+ The mean for this set of numbers is 20 / 5, which equals 4.
+
+
+
Explanation
+
The mean is calculated by summing the set of numbers and dividing by n. In this case: (1 + 5 + 6 + 3 + 5) / 5 = 20 / 5 = 4.
+
+
+
-
Use feedback for the correct answer to reinforce the process for arriving at the numerical value.
-
-
Use the following example problem as a model.
-
-
What is the arithmetic mean for the following set of numbers? (1, 5, 6, 3, 5)
-
-
- The mean for this set of numbers is 20 / 5, which equals 4.
-
-
-
-
-
Explanation
-
The mean is calculated by summing the set of numbers and dividing by n. In this case: (1 + 5 + 6 + 3 + 5) / 5 = 20 / 5 = 4.
-
-
-
-
- The mean is calculated by summing the set of numbers and dividing by n.
- n is the count of items in the set.
-
-
\ No newline at end of file
+
+ The mean is calculated by summing the set of numbers and dividing by n.
+ n is the count of items in the set.
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/optionresponse.yaml b/common/lib/xmodule/xmodule/templates/problem/optionresponse.yaml
index 50ec67b528..60d40cd132 100644
--- a/common/lib/xmodule/xmodule/templates/problem/optionresponse.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/optionresponse.yaml
@@ -8,29 +8,26 @@ metadata:
You can use the following example problem as a model.
- >>Which of the following countries celebrates its independence on August 15?<<
+ >>Which of the following countries celebrates its independence on August 15?<<
- [[(India), Spain, China, Bermuda]]
-
- [explanation]
- India became an independent nation on August 15, 1947.
- [explanation]
-data: |
-
-
Dropdown problems allow learners to select only one option from a list of options.
-
When you add the problem, be sure to select Settings
- to specify a Display Name and other values that apply.
-
You can use the following example problem as a model.
-
Which of the following countries celebrates its independence on August 15?
-
-
-
-
-
-
-
Explanation
-
India became an independent nation on August 15, 1947.
-
-
-
+ [[(India), Spain, China, Bermuda]]
+ [explanation]
+ India became an independent nation on August 15, 1947.
+ [explanation]
+data: |
+
+
+
Dropdown problems allow learners to select only one option from a list of options.
+
When you add the problem, be sure to select Settings to specify a Display Name and other values that apply.
+
You can use the following example problem as a model.
+ Which of the following countries celebrates its independence on August 15?
+
+
+
+
Explanation
+
India became an independent nation on August 15, 1947.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/optionresponse_hint.yaml b/common/lib/xmodule/xmodule/templates/problem/optionresponse_hint.yaml
index 139745f69f..80169409d8 100644
--- a/common/lib/xmodule/xmodule/templates/problem/optionresponse_hint.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/optionresponse_hint.yaml
@@ -24,28 +24,32 @@ metadata:
||A fruit contains seeds of the plant.||
hinted: true
-data: |
-
+data: |
+
+
+
You can provide feedback for each available option in a dropdown problem.
+
You can also add hints for learners.
+
Be sure to select Settings to specify a Display Name and other values that apply.
+
Use the following example problem as a model.
+ A/an ________ is a vegetable.
+
+
+
+
+
+
+
-
You can provide feedback for each available option in a dropdown problem.
-
-
You can also add hints for learners.
-
-
Use the following example problem as a model.
-
-
A/an ________ is a vegetable.
-
-
-
-
-
-
-
-
-
-
-
- A fruit is the fertilized ovary from a flower.
- A fruit contains seeds of the plant.
-
-
+
+ A fruit is the fertilized ovary from a flower.
+ A fruit contains seeds of the plant.
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/problem_with_hint.yaml b/common/lib/xmodule/xmodule/templates/problem/problem_with_hint.yaml
index e8981840b9..1f04d45f73 100644
--- a/common/lib/xmodule/xmodule/templates/problem/problem_with_hint.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/problem_with_hint.yaml
@@ -4,13 +4,12 @@ metadata:
markdown: !!null
data: |
-
-
-
Problem With Adaptive Hint
-
-
- This problem demonstrates a question with hints, based on using the hintfn method.
-
-
- What is the best programming language that exists today? You may enter your answer in upper or lower case, with or without quotes.
-
-
-
-
-
-
+ What is the best programming language that exists today? You may enter your answer in upper or lower case, with or without quotes.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/problem_with_hint_in_latex.yaml b/common/lib/xmodule/xmodule/templates/problem/problem_with_hint_in_latex.yaml
index b4f5bda678..0c3efad132 100644
--- a/common/lib/xmodule/xmodule/templates/problem/problem_with_hint_in_latex.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/problem_with_hint_in_latex.yaml
@@ -49,12 +49,11 @@ metadata:
markdown: !!null
data: |
-
-
-
Problem With Adaptive Hint
-
-
- This problem demonstrates a question with hints, based on using the hintfn method.
+
+
Problem With Adaptive Hint
+
This problem demonstrates a question with hints, based on using the hintfn method.
+
+
-
- What is the best programming language that exists today? You may enter your answer in upper or lower case, with or without quotes.
-
-
-
-
-
-
+ What is the best programming language that exists today? You may enter your answer in upper or lower case, with or without quotes.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/string_response.yaml b/common/lib/xmodule/xmodule/templates/problem/string_response.yaml
index 223cf22d5b..0c7667a405 100644
--- a/common/lib/xmodule/xmodule/templates/problem/string_response.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/string_response.yaml
@@ -3,7 +3,7 @@ metadata:
display_name: Text Input
markdown: |
In text input problems, also known as "fill-in-the-blank" problems, learners enter text into a response field. The text can include letters and characters such as punctuation marks. The text that the learner enters must match your specified answer text exactly. You can specify more than one correct answer. Learners must enter a response that matches one of the correct answers exactly.
-
+
When you add the problem, be sure to select Settings to specify a Display Name and other values that apply.
You can use the following example problem as a model.
@@ -18,26 +18,21 @@ metadata:
Nanjing Higher Normal Institute first admitted female students in 1920.
[explanation]
-data: |
-
-
In text input problems, also known as "fill-in-the-blank" problems,
- learners enter text into a response field. The text that the learner enters
- must match your specified answer text exactly. You can specify more than
- one correct answer. Learners must enter a response that matches one of the
- correct answers exactly.
-
When you add the problem, be sure to select Settings
- to specify a Display Name and other values that apply.
-
You can use the following example problem as a model.
-
What was the first post-secondary school in China to allow both male and female students?
-
- National Central University
- Nanjing University
-
-
-
-
-
Explanation
-
Nanjing Higher Normal Institute first admitted female students in 1920.
-
-
-
+data: |
+
+
+
In text input problems, also known as "fill-in-the-blank" problems, learners enter text into a response field. The text that the learner enters must match your specified answer text exactly. You can specify more than one correct answer. Learners must enter a response that matches one of the correct answers exactly.
+
When you add the problem, be sure to select Settings to specify a Display Name and other values that apply.
+
You can use the following example problem as a model.
+ What was the first post-secondary school in China to allow both male and female students?
+ National Central University
+ Nanjing University
+
+
+
+
Explanation
+
Nanjing Higher Normal Institute first admitted female students in 1920.
+
+
+
+
diff --git a/common/lib/xmodule/xmodule/templates/problem/string_response_hint.yaml b/common/lib/xmodule/xmodule/templates/problem/string_response_hint.yaml
index bbc6d5006f..415458ffc2 100644
--- a/common/lib/xmodule/xmodule/templates/problem/string_response_hint.yaml
+++ b/common/lib/xmodule/xmodule/templates/problem/string_response_hint.yaml
@@ -1,7 +1,7 @@
---
metadata:
display_name: Text Input with Hints and Feedback
- markdown: |
+ markdown: |
You can provide feedback for the correct answer in text input problems, as well as for specific incorrect answers.
@@ -24,31 +24,21 @@ metadata:
||Consider all 50 states, not just the continental United States.||
hinted: true
-data: |
-
-
-
You can provide feedback for the correct answer in text input problems, as well as for specific incorrect answers.
-
-
Use feedback on expected incorrect answers to address common misconceptions and to provide guidance on how to arrive at the correct answer.
-
-
Use the following example problem as a model.
-
-
Which U.S. state has the largest land area?
-
-
-
- Alaska is 576,400 square miles, more than double the land area of the second largest state, Texas.
-
- While many people think Texas is the largest state, it is actually the second largest, with 261,797 square miles.
-
- California is the third largest state, with 155,959 square miles.
-
-
-
-
-
- Consider the square miles, not population.
- Consider all 50 states, not just the continental United States.
-
-
-
+data: |
+
+
+
You can provide feedback for the correct answer in text input problems, as well as for specific incorrect answers.
+
Use feedback on expected incorrect answers to address common misconceptions and to provide guidance on how to arrive at the correct answer.
+
Be sure to select Settings to specify a Display Name and other values that apply.
+
Use the following example problem as a model.
+ Which U.S. state has the largest land area?
+ Alaska is 576,400 square miles, more than double the land area of the second largest state, Texas.
+ While many people think Texas is the largest state, it is actually the second largest, with 261,797 square miles.
+ California is the third largest state, with 155,959 square miles.
+
+
+
+ Consider the square miles, not population.
+ Consider all 50 states, not just the continental United States.
+
+
diff --git a/common/static/common/js/spec_helpers/jasmine-extensions.js b/common/static/common/js/spec_helpers/jasmine-extensions.js
index 5d5a8407ce..3afc2ed7d5 100644
--- a/common/static/common/js/spec_helpers/jasmine-extensions.js
+++ b/common/static/common/js/spec_helpers/jasmine-extensions.js
@@ -86,6 +86,15 @@
};
}
};
+ },
+ toXMLEqual: function() {
+ return {
+ compare: function(actual, expected) {
+ return {
+ pass: actual.replace(/\s+/g, '') === expected.replace(/\s+/g, '')
+ };
+ }
+ };
}
});
});
diff --git a/lms/templates/problem.html b/lms/templates/problem.html
index 55e1e77c78..1b912ad821 100644
--- a/lms/templates/problem.html
+++ b/lms/templates/problem.html
@@ -12,9 +12,7 @@ from openedx.core.djangolib.markup import HTML
-
- ${ HTML(problem['html']) }
-
+ ${ HTML(problem['html']) }
% if demand_hint_possible:
From 3556f2a3ea9cdec97ffed966c77b1cb9b9808c0d Mon Sep 17 00:00:00 2001
From: Ehtesham
Date: Tue, 21 Jun 2016 16:26:00 +0500
Subject: [PATCH 2/8] make CAPA problems (MultipleChoice and Checkboxes)
accessible
---
common/lib/capa/capa/capa_problem.py | 19 +++-
common/lib/capa/capa/inputtypes.py | 5 +-
common/lib/capa/capa/responsetypes.py | 14 ++-
.../lib/capa/capa/templates/choicegroup.html | 87 +++++++++++--------
.../lib/capa/capa/tests/test_html_render.py | 32 ++++++-
.../capa/capa/tests/test_input_templates.py | 25 +++---
.../xmodule/js/src/capa/display.coffee | 9 +-
common/test/acceptance/pages/lms/problem.py | 22 +++++
.../tests/lms/test_certificate_web_view.py | 6 +-
.../acceptance/tests/lms/test_lms_problems.py | 2 +-
.../tests/lms/test_problem_types.py | 70 +++++++--------
11 files changed, 183 insertions(+), 108 deletions(-)
diff --git a/common/lib/capa/capa/capa_problem.py b/common/lib/capa/capa/capa_problem.py
index 4a719a5711..15dd3f538a 100644
--- a/common/lib/capa/capa/capa_problem.py
+++ b/common/lib/capa/capa/capa_problem.py
@@ -176,7 +176,7 @@ class LoncapaProblem(object):
# transformations. This also creates the dict (self.responders) of Response
# instances for each question in the problem. The dict has keys = xml subtree of
# Response, values = Response instance
- self._preprocess_problem(self.tree)
+ self.problem_data = self._preprocess_problem(self.tree)
if not self.student_answers: # True when student_answers is an empty dict
self.set_initial_display()
@@ -752,7 +752,10 @@ class LoncapaProblem(object):
if problemtree.tag in inputtypes.registry.registered_tags():
# If this is an inputtype subtree, let it render itself.
- status = "unsubmitted"
+ response_id = self.problem_id + '_' + problemtree.get('response_id')
+ response_data = self.problem_data[response_id]
+
+ status = 'unsubmitted'
msg = ''
hint = ''
hintmode = None
@@ -766,7 +769,7 @@ class LoncapaProblem(object):
hintmode = self.correct_map.get_hintmode(pid)
answervariable = self.correct_map.get_property(pid, 'answervariable')
- value = ""
+ value = ''
if self.student_answers and problemid in self.student_answers:
value = self.student_answers[problemid]
@@ -780,6 +783,7 @@ class LoncapaProblem(object):
'id': input_id,
'input_state': self.input_state[input_id],
'answervariable': answervariable,
+ 'response_data': response_data,
'feedback': {
'message': msg,
'hint': hint,
@@ -836,6 +840,7 @@ class LoncapaProblem(object):
Obtain all responder answers and save as self.responder_answers dict (key = response)
"""
response_id = 1
+ problem_data = {}
self.responders = {}
for response in tree.xpath('//' + "|//".join(responsetypes.registry.registered_tags())):
response_id_str = self.problem_id + "_" + str(response_id)
@@ -857,6 +862,12 @@ class LoncapaProblem(object):
entry.attrib['id'] = "%s_%i_%i" % (self.problem_id, response_id, answer_id)
answer_id = answer_id + 1
+ # Find the label and save it for html transformation step
+ responsetype_label = response.find('label')
+ problem_data[self.problem_id + '_' + str(response_id)] = {
+ 'label': responsetype_label.text if responsetype_label is not None else ''
+ }
+
# instantiate capa Response
responsetype_cls = responsetypes.registry.get_class_for_tag(response.tag)
responder = responsetype_cls(response, inputfields, self.context, self.capa_system, self.capa_module)
@@ -881,3 +892,5 @@ class LoncapaProblem(object):
for solution in tree.findall('.//solution'):
solution.attrib['id'] = "%s_solution_%i" % (self.problem_id, solution_id)
solution_id += 1
+
+ return problem_data
diff --git a/common/lib/capa/capa/inputtypes.py b/common/lib/capa/capa/inputtypes.py
index f5d079ac64..53ff76c4d5 100644
--- a/common/lib/capa/capa/inputtypes.py
+++ b/common/lib/capa/capa/inputtypes.py
@@ -224,7 +224,8 @@ class InputTypeBase(object):
self.hint = feedback.get('hint', '')
self.hintmode = feedback.get('hintmode', None)
self.input_state = state.get('input_state', {})
- self.answervariable = state.get("answervariable", None)
+ self.answervariable = state.get('answervariable', None)
+ self.response_data = state.get('response_data', None)
# put hint above msg if it should be displayed
if self.hintmode == 'always':
@@ -316,8 +317,10 @@ class InputTypeBase(object):
'value': self.value,
'status': Status(self.status, self.capa_system.i18n.ugettext),
'msg': self.msg,
+ 'response_data': self.response_data,
'STATIC_URL': self.capa_system.STATIC_URL,
}
+
context.update(
(a, v) for (a, v) in self.loaded_attributes.iteritems() if a in self.to_render
)
diff --git a/common/lib/capa/capa/responsetypes.py b/common/lib/capa/capa/responsetypes.py
index 921148b1f7..15112853cb 100644
--- a/common/lib/capa/capa/responsetypes.py
+++ b/common/lib/capa/capa/responsetypes.py
@@ -250,8 +250,18 @@ class LoncapaResponse(object):
- renderer : procedure which produces HTML given an ElementTree
- response_msg: a message displayed at the end of the Response
"""
- # render ourself as a + our content
- tree = etree.Element('span')
+ _ = self.capa_system.i18n.ugettext
+
+ # get responsetype index to make responsetype label
+ response_index = self.xml.attrib['id'].split('_')[-1]
+ # Translators: index here could be 1,2,3 and so on
+ response_label = _(u'Question {index}').format(index=response_index)
+
+ # wrap the content inside a section
+ tree = etree.Element('section')
+ tree.set('class', 'wrapper-problem-response')
+ tree.set('tabindex', '-1')
+ tree.set('aria-label', response_label)
# problem author can make this span display:inline
if self.xml.get('inline', ''):
diff --git a/common/lib/capa/capa/templates/choicegroup.html b/common/lib/capa/capa/templates/choicegroup.html
index 55314db731..f3ce336b54 100644
--- a/common/lib/capa/capa/templates/choicegroup.html
+++ b/common/lib/capa/capa/templates/choicegroup.html
@@ -1,43 +1,56 @@
+<%
+ def is_radio_input(choice_id):
+ return input_type == 'radio' and ((isinstance(value, basestring) and (choice_id == value)) or (
+ not isinstance(value, basestring) and choice_id in value
+ ))
+%>