Analysis of a survey
given to Math 4 Students (Algebra)
at the Conclusion of the Course, Fall 2001
Computing and Information Services
University of Missouri – Rolla
15 November 2001
At the conclusion of the Math 4 class taught during the Fall Semester 2001, we issued a survey to all of the students. In this survey, we wanted to determine--from a student’s viewpoint--the effectiveness of the new BrainTrax Algebra Brain. Our determination of effectiveness hinges upon the premise that students who know more about algebra are going to be more successful in the course.
This survey was administered to 220 students. Of those, 92 students notified us that they did not use the system at all. These students were instructed not to complete the survey.
Figure 1a: Math 4 Fall 2001 Usage (entire class)
The following pages hold charts that reflect the responses of the remaining 128 students, both with and without regard of their visit frequency. Also in this report, we will draw conclusions based on student usage.
Figure 1b: Math 4 Fall 2001 Usage (participants)
Each of the following pages contains the analysis of a single question, or premise, we submitted to the survey participants.
“I Learned Something New from the Algebra Brain.”
In this question, we were seeking the student opinion about the teaching effectiveness of the Brain for concepts not presented by a human.
Figure 2a: Learned Something New
Discarding the “no opinion” votes as neutral, students agreed at better than a 2 to 1 ratio (59/23) that this system did in fact teach them “new” things about algebra. We were not surprised to observe that, in general, the more frequently the student visited the site, the more strongly they agreed with this statement. The following chart illustrates this point:
Figure 2b: Learned Something New
For those students with less than 5 visits, for whatever reason, a majority (18/12) disagreed with our premise that they learned something new. However, students falling into the 5 to 20 visit category agreed at better than an 8 to 1 ratio (33/4) that they did in fact learn something new at the site. An even stronger ratio (11/1) of agreement developed out of the 20 to 50 visit category.
Clearly the students who used this system did learn from it. While there are likely many reasons that that the students did not use the system in a uniformly large volume, for those who did, the learning experience with regard to algebra was enhanced.
“The Algebra Brain Clarified Something I Didn’t Understand in Class”
In this question, we wanted to know if our method of delivery was more effective than the instructor in clarifying a topic, once it was introduced by a human.
Figure 3a: The Algebra Brain Clarified Something Not Clear in Lecture
Discarding the “no opinion” votes as neutral, students agreed at better than a 3 to 1 ratio (65/17) that this system clarified concepts covered in class, but not fully understood by the student. The usage trend continues here, with students having a higher visit count trending toward stronger agreement of our premise.
Figure 3b: The Algebra Brain Clarified Something Not Clear in Lecture
A noticeable change in the pattern occurs here in the lowest volume category, where even students with a very few visits have a majority who agree with our premise (18/13). As the usage grows, so does the majority, with the 5 to 20 visit group agreeing at better than an 8 to 1 ratio (34/4), and the 20 to 50 group continuing with the strongest ratio (11/1).
“The Brain Interface Made It Easier to Find What I Wanted.”
Next, we were interested in the effect of the Brain navigation tool itself. While we have observed students use this tool effectively, we were interested in seeing what the students thought of it in person.
Figure 4a: Brain Interface Helped
Again discarding the no-opinion votes, students reported at well over a 2 to 1 ratio (63/25) that this tool did make it easier for them to find the information they needed inside the site. The usage trend again continues here, with students having a higher visit count trending toward stronger agreement of our premise.
Figure 4b: Brain Interface Helped
The previously noted change in the pattern continues here; students with very few visits still maintain a majority who agree with our premise (21/13). As the usage grows to the next level, so does the majority, with the 5 to 20 visit group agreeing at nearly a 4 to 1 ratio (32/9), and the 20 to 50 visit group maintaining a strong majority opinion, at just under a 3 to 1 ratio (8/3).
“The Web Pages Were Easily Understood.”
Regarding the content, we were interested in seeing how effective our writing and illustrations were in conveying the information to the student, so we asked about the understandability of the web pages. Notice that there is more to this question than meets the eye. Students attending UMR have a significant span in comprehension in both English and basic algebra. It is our intent that the web pages meet the needs of the entire range of student capabilities in these two areas.
Figure 5a: Pages Easily Understood
Again discarding the no-opinion votes, students reported at better than a 4 to 1 ratio (76/16) that our algebra content was effective in communicating at the student’s level of understanding.
Figure 5b: Pages Easily Understood
For this issue, even the infrequent visitors agreed strongly with our premise, giving a 4 to 1 ratio (28/7) in the agree column. As the usage grows, so does the majority, with the 5 to 20 visit group agreeing at nearly a 5 to 1 ratio (38/8), and the 20 to 50 visit group maintaining a strong majority opinion of agreement (9/1).
“The Web Pages Increased My Comprehension of Algebra.”
Next, we wanted to know how effective that communication was, so we asked the students (again), whether or not the web pages increased their comprehension of algebra. Note that we have asked this question three different ways to ensure that we are receiving valid responses about how effective this site is.
Figure 6a: Pages Increased Comprehension of Algebra
Staying with our pattern of discarding no-opinion votes, students this time reported at better than a 4 to 1 ratio (69/15) that our algebra content was effective in communicating at the student’s level of understanding.
Figure 6b: Pages Increased Comprehension of Algebra
For this issue, the strength of the infrequent visitors dropped sharply to just under a 2 to 1 ratio (19/11) in the agree column. However, in keeping with earlier observations, as the usage grows, so does the majority, with the 5 to 20 visit group agreeing at over a 9 to 1 ratio (38/4), and the 20 to 50 visit group joins the 50 to 100 visit group with no disagreement at all.
"I Would Like to Have More of My Classes Use Brain Technology to Support Them."
Finally, we asked students whether or not they wanted BrainTrax systems to support other classes. Student opinions were not quite as strong on this issue.
Figure 7a: BrainTrax for Other Classes?
By continuing to disregard the no-opinion votes, students reported as nearly a 7 to 1 ratio (68/10) that this type of system works for them, and they would like to see more of it in place.
In other words, we interpret this to mean that 68 out of the 220 students who were enrolled in Math 4, or over 30% of the entire class of students, including those who didn't use it, see value in this system for helping them to learn. This number changes to an even more compelling 53% when considering only the students who used the system.
Following the earlier demonstrated trend, those who use the system more tend to see it as more valuable. They may use it more because they see it to be more valuable.
Figure 7b: BrainTrax for Other Classes?
There was a large "no-opinion" group for this premise, giving rise to several questions that will require further study.
Individual BrainTrax Attributes
We were also interested in determining which components of the BrainTrax Algebra system the students found useful, so we asked them to evaluate each of the primary components with a rating between 1 and 5, with 1 being "not useful at all" and 5 being "most useful".
Figure 8: Usefulness of Brain Components
An observation about this chart is in order: Note that the Interactive Example and Testing System (IETS) is much more highly valued by the students than anything else on the chart.
As we had expected, the "terms" capability, where a term's definition pops up when the student places the mouse pointer over it, was also a popular item.
It is my somewhat biased opinion that these figures indicate that the BrainTrax system does, in fact, perform in the manner we intended it to. It can be enhanced to perform better. These enhancements will be addressed as they are identified, provided continued funding for this effort is available.
Comments regarding this report, including the conclusions drawn, are welcome.
Copyright © 1999 - 2003 University of Missouri – Rolla
All rights reserved.