This page last changed on May 20, 2008 by sfentress.

Thoughts about UDL Reports Available for Teachers and Researchers

via the Portal

Andy, 5/15/08
Ed and I brainstormed about UDL reports that would be useful to researchers and teachers.  The following ideas might be discussed at a UDL staff meeting.

Science content-related reports

First, a vocabulary problem is arising that needs to be fixed.  Sometimes units (like friction34, i.e. friction blue) are being referred to as activities.  This will cause confusion.  In the screenshot below (from the portal), the columns to the right of students' names are Activity 1, Activity 2, and Activity 3 - but a mouseover shows that the columns, in fact, represent UDL grades 5-6 electricity, friction, and plants units.  The labels in the top row should read Unit 1, Unit 2, and Unit 3---or, if it's feasible, electricity 56, friction 56, plants 56.
In addition to knowing which units students have been using, there should be a way for teachers and researchers to find out which activities each student has done within a unit.  Perhaps by clicking on the check mark next to a student's name for that unit, this detail could be represented on a separate screen with 10 columns next to each student's name, like this:

  Pre- Intro Story Activ 1 Activ 2 Activ 3 Activ 4 Math Wrapup Post-
Student 1 ? ?                
Student 2                    

(There might be more than one way to use the portal to find this type of information.  For example, the portal currently includes a screen for an individual student showing which units he or she has used; clicking a check mark on that screen might lead to a detailed view of the student's engagement in the10 activities for that unit, per above.)
For units as well as for activities the software needs to decide whether or not a check mark should show up in a given cell.  A good decision rule might be whether the student has saved any data / input / response for that unit or activity. If so, show a check mark. If not, do not show a check mark. (Later, we could come up with a more sophisticated scheme, if we decide we need one.)  If a student simply opens the activity to look at it, that would not merit a check mark.

Through the portal, it is currently feasible for a teacher or researcher to see a particular student's saved work on a selected unit (e.g., friction 34).  That's useful.  A variety of ways to look at saved work might be useful to teachers and researchers, of which this is one:

  1. One student's work on a whole unit (as stated above and already implemented);
  2. The work of one student at a time on a particular item / activity, ideally with an easy way to move from one student to the next in the same class (e.g., the teacher or researcher might want to review item 6 in activity 3, one student at a time; or activity 3 as a whole, one student at a time);
  3. A summary view of one item, e.g. the constructed responses of all students answering a particular question, like item 6 in activity 3, with all students' responses displayed on a single (long) page;
  4. A score for a single student (such as how many multiple-choice items he or she got right on the pre-test or post-test, or both);
  5. A score for a class (such as the average pre- or post-test score for the class, or both).

In several of these cases (items 2 and 4 above, particularly), it will be useful to develop a UI for teachers and researchers to quickly move from one student to another within a class without having to navigate back and forth among several screens.  

UDL-related reports

Presumably all or most of the UDL-related reports would be for researchers, although our vision is that teachers need to be able to assign features to students (as shown, for example, in some of the original PowerPoint slides showing Setup by Class and by Student---see last page).

We have discussed the need for data collection through the portal that will allow us to do research associating students with their special needs (e.g., English language learner; identified as Special Ed with an IEP; poor reading skills).  Next year, a teacher should be required to enter this type of data for all students in his or her class or classes.  We need the information in order to associate behaviors (e.g., clicking "help" buttons) and outcomes (e.g., post-test scores) with students' learning needs.  (We also need a way to handle this information that protects students' confidentiality, and we need to brainstorm some more about that.  Perhaps these data are only provided to researchers in files that replace students' names with ID numbers.)

My understanding is that the MAC project produced hundreds of thousands or millions of bits of information, making it a challenge to analyze those data.  Our work on UDL will benefit from developing an analysis plan ahead of time---what data do we especially need to collect; why; and how do we expect to use it? 


For example, if we offer a choice of language, and if teachers get to designate which students use a language other than English, we would want to know which teachers took advantage of this feature, for which students, on which units?  We would analyze: do teachers tend to do this for all students in a class whose first language is not English, or only some?  We would want to include a survey or interview question for teachers who have used UDL units asking them if this feature was useful. 

Will students be allowed to turn the non-English language on and off at will?  That would raise more complex data collection and analysis issues; e.g., it would be far more difficult to associate a score, or time spent, or satisfaction on a unit with the language used by the student. Our research questions might be simple: how often do teachers use this feature, for which students, why, and how useful do they believe it is to those students?  (A complex research question, surely beyond the scope of this project, would be to test whether students who study a UDL science unit in their native language have better outcomes than those who study it in English if that is not their native language.  As usual, random assignment of limited-English proficient students to a condition would strengthen credibility of the findings.)


If teachers assign scaffolding levels, we would want similar information in some type of report: which teachers take advantage of the feature, for which students, on which units?  Later, we might ask them how and why they made those decisions, and how useful they believe the feature was to students.  If students can also select different scaffolding levels, we would want to know on which items they do that, and how (e.g., how many times they change to a different level).

Coaches and Technical Help

For coaches and technical help, we would want to know who uses them, at what point in which unit (i.e., on which page), and be able to aggregate easily (X% of this class, Y% of that one, used a coach for such-and-such unit, and most often they used it for this or that page).  We would want to gather information from teachers and students (via surveys and/or interviews or focus groups) about how useful they found the coaches and technical help.

Technical help may be multi-dimensional, and we want as much detail as feasible.  E.g., a student might be able to click on a smart graph to get different types of help. 

Time on Task

We should keep this very simple, perhaps collecting nothing more than the total cumulative time a student spends on a unit, between the time they start the pre-test to when they complete the post-test.  Their time using a unit may well accumulate over several school days.

Using the Wrap-up

We might be able to insert some research questions in each unit's wrap-up, such as asking whether the unit was too easy, too hard, or just right; or whether the technology worked.

How do we know a feature was useful?

It will be a challenge to learn what features are useful, not just how often they are used. Asking teachers and students is one feasible approach; analyzing which students used what features and correlating to their needs is another.

Appendix: Old UDL PowerPoint Slides Showing Setup (samples)

UDL_Reports.doc (application/msword)
Document generated by Confluence on Jan 27, 2014 16:49