Course Group Reports


Note: The CGRs reporting on each course group's activities as we go through the conversion to semesters are available elsewhere.

The most recent report for each course group may be accessed by following the corresponding link. Complete details of the course group mechanism appears below the list of reports.

  1. Software Spine (Sept. '07) (Report of Oct. '04; 1999-'00)
  2. Programming Languages (Dec. '08/Jan. '09) (Feb. '05; Dec. '99)
  3. Computer Graphics (Oct. '08) (Report of May '05) (Previous report)
  4. Software Engineering (Spring/Summer '09) (Au '05) (previous report)
  5. Theoretical foundations (Wi '09); (2006 report; (previous report)
  6. AI (May 2009) Oct. '05 (previous report)
  7. Networking (May '08) (Apr. '05; March '02)
  8. Database (Feb. '07) (previous report (May '03))
  9. Operating Systems (pdf) (June '05)
  10. Computer Architecture (Spring '09) (previous report)

1. Background and Details

This section was written in the early years of the CGR mechanism. Some important changes have been/are being made (in 2008/'09) in our assessment and evaluation mechanisms and these have translated into important changes in the content and structure of individual CGRs. These revisions are described below.

Many of the courses in our undergraduate and graduate programs are, of course, closely related to each other. Historically, coordinators of strongly related courses have interacted informally and on an as-needed basis, appraising each other of changes in one course that may have impact on a related course. Following extensive discussion in the Curriculum Committee and in the Undergraduate Studies Committee we established the mechanism of Course Group Reports (CGRs) to both help these efforts as well as to document them.

All regular CSE courses are organized into groups of related courses. Coordinators of courses in each group are expected to interact with each other and with faculty (including part-time faculty) who regularly teach the courses in question on a regular basis to keep track of any problems that might arise, or to identify any changes that might be appropriate to make in the courses, etc. Ideas for any substantial changes in any course will, naturally, have to be brought to the Curriculum Committee for action in the usual manner. In addition, the coordinators of each group will present a status report on this group of courses to the Curriculum Committee on a regular basis, perhaps once every two or three years. For each course in the group, the report should address such questions as:

These reports will provide a record of and rationale for changes that may take place in the various courses. The reports will also make it easy for new members of faculty, as well as new students, and others interested in our programs, to get a feel for why the courses are the way they are. Further details of the CGR mechanism are available.

The current set of course groups is as follows:

  1. Software Spine: CSE 221, 222, 321.
  2. Software Engineering: CSE 560, 757, 758, 601.
  3. Computer Architecture: CSE 360, 621, 675, 676, 721, 775, 778; EE 261, 206, 567.
  4. Theory: Math 366, 566; CSE 541, 625, 680, 725, 727, 780; Stat 427, 428.
  5. Databases: CSE 616, 670, 671, 770, 772.
  6. Programming Languages: CSE 459, 655, 755, 756.
  7. Operating Systems: CSE 660, 662, 741, 760, 762, 763.
  8. Computer Networks: CSE 677, 678, 679, 752, 777.
  9. Computer Graphics: CSE 581, 681, 781, 782, 784.
  10. Artificial Intelligence: CSE 612, 630, 730, 731, 732, 739, 779.

2. Revisions (Dec. 2008/Jan. 2009)

a. Changes in accreditation requirements:

Accreditation criteria continue to evolve. In particular, the CAC Criteria include a new set of outcomes that our program will have to meet. The Undergrad Studies Committee has proposed revising our current outcomes to be consistent with the new CAC requirements (while continuing to meet the EAC Criterion 3 requirements). The proposed new outcomes are available here; these are expected to be approved in the near future.

The revised CAC Criteria also require (as part of Criterion 5) that each required CS course include a set of "expected performance criteria (PCs)". These seem to be essentially the same as the "intended learning outcomes (LOs)" that are included in our course syllabi since each LO in any of our course syllabi is a specifc item of knowledge or skill at a specific expected level of accomplishment (mastery, familiarity, or exposure) which is what a PC also seem to be. Thus we already meet this requirement.

We should also note that ABET/EAC seems to have decided that an outcome can be assessed only if a set of PCs is defined corresponding to the outcome. The claim seems to be that it is the PCs that can be assessed, not the outcomes. Thus what we need to do is map each of our (proposed new) outcomes to a set of PCs/LOs. The most sensible way to do this would be to use the LOs specified in the syllabi of a set of required courses for this purpose; this will allow us to easily make the case that our (required) curriculum is designed to achieve our outcomes. In those cases where the mapping doesn't seem to work well with the existing LOs, we should revise the LOs of particular courses in appropriate ways. A preliminary look at the current LOs for various courses suggess that such revision is especially likely to be required for the capstone design courses. The reason is that these are the main courses (in the major) that contribute to the soft outcomes (team-working, communication skills etc). So we would expect to map these outcomes to the LOs of these courses. But, somewhat surprisingly, the existing LOs of some of the capstone courses do not mention any skill related to these outcomes. These will definitely have to be revised. And it won't be enough to just list things such as "ability to communicate" since that is an outcome, not a "performance criterion". So we need to come up with suitable PCs/LOs for each of the soft outcomes. One possiblity would be to use the dimensions in the rubrics used to evaluate particular activities in the capstone courses, along with some suitable variation of the mastery/familiarity/exposure scale, to formulate the LOs. This will have to be done soon. Once we do this, summaries of these rubric-based assessments in the various capstone courses can serve as the main assessment mechanism for these LOs and the soft outcomes. In turn, this means that we must evaluate these summaries regularly and collect and document them appropriately.

We should also make some changes in our assessment mechanisms. In particular, changes should be made in POCAT, the recently instituted "exit test" for BS-CSE majors. Currently, the questions on the POCAT are based on various required courses but they are not specifically designed to help assess student performance in the various LOs of those courses. Given the issues mentioned in the paragraphs above, it would make sense to change this. Corresponding to each LO in the syllabus of each required course (or at least the higher-level required courses), it would be useful to create one or more questions more or less directly related to that LO; if we are able to design questions that correspond to more than one LO, that is fine as long as performance in the question helps assess achievement of each corresponding outcome. Of course, not all of these questions can be included in any particular POCAT since we don't want to substantially increase the number of questions in individual POCATs. Instead, we could create a bank of questions that includes at least one question corresponding to each LO in each required course; and for each offering of POCAT, we would choose an appropriate set of questions from the bank. This should ensure, over five or six offerings of the POCAT, that we have included questions corresponding to each LO of each required, higher-level course. And by looking at student performance in the POCAT on the question(s) related to any given LO of any given (required, higher-level) course, we will be able to assess how well students have met that outcome and arrive at any needed changes either in the course(s) or in the statement of the LO.

b. Changes in CGR content:

To account for these changes, the CGR is being revised as well. The main change is that the CGR will include, for each course that is required (or is a popular elective) in the BS-CSE program, an analysis of how well the course, based on recent offerings, achieves its intended LOs and possible changes based on this analysis. Given the (proposed) mapping from BS-CSE program outcomes to the LOs of, especially, required courses, this must be the key part of the analysis in the CGR. This should help with meeting the requirements related to evaluating the achievement of program outcomes and using the results to improve the program. Part of this analysis should also be based on student performance in recent POCATs, in questions related to particular LOs of particular required courses. Currently (as of Au '08), the questions on the POCAT are not designed specifically to help assess student performance with respect to specific LOs of the various courses. As a result, this part of the analysis in CGRs completed during this period will be very limited. As an alternative, as part of this section of this CGR, it may be useful to consider some possible questions, related to specific LOs, that might be used in future POCATs.

CGRs of groups that include capstone design courses have a particularly important role to play. They should include discussion and analysis of the summary results of the rubrics-based assessments of various activities in recent offerings of the course. This will, given the proposed inclusion of LOs based on the dimensions in these rubrics in the syllabi of these courses and the proposed mapping of soft outcomes to these LOs, help meet the CAC/EAC Criteria requirements with respect to assessment and evaluation of the soft outcomes. Once all CGRs adopt this format, in particular the inclusion of discussion of the level of achievement of the various LOs based on performance in recent POCATs and inclusion of discussion of summary results of the rubrics used in the relevant capstone design course indicating the level of achievement of the LOs related to the soft outcomes, the tie-in between our main assessment processes (POCAT and the capstone course rubrics) and the evaluation processes (the CGRs) will be clear.

An alternative would be to have another CGR, Professional Skills CGR, that contains this part of the analysis, based on the rubrics results from the various capstone design courses (as well as CSE 601). If we take this approach, the analysis of each capstone design course will be split into two parts, one dealing with the technical LOs of the course that will belong with the CGR for the subject group that the course belongs to, and one dealing with the soft/professional outcomes of the course; CSE 601 will belong to the Professional Skills CGR. The advantage of this approach is that the discussion about the various soft outcomes will be in one report rather than being split among several. Such an approach might also make it easier for this part of the various capstone courses to learn from each others' experiences.

c. Changes in CGR format:
(For an example of a CGR using this new format, see the PL report dated Dec. 2008/Jan. 2009.)

The recommended format for CGRs is as follows. The initial part of the report should consist of a table that lists the courses in the group (course number, title, and number of credit hours) and specify, for each course, whether it is a required or elective course in the BS-CSE program. Section 1, Summary, should be a brief, high-level summary of the group including mention of any recent, major changes (such as development of key new courses) in the group.

Section 2, Detailed analysis, is the main portion of the CGR. Section 2.1 should provide a summary description of each course in the group, what it is intended to contribute to the group (and the undergraduate/graduate program as a whole), how it relates to other courses in the group (and, possibly, its relation to other groups). The level of detail provided in the summary for the different courses does not have to be uniform; for example, new courses might be discussed in greater detail than long-existing courses. In any case, for each course, the summary should discuss not just the main topics and ideas presented in the course, but also include information about the activities, such as programming projects, homeworks, etc., that students in the course typically engage in.

Section 2.2, Evaluation of courses, is the heart of the CGR. Note first that the official syllabus (on the CSE web site) for each of our courses lists a a set of intended learning outcomes (LOs). Each LO specifies an item of knowledge and/or skill at one of three levels of performance, mastery, familiarity, or exposure. Mastery means the student should be able to apply the knowledge or skill even in a new context, and even when not specifically instructed to do so; familiarity means the student will be able to apply it even in a new context, when instructed to do so; and exposure means the student will have heard the term and/or seen it used, but may not be able to discuss or use it effectively without further instruction.

Each course must be evaluated against its LOs; specifically, for each outcome, whether the course includes suitable activities corresponding to that outcome and whether students, based on their performance in these activities, seem to be achieving the outcome at the intended level. In the case of required, high-level courses, a key part of this evaluation must be based on student performance in recent offerings of the POCAT in questions related the various LOs. In addition, the CGR should include ideas (probably omitting the details since the CGRs are public) for possible new questions for inclusion in the POCAT question bank. This is especially important for any LOs that are at the mastery level.

Where there is a mismatch between the level of achievement (mastery, familiarity, exposure) specified in the official syllabus for any given LO and student performance with respect to that LO, especially a substantial mismatch, the CGR should propose suitable corrective actions. These may vary from changes in the course content, changes in the programming projects or other assigned activities, changes in the learning outcomes (including their levels), or some combination thereof.

In the case of capstone design courses, the evaluation must include discussion of the LOs related the professional program outcomes, such as team-work and communication skills, of the BS-CSE program. This discussion should be based on the summary results of the various rubrics used in the course in evaluating these LOs. Alternatively, if, as suggested above, we introduce a separate Professional Skills CGR which is intended to contain this evaluation for all the capstone design courses (as well as CSE 601), this part of Section 2.2 in the (technical subject group) CGR needs only contain a brief summary of that evaluation.

Section 2.3, Relation to BS-CSE program outcomes and evaluation, should be organized as follows. First, list the BS-CSE program outcomes (POs). Next, for each course in the group that is required in the BS-CSE program specify, in the form of a table, the relation between the various LOs of the course and the various POs of the BS-CSE program. In specifying the contribution of a particular LO to a particular program outcome, use "XXX" to signify strong contribution; "XX" to signify moderate contribution; "X" to signify minimal contribution; if the LO makes no contribution to the PO, the corresponding cell in the table should be empty. Following the table, should be a brief summary, based on the discussion in Section 2.2, of how well the recent offerings of the course achieved each LO as indicated by both the course content and student performance in the course; and results from POCAT, if any, related to the various LOs of the course. A brief summary of the proposed changes in the course (content or activities or LOs or POCAT questions) should conclude the discussion for this course.

Sections 2.4 and 2.5, major changes since previous report and planned changes, should summarize the major changes in the group and the planned changes as detailed earlier in the CGR. Section 3, Conclusions, should offer brief, overall comments about the group and its contribution to achieving the BS-CSE POs. A final table should list, for each course in the group, the coordinator for the course and names of recent instructors. The names of the people involved in preparing the report and the date of the report complete the CGR.



Comments on any of the individual reports or on the mechanism as a whole may be sent to Bruce Weide, Chair of the Curriculum Committee, or to Neelam Soundarajan, Chair of the Undergraduate Studies Committee.