Subscribe Today
Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

The Tragedy of Assessyphus

by Michael Huber, Michael Phillips & Alex Heidenberg - October 09, 2006

In Greek mythology, Sisyphus is best known for being punished in the Underworld by being forced to roll a large stone with his hands and head in an effort to heave it over the top of a steep hill; but regardless of how much he pushes, as Sisyphus approaches the top of the hill, the stone rebounds backwards again and again. In academic mythology, Assessyphus is the faculty member who sometimes feels punished for being the departmentís point of contact in assessment. Many times, Assessyphus feels as if he is pushing a heavy rock up a hill, and just when he gets it close to the top, it rolls back down. The assessment process has become a burden for Assessyphus. An assessment cycle with a sample rubric is presented, with suggestions for establishing a successful plan to measure student learning.


In Greek mythology, Sisyphus is best known for being punished in the Underworld by being forced to roll a large stone with his hands and head in an effort to heave it over the top of a steep hill; but regardless of how much he pushes, as Sisyphus approaches the top of the hill, the stone rebounds backwards again and again.  Sisyphus was destined to push his rock for eternity, never reaching the top, never finishing his task.  Zeus punished Sisyphus by giving him a burden that he cannot overcome.

In academic mythology, Assessyphus is the faculty member who sometimes feels punished for being the departments point of contact in assessment.  Many times, Assessyphus feels as if he is pushing a heavy rock up a hill, and just when he gets it close to the top, it rolls back down.  The assessment process has become a burden for Assessyphus.  In fact, Assessyphus also has an eternal burden.  However, this burden may just be a perception.  Bernard Madison (2002) wrote:

Thrust onto the US higher education scene in the final two decades of the twentieth century, assessment continues to suffer mightily from misunderstanding, much of it because of the burden of its name.  The other weighty contributor to the misunderstanding is assessments cadre of early promotersadministrators, governing boards, accrediting agencies, and legislatures.  Most college faculty believed that assessment was, as the name implied, some kind of comprehensive evaluation&So the lines were drawn and assessment has struggled against these misunderstandings to gain both respectability and usefulness in U.S. higher education.

If Assessyphus can convince himself that the burden lies only in the name, he might be more successful.  Often, because of the name itself, the assessment process takes on negative connotations and the program doesnt get the attention it deserves.  The assessment program becomes a burdensome requirement that is often delegated to the junior faculty member.  

The assessment cycle

In its simplest form, assessment is the process of gathering and acting on information about student learning.  Assessment of our program creates the knowledge base that is needed to influence change.  A typical Assessment Cycle is shown in Figure 1.


Figure 1.  Typical Assessment Cycle

At the base of Assessyphus hill is the first step of assessment, which is to articulate the goals and objectives.  We as educators need to articulate the learning goals of the curriculum and a set of objectives that should lead to goal accomplishment.  What are these goals and where do they come from?  Does Assessyphus look to Zeus and Mount Olympus (the university administration or the accrediting body, for example) for guidance, or does he struggle on his own to develop the goals?  Assessment of our goals allows for a culture that seeks sustained improvement.  What is the rock that Assessyphus is pushing up the hill?  Many problems occur because Assessyphus tries to push too big of a rock.  All too often, a group begins an assessment program by wanting to assess everything, instead of a more manageable piece of the program.  Assessyphus has a greater chance for success if his rock is not too big.  Greece was not built in a day&

Perhaps Assessyphus should solicit assistance from other members of his department to help him.  He is not the only one with experience in determining measures for assessing objectives.  As mentioned earlier, if the rock being pushed is too big, the task may seem Herculean.  Getting the rest of the faculty to buy into the departments assessment plan can be a daunting task.  The work is endless and the rewards are minimal, akin to Assessyphus trying to defeat a better-armed enemy, or indifference.  A collective effort from the entire faculty will offer a better picture of overall student learning.  Other faculty will also provide valuable feedback to the process.  The stakeholders in a department assessment plan should work together to agree on goals and objectives before assessment occurs.  However, if the department does not reach a consensus and clarify its goals for student learning, Assessyphus may hold the burden for an unnecessary complication of effort.  As the faculty member responsible for assessment, Assessyphus might chair the departments assessment meetings, but he should not have the burden of developing goals and objectives alone.

As can be imagined, one of the hardest tasks that hinders Assessyphus from starting to push his rock is outlining goals and objectives.  Is the goal an improved program for majors, or is the objective a measurable improvement in students appropriately using technology to aid in problem solving?  Is Assessyphus seeking to improve proficiency in the depth and breadth of coursework?  Is the goal effective communications of student skills both verbally and in writing?   Perhaps the goal is to develop appropriate attitudes in students so that they will learn to be competent and confident problem solvers, becoming independent questioners and lifelong learners, etc.  There are many different reasons to push the rock to the top of the hill, and each reason requires its own strategy and assessment techniques.  One thing to keep in mind is that all objectives and goals must be written in measurable terms.  Failure to articulate measures will result in a rock that cannot be moved or is moved without being noticed.  For example, conclusions cannot be drawn unless the data gathered from assessing objectives is measurable.  It is difficult to measure a gut feeling.

Once the goals and objectives are established, Assessyphus turns to designing strategies to assess accomplishment of the goals and objectives.  These strategies are then part of a learning model centered on the student experiences. Often, while he pushes the rock, Assessyphus must define a learning model as the conditions in which its students learn.  These conditions include the structure, process, and content of student experiences.  These experiences take place in the classroom, out of the classroom, in groups, individually, etc.  Part of the strategy for Assessyphus is knowing how fast to push the rock; he needs to pace himself.  Some student experiences take time to measure.  Other experiences can be analyzed in a shorter time.  For example, fundamental skills knowledge can be assessed in a quiz or oral exercise, whereas developing habits of the mind might take several semesters or more.  Should Assessyphus ask his students to evaluate their personal accomplishment of the goals and then give evidence of that accomplishment?  Student self-assessment can provide a unique perspective on whether or not anticipated learning is taking place. However, caution should be exercised; while student perspectives can be valuable for instructor development, student feedback can be at times unreliable.

In addition to the learning model, leveraging technology has become a key ingredient in designing assessment strategies.  The use of technology can be a goal to be assessed, or it can be a vehicle for Assessyphus to measure the student attainment of objectives.  Technology has rendered many procedural skills obsolete.  Other procedural skills are so important and fundamental to learning both in mathematics courses and in other disciplines that students should be capable of executing them without the assistance from technologysuch as computer algebra systems, programmable calculators, or word processors.  If indeed, what you test (or more appropriately, assess) is what you get, then if you never assess skills without technology, students neglect these skills.  If you only use the same types of examinations with only the subject matter changing, students seek to become proficient at taking that type of examination, not the material.  What are the skills that are truly fundamental?  Lynn Steen (1992) shares one of the many challenges that mathematics educators face:

The key issue for mathematics education is not whether to teach fundamentals but which fundamentals to teach and how to teach them.  Changes in the practice of mathematics do alter the priorities among the many topics that are important for numeracy.  Change in society, in technology, in schoolsamong otherswill have a great impact on what will be possible in school mathematics in the next century&To develop effective new mathematics curriculum, one must attempt to foresee the mathematical needs of tomorrows students.  It is the present and future practices of mathematicsat work, in science, in researchthat should shape education in mathematics.  To prepare effective mathematics curriculum for the future, we must look to patterns in the mathematics of today to project, as best we can, just what is and what is not truly fundamental (p. 2).

An effective assessment program is comprised of many different assessment tools that provide a comprehensive picture of student learning.  Each different assessment provides a different perspective.  All of the individual perspectives can be combined to provide Assessyphus with an accurate assessment of the program.  Assessyphus must continually think about the movement of the rock.  Each movement of the rock, whether forward or backward, provides feedback about the attainment of program goals.  

How does Assessyphus push the rock up each hill?  Does he leverage the use of portfolios, group projects, oral presentations, graded homework, standardized tests, or other methods of assessment?  Are there existing benchmarks to compare measured data?  The methods used for measuring goals will depend on the discipline; how to determine the optimal techniques in gauging the nature of the learning involved, and then determining if the learning goals and objectives have been achieved.  Math-, science-, and engineering-based goals might be measured differently than humanities-based goals.  Both groups might want to assess how students have become critical and independent thinkers, but there are vastly different approaches in measuring outcomes.  When the rock falls back down the hill, does Assessyphus push it up in the same manner each time?  The design of experiences, their implementation, and the assessment of both the process and results are components of an inseparable system.  Assessment outcomes impact curricular decisions at the course and program level.  Similarly, the outcomes help reshape the learning environment that leads to new outcomes.   

An assessment program is like the master plan or strategy to move the rock up the hill.  Along the way feedback is required to determine where you are and if the rock is being pushed in the right direction.  So, too, must feedback be incorporated into the assessment cycle.  To be successful, it may help Assessyphus to develop a rubric.  An example of a rubric that employs multiple assessment techniques to assess the departments program goals is shown in Figure 2.  While this matrix may seem complicated at first, it does offer a grid to plan the assessment of learning.


Figure 2.  Multiple Assessment Technique Rubric

Against each of these goals, Assessyphus must make qualitative ratings as a strength (+), mixed success (0), or weakness (-).  Numerical responses can be used when possible, but one should be careful not to discount reflection and subjective assessments, especially student feedback from attitude surveys or faculty discussions.  Analysis of the data is just as important as the collection.  The results of data analysis allow Assessyphus to determine several things:  First, are the students learning?  Did we capture with the assessment tool what we had hoped?  If not, what do we do differently next time?  Are we giving too many quizzes?  Should we incorporate more group work?  As suggested before, can the students give explicit examples as evidence of their learning?  Often departments use a capstone course or a senior portfolio to try to capture a direct assessment of learning.  If certain questions on the final exam relate to specific objectives, did the students achieve those objectives?  Perhaps incorporating a survey of graduating seniors can provide insight; asking critical questions about specific courses can complement the direct gathering of data from items in the rubric.  The last part of the assessment cycle is using the feedback and results of the data analysis for improving the system.  Assessyphus knows that the cycle keeps going; he wants to use the analysis of the data to make the course or program better, and more conducive to student learning.  Then Assessyphus starts to push the rock again.

Be proactive

Assessment is a leadership responsibility.  Baseball legend Yogi Berra (2002) is credited with providing the following words of wisdom, which can be applied effectively to any assessment initiative:  If you dont know where you are going, youll probably end up someplace else.  Assessyphus knows he is pushing his rock, but sometimes he feels as if he is slipping (or spinning his wheels, so to speak).  He believes he is pushing the rock in the right direction, but he must be careful not to veer off the path.  He sometimes feels as if the eternal burden of assessment is overwhelming.  By realizing that the burden exists, Assessyphus can value the burden and gain ownership.  He will keep pushing the rock, but will it be a valued effort?  What if Zeus (the university president) is looking over the shoulder of Assessyphus?  Be proactive!  Start the assessment plan before Zeus throws a lightning bolt.  By choosing his strategies and developing a rubric, Assessyphus can avoid being directed to conduct a study that carries no meaning.  Also, a study which is proactive will satisfy the regional assessment body (accreditation organization) visit from Mount Olympus that will demand evidence of an ongoing assessment program.

Prometheus had stolen fire from Zeus and given it to the mortals, to provide light for their caves. The gift of divine fire unleashed a flood of inventiveness, enlightenment, and even productivity in mortal men.  As we incorporate assessment into our learning models, we too, can unleash inventiveness, enlightenment, and productivity into our courses and programs.  Assessyphus can carry out his eternal burden with a sense of duty and satisfaction.

The authors wish to acknowledge the assistance and encouragement of Dr. Anita Gandolfo, Director of the Center for Teaching Excellence, United States Military Academy.


Berra, Y. (2002). What time is it? You mean now? New York: Simon & Schuster.

Madison, B. (2002). Assessment: The burden of a name. Available online at: www.maa.org/saum/articles/assessmenttheburdenofaname.html (accessed 31 January 2006).

Steen, L. (Ed.). (1992). Heeding the call for change: Suggestions for curricular action. Washington, DC: Mathematical Association of America.

Cite This Article as: Teachers College Record, Date Published: October 09, 2006
https://www.tcrecord.org ID Number: 12777, Date Accessed: 10/24/2021 3:42:18 PM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Michael Huber
    Muhlenberg College
    E-mail Author
    MICHAEL HUBER received his Ph.D. from the Naval Postgraduate School in Applied Mathematics and is currently an Associate Professor in the Department of Mathematical Sciences at Muhlenberg College. He recently retired from the U.S. Army, where he was an Associate Professor at the United States Military Academy. His research interests are in applied mathematics, assessment, and mathematics education.
  • Michael Phillips
    United States Military Academy
    MICHAEL PHILLIPS received his Ph.D. from Clemson University in Industrial Engineering and is currently a Professor and Head in the Department of Mathematical Sciences at the United States Military Academy. His research interests are in applied statistics, modeling, and math education.
  • Alex Heidenberg
    United States Military Academy
    ALEX HEIDENBERG received his Ph.D. from Georgia State University in Research, Measurement, and Statistics and is currently an Associate Professor in the Department of Mathematical Sciences at the United States Military Academy. He is currently the Director of the Core Mathematics Program and works with curriculum design, assessment, and teaching with technology.
Member Center
In Print
This Month's Issue