Rick Wormeli and standards-based assessment and grading

Last night I attended Rick Wormeli’s lecture in the Cumberland, RI. He was invited to speak and train at the Cumberland High School in the day and talk with the parents in the evening. He was bringing his expertise in standards-based assessment and grading. When I first heard about the lecture I thought, “Ok, I will go and listen to a Common Core advocate because I need to understand the opposition.” I expected to be annoyed throughout the lecture. My expectations could not have been more wrong.

When Wormeli says “standards-based” he is not talking about any particular instructional program. What he means is that the student will be assessed and graded only on evidence of mastery of the standards. It is not based on student behaviors, neat notebooks, hygiene, or other subjective judgments. To do so is “grade falsification” and that is unethical. (He raises the unethical issue often.)

With a standard everyone involved has a clear picture of what needs to be mastered and how the evidence of mastery contributes to the grade. How the student achieves mastery does not contribute to the grade. The path, as it were, can be different for different children. And it is the teacher’s professional role to find that path with the student. Thomas Guskey in Education Leadership asks of teachers “Is my purpose to select talent or develop it?” Clearly it is to develop it.

He advocates for rework. “When,” he asks, “is a professional evaluated on the average of his or her work over time?” When you pass your driving test on the second try you have passed the driving test. You have shown evidence of mastery. You are not told that based on the average of the two tests you will need to take and pass a third test so as to raise your average. What matters most is the current level of achievement and not the past levels. The same is true for students. If the student shows evidence of mastery of the standard then the student has passed. Formative tests and quizzes are to aid the student and teacher in evaluating progress towards mastery. They are not graded. They are diagnostics. It is only the summative tests that are. And these can be retaken/reworked.

He is quite formal with how he consent to rework. He wants to know when and how the student expects to spend their time to improve their understanding of the standard? This then becomes a contract between him, the student, and the parents. In Mr Wormeli’s class rework is not an opportunity to skip the preparation for the original test or assignment.

Wormeli state that the standards should be atleast school-wide if not wider. Without that there is no way to fairly accesses students. If Ms World’s History 101 class uses a different standard than Mr Universe’s History 101 then how can either teacher know that the student is prepared for Ms Congeniality’s History 202 class? All that the teachers know for sure is that the student was passed along. And that is all the administration knows. And that is all the parents know. The student, you can be sure, knows the truth.

Wormeli is a dynamic speaker. He has vast teaching experience at elementary and secondary schools. He seems to have countless experiences and examples to draw upon. He is funny, playful, and, when needed, skewering. He is also very well rehearsed.

He covered lots of other ground and had time, mental, and physical conditions allowed he and the audience would have remained in the auditorium well into the night. I do plan to follow up on his lecture by reading his writings and viewing the videos he has done.

Ongoing attacks on servers around the world

"The Norse Map is a Wargames-style visualization of ongoing attacks on servers around the world. Though it shows honeypots rather than actual private or government targets, the result is a live snapshot of trends in computer mischief."

Access protocols and availability contracts

I read Reference Lists and Tables of Content and skimmed the referred to Data Packages and Research Object Bundle 1.0 and came quickly away with certain opinion that the days of defining a static data collection specification have pasted. Clearly, where the data for a research experiment can be petabytes in size a static data set is unlikely to be passed around its community of interest. What is needed is a common access protocol for data.

Now there are many hundreds of such protocols in application today. HTTP is an access protocol and one with content negotiation too. The W3C has a boat load of specifications within the SOAP space that are relevant. The semantic web is another deep source. Even the now long in the tooth Object Management Group's CORBA specifications all cover the same ground.

The point of course is that we really don't need another green fields specification and especially one for static data dumps. Instead we need to agree to access protocols and availability contracts. Unfortunately, I have nothing more to add right now as it is not really my problem.

Remove formatting in Skype messages

Our development team uses Skype for messaging, voice conference, and screen sharing. It has proven itself a reliable tool year in and year out. Unfortunately, for a software development team it is sometimes unhelpful, especially when it mistakenly interprets data or code as emoji or formatting instructions. I assume that Skype developers use Skype for their communication too and so will have the same frustrations with it interpreting message text as our team has.

Skype's Preferences dialog does allow disabling emoji but not formatting (at least on the Mac). To remove Skype formatting in messages were, for example tilde means strike-through and asterisks means bold do the following

  1. Quite Skype, 
  2. Edit the file $HOME/Library/Application Support/Skype/shared.xml, 
  3. Replace <EnableWiki>1</EnableWiki> with <EnableWiki>0</EnableWiki>,
  4. Save the file, and
  5. Restart Skype.

Update: Blogger's editor seems to have similar issues!

True is true and false is false.

If you send me to code examples containing either of the following you are not going to be hired, ever.
test ? true : false
if ( test ) { return true; } return false;

Letter to SC about Blended Learning

My letter to the South Kingstown School Committee of my grave concerns about the administration's objective for Blended Learning in our schools. It is mostly a reworking of an earlier blog posting.

"programs you depend on are written by dicks and idiots"

Good programming rant Programming Sucks. Take way is "programs you depend on are written by dicks and idiots." Say it loud brother.

Blended Learning

Yesterday I had the opportunity to talk with the High School’s new Principal Robert Mezzanotte and the STEM Coordinator Simone Palmer. The purpose of the meeting was to provide fuller answers to my 10 questions to the Superintendent of a few weeks ago. The meeting went as one would expect. And, as in many meeting that includes people, it was the side comments made and comments missing that were significant.

The Principal said that the successful introduction of “Blended Learning“ was his first priority. Until a week or so ago I had not heard of Blended Learning. Or if I had, I had assumed the term was the new educational term for computers in the classroom. It is not. Blended Learning is an approach to achieving an end goal of personalized student education through the use of technology for instruction. Let me try to explain.

High Schools use the same method of instruction today that I had when I attended. A teacher presents the same instruction to all students sitting in formation facing her. In the Blended Learning community this is called Traditional Instruction learning.

Before I move on, keep this diagram handy.

From Traditional Instruction the instruction and learning will move to Macro Differentiated learning. Here the teacher teaches to groups of students in the classroom. Each group is composed of students at the same “content level,” that is, the  a group of kids that “get it”, a group of kids that “don’t get it” and groups of kids that “sorta get it.” The teacher made the decision as to the groupings. The teacher will rotate between the groups over the duration of the class period. The teacher continues to drive the instruction (short lectures to small groups) and the instructional materials are not expected or, rather, not required to be online. (The online part is an efficiency that becomes important later, so keep watch.)

From Macro Differentiation the instruction and learning will move to Micro Differentiated learning. Here the teacher works with more and smaller groups of students. Some groups contain just one student — the smartest and the dumbest. When a teacher is not attending to a group that group will be busy learning from instruction delivered online and assessed continuously online. Progress is a calculation without  allowance to character or circumstances.

From Micro Differentiation the instruction and learning will move to Individual Mastery learning. Here all students have individualized online instruction, tutoring, and assessment. The teachers are available for coaching as are their peers — that is student to student coaching. Each lesson is a step in a chain without variance. Only the speed that the student moves along it is different. To be fair, each link in the chain may contain different content richness for the kids that “get it” and the ones that “don’t get it.”

From Individual Mastery we come to Blended Learning’s final destination of Fully Personalized learning. Here the student is master and commander of their own education. Directed by their own interests. A classroom is now a selection from an online catalog of available syllabi. Assessment is automatic and continuous. Peer to peer coaching and evaluation is routine. Learning has become “teacher-proof”. If you still have the diagram open, the information visual for Fully Personalized has a teacher sitting outside the student's learning. There is no direct connection between the two.

I am quite unsure where the teachers are in the Fully Personalized form of learning. The obvious place is an educational utopia where instruction is undifferentially produced for a student ideal that is hermetically delivered without an atom harmed. I might be overthinking this.
I don’t want Blended Learning. How about you?

"Specializing in what I have not done before."

Paul Kahn, one of the two founders of Dynamic Diagrams, once described his design agency as "specializing in what they have not done before." As I work to understand how secondary education is using technology in the classroom in the light of Federal, State, and local obligations and expectations I am constantly reminded of Paul's maxim and of avoiding being little more than a dilettante.

A visual hack for organizing your notes

I am always advocating for keeping and carrying only one notebook for all projects. Maintaining a single notebook does have its own problems and so I watch for interesting solutions to them. One problem is how to quickly find project specific notes? This solution is a simple means to visually organize the notebook's content.

From Japan, a Brilliant Notebook Hack for Organizing Your Notes: A simple trick to easily index and find the information you need

I am reminded that it is time to revisit my Keeping Your Academic Shit Together document.

Small things matter

A method we have in our common library is a check that a string is null, empty, or contains only whitespace. I suspect everyone has this method somewhere in their code. Ours was written a decade ago and we have not had need to replace it -- it works and has no obvious downside to normal operations.

However, every now and then I am reminded of something I learned at Tazz Networks when one of the teams was reviewing performance bottlenecks. The code that lowercased unicode strings was contributing to well over 1% of the total running time. The only reason for this high overhead was that at no time did anyone -- senior or junior -- state what the standard letter case was for hash table keys. And so, for example, the key butter was found in the code as "Butter", "butter", "BUTTER", and my favorite k.upper() only to lowered soon after! None of theses uses came from outside of the system (that is, user data) and so the development organization had complete control over their representation.

So what about the empty string check? The code is

return s == null || s.trim().length() == 0;

The use of trim always bothered me because it allocates a new string containing the original string without the leading and trailing whitespace. I would have been less bothered, perhaps, if we used this check infrequently, but some of the core code needs to check for empty because the development organization never set the standard for an empty string. If all empty strings could to guaranteed to be of zero length or, better yet, a sentinel value then the check would then be

return string == EMPTY_STRING

which is going to be very fast.

So, for now, the empty check has to look at the content of the string. I replaced the original check code with

if (s != null) {
    for (int l = s.length(), i = 0; i < l; i++) {
        if (!Character.isWhitespace(s.charAt(i))) {
            return false;
return true;

In the case where the candidate string is the empty string or a string without leading whitespace this implementation is about 18% faster (100M iterations). When the candidate string contains just one leading whitespace it is 481% faster. The candidate string needs 34 leading whitespace characters before the two methods have similar run times.

I am going to keep the new code. Small things matter. Especially, when frequently used.

Update revision date with Google App Script

I was bitching about what Google Documents is missing for technical documents. Which, in all honestly, is not much, especially, in the light of the revision and collaboration tools it does have -- and had on its first day.

Since I like to have revision dates and numbers in my documents how hard could it be to script their automatic update. Not hard at all. The following script will add an "Update Revision" menu item to a "Scripts" menu and when invoked it will replace all occurances of "Revision YYYY-MM-DD HH:MM" in headers, footers, and the body with the current timestamp.

function onOpen() {
      .addItem('Update Revision', 'updateRevisionTimestamp')

function updateRevisionTimestamp() {
  var doc = DocumentApp.getActiveDocument();
  var date = Utilities.formatDate(new Date(), Session.getScriptTimeZone(), "yyyy-MM-dd HH:mm");
  var revisionPattern = "Revision [0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}";
  var revisionReplacement = "Revision " + date;
  var body = doc.getBody();
  if ( body ) {
    body.replaceText(revisionPattern, revisionReplacement);
  var footer = doc.getFooter();
  if ( footer ) {
    footer.replaceText(revisionPattern, revisionReplacement);
  var header = doc.getHeader();
  if ( header ) {
    header.replaceText(revisionPattern, revisionReplacement);
  return doc;  

Jolokia is remote JMX with JSON over HTTP.

We use JMX a lot for low level, dynamic management of our systems. Even the most basic of MBean implementation that uses the "MBean" interface suffix and one concrete class implementation is far better than nothing. With our commitment to using JMX we have, over the years, developed a set of Java annotations and machinery to publish POJOs as MBeans.

We continue to use JConsole, over Java Visual VM, because of its utter simplicity. The tool does have some issues, but we know them well and they do not distract from the need to manage something quickly. The one area where this tool and other tools that I am aware of fall down is making the same change on several deployments.

For this we fall back to using the bash shell and Jolokia. Jolokia is remote JMX with JSON over HTTP. It is a servlet that is part of our standard Tomcat deployment. We have been using it for many years and it has not failed us yet. So, if I need to change some state on all our machines I run

    curl "http://$h/jmx/write/object-name/attribute-name/new-value &

Where object name, etc are replaced as expected. (Curl is normally called with --silent and --fail command line options.)

Mistake-Proofing & software poka-yokes

Chase's & Stewart's Mistake-Proofing is a useful read about the theory and practice of avoiding mistakes in manufacturing and service. It is a short book and so just skims the surface of the field, but it gave me enough information to better understand some problems my software development organization has with its "manufacturing."

The problem we have is that we don't so much have a data pipeline with state change at the joints, ie fixed points in the flow, but more a leaky data pipe where the leaks are the state changes. (We inherited much of this code from a systems integrator hired before my time.) Changing the whole flow is not possible right now and so how can we mistake-proof the leaks?

Mistake-Proofing advocates poka-yokes as a means to avoiding mistakes. The term is from Japan, where all the quality focused manufacturing ideas have come from since Deming arrived after WWII! It means to avoid (yokeru) mistakes (poka). We see them all the time all around us. Some are actual devices -- the car engine won't start unless the clutch is depressed -- and some are conventions -- hot water is on the left. In software, a common poka-yoke is method argument checking. For example,

public void m( String v ) {
  if ( v == null ) throw new IllegalArgumentException("v must not be null");

Other, more complicated poka-yokes are based on the satisfaction of accumulated conditions. For example,

if ( ! ( x && y && z ) ) throw new IllegalStateException("...");

So, Mistake-Proofing has given me a renewed inducement to improve our system and a nomenclature to use.

Note: I find negating a single conjunction of conditions, ( ! ( x && y && z ) ), more readable than a disjunction of negated conditions, ( ! x || ! y || ! z ).  Code readability is itself mistake-proofing.

Developing post of my education about schools

This a developing post of my education about schools in the USA. It will be idiosyncratic. I tend to pick up anything and read it. In time it will become an annotated bibliography to a thesis that will not be needed.
Tough P. How Children Succeed, Grit, Curiosity, and the Hidden Power of Character. Mariner Books; 2013.
Good book with supporting evidence for putting as much emphasis on building character as inculcating academics in secondary education. Last chapters feel like filler. Worth reading.
Berger R. An Ethic of Excellence, Building a Culture of Craftsmanship with Students. Heinemann Educational Books; 2003.
Good book that articulates how a child needs to see and reach for excellence and from that milesone build a successful school life. Worth reading.
Christensen C. Is K–12 blended learning disruptive? An introduction to the theory of hybrids; 2013
A white paper that brings Christensen's Innovator's Dilemma to education. Unsettling vocabulary of business in the context of students & teachers. I have much to disagree with here. Overall, there is no evidence given of how hybrid (or fully disruptive) learning models are successful (vis a vis educating the student and strengthening the teacher) nor how they are cost effective. Much hand waving. Worth reading, if only for ammunition.
Laptop multitasking hinders classroom learning for both users and nearby peers
Useful data point that learning is hindered more for the observer of than for the user of the laptop. 17% more.
If I am going to be successful in lobbying for a high school education deserving of our children I will need to learn a whole new kind of politeness and patients, and expect progress and not perfection. Perhaps this was not the year to give up booze.


Since I often write blog postings with embedded code I get frustrated with Blogger's editors. Neither the "Compose" nor the "HTML" variants work right when the code contains anything that looks like an HTML element or entity. My last posting lead to me looking for a Markdown editor and I discovered MacDown. I have used it once, to date, and it worked flawlessly!

Traversing, aka picking for scraping part 2

Extending the idea in the Picking for scraping posting is the traverse. A traverse safely traverses a (typically) hierarchical data structure where any of the intermediary traversal steps may encounter a missing node. For example, in an hypothetical application most data instances have a value at the end of the a traversal from A to B to C to D, however, this data instance is missing the B node. Traversing from A to B is safe, but traversing from B to C will result in an null pointer exception (NPE).

The common ways to avoid NPEs while traversing is to check each node for null before traversing it. And so you see code that looks like a ladder, eg

value = data
if ( value != null ) value = value.A
if ( value != null ) value = value.B
if ( value != null ) value = value.C
if ( value != null ) value = value.D

While this works it is very cumbersome and prone to copy-paste errors due to small variations found on some ladder rungs.

Some languages offer syntax support for traversing. This is called a null coalescing operator. Java, the language I mostly work in, does not have this operator and so a different solution is needed.

The solution I use allows for a safe traversing using data format specific classes. That is, XML has its own traverser, Json has its own traverser, etc. A fully general mechanism might be possible, but the technique is simple and so really does not warrant one. Here is the example of safely traversing some Json data

value = new Traverse(data).at('A').at('B').at('C').at('D').value()

Or, let's have B be an array and we want to traverse the first element

value = new Traverse(data).at('A').at('B').at(0).at('C').at('D').value()

The dictionary at() method is

public Traverse at( String key ) {
    if ( value != null ) {
        if ( value instanceof Map ) {
            value = ((Map)value).get(key);
            // data might be null indicating an unsuccessful, 
            // but safe from NPE, traversal
        else {
            // an unsuccessful traversal always sets data to null
            value = null;
    return this;

While the array at() method is

public Traverse at( int index ) {
    if ( value != null ) {
        if ( value instanceof List ) {
            value = ((List)value).get(index);
            // data might be null indicating an unsuccessful, 
            // but safe from NPE, traversal
        else {
            // an unsuccessful traversal always sets data to null
            value = null;
    return this;

As with the Builder pattern, the Traverse pattern always returns the traverse instance.

A common aspect is the check for type and this can be refactored out.

public Traverse at( int index ) {
    List l = cast(List.class);
    if ( l != null ) {
        data = l.get(index);
    return this;

protected <T> T cast( Class expectedClass ) {
    if ( value != null &amp;&amp; ! expectedClass.isAssignableFrom(value.getClass()) {
        value = null;
    return (T) value;

Expanded, working versions of XML traversal and Json traversal are at [TODO].

Oh, how do you get the traversed to value?

public <T> T value() {
    return (T) value;


My boss was a member and, later, chairman of the Lenox, MA school committee for over 10 years. I am glad for his explanations of the alien world known locally as the South Kingstown School District.

Re: 10 questions for the Superintendent and School Committee Chairwoman

I received a response to my letter with some questions about the 1:1 initiative. Note that the formatting is incorrect in some areas: I have asked for a clearer document.

I still need time to consider it. On the surface it says the right things and says them politely. (Very much Pauline Lisi's voice.) I am told that for more detail I need to meet with two other administrators. It has not been my experience that a meeting is more conducive to detail than a prepared document. The response is a step forward, but not the stride I was hoping for.