9/20/12

On why 30 Rock makes you feel smart

30 Rock makes you feel smart because in order to find it funny you must enjoy thinking through several layers (and thus by definition we are talking "meta-layers") of meaning.

For example, when Jack Donaghey exclaims: "Good god, Devon's gay.  He's more powerful than I thought!" we laugh because its smart of us to laugh.  I laugh because of how an absurd situation is overcome through wit.  The absurd situation here is the discovery that someone you know very well is gay.  Of course, many would consider it natural to exclaim "Good god" after finding out such a piece of information.  For these people the exclamation is typically warranted by the fact that being gay is somehow wrong, odd, taboo, a piece of juicy gossip, etc. It is for precisely this demographic that the joke is not intended.  The people who find it funny are those who would never actually exclaim "Good god" upon finding out someone is gay.  This demographic smacks its forehead when someone treats coming out like its something to be gossiped about or derided.  Those of us who find the joke funny enjoy the full line because it makes light of such a backward reaction (the "Good god!"part) by making fun of it with the exact opposite of its typical motive.  "He's even more powerful than I thought" is beautifully witty because it subtly insults a certain portion of the population by portraying its perfect ideological nemesis (i.e. one who believes that being gay is a big deal because it is somehow advantageous, cool, powerful, etc.) as equally absurd for basically the same reason (that it is absurd to think that one's sexual orientation is worth considering in terms of "better" or "worse" off). 

Looking at Screens

My life is dominated by screens.  I look at my phone and laptop in bed.  I look at my phone and laptop and TV in my living room.  I go the office and look at my phone and laptop and another small-TV-sized monitor.  There is rarely a waking moment when I do not have immediate access to a screen, usually multiple screens.

At times, this is concerning.  Philosophers like Husserl and Deleuze have noted the constitutive power of processes of passive or affective synthesis.  The idea here is that a great deal of my identity as an organism (Deleuze), as a subject (Deleuze and Husserl), and as a person (Husserl) is constituted by affective relations to my environment.  In an image: my habits, expectations, motivated chains of thought, comforts, and so on are built up over time as I snowball about my world, accumulating ever more of it as it shapes and guides me.

The worry: fundamental change in form of life: now life is spent looking at things on screens instead of just things.

Of course, screens themselves count as "things"--I'm not making that rigorous of a metaphysical distinction--but they sure are very different than most things.  Screens open up new worlds.  The world presents us with things, but so do screens, even though screens are one of those things we are presented with in the world. 

Counter to the worry: why the nostalgia for things?  what do we actually do with screens? I spend an increasing amount of my screen time looking at or communicating with other people.

I'm not sure what screens are making me into, but I do know that I am always interested in higher resolution.  "Clear and distinct..." as Descartes says...

8/21/12

Breaking Bad and Doxastic Voluntarism

As I work my way through the acclaimed television series Breaking Bad, an episode from Season 3 struck me.

I have been a devotee to The Wire ever since I plowed through all five seasons back in 2008...and then again in 2010.  The Wire is a masterpiece.  I think it is best summarized as being about "the effect of institutions on individuals."

But what is Breaking Bad "about"?  I had enjoyed the first two seasons, but this was no Wire.  In this particular episode of season 3, it struck me: BB is about belief.  Of course, BB is about many things. But I think its overarching metaphor of "breaking bad" qua transformation or rebirth can be analyzed in terms of what we choose to believe, or rather whether we can choose to believe things in a non-self-deluding way.  To use fancy philosopher terms, BB is about doxastic voluntarism.

In this particular episode, Walt's wife Skylar finally abandons a crucial strand in her web of beliefs to acknowledge what she knew to be true all along.  Skylar has always known that Walt had two cell phones.  Walt's abduction and subsequent (epically) elaborate alibi–that he was in a "fugue state"–give Skylar two doxastic options: (a) Remain skeptical of Walt's alibi in favor of her hard perceptual evidence that Walt is indeed "up to something", or (b) Assent to (adopt the belief that?) Walt's alibi is honest testimony.  Skylar acts as if she believes Walt for several episodes, which is perhaps sufficient for (b). When Skylar finally confronts Walt, her words echo Tolstoy's when Karenin lets himself go on believing that there is nothing going on between Anna and Vronsky:
"What he knew was so dreadful that now he was ready to believe anything"
Here I simply rehearse Ermanno Bencivenga's argument in his (1999) "Knowledge Versus Belief."  Belief is a more sophisticated achievement than knowledge.  Knowledge is basic and belief is "a way of countering knowledge, of disturbing it and possibly deactivating it, not of subsuming it in a comprehensive embrace."  Belief is deontic to the core: it involves a choice, a taking responsibility for a certain manner of representing the world and one's place within it.  Skylar had too much at stake to not adopt the veracity of Walt's alibi.  The lie was too big, too major of a strand in a coherent web far preferable to that other coherent, but incomplete, web.

Just as Karenin tells himself an improbable story about the meaning of Anna's mannerisms and gestures, Skylar projects improbable scenarios regarding Walt's strange disappearances, his second cell phone, his "fugue state."  Perhaps most interesting is the social nature of Skylar's delusion.  Not only is she wrapped up in a commitment to representing reality a certain way, she is wrapped up in Walt's commitment to representing reality a certain way.  From a third-person standpoint this is not surprising; of course we are more likely to adopt commitments that mirror and support the commitments of those closest to us.  Our stability depends on it.

8/20/12

Thinking Together (con't)

[excerpts from an interview with Rolf-Dieter Heuer, head of CERN]
(my emphases)

The European: Should scientists be more vocal in the public sphere?
Heuer: If they have right things to say, yes. When we make important decisions, we should be able to rely on sound premises and statements. Just babbling isn’t so good. I also believe that science can really provide an example for cooperation that works. “Diversity” is such a nice word, such a nice mix of different ideas and characters. We can show that diversity enriches us.

The European: In the middle of the Eurozone crisis, you head a European institution that has worked rather seamlessly for almost 60 years. Is basic research possible on a strictly national level?
Heuer: Our research would be impossible without a collaborative element. The infrastructure necessary for basic research has gotten so big – at CERN but also e.g. in the case of electron lasers. You need many brains and hands to build it – that’s impossible on a national level. CERN was founded in 1954, when a bunch of scientists and politicians got together right after World War II and said: “we will only be successful together.” The decades since then have confirmed that approach. I think that we will even have to go further in the future and cooperate globally.

The European: The re-nationalization of Europe is the wrong way?
Heuer: I think it goes in the wrong direction. We need something that we can work on collaboratively. And we also need projects that we can pursue individually – often, those are smaller projects. Take food as an example: it’s good to introduce certain standards, but we should not give up regional cuisines. We need to strike a balance between international large-scale projects and smaller, national or bilateral projects. You won’t get very far if you only pursue lighthouse projects.

Commentary:
I emphasized 4 claims from this except:

1.  Science can really provide an example for cooperation that works.
  •  This seems obviously true.  
2.   Our research would be impossible without a collaborative element.
  • This also seems obviously true, so long as "our" is indexed to a group of scientists working on a large scale project, requiring large scale equipment/infrastructure/man-hours, etc.
3&4.   (i)We need something that we can work on collaboratively.  And (ii)we also need projects that we can pursue individually – often, those are smaller projects.
  • I put three and four together because I wish to evaluate them in tandem.  (i) is a normative claim typical of the idea that the pursuit of scientific understanding of the world will naturally promote harmony and understanding.  (ii) also seems trivially true so long as "individual project" just means finding some purpose for your life through a collection of goal-oriented activities.  I currently have a watch-all-the-Breaking-Bad-episodes project, in addition to my get-a-phd-in-philosophy project.  The interesting thing to do here is see the four possible permutations of (i) and (ii).  Heuer claims both are true, and this sounds like a reasonable balanced position, Aristotilian virtuous and all that–the mean between extremes.  If one claimed (i) as true and (ii) as false, one would have a radically communal view of human life.  On this view, belief is collective all the way down.  One could claim that (i) is false and (ii) is true and have a sort of radical individualism–perhaps a sort of romantic-Nietzschean–existentialist view whereby we only find meaning in life through overcoming the herd and grasping one's ownmost finitude and making a Kierkegaardian leap of faith.  Rejecting both (i) and (ii) seems almost like Pyrrhonian skepticism–a cautious abstention from putting too much stock in any one surefire view of human value or teleology.  
I agree with Heuer's position, but with stronger leaning toward the importance of the communal aspect of life.  My position would be more like: We need things that we can work on collaboratively, and we probably aren't doing enough of these.  We also need projects that we can pursue individually, and we are probably too caught up in most of these.

8/6/12

On the nature of sport: climbing in focus

When I claim that climbing is not a "pure sport," one would be greatly mistaken to think that I mean this in any diminutive sense, i.e. as saying something like "climbing is merely an activity not a true sport."  I say this because climbing is more than a sport.  If anything, it's more like war.  I'm not trying to say that climbing is war, either.  I just mean you could die doing both.  Both necessarily involve the risk of death and/or grave injury.  No matter how cautious you are being as either a climber or a soldier, you can always just die for some reason beyond your control.  I grant that one can die playing a sport, but hey, let's face it, it doesn't happen that often.  It's a difference in kind, not degree.  Both climbing and war evoke primal survival instinct.  Perhaps some athletes can conjure this kind of adrenaline in intense situations, but it sure is easier when your life actually is on the line.  This makes climbing into a somewhat paradoxical activity: the more you put yourself at risk, the better you will perform.  You simply must.  You'll die otherwise.  Hard to feel like that on a tennis court.  But of course the more you put yourself at risk the higher the chance you'll die, so it kinda turns off the appeal of climbing harder.  People who improve as climbers–who "climb harder"–have the strange ability to be dumb enough to try riskier and riskier moves, but smart enough to know deep down their bodily capabilities.

Thinking Together

Why care about cognitive empathy?  It is not intrinsically truth directed.  The state I am talking about as "occurrent grasping," elicited when thinking together, is non-factive apparent knowing.  The end of thinking together is "cognitive alignment," evidenced by agents coming to share a language, to function within the same discourse.  Cognitive alignment is not intrinsically truth directed.  However, it is a desirable goal because it does entail the conservation of cognitive resources.  Conserving cognitive resources is obviously desirable.  Were we to question ourselves at every turn, show concern for the paradigm we are operating within, questioning our methods, our terms thrown in to doubt at every opportunity, always questioning the validity of the research program rather than simply pursuing the research program, in short, we wouldn't get anywhere.  However, we must be mindful that preservation of cognitive resources is not the ultimate value, truth is.  Thus, the virtue of cognitive alignment and preservation of cognitive resources is tempered by the virtue of critique.

The more ambitious claim: cognitive alignment gets us something more than just preservation of cognitive resources.  Thinking together is valuable for reasons besides efficiency.  Still working on this...

7/19/12

Some cool/weird things I learned about the Olympics

[source: Esquire magazine, August 2012]

1901: Sumner Paine's shooting medal keeps him out of prison: Accused of trying to shoot his wife's lover, Paine was released when police learned of his medal and realized he'd missed intentionally.

1912: George S. Patton, future general of the U.S. Army, competes in the first modern pentathlon.

1936: American sprinter Helen Stephens wins gold in the women's hundred-meter, only to be accused of being male.  A subsequent examination confirms her to be female.  Her rival, silver medalist Stella Walsh of Poland, will reportedly be found to have testes when she is autopsied in 1980.  A few years after the Games, Dora Ratjen of Germany, who finished fourth in the women's high jump, will admit to being a man, start living as one, and change her name to Heinrich.

1948: At age seventeen, American Bob Mathias wins the decathlon–after only four months of training[!!!]

1948, 1952: Karoly Takacs, a world pistol-shooting champion from Hungary whose dominant right hand was shattered by  grenade in 1938, teaches himself to shoot with his left, then earns gold medals in the next two Olympics.

1960: When his team issued shoes that don't feel right Ethiopia's Abebe Bikila runs the marathon barefoot, and wins.

1980: The U.S. leads a boycott of the Moscow games, leaving only eighty nations to compete.  The women's field hockey competition is reduced to two teams, the USSR and Zimbabwe, which cobbled together a team in less than a week.  Zimbabwe wins.

1988: Canadian sailor Lawrence Lemieux abandons his second-place position in a race to rescue an injured competitor.

1988: When live doves are released during the opening ceremonies, many are burned alive by the lighting of the Olympic cauldron.


7/16/12

On needing a replacement for paper

I do not have any specific nostalgia for printed books.  But I do not think that e-readers or ipads have come far enough to replace books...yet, at least.  Allow me to explain.  There are certain obstacles to replacing printed material with digital that are merely difficulties.  "Difficulties" include figuring out a way to get all that stuff we printed on paper for so long translated into a digital format, so we can ditch the filing cabinets and the stacks of useless documentation that we can't quite simply abandon.  However, there are certain obstacles to replacing printed material that are more than mere difficulties, they are real puzzles.  A "puzzle" is any problem associated with the transition from printed to digital material that involves a loss of functionality.

The various e-readers and ipads on the market have come a long way on these puzzles.  One particular area of success is an effective response to the typical complaint: "But I need to write on/highlight my books, so I need hard copies."  Well, as someone who often voiced that complaint and used it to justify a lot of printing, I will be the first to say that the ability to highlight pdfs and make annotations has really come along nicely, and I now find myself printing a lot less and making my annotations directly on the digital format.  Puzzle solved.

However, one puzzle remains which I call "the puzzle of tactile understanding".  Different books look, but more importantly, feel different in one's hands.  When you pick up a thick volume that you have been steadily working through, your thumb "knows" just about where to insert itself and open the book.  For someone like myself who goes back and re-reads certain sections of text over and over again, looking for a specific spot in a book is a matter of flipping the pages until the thickness feels about right, and then leafing a few pages forward or backward to the precise spot.  On an e-reader or pdf on a laptop, the only thing analogous to this is the scroll bar.  But the scroll bar does not provide a tactile sense of book-location–its only visual.  And when a book is several hundred, or even thousand, pages long, the visual difference on the scroll bar between pg. 456 and pg. 672 is not that noticeable.  "I know I highlighted something around here," I often think to myself as I quickly thumb the pages of my worked-over copy of Being and Time; but, holding down the scroll key while a pdf breezes down my screen just doesn't seem to allow for the quick references I can extract from printed books.

Of course, even if this is a puzzle and not a mere difficulty, it doesn't mean its important or will stem the tide of electronic books.  In fact, I'm almost certain it won't.  We will learn to scan pages visually and we will come up with smart scroll bars.  E-books already have the leg-up when it comes to precise searching (printed books don't have a search bar).  But I do think that this is indeed a puzzle and not a mere difficulty in that the tactile understanding provided by the thickness of printed books will soon be a thing of the past.

3/1/12

The (dark) age of ideas?

Attending TED, Davos, the Aspen Ideas fest, or one of their uglier cousins is the newest status symbol.  Sure, there are other exclusive events and high priced restaurants...but going to TED means you care about ideas.

In a recent article for New York Magazine, Benjamin Wallace hits the nail on the head when it comes to summing up TED:
"TED Talks, curated clips of the eighteen-minute lectures that are gathered on ted.com, have become today’s Cliffs Notes to sounding smart."
Even zippier:
"eighteen-minute nerd-bomb disquisitions"
And then, to put it all in context and give it the even-better-than-the-pseudo-intellectuals-at-TED-real-philosopher-intellectual stamp of approval:
"The atheist Daniel Dennett suggested that TED could “replace” religion, observing that it “already, largely wittingly I think, adopted a lot of the key design features of good religions,” including giving away content."
So what shall we make of the TED phenomenon?  It certainly counts as a phenomenon, after all.  As the article describes in great detail, TED's rise to fortune and fame has been meteoric, spawning dozens of imitators.  Wallace is on to something in his article.  TED is definitely playing a distinct cultural role.  But how will it be remembered?  I could go into a lengthy characterization, but to my delight, one of my heroes foresaw all of this.

In the opening of his greatest work, The Glass Bead Game, Hermann Hesse, speaking as the omniscient historian-narrator in a distant future, describes our current age as the "Age of Feuilleton" (feuilleton="a part of a newspaper or magazine devoted to fiction, criticism, or light literature"–I had to look it up too).  Forgive the lengthy quote, but trust me it's worth it.  Hesse is simply dead-on:
For there was also a good deal of lecturing, and we must briefly discuss this somewhat more dignified variant of the feature article. Both specialists and intellectual privateers supplied the middle-class citizens of the age (who were still deeply attached to the notion of culture, although it had long since been robbed of its former meaning) with large numbers of lectures. Such talks were not only in the nature of festival orations for special occasions; there was a frantic trade in them, and they were given in almost incomprehensible quantities. In those days the citizen of a medium-sized town or his wife could at least once a week (in big cities pretty much every night) attend lectures offering theoretical instruction on some subject or other: on works of art, poets, scholars, researchers, world tours. The members of the audience at these lectures remained purely passive, and although some relationship between audience and content, some previous knowledge, preparation, and receptivity were tacitly assumed in most cases nothing of the sort was present. There were entertaining, impassioned, or witty lectures on Goethe, say, in which he would be depicted descending from a post chaise wearing a blue frock-coat to seduce some Strassburg or Wetzlar girl; or on Arabic culture; in all of them a number of fashionable phrases were shaken up like dice in a cup and everyone was delighted if he dimly recognized one or two catchwords. People heard lectures on writers whose works they had never read and never meant to, sometimes accompanied by pictures projected on a screen. At these lectures, as in the feature articles in the newspapers, they struggled through a deluge of isolated cultural facts and fragments of knowledge robbed of all meaning. To put it briefly, they were already on the verge of that dreadful devaluation of the Word which produced, at first in secret and within the narrowest circles, that ascetically heroic counter-movement which soon afterward began to flow visibly and powerfully, and ushered in the new self-discipline and dignity of the human intellect.
So, the bad news is that our (myself included) love of TED and its kin is a symptom of our being "on the verge" of a "dreadful devaluation of the Word."  Sheesh, that stinks.  The good news?  Perhaps its all necessary for ushering in a "new self-discipline and dignity of the human intellect."  And the dialectic rolls on...

More likely, we'll start seeing articles on how TED talks can and should replace lower division course work.  Adjuncts are starting to organize after all, and organization means less "flexibility" which greatly depreciates "leveraging" potential, and we all know what that does for efficiency and productivity...ah, but I digress.


2/27/12

Decoding the Jargon: understanding the corporate model of the university

The Atlantic recently published an article online titled "5 Ways to Make College Much More Affordable for All Americans" written by a team of consultants at McKinsey & Company.  The article addresses a well known problem, a problem of constant debate in the burgeoning field of scholarship (and rhetoric) concerning contemporary higher education.  You know the one:
"America has a deep problem when it comes to the cost of higher education."  We are all worried about "skyrocketing student debt, ever-rising tuition, strained state budgets."
OK, familiar enough territory...wait, what?
"Our research at McKinsey shows that, when it comes to cost per degree, the top quartile of institutions is 38 percent more productive than the average of their peers."
So...you looked at all the stats on colleges and universities granting degrees and made a big list and arranged them least to most expensive.  How else would you be able to designate a sub-set of those institutions "the top quartile"?  The top quartile of what?  Oh, right, the top quarter of your list of least expensive schools.   OK, so this subset we have now is "38 percent more productive."  What is the "productivity" of a degree granting institution again?  I suppose that would have to mean the ratio of the rate of degrees granted per year over the net cost per degree granted.  But wait a minute, didn't we already arrange the list using the criterion of "least expensive"? So wouldn't including this criterion in the formula for calculating productivity (the desideratum, after all) automatically skew the notion of "productivity" in favor of the least expensive schools in the first place?   If the desire is productivity, and the productivity of a school is calculated with a fixed denominator (the avg. total cost of putting a student through the school), then of course it becomes a race to increase the numerator: get as many through as possible for the least amount of money.

Suppose we grant all of that.

OK, McKinsey, let's hear about some of your "ideas that could help make college more affordable while maintaining or improving quality."  That sounds great to me!
"The surest way to boost college affordability is to make certain that students complete their degrees and do it quickly."  
Back up a second.  Boosting affordability means speeding up time to completion?  Well, I guess that makes sense.  If I went to college to major in engineering, college would be more affordable if I only went for 2 or 3 years and just took engineering courses or courses necessary for understanding future engineering courses.  It makes some sense when you think about it.  If some students change their majors half-way through school, they often end up staying for an extra semester or even a fifth year of college.  That definitely means more money being spent on/by that student.

So let's evaluate the emerging picture of the more productive university: students would select a major and be urged and reminded to do nothing but that major, and perhaps even being locked into that major by some sort of pricing structure that ensures that people get on a track and run through it quickly.  If an engineering student found himself having some time to take a philosophy or literature class, and formally tried to enroll in such a class, the registrar would remind him (perhaps by automated email response) that if he indeed finds himself having extra time, this entails that he use that extra time to run all the faster along his intended course.  Perhaps his attempt would even be viewed as a tacit admission, as proof that he is not performing up to the standard of the contract he signed when he "decided to become an engineer."

Well, McKinsey, perhaps you really haven't thought through what your first suggestion amounts to, but you do at least raise a good point:
"Only 60 percent of American undergraduates seeking a bachelors degree complete their studies within six years, according to the National Center for Education Statistics."  
Indeed, that is cause for concern.  I agree that we should look at ways to make sure people finish their degrees on time.  But, oh, what's that?
"On average, students who do complete their degrees are paying for more courses and credits than required for their diploma or certificate. That's a waste of resources for both students and the governments that subsidize their tuition -- a waste that makes higher-ed more expensive for everybody"
Ah, yes, so you did think your suggestion through!  My distopian thought experiment above is not distopian at all for you!

What the hell, I'll play along.

I will join you in condemning those awful brats who have the nerve to take classes that aren't essential for their degrees (for, even if they finish in 4 years, if they could have done so in 3 then they should have in the name of productivity).  And if anyone had the nerve to change degrees and thus take longer, shame on them for being so fickle!  Let's not forget to chastise smokers, the obese, and the soda drinkers for driving up the cost of healthcare too.

The next suggestion contains a grain of truth as well, but is much more problematic than McKinsey seems to think.  They point out that one way to increase productivity is to:
"revamp antiquated transfer agreements that wrongly deny students credit for work done elsewhere."  
Of course, this sounds nice and I would like to support this, but if you ask any experienced teacher about handling transfer credit you will most likely hear some variant of, "Well, that depends."

I am intimately familiar with this issue.  I teach at a large research university with relatively bright students who worked hard in high school.  I also teach for a for-profit regional institution that is a "university" in name only.  I am not degrading this institution, but rather pointing out that it has an open admission policy.  They take anyone who can pay (yes, that often means with federal student loans).  The difference in writing quality between the two is, to say the least, pronounced.  Thus, were a student to be transferring from "elsewhere," I would not automatically assume this to entail that the student should be granted credit for my university's, say, lower-division writing requirements.  Of course, I am very happy when I meet a student at the university who has transferred in from community college and the whole thing has worked like it should.  Ideally, the community colleges and the for-profit institutions (the "elsewhere") would enact grading criteria that ensure strict alignment with the expectations of the university (the "there").  A compromise position would be to grant transfers the credits upon passing a suitable test designed by the members of the faculty who design the requirements of the major in question.  Thus, transfer credits remain possible but, like all credits, must be earned rather than granted on testimony.

So yes, making the whole transfer credit swamp more efficient is a good idea.  But you can't simply rewrite the rule in the registrars office and assume that you are maintaining any sort of standard for your own university (after all, we are looking after the value of a brand here).

Another way to speed up the process is to
"give students support and tools that help them plan the most efficient path to graduation." 
Another phrase as equally lovely as it is vague.  What sort of "support" and "tools" are we talking about here.  Well, according to the grammar of the sentence, we give them support and tools for "planning."  We help them plan.  We guide their planning.  But we already do that.  So this must be a better, productivity sort of planning tools and support.  It most likely would feature "enhanced planning techniques" that border on persuasion, prodding, and perhaps admonition.  After all, you had better "remind" your borrower of his responsibilities, of his promise.

Finally, the authors begin to mention the heart of the problem:
"Improving today's low completion rates will also require serious strides in K-12 schooling, because a student who isn't ready to do college work in the first place will have a harder time getting a degree."  
But this incredibly sobering mountain of a problem is assumed solvable, thus we skip immediately to the pay-off of this, stated in terms fitting of a boardroom discussion over increasing the speed of the machinery to expedite the production of what-the-fuck-ever:
"If all undergraduates were college ready when they entered school, for the same cost, the country would add 300,000 additional degrees above and beyond the roughly 2 million it graduates today." 
What are we even talking about again? 60% of the students who go to college don't make it through college.  Naturally, if they were "college ready" they would graduate (and on time).  I can tell you first hand, if all the students in my real and virtual classrooms were "college ready" (if "college ready" means being able to get a C in the course), I would indeed love that and they would surely graduate on time.  But you already stated the problem above: 60% of them are not college ready; hence the need for those "great strides" in K-12 education.

This is the ultimate buck passing.

Thus far, this report amounts to the statement: "If we could figure out a way to get all students college ready, we could make college more efficient and thus more affordable."  Yes, McKinsey.  You are right.  Why didn't we think of that yet?  We just need to make everyone smart and capable enough to get through college in under four years.  I'll leave this one alone (for now).

The next suggestion is to
"try new ways of teaching."  
This is often code for "get an adjunct to teach an online course or a hybrid online-classroom course."  What McKinsey has in mind is not too far off this stereotype.  They throw a lot of jargon at us about these "new ways" improve "learning outcomes" and "overall retention" while "deploying technology" to increase the variety of modes of "student teacher interaction."  Sounds nice.  So, you basically mean, decrease classroom time and get more done over the internet?  You guessed it:
"For example, Rio Salado College and Western Governors University rely on self-paced online instruction for the introduction to basic course material, use flexible adjunct faculty and student mentors, and  are able to deliver instruction at least 50 percent more efficiently than peers. Traditional brick and mortar institutions can pursue a similar process to increase quality and decrease instructional costs."
 Sounds great! Do you accept credit default swaps?

If we examine this example we find the following suggestion: get students to complete automated online tutorials in order to qualify for the next level of classes, in which either a adjunct, graduate student, or "student mentor" (another undergraduate?) grades you based on some pre-established  "scaffold" of assignments and a rubric with nice sounding "student learning outcomes" like "development" "organization" and "reflection."  I have developed an innate sense of the distinction between a rubric score of "5 out of 5" on the reflectivness scale ("Variety of elements captures reader’s attention") and a "4 out of 5" on said scale ("Variety of elements enhances interest").  Grading has become the practice of a subtle phenomenology of attitude recognition.  Is my attention currently captured? or merely enhanced?

*Sigh*  Whatever.  Fine.  Let's use the self-paced whats-it-called and the student mentors and the 50 percent thing sounds good too. 

The next suggestion is to
"recognize that learning happens outside the classroom."  
Once again, the experienced educator will not deny this outright, but will seek to understand what, exactly, this means.  McK provides a heartening rationale, which one would be hard-pressed to disagree with:
"Today there are medics and mechanics who acquired skills on the battlefield, but can't land a job back home as a paramedic or mechanic because they don't have a diploma or certificate that proves what they know. We need to develop ways for colleges to recognize the academic value of such prior work."
Of course I agree that if someone learned how to be a medic or mechanic in the army then he should not need a college degree to become a medic or mechanic.  Or, if the relevant employers of medics and mechanics require proof of a college level degree in the field, then colleges should simply grant them the required degree.  But of course now we are back to the question of accepting transfer credits.  You don't give someone a paramedic certificate without testing her.  And you don't grant someone a license as a certified mechanic without testing her in some way.  I don't really see a problem here at all to be honest.  If someone is qualified because he learned in the army, then he should be able to pass a test to prove his qualification and be on his way.  If the issue is one of affordability regarding the cost of taking some test, then the degree granting institution should work out some deal so that the veterans don't have to pay to prove their merit.

Now its just getting tiring.  It's only getting ridiculously vague and jargon-y sounding:
"Introducing leaner processes and shared services is one promising way to shrink the cost of management functions, student services, academic support services, and plant operations. Organizational redesign and smarter purchasing practices along the lines of what top performing private corporations now routinely do can also help."
OK fine, whatever.  If there are some more efficient ways to run the power grid at the school or lower costs on maintenance, then yes, let's go for it.  But what university isn't already looking for that stuff anyway?  I'm definitely on board with shrinking the cost of management functions.  Although the good people at McKinsey should be acutely aware of how broad and nebulous the term "management" has become.  What is a management function?  Beats me.  Shrink away.

Oh and of course let's not forget to tip our hats to the "top performing private corporations" and acknowledge the brilliance of their "smarter purchasing practices."  I suppose this means that universities should leverage their size to drive down the cost of textbooks, meal-plans, and dry-erase markers if nothing else.  That last sentence: "Organizational redesign and smarter purchasing practices along the lines of what top performing private corporations now routinely do can also help" essentially translates to "Hiring firms like McKinsey to make lists like this for you can also help."

And finally, to cap it all off, the government ought to "encourage" all degree-granting institutions by "reminding" them to adopt everything McK just told us by using financial aid as a "carrot and stick." "Encouraging" amounts to "incentivizing" in this vernacular, which would be the carrot part.  Whereas the stick is probably just some form of straight up financial coersion.  (Your institution will be turned over to a team of experts from the educational equivalent of the IMF.)

Wait, what is that you heard? Aren't the educators themselves not really on board with any of these suggestions?  Meh, fuck 'em:
While the education community is understandably wary about the details, state and federal governments should continue to carefully consider and pursue incentives that reward programs or institutions that embrace the changes described above.
Taken in isolation, none of these suggestions from McKinsey are downright awful or patently absurd, but taken collectively they amount to a re-imagined university that is simply not a university at all.  It is some sort of quasi-government funded loan company that keeps overhead on a product as low as possible in the name of affordability.  This reminds me of those stories I hear about how McDonald's primarily makes money by collecting rent on the properties it owns, and that the food is not the focus of the business.  According to wikipedia at least, McDonald's organizes the supply of food and materials to restaurants through approved third party logistics operators.  Wikipedia also tells me that one in eight workers in the USA have worked at McDonald's.  Man, they are really cranking out graduates!  What's their secret?  We'd better consult the experts in order to manage this problem.  We require the proper tools and support to increase efficiency and productivity...

1/28/12

On Knowing Where You Are

How do you answer the question, Where am I?  This post concerns this question in a broad sense, yet in a sense grounded in the concrete perceptual encounter with the world.  Our sense of where we are in space is a primary form of how our world is disclosed to us.  As Thomas Mann writes (and as I have cited him previously): 
Space, like time, engenders forgetfulness; but it does so by setting us bodily free from our surroundings and giving us back our primitive, unattached, state. Yes, it can even, in the twinkling of an eye, make something like a vagabound of the pedant and Philistine. Time, we say, is Lethe; but change of air is a similar draught, and, if it works less thoroughly, does so more quickly.
Mann is admirable on this front in that he maintains important distinctions between space and time while showing their ultimate connection in a more general form, which we can call space-time for lack of a better term; lifeworld would probably do as well, but that term is typically invoked with historical and cultural meanings in mind.

Not surprisingly, my musings on this topic were set off by recent travels.  While visiting New York city, I found myself wondering where I was on several occasions.  And I don't mean I was pondering my place in the universe or some existential inquiry; no, I literally needed to know where I was, specifically whenever I got off the subway and emerged back on the surface streets.

One can figure out where he is in a few different ways.  Some of these ways are more "direct" in that one can instantly grasp his spatial orientation by perceiving landmarks, or, say, the position of the sun in the sky.  Simply by regarding one's visual horizon, one can immediately know which way is North, and thus which way all of the other cardinal directions point, and thus, most importantly, which way one must walk if the bar one his headed to is uptown. This was my consistent mode of spatial orientation while living in New Mexico.  When living in a town with uniformly low buildings and distinct mountain ranges in all directions, figuring out where you are and which direction to go is as easy as looking up.

Alternatively, one can figure out where he is in a more "indirect" fashion.  This requires reading signs and making inferences.  If I come out of the subway and need to walk East two avenues and then south for three blocks, then my typical strategy (amateur city-dweller that I am) is to read off the coordinates from the nearest corner, then pick a direction and walk until I see the next set of street signs, thus enabling me to infer what direction I just walked in, and determine how to further determine my course.  I felt alienated doing this.  I quickly learned that I could orient myself in the "direct" manner by identifying specific buildings on the horizon (e.g., the Freedom tower is downtown, so if I can see it, I know which way downtown is, and thus I know where I am).  But this doesn't always work in NYC, where you often can't see further than whatever random set of unremarkable buildings swallow up your horizon.

So, I like to know where I am by being able to simply look around.  But perhaps New Yorkers don't need my crude guess-and-check system to orient themselves when they emerge from the subway.  Perhaps after years of practice they have a little internalized map, like the GPS on your phone, with a red dot indicating where they are and the map adjusting accordingly.  I wouldn't be surprised if this ability were innate in all humans, and just a matter of whether it is cultivated by specific cultures.  It is now well known that Australian aboriginal speakers of Guugu Yimithirr do not rely on "ego centric" direction words at all (left, right, forward, backward, etc.), but rather always refer to the cardinal directions.  To quote a recent NYTimes article:  
Whenever we would use the egocentric system, the Guugu Yimithirr rely on cardinal directions. If they want you to move over on the car seat to make room, they’ll say “move a bit to the east.” To tell you where exactly they left something in your house, they’ll say, “I left it on the southern edge of the western table.” Or they would warn you to “look out for that big ant just north of your foot.” Even when shown a film on television, they gave descriptions of it based on the orientation of the screen. If the television was facing north, and a man on the screen was approaching, they said that he was “coming northward.” 
The article proceeds to explain how the tribe's linguistic practices ingrain a sense of direction, to the point where speakers of this language simply "feel" directions, in the way someone with perfect pitch senses each note without having to calculate intervals.  This goes beyond the "direct" mode of spatial orientation I described above, however I would argue that it begins in the manner I describe above and becomes habituated to the point where the process is unconscious.

And this brings me back to Thomas Mann.  A change of setting is indeed one of the fastest ways to forget.  A new setting challenges you to orient yourself, to figure out where you are; whereas our home-spaces "feel like home" precisely because one knows where he is at all times.  In a new place we are set back to our "primitive, unattached state," which can be exhilarating, yet equally alienating.