Tuesday, December 10, 2013

How not to use peer assessment in a MOOC

Earlier this week, after much thought, I unenrolled from what until that point had been an excellent course on Global Public Health.

I'd viewed all the lectures, taken all the quizzes, done all the readings, and all that was left was the final assignment.

Students were given a reasonable amount of time in which to write a ~1500 word paper on one of three topics related to the course. One topic was somewhat rigidly defined, one was wide open, and one was in between.

I'd settled on the second option - write an essay on an infectious disease, focusing on the sorts of issues brought up by the course: international public goods, global health initiatives and so on. Fair enough. I was already familiar with leprosy from research I'd done years ago, and was ready to write a report on that illness.

It was on deciding on my topic that I realized I'd be drawing mostly on my existing knowledge and Google searches, not what I'd learned in the course, for my essay. That prompted me to take a closer look at the requirements.

Here are the ones that stood out as troubling or inappropriate:

a) The essay was worth 80% of the course mark. Placing such a heavy weight on the essay cheapened the rest of the course material. It is entirely possible to ignore the course, pick an infectious disease one is familiar with (or which has been extensively written about), pander to the specific points on the posted rubric and walk away with a statement of accomplishment. Conversely, it is also possible to pay careful attention to the lectures and readings, submit all the quizzes on time, and fail the course due to insufficient attention to the rubric, troubles with the English language, and so on.

b) The essay would be peer-assessed. This in itself is not a problem. I have been privileged to take other Coursera courses where peer-assessed essays are an invaluable and worthwhile component of the course. This was not one of them. (I'll post my thoughts on what I believe is needed for peer assessment to work in a later blog entry.)

c) This was the only essay required in the course. Students have no chance to learn how to grade, and there is no guarantee that a random subscriber to a free online course will have the knowledge, experience or language skills necessary to gauge the worth of a university-level academic essay on a nuanced, complex scientific and political subject.

d) The assessment rubric required students to establish that the essay writer had encyclopedic knowledge of a subject (failing this, a minimum mark should be given for the relevant rubric entry). This is a fundamental, fatal flaw in the grading rubric, especially since the course was billed as an introduction to the subject. This asks non-experts to know what it is that they do not know. How can a beginner in a subject, acquainted with only a modicum of information, judge whether a peer has not only more knowledge, but all-encompassing encyclopedic (the word used in the rubric) knowledge?

For all of the above reasons, a statement of accomplishment for this course provides at best a very mixed signal of whether a student was engaged with or understood the material. Though confident I could in short order write an essay that would earn acceptably high marks, I did not find the task intellectually honest. There was too much of a disconnect between learning from the course and scoring well on the final assignment, and so I chose, with some regret, to un-enroll at the eleventh hour.

Peer assessment CAN work, and work brilliantly - but this is not the way to do it.

Tuesday, December 3, 2013

The other side of the coin: a beautiful video and thoughtful design post about Bitcoin

Some weeks ago I wrote a long post detailing some of my current thoughts about Bitcoin.

This morning, the excellent design blog Fast Company posted a fascinating article on a beautiful video about Bitcoin.

I'm still uncomfortable about equating scarce tradable goods with currency or even currency candidates, but the post provides a lot of food for thought.

If nothing else, the video is worth a view or ten simply as a work of art. It's a breath-takingly gorgeous animation.

Thursday, November 28, 2013

The Eyres of London: an enthralling look at crime in the 13th century

I'd like to share with you 13th century court records with all the sizzle of a modern scandal-sheet.

During the 13th century, London's population was roughly 50,000. This was quite a respectable city size for the time, but did not allow for the elaborate machinery of justice we take for granted today. Instead, complaints were filed with the local sheriff or bailiff and prosecution was put on hold until the next time the circuit judge came around. These traveling circuit courts were named 'Eyres' (yes, just like Jane Eyre - her last name is no coincidence).

Most records of 13th-century Eyre trials and judgments are presumed lost, but two have survived in a remarkably complete state. These are the Eyres of London for 1244 and 1276. They were special in that the king himself (or his direct representative) sat in on the judgments. Being therefore the most perfect or official of trial records, multiple copies of the Eyre transcripts were made and distributed as exemplars to judges and other agents of law enforcement. They may be thought of as a bundle of sample cases and precedents with a royal seal of approval.

Copying was not the most reliable of arts at the time, and different surviving copies of the Eyre transcripts vary in details. Fortunately, in the 1970s a British scholar, Martin Weinbaum, took it upon himself to examine all the variations and from them (hopefully) reconstruct the original. His editions of the Eyres were published by the London society, and a hypertext version has recently been made available free of charge at the British History Online web site.

Not only did this scholar scrupulously edit the texts, but he also provided modern English translations alongside the original Latin. It's been over a decade since I last used my Latin lessons, but his translation seems to me to be accurate and conservative.

The London Eyre of 1244
The London Eyre of 1276

I must admit I found these quite by chance. As a student of economic history, my first thought when being asked to find a primary source that could inspire a piece of fiction was to go for probate records. These inventories of items left at a person's death are used frequently by economic historians, and are often heart-breaking or stirring. Entries such as, 'Wooden chest, worm-eaten, containing sheets of fine French cloth, never used. One reserved as burial shroud.' are not uncommon. (Rather reminiscent of the famous six-word novel, 'For sale: baby shoes, never worn.', isn't it?)

Finding well-catalogued, free probate records in English - or in a pinch, grammatically correct Latin - proved easier said than done. A hyperlink chase across a dozen web sites eventually led me to British History Online, where a mis-click took me to a transcript of one of the Eyre sessions.

I didn't leave the computer for a half hour after that. I was hooked.

London, as I mentioned above, had a rather small population. Because of that, personal cases that in another age would never make it to the ears of the head of state were presented to the king. Legal language had not yet calcified into its modern passive voice and restrained, careful neutrality. The court scribe recorded events quite conversationally, and as a result, the stories almost write themselves. All that's needed is putting meat and skin on the skeleton provided, and furnishing entertaining and plausible answers to the questions raised.

Consider the following entry, from one of the first sessions of the 1244 Eyre:

They say that on the feast of St. Ethelburga [11 Oct. 1226] Emma, daughter of Walter of Coggeshall appealed Gregory, son of master Gregory the Physician, of violently raping and deflowering her, and Richard, son of Thomas the Imagemaker of aiding and abetting him. Gregory and Richard come, but Emma does not, and she found pledges to prosecute her appeal, viz. Richard the Baker and John of Kennington, baker. Therefore they are in mercy and Emma is to be taken into custody. Afterwards the mayor and citizens were asked whether they were of opinion that peace had been made between the parties, and they said upon their oath and in the faith in which they are bound to the king that they had agreed together. Asked further if they believe that Gregory is guilty of the deed, they say that he is not guilty. They say also that he who was appealed for aiding and abetting has not made peace and is not guilty. Therefore he is quit. And Gregory is to be taken into custody. He made fine in half a mark, because he is poor, with Simon fitz Mary and John de Coudres as bis sureties.

First of all, note the date: the event took place in 1226, but was not heard until the Eyre of 1244. This is what I referred to above: cases were kept in cold storage until a judge errant arrived. It was common for the parties involved to die before the case made it to court. This led to entries such as this one, from the first case in the above link: "Stephen has died, therefore nothing from him."

The accused was the son of a physician. This raises interesting questions - did Gregory's father know of his son's activities? Did the accused use his father's position to gain access to Emma? Even if he did, apparently he still needed the help of Richard, son of an image-maker. I haven't looked it up, but I suspect this refers to religious images, such as figures of saints in a cathedral. Could Richard have used his father's professional contacts to lead Gregory to Emma when she was alone in prayer within an isolated chapel?

The accused and his accomplice made it to court. Emma, though she was still alive, did not. What was it that made her stay away from the proceedings? Did she fear for her life? She did appoint two bakers as her representatives. At the time, most homes took their dough to the neighbourhood baker, to be baked in a communal oven (for a fee). Suppose Emma kept mostly to herself after the assault. There is no husband mentioned, or children, or change of name - spinsterhood seems probable. If that were the case, she'd seldom leave home and meet other people... save for daily errands, such as fetching water from the pump and baking bread. While waiting for the bread she would have had time to socialize with and get to know the bakers in a safe setting, where she was surrounded by other women. That would explain why these were the men she trusted to represent her at court. Their profession is doubly interesting when we consider that bakers at the time were thought to be proverbially dishonest, adulterating flour with any sand or sawdust at hand. The modern equivalent would be to send two used car salesmen to vouch for her.

The court didn't think the bakers were an appropriate substitute. Gregory and Richard were granted mercy (a kind of conditional bail), and officers went off to arrest Emma - and presumably drag her back to court by any means necessary.

We don't hear any details of testimony by the principals in the case, if indeed there was any. The court was satisfied with conducting something halway between an opinion poll and a jury trial. The answer by the makeshift jury, consisting of the mayor of London and vague 'citizens', seems a bit contradictory. Gregory had 'made peace' with the court, which essentially meant he agreed to stop hostilities. He was found not guilty, but nonetheless taken into custody and charged half a mark, which he could not afford. Two others, presumably his friends, had to promise to pay it on his behalf. This odd juxtaposition of a 'not guilty' verdict combined with custody and a fine suggests that there was a plea bargain: I suspect Gregory agreed to 'make peace' and pay his fine as long as he came out of it without being officially labeled a rapist. While I'd have to research the issue, it's reasonable to think that there were social and religious costs to such a label quite apart from the law's punishment. Meanwhile, Richard the image-maker's son wanted nothing to do with the case. He refused to make his peace, but he wasn't found guilty, either. I can almost see him storming angrily out of the square, pushing people aside while muttering curses under his breath.

This is just one of many, many colorful stories in these wonderful source-books. In the next few weeks, once I've had a chance to read them more carefully, it is my intention to write up a few of them as short historical fiction. Care to join me?

Monday, November 25, 2013

An experiment in personalized learning

(Note to those reading this as part of their Blended Learning assignment: the introduction is meant to provide context for a general audience and may safely be skipped.)


Many of us - certainly those reading this blog post - live in a world of information overload. Facts are no longer as dear as they once were. An internet connection provides instant, searchable access to a great part of human knowledge.

The ability to search the internet selectively quickly transforms into the need to do so. A Google search for 'sloths' returns 898,000 hits. That is far too much information to take in, and a selection criterion is needed - a rule whereby some relevant sites will be accessed, and others discarded. Perhaps we're only interested in videos of baby sloths, or academic articles on their digestive systems.

Selection implies context, and context implies personalization. In one context, sloths may be seen in terms of their position in the hierarchy of cuteness. In another, they may be viewed as nothing but host environments for intestinal bacteria.

This context will have been chosen to enrich a particular world-view - an illustrator of children's books in our former example, perhaps, or a microbiologist in the latter. This is personalization.

The particular circumstances of a seeker of knowledge inform the context in which they place facts about the world around them, and this context shapes the selection criteria used to access this information in a way that makes sense to them.

The role of the teacher is changing. Not overnight, and not completely, but it is. For many years the archetypal teacher has been the 'sage on the stage' (as phrased by a course I reference below). An experienced adult with appropriate qualifications hands down information from the lecturer's podium to a student audience. Lectures proceed with the calculated regularity of a locomotive. Students must follow along at the prescribed pace or be left behind. There are extra readings, assigments, quizzes and exams to extend learning beyond the lecture hall, but these are almost always of a one-size-fits-all nature. Even the best-meaning teachers and lecturers seldom have the time to provide their students with customized exercises tailored to their characteristics.

There is some value to this paradigm. When information was costly and difficult to obtain, it made sense for a specialist to collect it on behalf of the students, then feed it to them in a partially digested form. This is similar to how parent birds feed worms to their hatchlings, or how a bee collects pollen and returns it to the hive.

As information becomes abundant and easy to access, the importance of this role diminishes. Anyone with access to a web browser can retrieve information - too much of it, in fact. The expert's role then shifts from the 'sage on the stage' to the 'guide on the side'. The value added from a real live (or pre-recorded) instructor is in giving a guided tour of the neighbourhoods everyone is free to walk through, imbuing them with structure, sense and meaning.

The same technology driving this shift - web search, social networking, interactive applications and so on - also makes it possible for learning to be customized to a student's characteristics and potential to a remarkable degree.

In what follows, I will take you on a tour of a small first step into this new 'blended learning'. The experiment was a small one, done cautiously, which makes it admirably suited as an introduction to the possibilities unlocked by this new way of thinking about learning.


A preliminary disclosure: I've taken a number of free online courses through Coursera, both for professional development and out of curiosity. One of the most rewarding has been Blended Learning: Personalizing Education for Students.

As their final assignment, teachers taking the course were asked to implement blended learning in their classrooms, then film a short video about their experience. Those without a classroom or video recorder (such as yours truly, for now) were asked to write a blog post about one of the student videos, reporting on the experiment as thoroughly as if it had been their own.

After viewing a number of student submissions, I decided to share with you Grace Dorrington's video. Her narration is exceptionally clear, and the changes made in her school were basic enough to serve as a good introduction to blended learning for someone previously unacquainted with the subject.

The main event

Before we begin, you may want to watch the video I'll be writing about. It's short, insightful and the narrator has a lovely accent.

All done? Great. Let's begin.

I'll follow the Economist style guide in calling people what they wish to be called, and refer to the video's author by her YouTube handle, graced05.

First things first

The school in question started out as a fairly traditional one. Teachers lectured at a blackboard, students listened and took notes. While there was an information technology component to teaching, the online learning software was not used to its full potential. It was essentially a virtual bookshelf - the cheap plywood kind - used to store documents.

There wasn't anything wrong in particular with the traditional style used at the school, but there wasn't much right with it, either. Students muddled through as they have around the world for decades, learning their lessons but not being terribly engaged or excited.

Teaching followed the one-size-fits-all paradigm, which worked about as well as Walmart stocking only medium-sized clothing. The same lessons that some pupils found too loose, others would find suffocating.

Two things stood out as candidates for improvement: student engagement with the material, and somehow allowing education to be customized to student needs, abilities and background.

Risk and Reward

The major change implemented by graced05's school was sensible, simple and effective. They replaced their aging learning sotware with Desire2Learn, one of the new generation of learning management systems.

By going with a well-known third-party application, the school reduced the risk involved in switching software platforms when compared to, say, a custom application.

Desire2Learn is well-reviewed. I do not mean that it has uniformly positive ratings - opinions of this platform are notoriously mixed - but that there are a lot of reviews from teachers around the globe spelling out exactly what they liked and what they did not like about the program.

The accumulated experience of thousands of vocal reviewers allows adopters of Desire2Learn to know exactly what they're getting into. If the negative reviews all mention aspects of the platform that are not important to the school in question, then the system will be superior in this case to what the raw average review score (3.2/5 on one site) suggests.

Before implementing the change, graced05's school made sure to upgrade their technology's infrastructure, further reducing the possibility of something going wrong.

What changed?

In bullet point form, here are a few of the changes made possible by the new software:

* Contextualized resources - Instead of having course web sites be virtual Sanford and Son junk piles, teachers have been placing resources in their appropriate context.

The economics web page at 1:31 in the video, although fairly rudimentary, nonetheless postions links, pictures and a video in a context that makes their relationship to the topic under discussion clear. This small change tests the waters of the 'guide on the side' paradigm mentioned above.

* Conditional release of course information - Desire2Learn allows for content to be gated by skill. In the traditional model, students are assumed to progress through the material at the same pace (barring repeating a year). Mastery-based learning, of which this is a trivial instance, only allows students to progress to the next level once they have demonstrated sufficient skills in previous levels.

Students who show facility with the material will be able to zip along quickly, where in a traditional class they may have been bored through having to wait for others to catch up. Similarly, students who may have had trouble advancing at the one-size-fits-all speed can now take more time to learn material they have difficulty with.

This is very similar to the way in which traditional side-scrolling video games (famously, Super Mario Brothers) gated content. In general, a player could not advance to level 2-3 until they had sucessfully completed level 2-2. Appropriately enough for our analogy, players who put extra work into research or exploration of the levels
could be rewarded with the discovery of secret 'warp zones' that allowed them to bypass a significant amount of content.

Based only on anecdotal evidence, I suspect that familiarity with the video game 'beat the level' trope has led students to greet this style of gated content with acceptance rather than frustration.

* Customized (if canned) feedback on quizzes - Until quite recently, I looked down on multiple choice tests and thought them inappropriate in almost any serious course. Taking a few Coursera courses have changed my mind. Modern learning management sytems allow the creation of multiple choice quizzes in such a way that they make a valuable complement to long-answer tests.

In a new-style test, the multiple choice answers are not arbitrarily chosen to look convincingly like the real answer (e.g. (a) 0.12 (b) 0.11 (c) 0.13). Instead, best-practice test designers consider all the common mistakes that a student could make when answering a question, and turns that solution into an answer.

When a student chooses that answer on the test, chances are they made the mistake in question, and appropriate, customized feedback can be given.

Consider a question that asks you to calculate (5 x 4) + 2. A student uncertain about how to use brackets may answer (b) 30. An example of possible feedback is 'Remember that you have to solve the operations in brackets first.'

Although this way of writing multiple choice tests requires more effort from the instructor, the result is a personalized learning experience that provides useful, valuable feedback. For an example in the video, see the screenshot at 2:05.


In this experiment, a fairly small change in structure - implementing an updated learning system - has led to a sea change in how learning, students and teachers are viewed.

In the videographer's own words (2:55), "Though they're still a vital part of ... education, teachers are no longer the only gateway to knowledge."

The adult teachers have had to readjust in their roles, shifting focus from feeding information and seeing it is properly digested to taking the students on a tour of the orchard. The students still end up full and satisfied, but now they climb the trees that most engage them, and pick the fruit that's most comfortably within their reach.

It will take time for things to settle down to a new steady state. Teachers will have to continue thinking as 'guides on the side' and how best to meet the needs of individual students with the available resources and techniques. The rudimentary state of the course sites featured in the video suggest that despite a promising start, there's a long way to go.

Students, meanwhile, will have to understand that they are now in large part responsible for the pace at which their education progresses. For all that this can be liberating and exciting, it can also be a bit scary. Not only that, but interstudent social relationships are bound to change. Where once students in a class were certain to share identical course histories (assginments, readings and so on) now there is no such guarantee.

Pupils will still make friends at school, of course, but the nature of those friendships will be changed at a basic, subtle level. This interesting adjustment will probably happen fairly quickly, if it hasn't already. Teachers have years of training and experience to make them set in their ways. Students join the classroom with a beginner's mind.

That's it for our guided tour of a blended learning experiment. See y'all Thursday, when we may or may not explore the seedy side of macroeconomic subsidies and strategic investment in utilities.

Thursday, November 21, 2013

PSA: Bitcoin is neither a currency, nor harmless

Disclaimer: my views on currency are not representative of those of most economists. For the sake of clarity, I write as if my opinions were authoritative, but I encourage the reader to question them.

Bitcoin, often described as a digital currency, is swiftly gaining popularity and media coverage as a possible alternative to the fiat currencies issued by national governments.

A Bitcoin is produced by having a computer perform an intentionally difficult mathematical calculation. When the calculation is complete, a Bitcoin is created. The difficulty is meant to keep Bitcoins scarce, and thereby endow them with value.

A fiat currency, such as the Canadian dollar, is usually endowed with value by law that guarantees its acceptance for the settlement of debts within a country. Typically, the government has a monopoly on the production of currency. While some small countries outsource the production of actual coins and bills, the basic point stands.

Bitcoin works differently: anyone with time and a computer capable of performing the calculations can 'mine' them.

It is no secret that from a historical standpoint, governments are almost guaranteed to abuse their power to create currency.

Temptations include printing money to pay bills or sweeten a re-election, limiting the money supply for reputation reasons and using the money supply as a tool to perpetuate an elite's political power and/or otherwise unsustainable economic policies.

Misuse of currency can lead to real devastation - see Zimbabwe's recent history.

I am sympathetic to the desire to get away from all this and switch to a currency that governments can't (easily) tamper with.

Unfortunately, Bitcoin isn't the answer. It isn't even a currency, and the economic misconceptions it is based on can have perverse and harmful, if unintended, consequences.

When we use a national currency, there is an expectation that we will be able to exchange it for almost anything we should need or want - even foreign goods. This is often true even if the currency is very poorly managed - we'll just need more of it.

Why is this possible?

Few things are as certain as taxes. Governments typically require that at least a portion of taxes be paid for in the national currency. Every tax-paying entity in the country therefore has a demand for the country's currency, because they'd like to avoid being jailed for tax evasion.

A fiat currency is therefore backed by the entire tax-paying productive capacity of the issuing country. If the world at large is fond of maple syrup, then the Canadian dollar will have value because Canadian producers of maple syrup will exchange at least some of their crop for the currency.

This is why a fiat currency doesn't need to be backed by gold, or silver, or anything else. If the country produces gold, then it's already backed by gold (though not at a stable guaranteed rate). It's also backed by falafel, painkillers, cleaning services, computers and anything else sold or created by a tax-paying business.

The value of fiat currency doesn't depend on the government requiring it for taxes - that's just one example of many that I chose because it's easy to explain in a few sentences. I leave it as an exercise to the reader to find others (a quick web search will yield a few).

Once you have at least a few entities accepting the currency no matter what, acceptance tends to snowball. Why does your friendly neighbourhood drug dealer accept Canadian dollars if she doesn't pay taxes? Because she can walk over to a convenience store and exchange them for candy.

(Note: lots of Canadian drug dealers and producers actually DO pay taxes - Canadian tax forms allow for paying taxes on income from undisclosed sources. No one wants to be the next Al Capone - a criminal mastermind finally brought down on tax charges.)

If the Canadian government made a mistake and accidentally quintupled the supply of Canadian dollars overnight, they would still have value. You could still exchange them for maple syrup, you'd just need a lot more of them (once prices changed).

Now, to Bitcoin. Why does Bitcoin have value?

Its creator, as interviews show clearly, thought currency had value primarily because of scarcity. Rare things have value. He figured that if he designed a way for Bitcoins to always be scarce, that would guarantee their value.

Fair enough. That doesn't make it a currency, though, that just makes it a scarce good.

The toenail clippings of registered nurses are also scarce (and actually collected for a few research applications), but that doesn't make them generally valued, or something you'd take to the fast food joint to trade for poutine.

National currencies have value largely because we can trade them for stuff we want. This, in turn, is because there's always a currency accepter of last resort - the issuing government. If worse comes to worst, the Canadian government itself will always accept Canadian dollars in exchange for, say, a parking ticket.

Who's the accepter of last resort for Bitcoin?

No one, as far as I’m aware. If I’m wrong, then whoever it is stands up and promises to trade Bitcoins for real stuff (like pizza or software) no matter what and forever is being a bit silly. They run a very high risk of being left holding the bag if there’s ever a crisis of confidence in the market.

While a number of businesses - not all of them illegal - currently accept Bitcoin, there's nothing in place to keep things that way. (Bonus link: a university recently started accepting Bitcoin tuition payments.)

That's why Bitcoin is not a currency. There's no expectation of being able to trade it for almost anything you might need or want.

Store credit coupons such as Canadian Tire money are closer to being currency, because the store will almost always trade them for real stuff.

They're still not currency, though - you can't easily trade a hardware store’s store credit for rice or soap.

One of the things that makes a national currency special is that if a nation's citizens are alive, then by definition everything necessary for life is available in the country. Most or all of these necessities are typically available in exchange for the national currency, in both legal and black markets. (There are exceptions: during periods of hyperinflation, producers and merchants may insist on payment in more stable, ‘hard’ currency.)

Suppose that these problems went away, and tomorrow everyone started accepting Bitcoin.

Bitcoin would still be a bad idea, simply because of the way it's produced.

To produce a Bitcoin, you need to intentionally waste a scarce resource: computing capacity. The mathematical problems solved are difficult, but they're also intentionally useless.

Bitcoin is created through the virtual equivalent of digging holes in the desert and filling them up again. While the activity itself is completely useless, it wastes labour, shovels, water, transportation and so on.

Earlier this month, a technology company sold $3 million (US) of Bitcoin mining equipment in four days.

That's 3 million dollars - more than you'll probably ever earn in your life - being spent on machines devoted to solving useless problems. Three million dollars that could have been exchanged for just about anything on the planet, including malaria research, legal aid for women at risk, and so on.

As an economist, I'm trained to think about the efficient allocation of scarce resources among unlimited needs and wants. From this point of view, the method for creating Bitcoins is obscene. I struggle to find any perspective from which it is morally justifiable - maybe one of my readers can help me with this.

As if the waste of computing resources isn't bad enough, the mining method also creates perverse incentives. In a recent example, mobile game developers are pondering hijacking their players' computing capacities to mine Bitcoins for them.

Even if this is agreed to by the players, you're still slowing down useful machines in order to solve useless problems.

The situation isn't hopeless. There's at least one way to make Bitcoin more palatable: turn it into a reward for equally difficult, but useful computation.

Lots of basic research - gene-sequencing, parsing astrophysical data, brute force molecular pharmacology and so on - require a ton of computer power. Some researchers already farm out their calculations to volunteers. The most famous example is probably the SETI@Home program.

If Bitcoins were rewarded for crunching THESE numbers, then it could actually benefit society.

It would also partially solve the 'accepter of last resort' problem. An academic organization using Bitcoin miners for gene sequencing, for example, could accept Bitcoin payments for access to academic articles and databases that usually require payment in hard currency.

(Aside: it is true that by providing an incentive to solve mathematical problems that are currently difficult, Bitcoin mining encourages innovation. That is also why the technique could be usefully harnessed for the improvement of basic science by being applied to the solving of difficult problems that are also useful and relevant.)

Thus ends my PSA. In its current state, Bitcoin is not a currency, and is harmful. With any luck, this'll change, but for now, don't buy into the hype.

Friday, January 20, 2012

To help the poor, help everyone

We expect modern governments to help us when we're down. For Canadians, this takes the form of (un)employment insurance, pensions for the elderly and disabled, universal health insurance and more. This social safety net works because the country at large pitches in through taxes, while only those in trouble - historically spoken of as 'the deserving' - get the payouts.

One big question is, who are the deserving? At first blush, it may seem obvious that we have to narrow eligibility for benefits to those who are really down on their luck. Consider old age pensions. Surely it's silly to give government pensions to the rich as well as the poor. The rich, being wealthy, don't need the typically small allowance and will probably just blow it on caviar and designer hubcaps. The poor, on the other hand, actually need the money to buy basic necessities, like bread, beer and heating fuel. Besides, if we give money to the rich, there's that much less to give to the poor.

Thoughts along these lines have led to the means testing of a host of benefits. Before the government hands out any cash, they check to make sure that you're miserable enough to qualify.

This all seems very reasonable, unless you've had some training in economics. Economists are trained to look beneath the surface of choices involving scarce resources, and the more you look at means testing using econo-vision, the more rotten it appears. It turns out that in many situations (including old age pensions) giving money to everyone trumps giving money to the deserving poor.

Below are the top five reasons why means testing is often a bad idea.

5. Stigma

Consider the following scenario. You've worked for twenty years at a bowling alley, but in 2008 the global recession comes along and you're laid off. People living through a world-wide financial crisis and credit crunch are worried about the future and not willing to spend much of their money on luxuries such as bowling. Or restaurant food. Or manicures. Pretty much everyone is firing their workers and hiring less, so you have to settle for working two entry-level part-time jobs in order to make ends meet. You used to live comfortably, but now you're stretching a package of Kraft Dinner for three meals and think washing laundry with soap is something that happens to other people. The baby's getting skinny, the roof is leaking and the car has all its battle scars from recent bumps and crashes. Time to ask the government for help, right?

Only if you're willing to endure the soul-crushing erosion of your sense of self-worth. Researchers recently confirmed what a bunch of us already suspected: many poor people will avoid going on the dole because they don't want to officially be tagged as paupers, and don't want government bean-counters to sneer at them while scrutinizing their income and expenses. The first might seem odd - if you're poor, you're poor, and there shouldn't be any problem with being correctly classified. There shouldn't be, but there is. It's the difference between showing up at the grocery store cashier station with an almost-empty cart and paying for it in cash, and showing up with a full cart, but paying for it with food stamps (or these days, a food ATM card). In the first case, you're poor, but proud - you may not be able to buy much with your income, but what you decide to do with it is entirely up to you. In the second case, your method of payment shows everyone staring at you in the grocery store that the government believes you can't be trusted with real money, and need to be stopped from spending your allowance on lottery tickets, booze and porn.

Something similar happens even when you're actually receiving a cheque that can be turned into cash. Cashing that welfare cheque can feel like advertising that you've failed at life, and aren't able to make it on your own without a government bailout. Interestingly, this effect is in play even when we're looking at big banks instead of poor people. When the US government insisted on handing out bailout money to banks, many of them tried to refuse the offered money. They were worried it would make them look bad in the eyes of their depositors and shareholders. Even the banks that were at risk of going bankrupt in a day or two didn't want to take the bailout money unless more prosperous banks took it, too. They didn't want to stand out as the ones that couldn't make it without a crutch from Uncle Sam.

Finally, there's the second form of stigma pointed out in the research paper. The thing about means testing is that in order for you to receive a benefit, someone (a government employee) has to go through the details of your life to make sure you're poor enough to qualify. This process can be humiliating and demeaning. "Oh, I see you have a pet fish, Mr. Smith. You told me you didn't have enough money to feed yourself, but you have enough to feed a pet?" "Do your children actually NEED a new sweater every year?" "Why does your cupboard have name brand cereals instead of the no name ones? Why do you have something as decadent as a cupboard at all?"

All these problems go away when the benefit is universal and everyone is getting it. There's no more shame in cashing that cheque because the millionaire up the road got one, too. If the family with the SUV is using food stamps to pay for a few frivolous desserts, it doesn't look bad when you use yours to buy the majority of your groceries. Also, if everyone is getting a cheque just for being alive and a resident, there's no longer any need for government agents to perform an audit almost as personal and pleasant as a colonoscopy.

Speaking of those government agents...

4. Means testing costs money

Gathering detailed information about people isn't cheap. Neither is processing information once you have it. Means-testing benefits like pensions is a continuing process - every month (or however long it is between payments) the government has to make sure that benefit recipients don't become wealthy enough to disqualify them.

Governments around the world are in financial trouble right now. Even setting aside debt-fueled meltdowns such as those in Greece and Portugal, their costs are up. No one's buying stuff, which means no one is hiring workers to make stuff, and on top of that banks are getting very shy about lending money to anyone. All this means that claims on (un)employment insurance and welfare are rising, at the very same time that the government's own income (tax revenue) is falling. Politicians are scrambling for ways to make national ends meet. One of the very first suggestions to come up is usually tighter means testing. The idea is that if you disqualify anyone who is able to afford two meals a day and give money only to the very poorest, you'll have more money to spend on bank bailouts and election advertising.

There are many problems with that notion, but here's one of the biggest: going down that route will massively increase the costs of administering the benefit. Remember, these countries are going through bad times so there are more people than usual applying for government benefits. This means that the country will need more civil servants to process the applications, which is costly. Tightening means-testing will require more investigation and monitoring of individuals, which also raises costs.

In the end, just giving the benefit to everybody who is breathing and a resident may be cheaper than hiring an army of men in suits to check monthly whether applicants are sufficiently close to starvation to merit a break. This is especially true if the benefits in question are tiny, as in fact the most heavily means-tested benefits tend to be.

Even worse - in a heavily means-tested world, where you may be disqualified for government aid if a civil servant notices you've graveled your previously dirt driveway, there's temptation on both sides for the applicant to slip a bill to the auditor to make her look the other way. That brings us to our next point:

3. Means-testing encourages fraud and corruption

Sadly, this one is self-explanatory - or should be. If an applicant's success in receiving government money depends on what a civil servant writes on her clipboard, there's temptation on both sides for the applicant to bribe the civil servant in exchange for approval. This is especially true if the means-testing is difficult or uncomfortable to qualify for. The bribe can potentially be quite high as a percentage of the benefit, since as far as the applicant is concerned, getting any money at all is better than not qualifying.

The more complicated the qualifications are, and the more people are applying, the easier it is to get away with fraud. Hey! What a coincidence - those are exactly the two conditions that we see being fulfilled in recession-ravaged countries. Given that governments usually implement means-testing as a way of saving money, there's not enough cash to hire more auditors to check on the civil servants and double-check their paperwork. The only way to detect fraud of this sort is if someone slips up in an exceptionally foolish way. Maybe a civil servant whose job it is to process applications for pensions suddenly starts wearing a Rolex watch and is caught speeding on the highway in a Porsche with a very expensive escort at his side. Or perhaps a welfare recipient shows up to the bank to cash the cheque in a designer suit, with diamond rings on every finger and a smile with enough gold teeth to plate a teacup. Unlike these show-offs, any fraudsters who play it safe and keep their heads down can remain undetected indefinitely.

Getting your citizens to treat bribery and fraud as a part of everyday life is a bad thing. Unfortunately, the harsher means testing is, the more it encourages this. No one wants to have to get rid of their decadent glass windows and working refrigerator in order to qualify for $40 a month, but if you can slip a tenner to an interviewer to have them confirm that you live in a shack made of cardboard salvaged from other shacks, why not?

Giving the benefit to everyone eliminates the middleman, and with it the temptation or possibility of fraud. (Well, okay, you can still pretend that your deceased friend is alive and continue to cash their welfare cheques.) It's not a good idea to make fraud part of the national culture, so governments may want to avoid creating a situation tailor-made for it. The harsher the means test is - the more miserable you have to be to qualify - the more likely it is that people will find ways to cheat.

2. Complexity means less take-up

The more tightly a government means-tests, the more questions it has to ask applicants and the more forms that must be filled out. In some cases, getting a benefit may involve visits to different government agencies, or having to fill out complicated questionnaires and provide evidence of destitution every month.

At some point, potential applicants may decide it's just not worth the hassle. Or they may be confused by a long list of requirements and decide not to bother since they probably don't qualify, even if they do.

Assuming the government is not actively searching for those who qualify for benefits (that'd be costly), means-testing can lead to those who need and qualify for the benefit not receiving it. By trying to exclude the well-off from the benefit recipients, you may end up excluding the poor, as well. This is doubly the case in countries where the poor have a low rate of literacy, and therefore have trouble wading through piles of government documents. A related problem is that the very poor may not have required supporting documents stating their identity and socioeconomic status. In this case, only the wealthiest of the poor, those who are already plugged into the system to a large degree, will be able to qualify for the benefit. The neediest will be excluded.

All of this can be avoided by having the benefit be universal. Proof of life and proof of residence are all that is needed. These are easily understandable and easily obtained.

At last we come to the 800-pound gorilla in the room...

1. Means testing is a tax that rewards poverty and discourages work

Suppose you've managed to jump through all the hurdles and are the proud recipient of a basic income grant, employment insurance, a child credit and government-paid medical insurance. All of this is given to you by your society's safety net because a year or two ago you fell into hard times. You lost your job, had to pay for an expensive operation, and ended up selling your house and moving into a trailer.

You've been happy to rely on government money while you piece your life back together, but now that you're in a better situation you figure it's time to get back to a career and re-join working society.

You've been out of the job market for a while, so it's unlikely employers will hire you for a rewarding full-time job right away. You need to start small, maybe working part-time at a retailer to build up much-needed work experience. So, you look in the classifieds, find a few promising ads, and are about to apply for them... when you realize that taking the jobs would make you poorer than you are now.

Why? Because all your benefits are means-tested. When you take on a part-time job and start earning a very small income, your basic income grant will be reduced dollar for dollar of your earnings. You'll be working for free. All of a sudden, nights spent frying potato chips and spreading mustard on hamburger buns don't look as attractive as an evening in front of the television - and that's not all. Since you're employed, employment insurance will also vanish. Having a reasonable if tiny income also makes you too wealthy to receive government health insurance or the child credit, so now you have to pay for operations (part of what landed you in poverty in the first place) and your child's clothes and school supplies on your own. You'd better hope you stay healthy despite your poverty-induced high fat, low nutrient diet, and that your child stops growing until clothes go on sale at the consignment store.

All in all, it's better to stay unemployed, since trying to bring yourself out of poverty will only make you poorer. Means testing is a tax on work, and a high one.

My example is extreme and crafted to exaggerate a point, but a very readable paper agrees with the basic message. The authors calculate that in real-world Canada, means testing works out to an effective 45% tax rate on each additional dollar earned for families with children that earn $20,000 to $30,000 a year. This tax rate isn't reached by families without children until they have an income of $70,000 a year. This is about twice the average Canadian income. The reason for the difference? Child benefits are clawed back from the moment a family starts earning an income that keeps them from starving. Every dollar earned means a fraction of a dollar lost in benefits. When an extra dollar earned means 45 cents lost in benefits, sacrificing a family's quality time to go from part-time work to full-time work or developing a higher-paying skilled career doesn't look very attractive. Means testing discourages the poor from taking on the jobs that could eventually take them out of poverty. The short-term cost is just too high.

"Stay poor, and we'll help out - if you manage to make it through all the bureaucratic hurdles. Try to get out of poverty, and we'll pull the rug out from under you." That's the basic message of means-testing. Targeting benefits to the poorest makes work very expensive. If benefits are universal, we don't see this disincentive to participating in the job market. If flipping burgers at McDonald's doesn't cost you your child grant, and you're able to keep all or most of what you make, you're more likely to take that fast food job, and maybe use it as a springboard for a more valuable career later on.

Means testing is one of those obvious notions that is actually a pretty crummy idea when you look at it closely. Unfortunately, its common sense appeal makes it very popular. Nearly every government in the world uses it as a linchpin of its social welfare program, and in the current global recession many nations are looking to tighten their welfare requirements. Instead, they should be loosening them.

Sunday, January 15, 2012

Grain prices in 2012

The price of grain has risen dramatically the last few years. This matters a lot because the world's poor subsist largely on grain. Any increase in price means less calories per day, malnutrition and starvation. Emerging economies, where standards of living are rising quickly, are also hard hit. As a population becomes more prosperous, it demands more meat in its diet. All those cows and chickens eat grain - a lot of it - and so a rise in the price in grain translates to a rise in the price of meat.

With that in mind, at first I was happy to read that according to the Food and Agriculture organization of the United Nations (FAO), food prices started falling in the latter half of 2011. Sure, the food price index for 2011 was still the highest since the FAO started measuring it in 1990, but at least there seemed to be hope for the future.

Reading past the headline tempered my enthusiasm. As an economist, I should have known that the price of something will generally fall for one of two reasons (or a combination of the two): lower demand, or higher supply. If no one wants something, the seller of the good will have to lower its price if she plans to get rid of it. If the market is flooded with more of a good than people wish to buy, its price will fall as sellers scramble to be the lucky one that gets a sale.

Both of these have happened in the global grain market. Worldwide financial troubles, particularly in Europe, have lowered demand for grain. Workers in Greece, say, are worried not only about their own jobs and incomes, but about their government's ability to continue to provide basic services. Frightened about the future, they've cut down on spending. Not just spending on luxuries and treats, but spending on basic food (grains and meat). That's scary.

Meanwhile, a bunch of farmers around the world looked at record grain prices from 2008 to 2010 and decided they wanted a piece of the action. Fields that might have grown other crops were converted to grow rice, wheat and (to a lesser extent) corn, and now there's too much of the stuff. Thankfully, grain can be stored, but that does nothing for farmers who need to sell their harvest NOW in order to have the money to feed their family and continue farming.

This happened dramatically in the US states of Kansas, Oklahoma and Texas. There, three things combined to make farmers plant more wheat than they ever had before. First, a drought that had been making it difficult to grow anything in the region eased. Second, the drought had led to many failed crops, and now those fields were ripe for planting with winter wheat. Finally, the high price of wheat made planting it seem a winning proposition. The problem is, of course, that it's not just one farmer thinking this way - they all did, and since they couldn't coordinate with each other, too much wheat was planted. The price of the grain fell so far that at least one farmer believes he can make more money selling his wheat crop as hay than as a grain.

Speaking of farmers coordinating with each other... the Canadian Wheat Board is set to lose its monopoly of western Canadian wheat in August of 2012. Until then, if you farm wheat in western Canada, you must sell it to the Wheat Board, which uses its clout (20% of the world's export market for wheat) to negotiate high prices. The Board isn't being entirely dismantled, but participation will be optional from August onward. Some farmers will choose to negotiate their own deals with wheat buyers. Since individual farmers have less bargaining power than the monolithic Wheat Board, and since the Board itself will have less wheat to bargain with, this should lead to lower prices for wheat worldwide. It may also lead to less planting of wheat in Canada, since the reward for doing so may fall.

Now for rice. The big players in the rice export business are India, Thailand and Vietnam. Right now, India is beating its competitors on price. This is despite an increase in domestic Indian demand for rice after the New Year's holidays, and an increase in the price of Indian rice to anyone buying it in US dollars, due to a newly strong rupee. In Vietnam's case, the price of its rice is higher than India's due to a government-instituted price floor on rice. In an attempt to help rice farmers, rice cannot be sold for less than the floor price. Unfortunately, this floor price is higher than the price of Indian rice, which makes Vietnamese rice relatively unattractive for importers. (This is just one of many cases where price-fixing worsens the problem it was intended to solve.)

India's strong performance on the world rice market is a little surprising, given how recently it came to the party. For four years, India banned the export of rice. The ban was only lifted last year, and so far India has exported 1.2 million tonnes of rice. An enthusiastic take-up of rice exporting is set to lead to a bumper crop this year, which will add to India's already large stockpiles of rice.

That up there is an important thing to remember, by the way: even if you are a grain exporter, you should not forget to keep stockpiles of the grain at home, just in case. South Africa has learned this lesson the hard way. Last year, it had a huge surplus of maize. Much more was harvested than South Africans could possibly eat in a year. As a result, most of it was exported. And by 'most of it', I mean almost all of it. Grain silos were left nearly empty. This left South Africa unprepared for this year's disappointing crop. (It's not in the article, but it may be that the huge surplus lowered maize prices to the point where farmers switched to other crops, leading to a shortage this year.) Not enough maize was grown to feed the country (or its chickens - poultry producers are upset at the rise in the price of feed), and South Africa found itself having to import maize at 800 rand a ton. It had exported its maize at 600 rand a ton. I wouldn't be surprised if it turned out that it was buying its own grain back...

So there you have it. Grain prices in 2012 are likely to go down, both due to excess supply from farmers that wanted in on the action of 2011's high prices, and from depressingly low demand in the European Union and elsewhere.