Thursday, March 29, 2012

There Are No More Frontiers (And That's A Little Scary)

Alaska's official state nickname is one of only two that doesn't follow the formula "The X State". It is called "The Last Frontier". (Not to be confused with Space, which is not a U.S. state, but is "the final frontier".) (The other is New Mexico, "Land of Enchantment".) (OK, done with parentheses, for the rest of the piece.) Unfortunately, the name no longer fits. The parts of Alaska that are going to be inhabited are already inhabited. There will be no Klondike v2.0. For that matter, much of the mountainous part of the rest of the country, though expanding in population, is no longer pushing tentacles of human settlement into the wilderness. The United States has reached something of an equilibrium.

"B'ars" are significantly less afraid of three-year-olds, though, so that's a bonus.
Image via Wikipedia.

That obviously bothers some people, like Peter Thiel, who started his career by created PayPal and is continuing it by creating controversy. His new, big, strange project is an artificial island in international waters, not regulated by US law (recent seasteading news). Thiel believes that minimal government will create a paradise, and this is the only viable way to make it happen. In other words: "It's 2012, where's my moon colony? Oh well, an artificial island works too."

The history of humanity seems to have had two phases: one where a young nation devises a new, better form of governance, and another where said nation does all kinds of crazy crap that tests the integrity of that form, causing some people to leave, colonize a new place, and start over at phase #1. We're neck-deep in phase #2, it seems. In the United States, this means you live in a place that is governed by entities who have a vested interest in keeping ideological conflict at a simmer—motivating the party bases—without allowing a full rolling boil. As long as the Fifth Party System remains in place, there will be no peace.

The BBC posted predictions for 2112 the other day, including one that surprised me quite a bit: That California would eventually secede from the Union. This struck me as two things: 1) unlikely and 2) unfortunately so. Now, I'm not one to argue for the secession of any part of the US, but it seems like we'd all be better off if there were some way for different ideological communities to engage in the level of self-governance they seek. California seems a very unlikely target for that for a number of reasons, but it's sort of sad that there's no way to vent the pressure of the ideological battle for dominance.

There's a saying regarding internet services: "If you're not paying for it, you're not the customer; you're the product." It seems clear that the same holds true for modern politics. If you're not receiving a subsidy, kickback, or special project, you're not a constituent. You're a flyer.

Handbills don't explore new frontiers.

Thursday, March 8, 2012

Storytelling Is the Most Important Human Activity, Ever

Since the dawn of the humanity that we now recognize, progress has been part of life. If you were a Mesopotamian farmer tucked away in the deep recesses of the BCE years, progress may have been so slow as to present you with an illusion of constancy. If you're my son or daughter, it will likely be so quick that it becomes background noise, like the motion of a plane on an intercontinental flight. We've tamed a good chunk of the elements, cured a load of diseases, and can reliably deliver either three pizzas or a month's worth of internet connection to many locations and for the same price.

We got there by telling stories. It may appear that we got here by doing science to things, refining our beliefs and moral codes, and creating political change, but those are all just flavors of storytelling. It's what humans do, and it's why we are so prolific. In fact, it's pretty much all humans do.

CC license, by Scholastic, Inc.

Politics is the easy example: Both wide-sweeping changes and small-scale elections depend upon narratives to function. The U.S. Declaration of Independence contains a number of narratives that shaped the Revolution, but it was in turn built upon the narratives of a number of colonists who felt they were getting a raw deal. (Those narratives, or a version thereof, are still taught in US History courses from K to 12.) Elections depend upon narratives, too, and this much should be obvious. Candidates tell stories about what they will do, what the city, county, state, or nation should do. And they get elected based on how well the narratives they tell, and the narratives told about them, thrive.

Science, however, is also all about the narrative. The scientific method requires experimentation, precisely because it wishes to produce useful narratives, which describe things that appear to be constant and accurate to a standardized human perception of the world. We've long idealized experimentation as the heart and soul of science, but it is the narrative constructed around the experiment—the story—that makes science available and useful to the whole of humanity. Newton did extensive experimentation on the effects of gravity, but it is the apocryphal narrative of his apple that sticks in the collective memory.

Religion, too, of course. If you're a skeptic, you believe that religion is a set of narratives created by man to explain the inexplicable. If you're a believer, it's that God delivered narratives to man in order to help us imagine the unimaginable. Either way, the story is foremost.

And so on. Mathematics consists of narratives about abstract concepts, presented in a symbolic language. Art often uses a literal narrative to communicate a figurative narrative, and identifying that figurative narrative is part of (if not most of) the joy of artistic endeavor.

Our economy, and therefore the quality of life we enjoy, depends almost entirely upon the narratives we tell: the ones we tell ourselves, the ones we tell the rest of the world, and the ones told and believed by every other human being on the planet. Belief, or conviction, the state of holding a narrative as truth, is the most important force in our world. Affecting beliefs is a huge human responsibility, as is adopting them.

In our modern world, ideas spread very quickly. Narratives shared through the Internet are often adopted and propagated very quickly, with the potential for drastic change to the human universe literally overnight.

It is very important that we exercise wisdom in sharing narratives of any sort, and rigorously examine the narratives we have chosen to accept.

Friday, December 2, 2011

Never Name Your Blogpost "A Modest Proposal"

I recently had a conversation with a friend about homeschool. She used to be a public school teacher, but she believes the current system of public education is irredeemably broken. (She is right.) As the discussion continued, she sheepishly mentioned that she had already found a system that she would like to use for her son (who is currently 15 months old, so it's a way off yet). She beamed as she described it: "It's really focused on the classics, and they focus a lot on reading actual historical books. And there's no technology. They don't believe in technology." I was nonplussed*—how could that possibly be a benefit?

courtesy Hartlepool Cultural Services (CC BY-NC 2.0)

I.

But I guess education and technology have always had a funny relationship. The push to get computers into schools seems to have led to a lot of fraud, waste, and abuse. My wife, who for a year taught chemistry recitations at Arizona State University, told me about a typical tech-hole called "The Thunder Room" (I know, sounds like a...nah, forget it). Six digital whiteboards, a Mac laptop (now a tablet!) for every student, and a powerful desktop for presentations for the teacher. The rumor is that it cost a quarter of a million dollars, and mostly ends in frustrating glitches and ruined laptops, and probably overtime for some IT guys. So, yeah, tech can be a wasteful distraction.

But then there's Salman Khan. The Khan Academy's bajillions of math and science (and other) videos teach essential topics for free, and many of these lessons have associated exercises. The whole thing's gamified, too, so some people are liable to learn calculus just to "level up". It's something that would be completely impossible pre-internet. (The idea of free, anywhere, anytime math/sci ed, that is. People have been learning calculus since it was invented, of course.) Khan seems to justify tech in education.

On the fringes of technologically-enabled learning, there's Wikipedia, which seems to defy categorization—it's a Rorshach test for people's opinions of modernity. Detractors point to the rampant factual errors, stubby articles, and blatant self-promotion and vandalism. Supporters point out the fact that it is the largest single repository of human knowledge in history, and that it's completely free of charge and free-as-in-speech. Either way, it's a fixture of modern first-world society, and it's probably not going anywhere soon, even if it's not a valid source for term paper footnotes.

II.

Which brings me to the point of education: what is it? Is there something about the nature of education that would make a non-technological education preferable?

To my mind, there are three functions of a modern education. Note that these are not "purposes", so to speak. Talking about the purposes of education is a philosophically scary blind alley. The functions are: allowing a person to participate in society, creating opportunities for students to make money, and teaching a person information relevant to their self-awareness.

Allowing people to take part in society involves loading them up with culturally important information. Things like "O Captain, My Captain", the War of 1812, the finer points of good writing, the use of basic technologies, the rules of football (whichever type is applicable), et cetera. Of course, understanding The Allegory of the Cave isn't particularly relevant to someone who's going to be placekicking for the Chargers for the next 12 years, nor are the finer points of a 40-yard field goal essential to most Princeton philosophers. The information we give here is just that, info, and coverage is spotty by definition.

Opportunities for wealth creation usually come in the form of post-secondary degrees, although social networking and training in schools can also lead to non-collegiate methods of money making (the garage band members that met and learned chords in music classes, the entrepreneurs that started selling during lunch period, etc). The function of a homeschool in this case would be something more like a traditional high school—to teach students how to successfully navigate the collegiate universe with the end of a degree in mind.

Information relevant to one's self-awareness. That's a tricky phrase. Perhaps "teaching students how to think" would be more germane, if less accurate. You don't teach someone how to think; you teach them that there are many different ways to think, and that some are better than others, and that there isn't one that is perfect or best. The result of this is self-awareness, sapience, and consciousness, lending to this person the ability to willfully change the world for the better.

Finding a way to perform all three function is daunting; the desire for homeschool programs to focus on classics and fundamentals is understandable. In a world drowning in seas of data, a simple, clearly-defined canon of work untainted with the messiness of instant communication is very attractive. Beyond that, American schools have been falling behind, or so nearly everyone is telling us. They used to be the best in the world, so returning to the Golden Age seems like a good idea.

III.

Before we jump on the paleo-educational bandwagon, though, let's consider what all that "messiness of instant communication" is all about. It is believed that the world will have produced 1.8 zettabytes of data in 2011. (A zettabyte is a trillion gigs.) That is almost an order of magnitude greater than the amount created in 2006, which was at that point 3 million times more data than is contained in all the books ever written. And out of that, a few important pieces can be gleaned—the signal-to-noise ratio is infinitesimal.

All three of the functions of education I described depend upon a student's ability to navigate the world around her. No one has ever taken part in a culture, made money, or learned to be self-aware without other people. This is not because humans are social animals, it is because the human world is a social world. We are not particularly strong or particularly fast, but we're smart and when we get together we can build a lot of things that make life a lot less difficult and painful for ourselves. The world we now inhabit has become a new world, the world of crushingly large amounts of information. The successfully educated can navigate it.

(It is important to remember that I am not advocating relinquishing any of your educational duties. This is not "it takes a village to raise a child". The model is to take what you need from the resources that are available—"raid the village to raise your child".)

Unfortunately, most college grads cannot distinguish fact from opinion, nor can they search for information successfully. It is as if humanity has become aquatic, and schools have not taught students to swim.

IV.

The modest proposal is this: fill young children with as much knowledge—arm them with as many tools—as possible (language, math skills, etc.), and when they're older, present them with the problems they will actually have to solve in their lives, and allow them access to means by which they can find solutions. For math and science, this will probably mean pointing them to Khan (and to Wikipedia). For information literacy, unfortunately, the tools have not yet been built. I'm sure the world would love it if they were, and you may be able to join with others to do so.

Those tools would require students to find information, read it, and analyze whether they trusted the source or not. The best way to do this, again, would be to ask the student a question and let them answer it using any resource they can. Closed-book tests on simple facts are memory exams, not useful assessments of learning.

*Footnote: While I personally think technology is essential for an education, I completely respect the decisions of my friend and anyone else who decides to avoid it in schooling their children. Frankly, if you're at least halfway competent and love your child, you'll almost certainly beat the crap out of public school.

Thursday, October 27, 2011

Announcing Dispatches from Next Year

Reverse Disclaimer: I'm not a Stephenson fanboy. In fact, I've only started to read two of his books, which seemed really good, but ended up on the wrong of work or school scheduling problems and have had to return to the libraries from whence they came far too early.

Famed science fiction novelist Neal Stephenson wrote a huge, huge article for World Policy in which he faults modern science fiction for not being idealistic enough. His basic argument is thus: Modern state actors refuse to take scientific risks on the order of those of the Space Race. This is in part due to cynical, negative-effects-based sci-fi. His basis for this is simple: back in the day, we wrote epic science fiction, with space, physical or cyber-, stretched out before us as a new world. That's much less the case now.

"The imperative to develop new technologies and implement them on a heroic scale no longer seems like the childish preoccupation of a few nerds with slide rules," Stephenson says. "It’s the only way for the human race to escape from its current predicaments. Too bad we’ve forgotten how to do it."

It's in that vein that I'm announcing my new sub-blog (yeah, I'm definitely running with too many irons in the fire). It's called Dispatches from Next Year, and the concept is this: Each Dispatch will be a commentary on the very near future, given the technologies, social movements, and politics of today. Each Dispatch will have a grain of the open possibility of tomorrow—fantastic, alien, and reachable.

Friday, September 9, 2011

On Whiskey and Scorpions

Practical fallacies and the virtue of practical thinking.

There’s a ton of content out there on logical fallacies—the concept of logical fallacy has definitely entered into the network of “pop sci” content outlets (Boing Boing, Wired, Radiolab, Malcolm Gladwell, TED, etc). The idea being, I think, that if you can rid humanity of logically improper thinking, you can get right to the business of landing on Mars, ending poverty and disease, and/or finally cancelling “The Bachelorette”.

 I applaud this effort, even though I was recently caught using the phrase “begging the question” terribly wrong. Stamping out the Lake Wobegon Effect or “If-By-Whiskey” is absolutely a worthwhile endeavor, but I do have one concern (without being a concern troll): logic only gets you halfway to the door.

This guy is doing his part to stamp out "If-By-Whiskey"
Here’s an example: there’s an infamous game theory puzzle called the Two Envelopes Problem. In it, a participant is given two identical envelopes, one containing X amount of money, and the other containing twice that. This hypothetical participant takes one envelope, but before opening is, she’s given an opportunity to exchange the envelopes. When she gets that envelope, she’s given the opportunity to exchange the envelopes, and so on.

The trick here is that the math says you’re always better off to switch envelopes (much like in the Monty Hall Problem, you’re better off to switch doors). Unfortunately, the math doesn’t take into consideration that you don’t have all eternity, dang it, and $X is better than $0 (for positive values of X, of course). “Wait, wait,” you’re saying, “this is game theory! If the math doesn’t reveal the most logical course of action, there’s something wrong with the math, not with logical thinking on the whole!”

And you’d be right. But, as of right now, the math isn’t there. It hasn’t been corrected fully, so it can’t be used. So you’re stuck there shuffling envelopes indefinitely until you make an illogical, but practical, solution (though I would switch envelopes at least twice, in case the weight of the money is a giveaway). Unfortunately, there are a lot of people in the world still shuffling envelopes. I see three big ways this happens, and I'll talk about one right now:

Forgetting Why. The TED-friendly marketing dude Simon Sinek has a whole talk on “Starting with Why”, and the tl;dr version is this: “People won’t buy your product, they’ll buy your vision.” And that’s awesome, but the “why” I’m talking about is more like the “why” in “Why do scorpions have poisonous stingers?” or “Why does my car have bags of air that inflate when I crash?” and less like “Why does Apple make cool gadgets?” Forgetting Why is forgetting that scorpions can't choose not to have stingers.

A classic example of Forgetting Why stems from Malcolm Gladwell’s recent appearance on the Radiolab episode “Games”. Jad and Robert were talking about a study that showed that 4 out of 5 people root for the underdog in any given, unbiased scenario. Gladwell, apparently did not, and briefly mentioned his sadness when the expected winner fails in an athletic contest. In doing so, he illustrates just what I mean by “forgetting why”.

You see, the “purpose” of a tournament isn’t to identify the strongest team. If that goal were more important than all others, we could simply have a committee do some math to the field of teams after a long season and we’d have a verifiable winner without all the madness. But the reason we have an NCAA basketball tournament is because it’s entertaining, and entertainment has a market value. The reason March Madness exists is not to tell us who's the best—it's to make lots and lots of money.

Conflating the perceived “purpose” of a thing with its function is a problem, because it can have the logical conclusion of a stung child telling the scorpion to get rid of that stupid stinger. We've all seen people who are obsessed with a cause that they cannot possibly effectively champion. Many of them are Forgetting Why, assuming that someone, somewhere, is making life unfair by his or her choice, and if he or she would only choose otherwise, a massive problem would be solved.

In other words, if you can’t assume that products of evolutionary systems control their existence. Like scorpions, or basketball tournaments, or cultures, or economies. Next up: Forgetting How.

Thursday, September 1, 2011

Beat the Filter Bubble: News from Outside the Spotlight

Camila’s War

In a nation thousands of miles away, one that has, along with its neighbors, a history of despotic rulers, brutal coups, and ground-shaking political conflicts, a youth-led uprising is taking place against perceived injustices. The protests have been taking place for months now, and are beginning to spread to neighboring countries. The leader of these rebels against authority has given the federal government an ultimatum: he only has one chance to address their cause, at a meeting this Saturday.

That nation isn’t Libya, Syria, or Bahrain—it’s Chile. The charismatic leader is 23-year-old Camila Vallejo, president of the student government of the University of Chile, and the issue is nationalizing education. Her nemesis is Chilean president Sebastián Piñera, notable for being a right-of-center leader in a Latin America that is increasingly swinging to the left. Ms. Vallejo’s group demands nothing less than education [through post-secondary] that is “free, public, and of [high] quality”.


Camila Vallejo
Camila Vallejo in German newspaper Die Zeit. CC-BY: Germán Póo-Camaño


And their protests prove they are serious. Over the last two months, students have converged on public squares in the capital, supported the Chilean labor movement’s strike, and visited neighboring Brazil to spark protests there and meet with Brazilian president Dilma Rousseff. (Interestingly, Brazil does have public universities, but the protesters are demanding that the government double their investment in education.)

The group’s seriousness is matched only by its savvy. Ms. Vallejo, like so many revolutionaries these days, has a strong online presence. Her Twitter feed updates at least a few times every day, posts which have recently been accompanied by a black square where a profile picture should be—a show of solidarity with the loved ones of 16-year-old Manuel Gutiérrez, who was killed during the protests that accompanied the labor strike. Vallejo has over 200,000 followers.

Along with the support, of course, comes the resistance. Piñera for his part has repeatedly denounced the nationalization of education, indicating that he will never support it, which should no doubt make Saturday’s meeting very interesting. And Vallejo has informed authorities that her life has been threatened via Twitter, including one tweet that said, roughly translated, “We’re going to kill you like the bitch you are.”

Not everyone who opposes the student movement is quite that severe. La Tercera, a newspaper with wide circulation, published an opinion that the privatization of universities in Chile serves a purpose: to facilitate the explosive growth university attendance in the country, which currently sits atop the Latin American list for percentage of college-aged student currently engaging in studies. The free market, says the editorial, is the only engine that could have supported that growth.

Free-market solutions have been the hallmark of Piñera’s administration, and while they seemed popular at the time of his election, Mr. Piñera’s popularity has waned severely since taking office in March of last year. Popular or not, though, he is still in charge, which means something different in Chile, when has been largely stable and democratic since the strangely peaceful removal of military dictator Augusto Pinochet in 1990, than it would in Syria or Egypt.

The Arab Spring, in fact, may be an unfortunate backdrop for the educational protests, as Chile’s situation does not, in fact, involve widespread oppression, religious infighting, nor oil interests. A better comparison might be the anti-corruption movement in India, led by Anna Hazare, a man who has been called (admittedly in hyperbole) “a second Gandhi”. Hazare’s followers protested, non-violently, and when their leader was jailed, they watched with rapt attention as he began a hunger fast.

It may seem that these tactics of protest, normally reserved for the brutally oppressed, are unlikely to be useful for a movement like the one in Chile, over something as distant from life and liberty as free college tuition. It should be noted, then, that after a few days of Hazare’s hunger fast, India’s Parliament acquiesced, and have been working on meeting his demand: simply to close the loopholes on an already existing anti-corruption bill.


Saturday, July 23, 2011

Google+ and the Economy of Circles

I recently had an enlightening conversation with a brilliant friend that will remain nameless until he posts his own version of events. This friend had come across, somewhere on the internet, a page containing a list of “Twitter accounts that will follow you back, guaranteed”. As an experiment, he followed everyone on the list, and, as per the guarantee, they followed him back. Interestingly enough, however, is that he’s still racking up tons of followers, beyond those of the original list. Having more followers means you will get more followers.

Thinking about the ramifications of following what amount to dummy accounts in order to increase his follow rate, this friend pointed out that there’s no reason for him to unfollow the accounts he followed. Many of them don’t post, and those that do can be put into a list, and that list can be silenced. There’s no regulation in the Twitter TOS that says he can’t follow certain accounts (that would be counterproductive), and there’s no social stigma associated with it, as no one is likely to double-check his follow list for bots. To my friend, this seemed to be a fatal flaw in Twitter’s system—it can be easily gamed without consequence. Fortunately, Twitter is not a game. It’s a free-market economy.

In fact, Twitter is a very free economy. Because there are only the barest of regulations and developers can and do develop tools that automate and simplify actions in Twitter, users are left to rational self-interest as their only guide to navigating the economy, that is, their audience. Some, like my friend, do this, escaping from the trappings of social convention to further their clout by whatever means available. But because of this forced freedom, Twitter is a world of bubbles (as in “housing bubble”), where individual importance shifts from day to day, and where there seem to be more “experts” than regular Joes.

Compare this to Facebook—an economy with heavier restrictions, where businesses register differently than celebrities, who register differently than regular people. Where events aren’t just blipped out to the masses, they’re fully scheduled and organized. Where there’s such a thing as “like” and where it takes mutual consent to form a connection. Facebook, by its nature, abhors the expert, and embraces the commoner. Spamming is not only encouraged, but subsidized, and there’s no reason to aspire to any benchmark of social performance, unless you’re a company.

I count myself among what I imagine to be a majority of people, content with the shortcomings of both methods. We doubted the need for another major social media platform, especially after having been jilted by Diaspora. Yet there was the Goog, rolling out what at first blush was poised to be Wave 2.0. Thankfully, it wasn’t—it was an exceptionally simple design that approached the social media economy as a free market, much like Twitter. And, in fact, despite the brilliance that is the Circle system of separating one’s friends, there’s nothing yet baked into G+ that will keep it from becoming Twitter. There’s also nothing that can prevent it from becoming like Facebook. At the same time.

That’s the brilliance of Google+: giving the user the ability to change the rules. Because one can change one’s content stream like the channel on a television, users can make the service into anything they want: a Twitter clone, a Facebook clone (minus some largely unnecessary features), a mix of the two, or something else entirely. The killer app is being able to choose your audience, and thus, your economy.

A lot has been made of G+’s ability to succeed—whether it will be a functional social network on the order of Facebook and Twitter. The numerical evidence seems to indicate that that’s a big yes, but whether it will actually succeed—that is, yield an improvement for direct social media in general? That will depend on whether we, the users, learn to make it work to bridge the gaps in existing social media services. For that to happen, we need to experiment, and we need to share.

There’s a lot of upfront work that goes into building a social network, and in Facebook and Twitter, that setup seemed to be straightforward and repetitive. For Google+ users, the setup phase is ongoing, as we attempt to discover how best to format our circles and use them in a way that makes sense. Experimentation will yield functional formats on an individual level, but then each of us needs to continue to have a usability discussions with any friends willing to share, ensuring that disruptive ideas have a chance to spread through the network. I recommend a circle called “Meta” or “Google+” for that.

That’s really the unexpected joy of Google+: millions of people, working together to make something awesome.