Friday, December 2, 2011

Never Name Your Blogpost "A Modest Proposal"

I recently had a conversation with a friend about homeschool. She used to be a public school teacher, but she believes the current system of public education is irredeemably broken. (She is right.) As the discussion continued, she sheepishly mentioned that she had already found a system that she would like to use for her son (who is currently 15 months old, so it's a way off yet). She beamed as she described it: "It's really focused on the classics, and they focus a lot on reading actual historical books. And there's no technology. They don't believe in technology." I was nonplussed*—how could that possibly be a benefit?

courtesy Hartlepool Cultural Services (CC BY-NC 2.0)

I.

But I guess education and technology have always had a funny relationship. The push to get computers into schools seems to have led to a lot of fraud, waste, and abuse. My wife, who for a year taught chemistry recitations at Arizona State University, told me about a typical tech-hole called "The Thunder Room" (I know, sounds like a...nah, forget it). Six digital whiteboards, a Mac laptop (now a tablet!) for every student, and a powerful desktop for presentations for the teacher. The rumor is that it cost a quarter of a million dollars, and mostly ends in frustrating glitches and ruined laptops, and probably overtime for some IT guys. So, yeah, tech can be a wasteful distraction.

But then there's Salman Khan. The Khan Academy's bajillions of math and science (and other) videos teach essential topics for free, and many of these lessons have associated exercises. The whole thing's gamified, too, so some people are liable to learn calculus just to "level up". It's something that would be completely impossible pre-internet. (The idea of free, anywhere, anytime math/sci ed, that is. People have been learning calculus since it was invented, of course.) Khan seems to justify tech in education.

On the fringes of technologically-enabled learning, there's Wikipedia, which seems to defy categorization—it's a Rorshach test for people's opinions of modernity. Detractors point to the rampant factual errors, stubby articles, and blatant self-promotion and vandalism. Supporters point out the fact that it is the largest single repository of human knowledge in history, and that it's completely free of charge and free-as-in-speech. Either way, it's a fixture of modern first-world society, and it's probably not going anywhere soon, even if it's not a valid source for term paper footnotes.

II.

Which brings me to the point of education: what is it? Is there something about the nature of education that would make a non-technological education preferable?

To my mind, there are three functions of a modern education. Note that these are not "purposes", so to speak. Talking about the purposes of education is a philosophically scary blind alley. The functions are: allowing a person to participate in society, creating opportunities for students to make money, and teaching a person information relevant to their self-awareness.

Allowing people to take part in society involves loading them up with culturally important information. Things like "O Captain, My Captain", the War of 1812, the finer points of good writing, the use of basic technologies, the rules of football (whichever type is applicable), et cetera. Of course, understanding The Allegory of the Cave isn't particularly relevant to someone who's going to be placekicking for the Chargers for the next 12 years, nor are the finer points of a 40-yard field goal essential to most Princeton philosophers. The information we give here is just that, info, and coverage is spotty by definition.

Opportunities for wealth creation usually come in the form of post-secondary degrees, although social networking and training in schools can also lead to non-collegiate methods of money making (the garage band members that met and learned chords in music classes, the entrepreneurs that started selling during lunch period, etc). The function of a homeschool in this case would be something more like a traditional high school—to teach students how to successfully navigate the collegiate universe with the end of a degree in mind.

Information relevant to one's self-awareness. That's a tricky phrase. Perhaps "teaching students how to think" would be more germane, if less accurate. You don't teach someone how to think; you teach them that there are many different ways to think, and that some are better than others, and that there isn't one that is perfect or best. The result of this is self-awareness, sapience, and consciousness, lending to this person the ability to willfully change the world for the better.

Finding a way to perform all three function is daunting; the desire for homeschool programs to focus on classics and fundamentals is understandable. In a world drowning in seas of data, a simple, clearly-defined canon of work untainted with the messiness of instant communication is very attractive. Beyond that, American schools have been falling behind, or so nearly everyone is telling us. They used to be the best in the world, so returning to the Golden Age seems like a good idea.

III.

Before we jump on the paleo-educational bandwagon, though, let's consider what all that "messiness of instant communication" is all about. It is believed that the world will have produced 1.8 zettabytes of data in 2011. (A zettabyte is a trillion gigs.) That is almost an order of magnitude greater than the amount created in 2006, which was at that point 3 million times more data than is contained in all the books ever written. And out of that, a few important pieces can be gleaned—the signal-to-noise ratio is infinitesimal.

All three of the functions of education I described depend upon a student's ability to navigate the world around her. No one has ever taken part in a culture, made money, or learned to be self-aware without other people. This is not because humans are social animals, it is because the human world is a social world. We are not particularly strong or particularly fast, but we're smart and when we get together we can build a lot of things that make life a lot less difficult and painful for ourselves. The world we now inhabit has become a new world, the world of crushingly large amounts of information. The successfully educated can navigate it.

(It is important to remember that I am not advocating relinquishing any of your educational duties. This is not "it takes a village to raise a child". The model is to take what you need from the resources that are available—"raid the village to raise your child".)

Unfortunately, most college grads cannot distinguish fact from opinion, nor can they search for information successfully. It is as if humanity has become aquatic, and schools have not taught students to swim.

IV.

The modest proposal is this: fill young children with as much knowledge—arm them with as many tools—as possible (language, math skills, etc.), and when they're older, present them with the problems they will actually have to solve in their lives, and allow them access to means by which they can find solutions. For math and science, this will probably mean pointing them to Khan (and to Wikipedia). For information literacy, unfortunately, the tools have not yet been built. I'm sure the world would love it if they were, and you may be able to join with others to do so.

Those tools would require students to find information, read it, and analyze whether they trusted the source or not. The best way to do this, again, would be to ask the student a question and let them answer it using any resource they can. Closed-book tests on simple facts are memory exams, not useful assessments of learning.

*Footnote: While I personally think technology is essential for an education, I completely respect the decisions of my friend and anyone else who decides to avoid it in schooling their children. Frankly, if you're at least halfway competent and love your child, you'll almost certainly beat the crap out of public school.

Thursday, October 27, 2011

Announcing Dispatches from Next Year

Reverse Disclaimer: I'm not a Stephenson fanboy. In fact, I've only started to read two of his books, which seemed really good, but ended up on the wrong of work or school scheduling problems and have had to return to the libraries from whence they came far too early.

Famed science fiction novelist Neal Stephenson wrote a huge, huge article for World Policy in which he faults modern science fiction for not being idealistic enough. His basic argument is thus: Modern state actors refuse to take scientific risks on the order of those of the Space Race. This is in part due to cynical, negative-effects-based sci-fi. His basis for this is simple: back in the day, we wrote epic science fiction, with space, physical or cyber-, stretched out before us as a new world. That's much less the case now.

"The imperative to develop new technologies and implement them on a heroic scale no longer seems like the childish preoccupation of a few nerds with slide rules," Stephenson says. "It’s the only way for the human race to escape from its current predicaments. Too bad we’ve forgotten how to do it."

It's in that vein that I'm announcing my new sub-blog (yeah, I'm definitely running with too many irons in the fire). It's called Dispatches from Next Year, and the concept is this: Each Dispatch will be a commentary on the very near future, given the technologies, social movements, and politics of today. Each Dispatch will have a grain of the open possibility of tomorrow—fantastic, alien, and reachable.

Friday, September 9, 2011

On Whiskey and Scorpions

Practical fallacies and the virtue of practical thinking.

There’s a ton of content out there on logical fallacies—the concept of logical fallacy has definitely entered into the network of “pop sci” content outlets (Boing Boing, Wired, Radiolab, Malcolm Gladwell, TED, etc). The idea being, I think, that if you can rid humanity of logically improper thinking, you can get right to the business of landing on Mars, ending poverty and disease, and/or finally cancelling “The Bachelorette”.

 I applaud this effort, even though I was recently caught using the phrase “begging the question” terribly wrong. Stamping out the Lake Wobegon Effect or “If-By-Whiskey” is absolutely a worthwhile endeavor, but I do have one concern (without being a concern troll): logic only gets you halfway to the door.

This guy is doing his part to stamp out "If-By-Whiskey"
Here’s an example: there’s an infamous game theory puzzle called the Two Envelopes Problem. In it, a participant is given two identical envelopes, one containing X amount of money, and the other containing twice that. This hypothetical participant takes one envelope, but before opening is, she’s given an opportunity to exchange the envelopes. When she gets that envelope, she’s given the opportunity to exchange the envelopes, and so on.

The trick here is that the math says you’re always better off to switch envelopes (much like in the Monty Hall Problem, you’re better off to switch doors). Unfortunately, the math doesn’t take into consideration that you don’t have all eternity, dang it, and $X is better than $0 (for positive values of X, of course). “Wait, wait,” you’re saying, “this is game theory! If the math doesn’t reveal the most logical course of action, there’s something wrong with the math, not with logical thinking on the whole!”

And you’d be right. But, as of right now, the math isn’t there. It hasn’t been corrected fully, so it can’t be used. So you’re stuck there shuffling envelopes indefinitely until you make an illogical, but practical, solution (though I would switch envelopes at least twice, in case the weight of the money is a giveaway). Unfortunately, there are a lot of people in the world still shuffling envelopes. I see three big ways this happens, and I'll talk about one right now:

Forgetting Why. The TED-friendly marketing dude Simon Sinek has a whole talk on “Starting with Why”, and the tl;dr version is this: “People won’t buy your product, they’ll buy your vision.” And that’s awesome, but the “why” I’m talking about is more like the “why” in “Why do scorpions have poisonous stingers?” or “Why does my car have bags of air that inflate when I crash?” and less like “Why does Apple make cool gadgets?” Forgetting Why is forgetting that scorpions can't choose not to have stingers.

A classic example of Forgetting Why stems from Malcolm Gladwell’s recent appearance on the Radiolab episode “Games”. Jad and Robert were talking about a study that showed that 4 out of 5 people root for the underdog in any given, unbiased scenario. Gladwell, apparently did not, and briefly mentioned his sadness when the expected winner fails in an athletic contest. In doing so, he illustrates just what I mean by “forgetting why”.

You see, the “purpose” of a tournament isn’t to identify the strongest team. If that goal were more important than all others, we could simply have a committee do some math to the field of teams after a long season and we’d have a verifiable winner without all the madness. But the reason we have an NCAA basketball tournament is because it’s entertaining, and entertainment has a market value. The reason March Madness exists is not to tell us who's the best—it's to make lots and lots of money.

Conflating the perceived “purpose” of a thing with its function is a problem, because it can have the logical conclusion of a stung child telling the scorpion to get rid of that stupid stinger. We've all seen people who are obsessed with a cause that they cannot possibly effectively champion. Many of them are Forgetting Why, assuming that someone, somewhere, is making life unfair by his or her choice, and if he or she would only choose otherwise, a massive problem would be solved.

In other words, if you can’t assume that products of evolutionary systems control their existence. Like scorpions, or basketball tournaments, or cultures, or economies. Next up: Forgetting How.

Thursday, September 1, 2011

Beat the Filter Bubble: News from Outside the Spotlight

Camila’s War

In a nation thousands of miles away, one that has, along with its neighbors, a history of despotic rulers, brutal coups, and ground-shaking political conflicts, a youth-led uprising is taking place against perceived injustices. The protests have been taking place for months now, and are beginning to spread to neighboring countries. The leader of these rebels against authority has given the federal government an ultimatum: he only has one chance to address their cause, at a meeting this Saturday.

That nation isn’t Libya, Syria, or Bahrain—it’s Chile. The charismatic leader is 23-year-old Camila Vallejo, president of the student government of the University of Chile, and the issue is nationalizing education. Her nemesis is Chilean president Sebastián Piñera, notable for being a right-of-center leader in a Latin America that is increasingly swinging to the left. Ms. Vallejo’s group demands nothing less than education [through post-secondary] that is “free, public, and of [high] quality”.


Camila Vallejo
Camila Vallejo in German newspaper Die Zeit. CC-BY: Germán Póo-Camaño


And their protests prove they are serious. Over the last two months, students have converged on public squares in the capital, supported the Chilean labor movement’s strike, and visited neighboring Brazil to spark protests there and meet with Brazilian president Dilma Rousseff. (Interestingly, Brazil does have public universities, but the protesters are demanding that the government double their investment in education.)

The group’s seriousness is matched only by its savvy. Ms. Vallejo, like so many revolutionaries these days, has a strong online presence. Her Twitter feed updates at least a few times every day, posts which have recently been accompanied by a black square where a profile picture should be—a show of solidarity with the loved ones of 16-year-old Manuel Gutiérrez, who was killed during the protests that accompanied the labor strike. Vallejo has over 200,000 followers.

Along with the support, of course, comes the resistance. Piñera for his part has repeatedly denounced the nationalization of education, indicating that he will never support it, which should no doubt make Saturday’s meeting very interesting. And Vallejo has informed authorities that her life has been threatened via Twitter, including one tweet that said, roughly translated, “We’re going to kill you like the bitch you are.”

Not everyone who opposes the student movement is quite that severe. La Tercera, a newspaper with wide circulation, published an opinion that the privatization of universities in Chile serves a purpose: to facilitate the explosive growth university attendance in the country, which currently sits atop the Latin American list for percentage of college-aged student currently engaging in studies. The free market, says the editorial, is the only engine that could have supported that growth.

Free-market solutions have been the hallmark of Piñera’s administration, and while they seemed popular at the time of his election, Mr. Piñera’s popularity has waned severely since taking office in March of last year. Popular or not, though, he is still in charge, which means something different in Chile, when has been largely stable and democratic since the strangely peaceful removal of military dictator Augusto Pinochet in 1990, than it would in Syria or Egypt.

The Arab Spring, in fact, may be an unfortunate backdrop for the educational protests, as Chile’s situation does not, in fact, involve widespread oppression, religious infighting, nor oil interests. A better comparison might be the anti-corruption movement in India, led by Anna Hazare, a man who has been called (admittedly in hyperbole) “a second Gandhi”. Hazare’s followers protested, non-violently, and when their leader was jailed, they watched with rapt attention as he began a hunger fast.

It may seem that these tactics of protest, normally reserved for the brutally oppressed, are unlikely to be useful for a movement like the one in Chile, over something as distant from life and liberty as free college tuition. It should be noted, then, that after a few days of Hazare’s hunger fast, India’s Parliament acquiesced, and have been working on meeting his demand: simply to close the loopholes on an already existing anti-corruption bill.


Saturday, July 23, 2011

Google+ and the Economy of Circles

I recently had an enlightening conversation with a brilliant friend that will remain nameless until he posts his own version of events. This friend had come across, somewhere on the internet, a page containing a list of “Twitter accounts that will follow you back, guaranteed”. As an experiment, he followed everyone on the list, and, as per the guarantee, they followed him back. Interestingly enough, however, is that he’s still racking up tons of followers, beyond those of the original list. Having more followers means you will get more followers.

Thinking about the ramifications of following what amount to dummy accounts in order to increase his follow rate, this friend pointed out that there’s no reason for him to unfollow the accounts he followed. Many of them don’t post, and those that do can be put into a list, and that list can be silenced. There’s no regulation in the Twitter TOS that says he can’t follow certain accounts (that would be counterproductive), and there’s no social stigma associated with it, as no one is likely to double-check his follow list for bots. To my friend, this seemed to be a fatal flaw in Twitter’s system—it can be easily gamed without consequence. Fortunately, Twitter is not a game. It’s a free-market economy.

In fact, Twitter is a very free economy. Because there are only the barest of regulations and developers can and do develop tools that automate and simplify actions in Twitter, users are left to rational self-interest as their only guide to navigating the economy, that is, their audience. Some, like my friend, do this, escaping from the trappings of social convention to further their clout by whatever means available. But because of this forced freedom, Twitter is a world of bubbles (as in “housing bubble”), where individual importance shifts from day to day, and where there seem to be more “experts” than regular Joes.

Compare this to Facebook—an economy with heavier restrictions, where businesses register differently than celebrities, who register differently than regular people. Where events aren’t just blipped out to the masses, they’re fully scheduled and organized. Where there’s such a thing as “like” and where it takes mutual consent to form a connection. Facebook, by its nature, abhors the expert, and embraces the commoner. Spamming is not only encouraged, but subsidized, and there’s no reason to aspire to any benchmark of social performance, unless you’re a company.

I count myself among what I imagine to be a majority of people, content with the shortcomings of both methods. We doubted the need for another major social media platform, especially after having been jilted by Diaspora. Yet there was the Goog, rolling out what at first blush was poised to be Wave 2.0. Thankfully, it wasn’t—it was an exceptionally simple design that approached the social media economy as a free market, much like Twitter. And, in fact, despite the brilliance that is the Circle system of separating one’s friends, there’s nothing yet baked into G+ that will keep it from becoming Twitter. There’s also nothing that can prevent it from becoming like Facebook. At the same time.

That’s the brilliance of Google+: giving the user the ability to change the rules. Because one can change one’s content stream like the channel on a television, users can make the service into anything they want: a Twitter clone, a Facebook clone (minus some largely unnecessary features), a mix of the two, or something else entirely. The killer app is being able to choose your audience, and thus, your economy.

A lot has been made of G+’s ability to succeed—whether it will be a functional social network on the order of Facebook and Twitter. The numerical evidence seems to indicate that that’s a big yes, but whether it will actually succeed—that is, yield an improvement for direct social media in general? That will depend on whether we, the users, learn to make it work to bridge the gaps in existing social media services. For that to happen, we need to experiment, and we need to share.

There’s a lot of upfront work that goes into building a social network, and in Facebook and Twitter, that setup seemed to be straightforward and repetitive. For Google+ users, the setup phase is ongoing, as we attempt to discover how best to format our circles and use them in a way that makes sense. Experimentation will yield functional formats on an individual level, but then each of us needs to continue to have a usability discussions with any friends willing to share, ensuring that disruptive ideas have a chance to spread through the network. I recommend a circle called “Meta” or “Google+” for that.

That’s really the unexpected joy of Google+: millions of people, working together to make something awesome.

Monday, June 27, 2011

Writing About Technical Subjects (Pt. I)

So, you're a company intent on empowering people with hardware, software, and/or other new tech. Or you're a non-profit with Big Ideas about digital rights, tech literacy, or another 21st Century cause. Or you're a copywriter, learning to ply your craft in the content economy. Telling compelling stories about complex topics is exciting, but challenging. This fact is your greatest friend and most terrifying enemy:

We now live in a world where the "simple facts" are not so simple.

Take Facebook. The second most-frequently visited place on the web is one of a handful of websites that has spawned its own verb. (When was the last time you said you were "facebooking"?) Behind the scenes, though, it's a complex messaging, image-sharing, networking beast of a site-slash-app, not to mention the developer interface for third-party apps.

The features aren't what Facebook splashes on the sign-up page, though. It says,

Facebook helps you connect and share with the people in your life.

There's a simple principle in effect here: you don't sell features, you sell motivations. I've long appreciated market researcher and TED Talker Simon Sinek's verbiage:

Start With "Why"

That's hard and fast for me: successful pitchmen and copywriters have always done this, even if it wasn't worded exactly as above. That doesn't change in the world of software. In fact, in a world of rapid development and widespread innovation, there's even less place for people selling boxes of features.

Recently, Twitter posted a guide to help journalists and other newsy sorts use their service. They didn't have to—heaven knows people have already been using Twitter to do news since the dawn of tweets—but they did it as a help to their userbase. #TfN may not be copy, but it follows another guideline I love:

Solve a Problem

When possible, I present services as solutions to problems that people face. (As an aside, I think the use of "solution" as a buzzwordy substitute for "service", "application", or "product" dilutes the semantic power of that word.) Some companies don't use much text to show that they solve a problem—apps like Turntable.fm solve a problem (or perhaps "grants a wish") in an obvious way, and sell themselves on word of mouth alone. For everyone else, there's copy.

Trouble is, you have to explain a complicated problem like "T1 connections used to be state of the art, but nowadays you get more bandwidth if you sign up with a Wireless ISP" to someone who may not understand "T1", "bandwidth", or "Wireless ISP". If you're a Wireless ISP, that's going to cause problems.

The writer, then, has to make the problem and its solution simple enough to be clear to the reader. "Telling your story" has become a popular phrase in the marketing industry, and one of the tricks of the trade is just that: making copy into a story, with a conflict, a resolution, a hero, and—if necessary—a villain.

Tell a Story

Of course, you don't want to take storytelling to the extreme, either, or your reader will think you don't respect them. Leave some complex concepts in the narrative, and explain what must be explained.

Don't Condescend

Here's a bit of copy I wrote for a client that I think exemplifies these ideas:

For years, network service was fastest when it was delivered by physical cables and circuits, the “T1” being the most popular with businesses. The T1 was a workhorse, but, as content on the web has become more complex, it’s clear that T1 just can’t keep up. Fortunately, wireless technology has kept pace with increasing bandwidth needs—OneAxis.net wireless can offer speeds comparable to those of cable internet[...]

So, there you have it, an overview of copywriting on complex topics. Next up: News writing.

Friday, June 10, 2011

Intersection: Trumor + The Filter Bubble

First in a series of pieces that examine the hidden links between apparently unrelated ideas.

Scientists at the Laboratory for Information and Decision Systems at MIT have been working on a way to determine how far an idea will propagate via Twitter. The system, named Trumor, measures the reach and influence of individual users on Twitter, much like the commercially-available service, Klout. Trumor, however, creates a list of "superstars" in each topic (e.g. soccer, automobiles, copyright law). These users are very likely to have their comments propagated through the network.

The Filter Bubble is the title of a book and the name of an idea developed by online activist Eli Pariser. The basic premise is this: online information is being increasingly organized automatically by individual preference, and this algorithmic curation may not be to our benefit. As an example, Pariser notes the results pages for Google searches two of his friends did on "Egypt" during the revolution earlier this year. One friend received mostly news, the other received travel agencies and basic encyclopedia information, but nothing on the protests. As human beings rarely like their ideas to be challenged, the filter bubble tends toward increasing divisions between groups of people, based on ideology.

The Intersection: As people self-organize by ideology, they will likely either choose to restrict the number of information sources they rely on, or the filter bubble will do that for them. As the bubble closes in, information from outside will leak in less and less frequently, making the power of online thought leaders very strong as concerns influencing those that follow them. (Incidentally, the implications of the word "follow" in the Twitter context will become creepier.)

As this occurs, the knowledge of who influences whom (will MIT retain this information? will it be made public?) will likely fall into the hands of corporations, political parties, hackers, and activist groups. The game will become "Who influences the influencers?" as groups attempt to convince members of the superstar list they're interested in to carry their content. This structure of top-down siloed syndication would be very weak to subtle subversion and account hacking.

In sum: As the world outside our senses becomes more important, our reliance on others' perspectives increases. As our filter bubbles close in, our worldviews become less robust and more vulnerable to manipulation via the social media superstars we rely on. Reality-hacking and reality-selling are imminent, but not in a cool cyberpunk way.