This is interesting (and depressing): Why women leave tech: It’s the culture, not because ‘math is hard’.
This is interesting (and depressing): Why women leave tech: It’s the culture, not because ‘math is hard’.
HyperText 2015 (bold mine):
The ACM Conference on Hypertext and Social Media (HT) is a premium venue for high quality peer-reviewed research on theory, systems and applications for hypertext and social media. It is concerned with all aspects of modern hypertext research, including social media, adaptation and personalisation, user modeling, linked data and semantic web, dynamic and computed hypertext, and its application in digital humanities.
HT2015 will focus on the role of hypertext and hyperlink theory on the web and beyond, as a foundation for approaches and practices in the wider community.
Submission Instructions for HyperText 2015:
All submissions should be formatted according to the official ACM SIG proceedings template and submitted in PDF format
So much lack of self-awareness.
Your daily reminder that politicians don’t understand technology or the modern world. In Parliament yesterday, Andrew George MP (Lib Dem, St Ives) said: “It’s run from a call centre in Newport 200 miles away, and also it uses logarithms which actually involve them asking a patient in my constituency, ‘Um, are you conscious?’.”
Hansard corrected it from “logarithm” to “algorithm”. It may just be an instance of “mis-speaking”, but I’m genuinely worried that the people who run our country mostly don’t know the difference between a logarithm and an algorithm. And worse, they probably don’t know even care why not knowing that is a problem in a society based so heavily on science and technology. Scary.
Gross ignorance of science and technology would also explain David Cameron’s suggestion to ban messaging services that use encryption, and why such a suggestion would prompt security experts to say that he is “living in cloud cuckoo land”.
Today, I listened to an activist talking about the Global Summit to End Sexual Violence in Conflict which is taking place this week in London. An impassioned plea for political solutions to a global problem—the use of rape and sexual violence against both men and women as a weapon in war and conflicts. Nobody can object to that, surely.
What I heard in the discussion about this conference is the same as what I hear when a wide variety of political campaigns are discussed: to make an effective change, we need to understand the cultural, social, religious and political contexts of the places where this takes place. This is not moral relativism: it’s not to excuse rape or sexual violence. But to formulate an adequate response in terms of policy, one must understand the politics, the society, the culture, the religion, and work in a sustained and committed way with local activists and civic society. Otherwise, you’ll go in, enforce some ham-handed solution that’ll smell like imperialist meddling, of the White Saviour coming to save the impoverished natives.
To change a society, you need to understand it, or your efforts won’t connect with the people in that society. You’ll just end up sounding like a big, clueless phony. That kind of political engagement is hard work.
At the same time activists realise this more and more, we see it being applied less and less in the technology industry.
Running alongside the Ending Sexual Violence in Conflict event is a hackathon. As hack days/hackathons go, this has a laudable goal. I don’t think anyone thinks more sexual violence in conflict is desirable.
But the use of hack days to try and solve social problems itself seems like a bad hack. I hope I’m wrong: it’d be great if tools get developed at the EndSVCHack event that serve the important social goal of the activists trying to fight against rape and sexual violence.
I just don’t buy it though. If you sat me down and asked me to build tools to support those trying to help the victims of sexual violence in conflict zones, there’s a lot of issues one would face. Okay, first of all: linguistic. I speak English and I know enough French that I can get by in a restaurant. I had a quick Google to find out where the chief problem zones are with sexual violence in conflict.
The International Campaign to Stop Rape and Gender Violence in Conflict lists four countries with significant issues—Burma, Colombia, the Democratic Republic of the Congo, and Kenya. I know very little about the context of what is going on in any of those countries, and I have a funny feeling most programmers living and working in London probably don’t know much about these countries beyond what they can glean from Wikipedia.
If the sort of activism and political campaigning that needs doing needs to be smart, culturally-aware and so on, hackers are going to fail to appreciate that context in a two day process.
Next problem: institutional. Let’s say something gets built during that two days that is actually suitable for use by governments and/or NGOs that are trying to reduce sexual violence in conflict zones. How is that going to be used by the organisations working in the field? How is it going to be maintained? Who is going to train people working in the country? Plenty of hacks get built at hack days and then disappear. The hackers have jobs and lives they need to get back to. They may be able to crank out an app prototype, but the time to polish it, release it, maintain it and adapt it to the needs of the different societies in which this is trying to run—well, unless there’s some plan there, most of the hacks won’t be there a month later.
I’ve written about this before with regards to FloodHack: I’m not opposed to these kinds of thing, but I’m just very sceptical that they will have any results. If you wished to produce hacks to support NGOs trying to eradicate sexual violence in conflict zones, a hack day might not be the best way of doing it. Imagine instead if we had a fund which NGOs could apply to in order to get a couple of programmers for a few months. If you’ve got them there for a few months, then the programmers can actually understand the context of the problems they are solving—maybe go out into the society where the issues are. When they build things, they can do so knowing that there’s some institutional context—an NGO, a government etc.—that will maintain what they build.
The trend to think “oh, big social problem, let’s run a hack day!” seems to be a clear example of what Morozov calls “solutionism”. Apps don’t solve all social problems. Technical efforts to help solve difficult, very culturally-specific social problems seem a poor fit for the hack day format.
But I wish them luck and I hope my scepticism doesn’t discourage people from trying.
An Ajax loading wheel is not a “user experience”. It is a waste of the only life you have animated in miniature.
What is an opinion? What is the point in having opinions? I would suggest that your answer to these questions is very much dependent on who you are and what you value in society.
But let me answer just for myself. The value in opinion is dependent on whether your opinion is informed, whether you are a reasonable person in command of the relevant facts, whether you are aware of your suspceptibility to erroneous thinking processes and capable of overriding them. The opinion you come to would ideally be structured. That is, it would have some kind of factual premises, some set of reasonable procedures you use to in order to reach certain conclusions, and the areas where you have had to settle for subjective feelings or emotions spelled out in a way that people can see how you reached your opinion. You expect that the holder of an opinion can justify that opinion in some fashion by appealing to the premises, the reasoning procedures and so on.
Perhaps my opinion on the subject of opinions is based on my personal and educational background—undergraduate and postgraduate degrees in philosophy. But I’m not dogmatic about this: I don’t think all opinions need to follow from some kind of pure reason or hew to the truth conditions of logical positivism. We can say intelligent and informed things about the subjective realm, about art and music and our emotions and personal experiences. Even with those, we can aspire towards understanding, to providing reasons and arguments, even if those reasons are subjective or presuppose some view not shared by others.
If that is close to your understanding of the nature of opinion, let me congratulate you on being a member of a proud philosophical tradition stemming back to the ancient Greeks. I have bad news for you and for your intellectual ancestors—Socrates, Hobbes, Descartes, Hypatia, Hume, Darwin, Russell—or whoever you pick for your hand from the grand deck of intellectual Top Trumps cards. All of you are out of touch with the modern world of business, advertising and consumerism. And if you are out of touch with those, then by extension, you are out of touch with their offspring: technology, media and the intersection of those things—social media.
It is with this background that I tested out State, a relatively new social media/technology startup based in London that is seeking to build a “global opinion network”, where the user can “have [his or her] opinion counted and see where [they] stand relative to others”. State has $14 million in funding, according to TechCrunch, and intriguingly has professional bullshit peddler Deepak Chopra on their board of advisors according to GigaOm.
I have been giving it a try: I mean, it should be a perfect fit—I have opinions. More than that, I’m a loudmouthed grumbly person who likes sharing my opinions with only minimal solicitation. Sounds like my sort of service. I was encouraged to join partly because a former colleague of mine had just started there and encouraged me to give it a try.
State is indeed a very interesting service, not so much because I think it will be either popular or important. I don’t think it will be either of those two things. But as a perfect encapsulation of exactly what the future holds for social media and society, you couldn’t do much better.
When one joins State, one is encouraged to find topics and to state one’s opinion on them in the form of a number of single word ‘opinions’, of the following form:
You get the drift.
In fairness to State, you can then attach a comment to one’s statement, to qualify or expand on it. But the primary index of one’s opinions is this single word expressive grunt: awesome, amazeballs, fab, OMG, fail, omnom. The designers of State looked at Twitter and decided it was not short-form enough and so have stripped from it any content besides the hashtags.
Of course, my predictions of what will become popular is very fallible and though I personally do not see State having much success, it could end up being the next big thing. If it does, it won’t be long until people from the worlds of marketing, business and media swarm on to it, demand some kind of API to extract opinions from the State platform and have them displayed in executive summary form on a Big Data-powered dashboard platform. Engineers will scurry around so that senior figures in consumer-facing industries who have a stake in public opinion will be able to see an algorithmic summary of what exactly the interconnected plebiscite thinks of their brands, their celebrity representatives, their preened political spokesmen, all helpfully quantified into a stock ticker-style ‘metric’.
The helpful grunts from social media will be put through “sentiment analysis” and the opinions of the consumer will lead to a happier, better world where marketers can slice us, dice us, mix together our opinions with our demographic data, quantify whether our preferences satisfy key performance indicators and lots of other important measures.
Opinions in this new world of social media aren’t opinions: they are signalling grunts for marketers. Are you doing better than your competitors? Count up the positive grunts and the negative grunts, calculate the balance of grunts and see if you are getting more grunts than the other guys. The consumer has so much choice on where exactly to post their grunt: on Twitter (with a hashtag, perhaps), on Facebook (by liking posts and pages), on Google+ (if that still exists) and finally now on State. As a system of grunt aggregation, State is impeccable.
Where it falls down is on that boring rational philosophical stuff I started with. In the epistemology of State and many similar social media sites, opinions don’t have supporting reasons. They don’t derive from any confrontation with evidence or experience. They don’t allow for refutation or reformulation or revision. You can refute an argument; you can’t refute a grunt. Ambivalence leads to confusion: thinking a politician is a vicious, dastardly shitbag but admiring his Machiavellian success doesn’t easily translate easily into a simple aye or nay vote for that person.
It’s quite telling that for an opinion platform, I am actually unable to express my opinion of State on their own platform but have to resort to constructing paragraphs of prose and posting them on my own website. But then, I’d like to think my opinion on this topic has reached the point where it is no longer a grunt but some kind of at least vaguely sophisticated take on the place of a piece of technology in society.
Of course, in the consumerist zeitgeist, complex thought is rather embarassing. If a consultant tells you something is impossible or unethical or complicated, you just sack them and hire a bullshit-peddling yes-man to tell you that everything will be fine, and that pigs will fly so long as they practice positive thinking. Why bother weighing up a complex interlocking argument when you can grunt an opinion about a hashtagged blipvert or whatever it is some advertising creative has come up with this week?
If you want to grunt about things, I highly recommend State. As grunt publication and aggregation platforms go, it is exquisite—wonderfully designed, superbly executed, beautifully illustrated and rather addictive. If you want to express something more like an opinion and less like a grunt, you might want to read a writing manual and start a blog, as well as prepare for being ignored by the decision-makers in society because they’ve collectively decided that grunting is more important than well-considered opinions.
Yesterday, I posted about sexism in a job ad. But one of the things that concerned me about the ad beyond the sexism is the absurd levels that companies now go to with “culture fit”.
From that ad:
He’ll like comics, will code for fun, probably wear band t-shirts
I’ve seen other job adverts like this that ask for a completely superficial level of cultural sameness.
And it is utter bullshit. What damn difference does it make whether I like comic books or wear band t-shirts? I happen to not do either of those things. Unlike seemingly everyone in hackerdom, I don’t actually like beer. I don’t read comic books (the genre just doesn’t do anything for me). The thought of spending any time in some shithole Shoreditch bar listening to grimy indie rock is utterly unappealing to me. (Similarly, stuffing five pound notes into a young lady’s g-string doesn’t do much for me either.)
Why should this nonsense matter? Why does what I do in my spare time affect my job? If I went into a job interview and they asked whether I go to church or whether I’m single or married or in a long term relationship, that’d be highly inappropriate. Society—and employment law—rightly tells employers that race/ethnicity, religion, sexual orientation, gender, marital status and age are not criteria they can legitimately use to decide on whether to employ someone. But apparently, music and fashion and what types of alcoholic beverage I consume are now things that employers can use to inform their decision making as to whether I would be good slouched in front of Eclipse or TextMate all day banging out code.
Think about it in terms of your superiors at work. I don’t give a fuck whether my manager spends his or her weekends seeing the latest hipster grunge act or dressed to the nines for a night at the opera house. It’s far, far more important that they are competent, capable, sincere, have experience and can enable me to be a better person in my work than the sort of music they listen to, or whether they are into microbrews or read comic books.
What this actually represents is easy to work out: an attempt to hire people just like them—from the same class background, racial/ethnic background and age range, using culture as a proxy. A crude, cynical way of doing discrimination by the backdoor. You don’t want a woman on your team? She doesn’t have cultural fit. Don’t want a middle-aged parent on your team (because, gasp, you might have to pay them well, not require them to do Red Bull-fuelled all-nighters, and let them have time off for little Timmy’s piano recital)? Cultural fit issues. Don’t want black queer men around calling out your racist homophobic brogrammers? Treat ‘em like shit because they don’t “fit” your “culture” (even if your workplace culture is the sort they study in Petri dishes rather than put up in the British Museum).
I’m pretty sure that writing this post means there are now certain cultures I now won’t fit in. Oh well.
How we used to read the news, back in the era of the Web:
How we read news in the era of fucking stupid pointless iPhone apps.
In the “web vs. apps” war, I think you can infer which side I’m on. I wouldn’t download a BBC app or an NPR app for my computer. Why would I want one on my phone? Do I buy a separate radio to listen to different stations? No. The functionality is the same, the only thing that differs is the content. Apps ought to provide some actual functionality, not just blobs of content wrapped up in binary files.
The release of the iPhone 5 seems to have set off more Internet debate about smartphones. I’m completely uninterested.
I have a smartphone: a Samsung Galaxy S2. It makes phone calls. It has some neat applications. The Gmail app on Android is superb. But beyond that… it’s a phone.
People debate smartphones as fetish objects. On Facebook, Robert Scoble said that
holding the new iPhone sold me. There’s nothing wrong with a few fetish objects or things that look nice. I don’t spend a lot of time just holding my phone. I spend time either using my phone or having my phone in my pocket. I just can’t understand spending hours waxing rhapsodic or worrying about this stuff. I mean, someone gave me a very nice bottle of eau d’toilette a while back, and I enjoy both the design of the bottle and wearing it but not to the point where I’m going to go and argue about the bottle designs of different brands of fragrance and express disappointment if a particular perfume manufacturer fails to innovate sufficiently.
People are overthinking this shit. They are phones. Unlike the Windows v. Mac v. Linux fights of yesteryear, it isn’t like anyone actually uses these things for anything important.
It’s a recurring theme in the argument about journalism: that journalists don’t know what they are talking about. With the magical powers of science, I want to see if that’s true. Below is a series of questions I have come up with to test whether it’s actually true or not. And by science, I mean a hastily constructed pop quiz.
Here’s the deal. If you are a technology journalist, please answer truthfully. I know all journalists are truthful and honest—I’ve been watching the Leveson Inquiry. See how many you can get right.
If you get less right than you think you should, consider whether you should be writing about technology.
Above is a sample of some code written in a programming language that was introduced in the last decade.
Please identify the name of the programming language and the broader family of programming languages that it is in.
If you can, please identify the name of the creator of the language.
Programming languages fall into two types: dynamically typed and statically typed. Please identify whether this is a dynamic or a static language.
Above is a sample of some code written in a different programming language.
Please identify the name of the programming language.
If you wished to produce a game to sell on the App Store for iPhones/iPads, which of the two languages you have identified above would be more suitable to build such a game in given the constraints placed on developers in the iOS ecosystem?
Above is a sample of a string that identifies something. Please can you identify what it is for.
WEP, WPA and WPA2 are types of what?
FAT32, ext4 and HFS+ are types of what?
You have probably heard of NoSQL. Please choose the odd one out: Redis, Riak, CouchDB, MariaDB, MongoDB, eXist.
I was going to ask a complicated question, but it’s now past 2am and the question I was going to write involved me reading assembly code, and I took the executive decision that I couldn’t be arsed.
Anyway, all of the questions above are on topics that have been covered or mentioned at least once on either TechCrunch or ZDNet. Even if you have no plans on writing about those topics, it’s something you presumably need to vaguely understand in order to be able to read the writings of other tech journalists.
Before you say “ah, but writing about technology doesn’t mean you have to be a geek”, let me ask you this: would you read what a music journalist has to say if they can’t identify the bands that are discussed in their own newspapers and on the websites they write for? What about a motoring journalist who had no idea what an axle was? A politics journalist who couldn’t tell you what a party whip does? A journalist covering the financial markets who has no idea who sets the interest rates? A wine reviewer who doesn’t know whether chianti is red or white? A science journalist who doesn’t understand the difference between an element and a compound? A religion writer who was a bit shaky on the difference between Protestantism and Catholicism? If not those, why do we accept journalists writing about technology who couldn’t tell you what a compiler is?
I’ve finally got USB tethering working between Mac and Android. I followed these instructions from AskDifferent (the StackExchange site for Apple and Mac related questions). You have to install a piece of software called EasyTether on your phone, and then carefully follow the instructions in the app which include installing drivers on your computer. It takes about 10 minutes.
But if you do that… it actually works. I’ve set up 3G connections before on Linux, for instance, which have required me to write AT strings and so on. (Which, you know, why? It’s 2012, for fuck’s sake.)
So why, given that any decent Android phone has a portable USB hotspot mode which basically makes it so your phone can rebroadcast the 3G signal as a wifi hotspot.
Two reasons come to mind.
Firstly, battery usage. You don’t need the wifi running in either your phone or on your computer. Less battery usage is obviously good.
But the far bigger reason is that you actually get a better connection. One thing I’ve noticed with both MiFi dongles and with the Android portable hotspot is that when you dip in and out of a mobile signal area, it’s very slow to reconnect. You spend a lot of time in TCP/IP-free limbo. This never used to be the case with GPRS: you’d get very quick reconnection, obviously at an unacceptably low speed.1
I’m writing this on the train home, and I’m getting service in areas that I wouldn’t when using portable hotspot. Portable internet that isn’t infuriating is good. That I have about an extra hour of battery life on my laptop is a nice bonus.
I’d like to reiterate a fundamental point: speed is one of the least important aspects of broadband connections. Reliability, latency, usage caps and so on is far more important than speed, depending on the application you are using. For pottering about on the web, downloading a few MP3s, what’s the damn point of having super-duper-ultra fast broadband? You can give me fifty megs a second, but if I can’t afford to use more than a gigabyte of data a month, it’s basically a toy. ↩
After an evening of cynicism last night, reading a bloody awful article by a pompous twit, and travelling on bloody slow trains, and then logging on to Twitter and seeing a bunch of bloody fools debating things they are completely ignorant of without even a modicum of philosophical charity, I found something which restored my trust in the human race: psd’s talk at Ignite London. It combines giving naughty link-breaking, data-sunsetting corporate types a spank for misbehaviour with an admiration for I Spy books. I had I Spy books as a kid, although mine were products of the late 80s/early 90s and had the Michelin Man, although in not nearly as an intrusively corporate way as Paul’s slides of current day I Spy suggests. Do forgive me: I’m going to do one of those free-associative, meditative riffing sessions that you can do on blogs.
The sort of things Paul talks about underly a lot of the things I get excited about on the web: having technology as a way for people to establish an educational, interactional feeling with the world around them, to hack the world, to hack their context, to have the web of linked data as another layer on top of the world. The ‘web of things’ idea pushes that too far in the direction of designed objects (or spimes or blogjects or whatever the current buzzword is), and the way we talk about data and datasets and APIs makes it all too tied to services provided by big organisations. There’s definitely some co-opting of hackerdom going on here that I can’t quite put my finger on, and I don’t like it. But that’s another rant.
I’ve been hearing about ‘gamification’ for a while and it irritates me a lot. Gamification gets all the design blogs a-tweeting and is a lovely refrain used at TED and so on, but to me it all looks like “the aesthetic stage” from Kierkegaard applied to technology. That is, turning things into games and novelties in order to mask the underlying valuelessness of these tasks. Where does that get you? A manic switching between refrains. To use a technological analogy, this week it is Flickr, next week it is TwitPic, the week after it is Instagram. No commitment, just frantic switching based on fad and fashion. Our lives are then driven by the desire to avoid boredom. But one eventually runs out of novelties. The fight against boredom becomes harder and harder and harder until eventually you have to give up the fight. There’s a personal cost to living life as one long game of boredom-avoidance, but there’s also a social cost. You live life only for yourself, to avoid your boredom, and do nothing for anybody else. Technology becomes just a way for you to get pleasure rather than a way for you to contribute to something bigger than yourself.
In Kierkegaard’s Either/Or, the alternative to this aesthetic life was typified by marriage. You can’t gamify marriage, right? You commit yourself for life. You don’t get a Foursquare badge if you remember your anniversary. The alternative to aestheticism and boredom is an ethical commitment. (And, for Kierkegaard anyway, ultimately a religious commitment.1) And I think the same holds true for the web: you can gamify everything, make everything into Foursquare. Or you can do something deeper and build intentional, self-directed communities of people who want to try and do something meaningful. Gamification means you get a goofy badge on your Foursquare profile when you check into however many karaoke bars. A script fires off on a server somewhere and a bit changes in a database, you get a quick dopamine hit because an ironic badge appears on your iPhone. Congratulations, your life is now complete. There’s got to be more to life and technology than this. If I had to come up with a name for this alternative to gamification that I’m grasping for, it would be something like ‘meaning-making’.
Gamification turns everything into a novelty and a game (duh). Meaning-making turns the trivial into something you make a commitment to for the long haul; it turns the things we do on the web into a much more significant and meaningful part of our lives.
In as much as technology can help promote this kind of meaning-making, that’s the sort of technology I’m interested in. If I’m on my deathbed, will I regret the fact that I haven’t collected all the badges on Foursquare? Will I pine for more exciting and delightful user experiences? That’s the ultimate test. You want a design challenge? Design things people won’t regret doing when they are on their deathbed and design things people will wish they did more of when they are on their deathbed. Design things that one’s relatives will look back in fifty years and express sympathy for. Again, when you are dead, will your kids give a shit about your Foursquare badges?
A long time ago, I read a story online about a young guy who got killed in a road accident. I think he was on a bike and got hit by a car while driving home from work. He was a PHP programmer and ran an open source CMS project. There was a huge outpouring of grief and support from people who knew the guy online, from other people who contributed to the project. A few people clubbed together to help pay for two of the developers to fly up to Canada to visit his family and attend the funeral. They met the guy’s mother and she asked them to explain what it is that he was involved in. They explained, and in the report they e-mailed back to the project, they said that the family eventually understood what was going on, and it brought them great comfort to know that the project that their son had started had produced something that was being used by individuals and businesses all over the world. This is open source: it wasn’t paid for. He was working at a local garage, hacking on this project in between pumping petrol. But there was meaning there. A community of people who got together and collaborated on something. It wasn’t perfect, but it was meaningful for him and for other people online. That’s pretty awesome. And it’s far more interesting to me to enable more people to do things like this than it is to, I dunno, gamify brands with social media or whatever.
This is why I’m sceptical about gamification: there’s enough fucking pointless distractions in life already, we don’t need more of them, however beautiful the user experiences are. But what we do need more of is people making a commitment to doing something meaningful and building a shared pool of common value.
And while we may not be able to build technologies that are equivalent in terms of meaning-making as, say, the importance of family or friendship or some important political commitment like fighting for justice, we should at least bloody well try. Technology may not give us another Nelson Mandela, but I’m sure with all the combined talent I see at hack days and BarCamps and so on, we can do something far more meaningful than Google Maps hacks and designing delightful user experiences in order to sell more blue jeans or whatever the current equivalent of blue jeans is (smartphone apps?).
The sort of projects I try to get involved in have at least seeds of the sort of meaning-making I care about.
Take something like Open Plaques, where there are plenty of people who spend their weekends travelling the towns and cities in this country finding blue memorial plaques, photographing them and publishing those photos with a CC license and listing them in a collaborative database. No, you don’t get badges. You don’t get stickers and we don’t pop up a goofy icon on your Facebook wall when you’ve done twenty of them. But you do get the satisfaction of joining with a community of people who are directed towards a shared meaningful goal. You can take away this lovely, accurate database of free information, free data, free knowledge, whatever you want to call it. All beautifully illustrated by volunteers. No gamification or fancy user experience design will replicate the feeling of being part of a welcoming community who are driven by the desire to build something useful and meaningful without a profit motive.
The same is true with things like Wikipedia and Wikimedia Commons. Ten, fifteen years ago, if you were carrying around a camera in your backpack, it was probably to take tourist snaps or drunken photos on hen nights. Today, you are carrying around a device which lets you document the world publicly and collaboratively. A while back I heard Jimmy Wales discussing what makes Wikipedia work and he said he rejected the term ‘crowdsourcing’ because the people who write Wikipedia aren’t a ‘crowd’ of people whose role is to be a source of material for Wikipedia: they are all individual people with families and friends and aspirations and ideas, and writing for Wikipedia was a part of that. As Wales put it: they aren’t a crowd, they are just lots of really sweet people.
What could potentially lead us into more meaning-making rather than experience-seeking is the cognitive surplus that Clay Shirky refers to. The possibilities present in getting people to stop watching TV and to start doing something meaningful are far more exciting to me than any amount of gamification or user experience masturbation, but I suspect that’s because I’m not a designer. I can see how designers would get very excited about gamification because it means they get to design radically new stuff. They get to crack open the workplace, rip out horrible management systems and replace them with video games. Again, not interested. The majority of things which they think need to be gamified either shouldn’t be, because they would lose something important in the process, or they are so dumb to start with that they need to be destroyed, not gamified. The answer to stupid management shit at big companies isn’t to turn it into a game, it’s to stop it altogether and replace the management structure with something significantly less pathological.
Similarly, I listen to all these people talking about social media. Initially it sounded pretty interesting: there was this democratic process waiting in the wings that was going to swoop in and make the world more transparent and democratic and give us the odd free handjob too. Now, five years down the line and all we seem to be talking about is brands and how they can leverage social media and all that. Not at all interested. I couldn’t give a shit what the Internet is going to do to L’Oreal or Snickers or Sony or Kleenex or The Gap. They aren’t people. They don’t seek meaning, they seek to sell more blue jeans or whatever. I give far more of a shit what the Internet is doing for the gay kid in Iran or the geeky kid in rural Nebraska or a homeless guy blogging from the local library than what it is doing for some advertising agency douchebag in Madison Avenue.
One important tool in the box of meaning-making is consensual decision making and collaboration. There’s a reason it has been difficult for projects like Ubuntu to improve the user experience of Linux. There’s a reason why editing Wikipedia requires you to know a rather strange wiki syntax (and a whole load of strange social conventions and policies - you know, when you post something and someone reverts it with the message “WP:V WP:NPOV WP:N WP:SPS!”, that’s a sort of magic code for “you don’t understand Wikipedia yet!” See WP:WTF…). The reason is those things, however sucky they are, are a result of communities coming together and building consensus through collaboration. The result may be suboptimal, but that’s just the way it is.
Without any gamification, there are thousands of people across the world who have stepped up to do something that has some meaning: build an operating system that they can give away for free. Write an encyclopedia they can give away for free. All the gamification and fancy user experience design in the world won’t find you people who are willing to take up a second job’s worth of work to get involved in meaningful community projects. On Wikipedia, I see people who stay up for hours and hours reverting vandalism and helping complete strangers with no thought of remuneration.
It may seem corny, and it’s certainly not nearly as big of an ethical commitment as the sort Kierkegaard envisioned, but this kind of commitment is something I think we should strive towards doing, and helping others to do. And I think it is completely at odds with gamification, which seeks to basically turn us all into cogs in some kind of bizarre Skinner-style experiment. We hit the button not because we are getting something meaningful out of it, but because we get the occasional brain tickle of a badge or get to climb up the leaderboard or we get seventeen ‘likes’ or RTs or whatever. Gamification seems to be about turning these sometimes useful participation techniques into an end in themselves.
Plenty of the things which make meaning-making projects great are things any good user experience designer would immediately pick up and grumble about and want to design away. Again, contributing to the Linux kernel is hard work. Wikipedia has that weird-ass syntax and all those wacky policy abbreviations. Said UX designer will really moan about these and come up with elaborate schemes to get rid of them. And said communities of meaning will listen politely. And carry on regardless. Grandma will still have a difficult time editing Wikipedia.
When I listen to user experience designers, I can definitely sympathise with what they are trying to do: the world is broken in some fundamental ways, and it is certainly a good thing there are people out there trying to fix that. But some of them go way too far and think that something like “delight” or that “eyes lighting up” moment is the most important thing. If that is all technology is about, we could do that a lot easier by just hooking people up to some kind of dopamine machine. Technology should give us all our very own Nozickian experience machine and let us live the rest of our lives tripped out on pleasure drugs. I read an article a while back that reduced business management to basically working out how to give employees dopamine hits. Never mind their desire for self-actualization, never mind doing something meaningful. Never mind that the vast majority of people opt for reality with warts than Nozick’s experience machine—the real world has meaning.
The failure of meaning-making communities to value user experience will seem pretty bloody annoying, if only to designers. There are downsides to this. It sucks that grandma can’t edit Wikipedia. It sucks that Linux still has a learning curve. Meaning-making requires commitment. It can be hard work. It won’t be a super-duper, beautiful, delightful user experience. It’ll have rough edges. But that’s real life.
A meaningful life is not a beautiful user experience. A meaningful life is lived by persons, not users. But the positive side of that is that these are engaged, meaning-seeking, real human beings, rather than users seeking delightful experiences.
That’s the choice we need to make: are technologists and designers here to enable people to do meaningful things in their lives in community with their fellow human beings or are they here as an elaborate dopamine delivery system, basically drug dealers for users? If it is the latter, I’m really not interested. We should embrace the former: because although it is rough and ready, there’s something much more noble about helping our fellow humans do something meaningful than simply seeing them as characters in a video game.
This is one thing I disagree with Kierkegaard very strongly on. But not for any high-falutin’ existentialist reasons. I just don’t believe in God, and more importantly, I don’t believe in the possibility of teleological suspension of the ethical, which makes the step to the religious stage of existence rather harder! I’m not even sure I’m in the ethical. It could all be a trick of my mind, to make me feel like I’m some kind of super-refined aesthete. Or it could be rank hypocrisy. But one important thing to note here is that the aesthetic, ethical and religious stages or spheres of existence, for Kierkegaard, are internal states. The analogies he uses don’t necessarily map onto the spheres. So, you don’t have to be the dandy-about-town, seducing women and checking into Foursquare afterwards to be in the aesthetic. If you are married, that doesn’t mean you are in the ethical stage. Nor does being overtly religious or, rather, pious, mean you are in the religious stage. Indeed, the whole point of Kierkegaard’s final writings, translated into English as the Attack Upon Christendom is that Danish Lutheranism was outwardly religious but not inwardly in a true sense. ↩
Everyone is going on about how they are making “HTML 5 sites” and going on and on about how HTML 5 is giving them a hard-on or something equally exciting.
So, I’ll show you how you join this amazing club.
Open up your text editor and find some HTML file or template.
Look for the bit right at the top. It is called a DOCTYPE. It’ll look something like this:1
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML+RDFa 1.0//EN" "http://www.w3.org/MarkUp/DTD/xhtml-rdfa-1.dtd">
Now, delete all that and replace it with:
Save the file and push it out onto the web.
Congratulations, you are now using HTML 5.
Give yourself a big pat on the back. Listen to some cutting edge spacey techno or something. ‘Cos you are living in the future, man. Your techno-halo is so bright, I need to put on two pairs of shades indoors.
You are now officially signed up in the fight against SilverFlash and a minion in Steve Jobs’ campaign for the open web or something. (Because embrace-and-extend is so much nicer when it comes from Apple and Google than when it comes from Microsoft and Adobe.)
You can also go to your boss and justify a huge champagne and coke-fuelled party with hookers and everything because you are now fully buzzword compliant. You can get venture capitalists and TechCrunch and other people who wouldn’t know a DTD from an STD2 to give your huge, manly testicles a thorough tonguebath – sadly, only rhetorically – because you are smart and hip enough to be using HTML 5. Pow! Bam! Shazam! You are like a cross between Nathan Barley and Rambo!
Or, you know, you could actually learn what HTML 5 is. Let me give you a clue: it is quite a lot like HTML 4. That’s part of the philosophy of the damn thing: it is continuous with what you are already doing rather than a radical shift! It is that cliché: evolution not revolution. It’s like the difference between OS X Leopard and Snow Leopard.
Once you realise this important truth, you can drop the buzzwords, and just quietly educate yourself on some of the quite nifty new things you get to do on the web, get your rather excitable colleagues to calm down before they feint in pre-orgasmic excitement, and maybe try and nudge the community at large into realising that HTML 5 is a few new bits and bobs they are adding to HTML, not some hybrid of Jesus and Vannevar Bush riding down on a pterodactyl/unicorn hybrid giving out ultratight Fleshlights to anyone who slings angle-brackets so they can prepare for the giant fight between HTML 5, evil browser plugins and mobile app stores.3
You can adopt HTML 5 really quite slowly: if your site sucks now, making it “HTML 5” won’t make it not suck. Even better, don’t start with HTML 5. Start with CSS 3: the nature of CSS is that it is much easier to fiddle with a stylesheet, add a few things like media queries and so on.
Be patient and don’t rush into this. Include only technologies that improve your site and the experience of using it. Not because some fucking bullshit web design blog you found on Reddit is jabbering on about how it is the most awesomest thing ever invented since someone discovered you could have sex while eating sliced bread or some other crap like that. It’s not. It’s an evolutionary step from existing HTML on the web that gives you a few shiny new things that might make life easier.
Now calm down. I’ve just washed my clothes and I don’t want you jizzing all over them when you discover the joys of the section element.
Yours will be much more boring. It won’t have cool shit like RDFa in it because you suck. ↩
To be fair, DTDs and STDs share a scary resemblance in lots of ways. You can prevent the transmission of DTDs by adopting RELAX NG for all your XML schema validation needs. ↩
Again, the whole native vs. web thing is fucking stupid. The only reason it is happening is because people seem to think that everything needs to be an app. You know, if the thing is more like a web page, you put it on the web. If it is more like a desktop application, you put it in an app. Content? Web. Functionality? App. This also resolves all the stupid nonsense about app store approvals. Why have we reached a situation where people are putting content in an app? You know, people are downloading blobs of Objective-C compiled object code that contain satirical political cartoons. Then they are complaining when Apple ban the ‘app’. What the fuck is that all about? Put that shit on the web. Apple can do what they want to apps, but why let them tell you what you put in your content. Let them approve functionality, not content.
There was a time many moons ago when you had to download a Windows application – actually, you had to have a Windows application sent to you on a CD-ROM – in order to order groceries from Tesco. This is the app world we live in today, and it is totally idiotic. Apps are things like Vim or Firefox or Unreal Tournament 2004 or iTunes or The Gimp or Final Cut Pro. If you wouldn’t download a Windows or Mac app to read Wired Magazine, why are you downloading a damn iOS app?
What is so stupid about this is that while Apple and Android and whatnot train everyone up into using app stores, what’s the reaction of plenty of people in the open source community: don’t worry, the web will do it. (Or worse: we’ll make an open web app store!) But it’s bullshit. The web is a pretty damn retarded application platform. I mean, it is okay in a pinch, but I’m not betting on a decent Ajax version of Vim, Half-Life 2 or Adobe Illustrator any day soon. And why would I want to use Google Docs when I’ve got thirty years of hard work by Donald Knuth and Leslie Lamport sitting there ready to churn out absolutely awesome pixel-perfect print documents from my damn command line. Plain text, Vim and Git (or Emacs and Mercurial or some other combination thereof) will beat the socks off whatever cloud vapour out there.
Here’s an idea that came to me after reading about all the different teams and infrastructure on the Ubuntu Wiki:
Read My Docs
A web service to connect people willing to proof-read and provide constructive feedback on open source software documentation. Projects could post announcements of newly written or radically revised documentation including manuals, tutorials, guides, man pages, READMEs etc. Each announcement would have a comment page, and links back to issue trackers and/or version control systems so that contributors could submit back either comments, issues straight into the issue tracker or even diffs/git pull requests/hg patch queue requests etc. (this is made easier by the use of distributed version control systems).
Each announcement would be marked up to specify the intended audience of the documentation: end user, developer, expert, absolute newbie etc. In addition, if announcements were specific to particular language or technology communities, these could be noted using tags so that one could drill down and find those things.
Language communities where packages are released through a central repository such as RubyGems, PyPI, Haskell’s Hackage etc. could have newly released packages automatically announced to the site so that developers with a specific interest could keep an eye on the quality of documentation and improve it.
A community could perhaps build around such a site and they could build up good practices that could be documented, leading to a positive spiral of better and better documentation. Some kind of intelligent game mechanic could potentially be applied so that instead of people rushing around cities checking into venues on Foursquare, they would get goofy badges and points and mayorships and leaderboards and so on for doing something useful like writing better documentation.
The end result? A small army of documentation fairies who would improve open source documentation across a wide range of projects, languages, communities and Linux distributions without having to join any of those communities. And hopefully a fun way for people who aren’t programmers to ease the documentation burden from the people who’d much rather be writing code.
I’ve never understood people who deem skills obsolete long before they actually are. A while back, someone started a wiki listing obsolete skills. Some skills truly are obsolete. But most of the time, just as with technology, when someone says a particular skill is obsolete, there’s usually a pretty good chance that you will still need to do it. Programming in FORTH may be less commercially useful these days than programming in Java or Python, but the programmer who knows FORTH has a valuable skill even if he doesn’t find himself using it very often. (I’ve got an RPN calculator app installed on my iPod touch…)
Being able to use a typewriter is one of those things. Everyone tells me that typewriters became obsolete in the 1970s. Strange. I was born in the mid-80s and still remember that in 1995, I was using a manual typewriter for a school report even though we had a PC. And the skills I learned using a manual typewriter – namely, the ability to touch-type – are pretty useful now I’m chucking around Ruby code in Vim or academic citations in LyX.1
I’m used to people telling me that typewriters are obsolete technology, and skills like being able to change dot-matrix printer ribbons or operating a rotary phone are now obsolete. But I never expected handwriting would ever end up in the same category.
Most schools still include conventional handwriting instruction in their primary-grade curriculum, but today that amounts to just over an hour a week, according to Zaner-Bloser Inc., one of the nation’s largest handwriting-curriculum publishers. Even at institutions that make it a strong priority, such as the private Brearley School in New York City, “some parents say, ‘I can’t believe you are wasting a minute on this,’” says Linda Boldt, the school’s head of learning skills.
Parents are finding it strange that schools are spending time teaching children how to actually write? What the fuck is that all about?
I’ve got an iPad. I’ve had a Palm Pilot in the past. I’ve got a laptop. And I still write by hand a hell of a lot. Why? Because it is fast. And it is especially quick if you want to jot down something other than linear text. If you want to draw a diagram, doing it in a notepad is a hell of a lot less painful than doing it on almost any computing device I’ve ever used.
This is partly my objection to most smartphones: when I use a laptop or desktop computer, my fingers can keep up with my brain. I can just about do similarly when I’m writing longhand. If I’m using an iPhone/iPod touch or a Blackberry or whatever, it takes a bloody long time to take notes. I guess if you can’t touch type on a full-size keyboard, being forced to use a smudgy little iPhone keyboard that big of a step down.
So, imagine, we don’t teach kids how to write by hand. How do they answer exams in school and in university? For many subjects, undergraduate and postgraduate, you still have to do pen-and-paper exams.
What if you want to become a reporter? If you are in Afghanistan reporting on the war, you may not have the chance to take an iPad with you.
What if you simply want to leave a note on the fridge reminding one’s family members or flatmates to buy some milk when they next go to the shops? Oh yeah, I’m just going to login to my computer and tap it out in a word processor and send it to the damn laser printer hoping that some idiot hasn’t left it without paper or toner? I like computers more than most, but even I’d just grab a post-it note in that situation. I like pen and paper for the same reason I prefer using Vim over Microsoft Office: it is simple, powerful and failure-resistant.
It is the old tale of the reluctant geeks again: while everyone else breathlessly adopts technology very quickly, those of us who know the technologies in unhealthy amounts of details are a fair bit more conservative about changing systems that work. And, well, handwriting actually works pretty well. The pundits have gotten equally breathless about e-books, but I expect I’ll still renewing my library card to borrow physical books for the next decade or three.
As with e-books, there may come a time when technology has far outpaced the need for writing by hand. 2010 is not that time. The fact that we have to resort to neuroscience to justify teaching basic writing skills is absolutely pathetic.2
Bully for Walt Mossberg. The tech media really is getting rather ego-centric...
I just read an article in The Guardian about "the future of reading" and it cited all the usual sources: Nicholas Carr and lots of neuroscience studies and blah blah. That is all fine and good, but there is something that really got up my nose about the article: the idea that there is some kind of technological bandage for people not reading as much or for as long as they "ought" to be.
The idea that the ADHD crowd with their seven hundred browser tabs open will suddenly all go out and buy Kindles or iPads and the problem will just be solved like that is a joke. I have both an iPad (yes, I gave in) and an e-ink based reader (like a Kindle but without Amazon's DRM layer). I still have a stack of unread books on my shelves and desk that would probably reach up to at least my thigh, and a few thousand pages of downloaded papers and articles, manuals, documentation and miscellanea across Papers, Stanza, GoodReader, iBooks and on the micro SD card for my e-reader.
New technology doesn't add more hours to the day. Nor does it change the working patterns of the world around us. It doesn't make a 300 page tome any less lengthy (although it does let me store more books than I could ever read on a card the size of one of my smaller fingernails). It doesn't make the email I get any less idiotic. It doesn't make Twitter or Reddit or Google Reader any less seductive. Oh yeah, all three of those are on the damn iPad, along with a stash of games and music and videos. What exactly is going to entice me to tap GoodReader rather than reading Twitter or playing Sniper Strike again?
And these technologies aren't going to just magically turn people into closer and more careful critical readers. I can see it on Oprah: before getting my Kindle, I was an unliterary fool who just about managed to get through an article in the gossip rags about the latest Big Brother winner, and now I'm the next D. F. Strauss. No, we are kidding ourselves.
Neither the Internet nor any particular technology won't save our souls or our attention spans. Nor will they finance the news industry or save the music industry from impending failure. Anyone who is betting on the iPad solving all of society's woes (or indeed saving their own behinds) is just deluding themselves. Technobandages may exist for some things, but there isn't a special technobandage that will fix any perceived shortfall in literary reading.
And, you know what, it isn't all bad news. When the U.S. National Endowment for the Arts released the 'Reading At Risk' report back in 2004, it was widely reported. I was going to cite it here but - you know what? It is out of date! I didn't see anywhere near as much press reaction when in 2009 the NEA released the latest version of the report, 'Reading on the Rise' which reported that the number of adults in the U.S. who read literature has increased by seven percent between 2002 and 2008 (it is climbing back up towards what it was in 1992), and the largest increase by age group has been among 18-24 year olds - you know, the group that always gets branded rather negatively as the 'Facebook generation', despite the fact that they were probably up in their bedroom reading large quantities of written material online while their parents were downstairs in the living room watching some crap TV programme. In fact, although not statistically significant, the only group to have dropped in the survey are the 45-54 age group.
Interestingly, the report also notes that "most online readers also report reading books". The Internet isn't crowding out book reading: "For adults who read online articles, essays, or blogs, the book-reading rate is 77 percent". This is no surprise: television has fucked up far more people's lives than the Internet ever could. Worrying about the Internet rather than television is a little bit like worrying about the trace amounts of carcinogens in your coffee when you smoke forty a day.
Was this rise in American literary reading all because of the Kindle or the iPad? Well, it can't be because of the iPad because it wasn't even out when the surveys were being done. The Kindle and other similar e-ink devices? Maybe. But I think it is much more likely that other factors than technology have been the cause. I'm sure us techno-types would like to think that we've brought about some kind of literary renaissance, but - you never know - people may have actually made a positive change all by themselves without having to use the Kindle as some kind of literary nicotine patch.
126 Alaska Street, London, SE1 8
Okay, first read Keep developers out of politics, please and then read Keep stereotypes of software developers out of politics, please.
I'd suggest, given the arc of the thread thus far, that it would be far more useful to keep the existing political class out of politics. That means lawyers, that means money-men and that sure as fuck means bankers. Software developers are ill-suited to politics? Perhaps I have too much self-interest in perpetuating this stereotype but software developers are, by and by, the group of easily identifiable people I know that are least likely to fall into the 'greedy asshole' category.
And greedy assholes are the problem. To understand the problem with politics, don't look so much at ideology, look at greed. You want to understand the bailouts? Greed. Want to understand the lack of a decent regulatory framework around the financial sector that led to worldwide economic collapse? Greed. The response to Deepwater Horizon? Greed. You go to Westminster or Washington and you'll find fucktons of greedy assholes being fed bullshit ideological lies sweetend with bullshit pay-offs by other greedy assholes. And, if those people were in the IT industry, they'd be the suits. Or, these days, they'd be the assholes in suits who don't wear ties to look cool and trendy.
However crazy Richard Stallman gets, I'd much rather have him serving in US politics than some fucking Gartner analyst. He may be nutty and do embarrassing and impractical things, but the Richard Stallmans of this world are far more genuine and non-asshole-ish - even when they are being assholes! At least they are being assholes for a good cause, rather than just being assholes to further their own greedy self-interest. Richard Stallman is being an asshole so that you can have a free and open source copy of Emacs. The fucking legislature are being assholes so that they can get a handjob from some big business lobbyist in a vague and unfulfilled promise to bring jobs.
And, to be honest, anyone who has done web stuff knows about governance. If you are building social systems, you know what huge differences very small changes can bring. Write the copy slightly differently and you get a huge reduction in trolls. Think Slashdot's karma system. Think of the process of trying hard to build niche community sites that are filled with good content rather than assholes. While the social media people and their corporate overlords are happy to pull a Wordpress or phpBB thing off the shelf and stick it up, we know that sometimes you have to put some effort in if you want to get the reward out. Again, think Hacker News or Stack Overflow. The economics and political science crowd have discovered this kind of thing recently with all the behavioural economics "nudge" stuff.
Given Douglas Adams' very simple rule that the people who are best qualified to rule are the people least desiring of power, I think software developers are a pretty good fit. We've got a public relations man in No. 10 at the moment. Yes, David Cameron is formerly of the PR industry. Compare: software developers are paid to tell you the truth. And we try to not hesitate in that task. If you ask me what I think of a particular database or programming language, I'm not going to beat around the bush. David Cameron is a PR man. A man whose former profession is nothing more than the task of lying for money.
This is nothing personal or partisan about Cameron, but how can I believe anything he says when his former profession is lying for money? How could I believe Blair with the omnipresent Alistair Campbell whispering in his ear? Our leaders are either turning into celebs - Schwarzenegger - or being surrounded by such a huge layer of PR bullshit as to insulate them from reality, and to free them from the petty demands of truth and reality. This is no grand or original observation: it is simply cold, hard fact. And what are the results? Crap. Government seems to think that issuing press releases is making policy, giving press conferences is implementing policy - and the actual policy itself is being written behind the scenes by lobbyists.
The governments of the Western world have brought us the Digital Economy Bill, the Digital Millenium Copyright Act, crazy fucking libel laws that allow assholes to persecute our fellow geek brothers and sisters (the Singh v. British Chriopractic Association case) for attempting to bring scientific understanding to these idiots. Our governments are ripping up funding for basic academic research and replacing it with bullshit like the Research Excellence Framework.
And rather than stopping to realise the craziness of all this, they seek to impose it on everyone through international bodies: through the IMF, through the WTO, WIPO, the UN, through bullshit trade agreements like ACTA.
Our governments have padded their own nests in creating voting systems that are totally unfair, outdated and more suited to a time when it was only the lord of the manor who could vote and not the servants or peasants around him. They've promised change alright - for decades. Do we believe them? Occasionally. We are then promptly disappointed.
Now we have governments who, to fix the problems caused by their asshole friends in the banking industry are taking it out on the worst off in society.
My question is just this: where are you going to find more concern about this? At some hoity-toity analysts and VCs conference like Web 2.0 or LeWeb or at a hackers conference? Just what have our industry's money-men done about this? Sweet fuck-all, it seems. Silicon Valley, and its offshots, is filled with a rampant and totally fucking stupid technolibertarianism. They're all off in Ayn Rand la-la land, believing that if we can just privatise the roads, that'll sort everything out. Yeah, assholes. Yes, yes, you can go on about boring things like the Internet being created by the US government under DARPA, and the public funding given to CERN, and how much of all this is public infrastructure, about the key role that universities have been playing in fostering startup cultures (MIT, Stanford, Cambridge etc.). You'll just get back a load of Ron Paul propaganda and exhortations to read Atlas Shrugged.
But, come back down to earth. Software engineers seem to be much less assholish than that. There's a reason why we try and keep things like BarCamp free and low-cost. Because we're not all rich assholes. The Web 2.0 Summit costs four thousand dollars. BarCamps are free. The BarCamp crowd mostly aren't rich assholes. They're people who are just trying to do useful and fun things with the skills they've got.
What makes software development slightly different is that it is a relatively creative act - not the only creative act (some people, for some reason, seem to think that if we say building software is a creative act, we are saying that it is the only creative act. I do not know where this stereotype comes from, but people have made a lot of hay out of it.), but a creative act. Those who are involved in it get to spend time solving relatively interesting problems and are reasonably well paid for it. Admittedly, you have to spend all day in front of a computer. But you aren't spending most of that time answering e-mails from assholes.
I can't imagine why anyone in their right mind would want to exclude software developers from politics. They are genuinely some of the least assholeish people I know. Oh, wait, I do know. Certain existing powers would much rather have our government made up of easily-bribed, industry-lobbyist-fed assholes who will preserve their comfy status quo and bend over backwards the next time Goldman Sachs wants to run our economies into the ground and walk away with a giant fucking bailout.
Software developers: we may not be pretty or be dynamic and exciting speakers, but we try hard not to be assholes. That is a huge asset in a world run by assholes.
I hate to bang on about this, but it is important: while doing my Master's degree, I never had a single PowerPoint presentation. Lecturers made do with distinctly analogue technology: paper handouts, blackboards/whiteboards, vocal cords and books. Before that, I only had one lecturer who used PowerPoint. I am no great fan of PowerPoint. And I'm sceptical of most e-learning projects. I've written before on Learning Objects, and most e-learning projects seem to be driven by trying to push what is currently popular, fashionable or on the bleeding edge into the classrom, regardless of whether it actually will benefit learners. Most e-learning is technology for technology's sake. Digital whiteboards? Fine, except they are harder to read for people with vision problems. I remember once having to play a quiz using remote controls - very much like a gameshow. Except half the remote controls didn't work. It would have been a lot easier just to give out a scrap of paper, write the answers on it, then swap test papers with the person next to you and mark the questions. Most technology in the classroom falls into two clear categories: pointless or really pointless. With all the people pushing technology as a magic fix for all that ails education, plenty of us geeks are a bit more reticent about it. For me, it is pretty much a necessary precondition for any useful learning on non-technical subjects to set my computer aside and read a damn book on the subject. In fact, it is often sufficient for technical stuff too - a few hours and a good O'Reilly manual is sometimes more than enough.
Which puts this story in The Argus, Brighton's local paper, that I saw yesterday into perspective. Davison Church of England High School for Girls, a secondary school in Worthing is going to require pupils in year 9 (that is, 13-14 year olds) to purchase an Apple iPad
as part of its new elearning project, which will begin in September. This e-learning project is so well-tested that the school is requiring parents to spend three hundred pounds on a gadget that hasn't even been released yet.
The announcement and sale of early release Apple products has long been said to bring with it a "reality distortion field" - why, Apple's critics charge, would people be willing to purchase the iPod Shuffle, a device that until recently didn't even tell you what song was playing, without the presence of some kind of fanatical craziness reminiscent of the religious enthusiasms of revival-tent preachers? Apple customers are - the critics say - more like Scientologists or Objectivists or those goofy people who take Dan Brown novels or the rantings of Glenn Beck a bit too seriously - it is just a computer after all. As an Apple customer who has no end of problems with his hardware, and who would much rather be in the free world of Linux and GNU but is held back by the pragmatics of the world, I disagree with the idea that all Apple customers are driven by such brazen religiosity. Many of us have a slight aesthetic preference towards shiny objects but manage to keep it under control when faced with more practical concerns of everyday life - like not pissing money away recklessly on anything vaguely shiny and magical-sounding.
Surely, though, an e-learning project needs some thought, some testing. Before you require parents to go and spend a few hundred pounds on a gadget that will form an essential part of the curriculum, it might be useful to actually get your hands on one and test it out - see whether or not it does the job you are intending of it. It is sad to see the reality distortion field extends beyond the fawning media, the obsessive blogosphere, the cheering and hollering convention centres and out to the humble C-of-E secondary schools in Worthing. This is very sad, but it also points to a lurking travesty: the fact that schools are absolutely failing to teaching their charges that technology is a tool to empower them, to liberate them and to have fun. It is not about delivering a stream of learning objects with all the soul of a Chicken McNugget. Technology isn't just some gadget you use to coerce Facebook-addicted pupils into giving a shit about GCSE Geography or, worse, some fakakta Business Studies 'diploma' - basically what we used to call a GNVQ or a VCE rebadged for about the seventeenth time this decade to try and persuade people that there is actually parity between academic and vocational qualifications, even though there isn't. Schools love technology to the point where it turns their pupils into little hackers - because little hackers don't fit with the control implicit in the school ethos. They are the ones who moan about wanting to install Linux on everything and point out your inadequacy. They will generally take absolutely no shit and follow a course of enlightened absenteeism - if they aren't learning something useful, they may just stop turning up.
Schools should be teaching children to tinker, to hack, to throw code together hastily and make it do something cool. This is easier than ever these days thanks to virtualization. What happened to the idealism that gave us the BBC Micro? All swallowed up and replaced with proprietary, locked-in entertainment devices - and expensive ones at that. Imagine if you gave every kid a netbook instead and taught them Python. If you can't afford a netbook, give every kid a Xen instance. That will teach them more than all the e-learning initiatives you can brainstorm. If you want to do technology in the classroom, get rid of your bullshit ICT classes. Stop telling yourself that teaching people how to use Microsoft Word is teaching them an important life skill. Teach kids to hack and tinker would be teaching them a useful life skill rather than an expensive, untested, "cutting edge" e-learning project that teaches them even more dependency on hardware and software vendors.
I just wanted to repost my violent agreement with Jeff Atwood's post about netbooks. It responds to a post saying that netbooks are just poor smartphones. I disagree. Smartphones are just bad netbooks.
A smartphone has a keyboard: but it is a lame keyboard that you can barely use to write an e-mail, let alone a novel or a few hundred lines of C++ with.
A smartphone has the ability to make voice calls: netbooks have Skype, which is basically a legacy emulator for the phone layer. If you want to communicate, use e-mail or IM. They are like voice phones but significantly less annoying.
A smartphone can run any software you like: so long as it has been pre-approved by Apple or some shady cabal of profit-addicted mobile providers. A netbook lacks this significant limitation. Debian, Ubuntu, Fedora, Suse, Mandriva, OS X (Hackintosh), Windows - pick your poison. They all taste pretty good compared to selling your soul to the phone companies. A previous generation of hackers did everything they could to fight against the phone companies - Google 'phreaking' if you don't believe me - now what do we do? Let them choose what software we use.
You can read mail on your smartphone: but I get to use mutt on my netbook. This sucks significantly less than your mail client.
Your smartphone is multiuser in the sense that multiple people can use it. My netbook is multiuser in the sense that I can set up an account on it for family members to use, and they can't go poking through my e-mail. (Apparently, the iPad is 'multi-user' in that you can play drag-and-drop chess on it like you can with the Microsoft Surface. Great. Can you let other people use it without them able to read your mail?)
Call me when your smartphone runs bash, vim and git. Call me when I can write my dissertation on your iPhone. In fact, don't call me. Write me out an e-mail on that teeny-weeny keyboard and I'll send it back and make fun of all the times your iPhone has turned 'reading' (as in books) into 'Reading' (as in Berkshire town, home of the annual rock festival and the UK home for all the big tech companies like Intel, Microsoft, Oracle et al.).
Netbooks are everything that makes computing awesome in a smaller package. I want to take one everywhere: someone needs to start making holsters for these babies.