Every status update since the dawn of Thomas

Categories

Tuesday, 16 November 2010

Yet Another Christmas - 5 Ghosts of Christmas, Past and Present

TV ad: “It’s that time of year when the Christmas decorations come out”

Un-named TV watching friend: “Not it’s not, chap. Put them away.”

Yeah, I know, predictable. The Christmas-themed post – well, let’s get it out the way early. It may surprise regular readers that I do not consider myself a Scrooge: I, in theory, like Christmas – Or at least I used to. Not so terribly long ago, in my mid-twenties, I was, in fact, known for my love of the festival (or at least resistance to others’ Yuletide whinging and cynicism) – to the extent that for a few years at the shop I worked in I became, by default, the guy who voluntarily got in, and put, up the ‘decs’. But I can’t deny a tendency over the years towards increasing apathy, the unresponsive grunt or the weary sigh as the season kicks in – oh, that shit again. Another f***cking Christmas. This now happens automatically, and then I catch myself doing it and feel a bit sad – how did it get this way? How did it come to this? Well, it’s not so odd for a childless singleton of my age – it’d be far odder for someone in my ‘demographic’ to be getting all excited about it. But still, my feelings upon the seasonal season are complex, and there’s material to be mined here; so indulge me, if you will, in a slightly unwelcome stirring up of my Christmas past...

Ghost 1: The Loss of Faith

The first blow was, of course, when Father Christmas stopped existing. Christmas was a time of crisis-inducing excitement when I was a kid – it was by far the highlight of the year, and cast a feverish glow over the whole of Autumn and Winter. It wasn’t just the toy payload, although that should be mentioned – no adult gift will ever be as exciting as a Lego pirate ship. The only time I came near to this level of gift satisfaction as an adult is when I bought myself a guitar and amp-simulator/effects unit (with contribution from my parents as their gift) and waited until Christmas day to “open”/play with it. You’ve got to have something to play with... next year I think I might buy myself a Lego pirate ship.

No, it wasn’t just the toy payload, it was the magic. It was a time when the supernatural, the arcanely fantastic, actually, in reality, made itself tangible and touched your life – and was accepted by everyone as really, actually doing so. Supernatural goings on and fantasy beings from fantasy worlds were mainly consigned to books and TV, and I knew they were made up. Sure, there were also second-hand, supposedly true supernatural tales told by others – but at Christmas the magic rained down with a vengeance and happened to you, unambiguously, undeniably. And that was crisis-inducingly exciting – for a day-dreamy, imaginative (read: weird and arty) kid like me, it just grabbed my mind and took it far, far away.

So the year that my brother told me he had heard Father Christmas come into the room with our stockings, and then go and take a piss down the hall, was a little confusing. Would this magic being, un-encumbered by time, really be bound by such mundane, earthly functions? It was a little unglamorous, but I saw no reason why not. My brother (older than me) was more sceptical – he was suffering a Crisis of Faith (from which he never recovered – he was evidently Told shortly after this when he raised the issue with the folks).

I however, held out for another year. The topic was much debated amongst my peers in class. Some claimed to have been Told by older siblings – whisper it – there is no Santa Claus. Some had tales of finding the contents of their Christmas stockings in their parents’ wardrobe. The evidence was mounting, but I was in firm denial. I was one of the most vocal amongst the embattled Faithful. I constructed all kinds of ‘logical’ arguments to pooh-pooh the doubters, to try to make out that it was simply impossible that my parents could do it all, that it surely didn’t make sense. When would they buy all that stuff? Where did they store it –I ain’t seen no gifts in no wardrobe, fool! (I clearly didn’t have much grasp of what adults were capable of).

So, when, after Christmas, my Mum sat me down with my brother and they Told me, I felt more than a little embarrassed. I clearly remember thinking at that point: Just goes to show that, if you want to believe something enough, you will find arguments for it in the face of all evidence. I became aware of the power of self-delusion, in myself and others, which I like to think is a pretty damn profound thought process for someone of that tender age. The prospects for me ever being uncritically religious after that were pretty slim (though my early-mid teens did see some ill-defined wishy-washy religious dabbling). The repercussions for ‘magic’ in the world in general were also not lost on me – the world suddenly looked significantly duller, greyer and more uncertain.

Ghost 2: Cheese Mind

In my sixth-form years I got into the habit of making pointless analytical comment on aspects of culture that people largely took for granted. Which I still do, chronically. In fact, I studied in it at two universities and made a living out of teaching it for four years. It made me feel clever, and anyway, I couldn’t turn it off. I still can’t turn it off, which is sometimes a problem, and yes, it still does make me feel clever. Or alienated from my own society, one of the two.

Anyway, this was the point at which I started noticing that the paraphernalia of Christmas, all the motifs and symbols, were largely trotted out by people without any conscious thought whatsoever. All the guff – tinsel, bells, holly, trees, logs, santa hats, reindeer, stars, Merry Christmas and a Happy New Year – it was all on auto-pilot. Sure people are dimly aware of where this cultural bric-a-brac comes from – well, some of it, anyway – but no one really cares. Walking down the street with a friend I remarked that the glittery gold banners saying “Merry Christmas” in multiple neighbour’s windows may as well say “Cheese Mind” for all the thought that had gone into the greeting.

I had a point – the point is that the words “Merry Christmas” are not really words any more, just a sound, a polite seasonal code-utterance, a petrified shorthand signifier, which could be anything at all if enough people agreed to use it. Hence “Cheese Mind”. Why must Christmas always be Merry? What stops us using any other perfectly apt adjective? The fact of the greeting’s unchanging form shows that no real, personal consideration has gone into it at all. Ok, of course repetitive utterances like “Morning”, “Thankyou” or “Goodnight” are standardised for practical use – but actually, it is the same with most Christmas symbolism.

All of the common motifs you see around at Christmas will have their roots in some kind of folk tradition or practice, but no one feels they have to be clued up on this before they use them. It’s Christmas innit. Put a tree in your lounge. Put bells in your pop song. Something to do with Jesus in a manger and shepherds and that. This is most salient every time a Hollywood film tries to meddle with Christmas folklore – every new generation takes on the previous one’s traditions and symbology in an utterly superficial fashion, and then proceeds to ‘update’ them for ‘modern times’. The result is that you end up with a massively surreal mishmash of nonsensical, ill-fitting elements that they then have to try to force some kind of sense into. I mean, have you seen Santa Claus the Movie? The Santa Clause? Jingle All The Way? I mean – as an adult?

But this doesn’t just happen in films – it happens in society in general. When we look at other cultures’ rituals and traditions they often look utterly baffling, alien, and relentlessly oddball, and you wonder how they ever came up with that – well, Christmas is a case in point. Just think about it for a second – how would you explain it, in all its glorious absurd detail, to an alien (“So, tell me again, hu-man, where exactly does the flying fat man with the red and white arctic wear and the little people come into the equation? Hu-man?”).

The symbols used, see, over time and generations, become totally divorced from their origins, and just become groundless signifiers, spat out every year like festive tourettes. The map remains, but the territory has eroded away. As a yoof, “Cheese Mind” was simply a funny observation; but today, ill-informed and unthinking reflex touting of the thousand Christmas clichés you are bombarded with every year actually, actively rankles me – not alot, just a kind of low-level abrasive irritant. For f***’s sake, if you’re going to spray me with trite Christmas cliché, at least put some thought into it. At least get it right. It’s the same with Easter and it’s the same with vampire films – don’t get me started on vampire films.

Ghost 3: Don’t Look Back

But I still loved Christmas, in general. For a while, in my early twenties, I found new ‘magic’ in it by genning-up on the history and folklore of Christmas every Christmas time. Today’s traditions are an impossibly complex meshing of multiple strands – whilst nominally a Christian festival, with the Nativity, St. Nicholas and Good-Will to All Man, there is a heady, intoxicating cocktail of pagan Mid-Winter traditions that infuse the whole thing. Despite the St. Nicholas story, the figure of Santa Claus is clearly heavily derived from pagan folklore, a kind of wild-man-in-furs Green-Man/Dionysus nature spirit. All the nature stuff – gifts under trees, the holly and the ivy – is all obviously north-European, and at some point Christmas just merges with chilly north-European folklore in a fascinating way. Dark, icy north-European folklore, full of snow and forests and goblins. It’s all very Wicker Man – it’s Moominesque, but gorier, and I love it.

This worked for a while, but I still couldn’t help noticing a recurring malaise – pretty much every year began to seem like an anti-climax. You would catch yourself thinking that you just weren’t feeling as Christmassy as usual – as you should be – for the time of year. You would get frustrated at not being able to conjure more Christmas spirit with the usual devices that used to fill you with it. And then you would feel guilty about it. Why am I not Christmassy? Everyone else is. What a party pooper I am. The whole thing was just a little bit disappointing and – yes, frustrating; not because you were being a Scrooge, but because you were trying not to be – to recapture something just beyond your reach, that you expected to be on-tap.

Of course, it’s age, but we can be more specific: It’s not so much that I was too serious and responsible and adult for Christmas now, it’s that things had simply changed too much – a friend of mine hit the nail on the head when he pointed out that it’s about connection with childhood. His girlfriend still feels Christmas acutely, but she currently lives with her parents in the same house that she did as a child, and they largely still adhere to the same routine.

For me those early childhood Christmases were so perfectly formed that they set the template rigidly – being a young family they were always spent at home with just us – my mother, father and brother, totally safe, contained, predictable, a hermetically sealed idyll. But the older you get, the more circumstances change, and every small upheaval and alteration takes you further away from recapturing the feelings you had in that idyll – never mind the more massive upheavals that come later as an adult. By your mid-late twenties there is very little left of that connection at all – but that connection is what you need to kick-start the Christmassy feeling first learnt in the childhood idyll, so, if you’re a soft nostalgic type like me, you scrabble after what strands you can find, but it’s frustrating and never quite does it. Time to give up – accept it, Christmas doesn’t mean the same thing now. Appreciate it for what it is today –don’t look back.

Ghost 4: Bad Christmases

So maybe my enthusiasm for the festive season was on the decline – but was it terminal? Who knows? Because what really ingrained today’s state of seasonal apathy was the one-two punch of 2007 and 2008: a right pair of Bad Chritmasses.

I won’t go into the gruesome detail, but 2007 saw a tragedy in the extended family that indirectly precipitated an all-out family feud, complete with ensuing logistical crises. It started, like the detonation of an unexpected H-bomb, at table during Christmas dinner, on Christmas Day afternoon.

By Christmas 2008 that state-of-affairs had stabilised, but there was more threat of tragedy (unrelated) in my immediate family, that I’m infinitely thankful eventually turned out ok – but at that time extreme worry and heaviness hung over us all. I also became aware as Christmas drew nearer, that the last year had actually, in some kind of irrational Pavlovian response, made me apprehensive rather than excited about it. In a way, I was right to be – something, as it turned out, was brewing. There was about to be a thunderous fall-out amongst my close friends, that I was unwittingly the catalyst for. For reasons I won’t go into here, Christmas Eve had been upsetting and set up an explosion of resentment and animosity amongst friends on New Year’s Eve that saw me start 2009 in a crippled ball of jangled nerves. Like a kicked puppy that’s accidently pooed on the carpet, I was, all at once, a little guilty at having pooed, very worried for who had stood in it, but also mortified and uncomprehending at the screaming it had set off - and the severity of the kicking I’d just taken.

Neither 2007 or 2008 were silly short-lived tantrums – they were complicated, deep-running, circumstance-changing divisions – whose repercussions are, to some extent, still felt today. There is nothing quite like the weight of mortality, combined with the hunted-animal feeling of dancing over yawning rifts that are opening up between your nearest and dearest, to make the tinsel sag and the carols ring hollow in the ears. Such immediate pressures and strains tend to force one out of Christmas mode – it all looks a little unimportant, and it’s hard to pay the festivities much attention when the harshness of the real world comes a-knocking. If I was underwhelmed by Christmas before, now I wasn’t ‘whelmed’ at all.

So in 2009 I decided to expect nothing. I wouldn’t feel like I had to feel Christmassy, and I’d make no apology for shrugging my shoulders at it. As it turned out I was rewarded with a thoroughly pleasant, warm, happy, low-key Christmas – nothing massively exciting, just... nice. There was even an impromptu staff gathering and guitar-led sing-along at work, the first time anything of the sort had been done (Dunkirk spirit – work was a mess at the time). And New Year at the local was a very sociable blast. All in all a satisfying Yuletide – so again, this year, I will expect nothing, force nothing. A lesson learnt.

Ghost 5: Not Christmas’ Marketing Demographic

“Christmas is about the children” people would say, and I would shake my head at it as another example of the serious, sensible platitudes thrown out by responsible middle-aged types who had lost all warmth, joy and humanity from their serious, sensible souls. Terminally cynical, I’d think – Christmas is about community, celebration of life, a time to relax and indulge and be grateful for what you’ve got and to the people around you. It’s not just a glorified kids party, a bit of glitter and fairytale and toys for the spoilt offspring – it’s a cosy, communal, hallowed tradition, it’s for everyone.

But they were right, at least in some respects. Because (as stated in Ghost 3), the further away my own childhood gets, the less it means, and I’d be a fool not to notice that the people my own age to whom Christmas really matters – even though they may gripe about it – are the people with children. To them it is a living tradition, vital and necessary once more. What has gone wrong here is that the cycle for me should have been renewed – at my age I should be settling down and producing squawking pink little people in order to take charge and do that Christmas shit with them. But I haven’t, so my connection with Christmassy-ness continues to dwindle into near insignificance. I’m far too old for my parents to be lavishing soft Christmassy nonsense on me – it’d just be silly and embarrassing – but I have no one myself to lavish soft Christmassy nonsense on either. So there is no soft Christmassy nonsense anymore – except for the clichéd impersonal stuff I’m unwelcomely bombarded with by the world in general.

If in any doubt about this diagnosis, I only need to look at any of the abundant Christmas propaganda. All of it – the ads, the TV, the movies, the events – all are aimed at families (young families), or every occasionally young couples (about to have families). The season is thus renowned for making lonely people feel their loneliness all the more acutely, and it’s no wonder. But even ignoring the loneliness factor (I can’t credibly strike that pose since it would be grossly unfair to my friends and family, who are just lovely and always around), it’s more simple alienation from Christmas. One day it just hits you: Christmas is not for me anymore. I am not Christmas’ marketing demographic.

What a sad man you are, Thomas

But I want to end on a positive note. I am still, generally, pro-Christmas – I will not become a full-blown griping Scrooge about it, because there is still something of the nostalgic, dreamy idealist in me: Yes, the unconscious, knee-jerk, ritual habits may annoy me as the same old nonsense is trotted out ad-nauseum; and the bandwagon-jumping marketing machine may make me grind my teeth – as it kicks in as soon as you get back off your summer holiday, desperately trying to squeeze some life into the usual tired clichés in an attempt to grab your attention (and your money).

But I will always have a soft spot for the atmosphere, the cosiness, the myth and folklore and flights of fancy – and the sheer, silly, glorious eldritch irrationality of it all. One over-riding goodness of the festive season is how it forces people to change their day-in-day-out patterns for a while, even if it’s in a ritualised way. Christmas is something going on, something large scale and communal, something happening – that everyone in the country (and many other countries), regardless of background or religious persuasion, cannot avoid being involved in, in some capacity. The Christmas-New Year period gears everything up to a feverish pitch of activity and then abruptly grinds everything briefly to a halt, before we are allowed to carry on again as normal – and I think this is a good thing. It calls an end to the year, marks the passing of time, brings things to a head, sweeps in a fresh phase – it changes things. It’s important to have these periodical events and shake-ups, and to mark them, lest life just plough on un-noticed, relentlessly uniform. So hooray for that. Oh, and the food’s good.

Gift suggestions for Thomas: Lego Pirate Ship

Wednesday, 20 October 2010

The Ties That Bind

Today I bought 5 ties, and I don’t know how I feel about that. I have vague but deep-rooted suspicion of ties: Few garments pong as pungently of human social politics. More than any other element of men’s apparel, the tie is a symbolic statement of professionalism, respect and, yes, social conformity.

This suspicion dates back to student days. Considering ourselves quietly radical and bohemian (as students are wont to do), we would note the worryingly Freudian unconscious symbolism of that noxious bit of cloth (we were psychology students, after all) – a knotted noose around your neck, a kind of dog-lead that your masters hold you by, tethering you to work, in the grip of authority, at the beck and call of society - it’s even called a tie, we would mutter darkly – and gaze upon each other with wild surmise. But, for all our clever anti-establishment musings, we all knew, really, that we too would don one without hesitation at the first whiff of a job interview. Yeah, we’d say then, with a dismissive pragmatism, it’s just clothes isn’t it? Doesn’t really matter. At least it’s a step up from the servile, style-crime horrors of the full-blown, identity-stripping supermarket uniform.

I haven’t worn a tie to work for 6 years, and have only ever had two jobs requiring me to do so, both of which were short lived (temporary positions, I should stress – I didn’t get sacked for tie-protest or anything). Most of my working life I have not had to wear one, a state of affairs I’ve been more than happy with. One can scrub up reasonably smart without, thanks, if one wants to feel well-groomed and professionally stylish. I had consigned the accessory to “special occasion only” status, a piece of archaic formal dress-up kit that really doesn’t need to make an appearance outside of funerals, job interviews and black-tie functions. I viewed my tiny selection like the Queen’s Royal Carriages – things to wheel out once in a blue moon for a bit of glittery pomp and pageantry, but frankly a bit showy. I mean really, who needs that faff when getting dressed, on a daily basis?

I don’t want to be the hoon who states the obvious, but I will anyway: A tie is, aside from “style”, functionally pointless – it may have evolved from a useful garment but it is useful no more. It’s like the human appendix, but with a snazzy pattern on it. No one’s shirt is going to fall open embarrassingly without it. No one thinks “oh, it’s a bit cold out, I’d better grab a tie”. No one thinks “hmm, this collar just isn’t comfortable... I know, I’ll put on a tie with it, that’ll feel better”. A tie’s function is all psychological – it is all about the image it projects, and the effect that has on the viewer, as well as the wearer. As fashion it can make all kinds of statements (not always favourable), but the most common are, of course, respectability and power. This latter element is what we failed to recognise as right-on student beatnik wannabes – we were concerned with the lot of the lowly office drone, oppressed by his shirt-and-tie uniform. We did not consider the tie as a symbol of dominance, power, seriousness and success – the politician, the executive, the banker, the academic, the chairman of the board. This is not tie-as-dog-lead, this is tie-as-diamond-studded-man-jewellry. Like uniform a tie can be a social symbol of either authority or servility – but unlike a uniform it doesn’t necessarily tell you explicitly which it is. You need extra cues and context to tell that – is it part of a well groomed package, or a token effort? Is it worn with an expensive suit or a cheap nylon shirt? How does the wearer talk and hold themselves? Ultimately it’s a conformity thing rather than an obedience thing – whatever the person’s power or lack of, they are playing the game – when faced with a tie you are faced with (at least a veneer of) establishment.

Whatever, my coolness of feeling towards ties is more than just a hangover from idealistic youth; today, more serious than this, is the simple matter of preferences... I’ve never particularly liked the look, never found the style alluring, particularly on myself, and never felt very comfortable with any clothing that restricts at the neck. And if you think this is a result of my early ideological objections, it’s not, necessarily – I feel exactly the same way about polo-neck sweaters (despite their beatnik associations). Or even T-shirts with small neck-holes. I am strictly an open-necked clothing motherf***er. It’s a style and taste thing, like my equally arbitrary preference for earth tones and dark hues. I don’t know where these things come from, except that they haven’t ever changed as far as I can remember, back into earliest childhood.

You understand though: The philosopher Wittgenstein never wore a tie, and this is a big deal: “He did not dine with the faculty, as the faculty in all its grandeur always dines in academic gowns, black shoes, and neck tie. Wittgenstein was forever tieless and wore a suede jacket that opened and closed with that marvelous invention: the zipper; and his shoes were brown.” (that’s from here). As far as “no-bullshit” heroes go, Wittgenstein is a humdinger, one of the few philosophers for me who consistently cuts through the nonsense and clearly and forcefully brings out the subtleties of how stuff really works, with a genius originality and attention to detail. And that guy didn’t wear a tie. His photos look surprisingly modern because of this – before I knew much about him I always presumed he was active in the ‘50s, ‘60s, or even ‘70s, because of this – but he died in 1951.

I know that argument is a fallacy – Wittgenstein’s genius and his failure to tie a half-Windsor in the morning are not necessarily related. On a similar but contrary note, you could say “but he was a successful Cambridge-educated genius, so he could get away with it.” Did I hear you say that? Tut. Shocking. That’s the same argument as “Jesus could work on a Sunday because he was Jesus” – as if he should have special treatment because he was special - it’s circular logic and completely misses the point that he was making a point with his behaviour.

Let’s put it another way – to paraphrase Socrates (kind of... well, not really, actually): Did Wittgenstein not have to wear a tie because he was a genius, or was he a genius because didn’t wear a tie? Um. Well, ok, I’m not overly sure that one’s formal-wear habits fall within the Venn-diagram of genius (unless you’re in the business of being a fashion Deity), but one can’t help but hmm on whether the two things were related. Wittgenstein’s “no” to the neck-rag was not down to a militantly anti-establishment stance, or hipster politics; neither was it an affectation of cool, liberal, touchy-feely casualness, in the style of Tony Blair or David Cameron (the guy’s demeanour was pretty intense and severe). Much more likely, it was simply down to the fact that he was a relentlessly idiosyncratic man and absolutely stubborn in doing things his own way (as well as being continental – he was Viennese – and you know what those over-familiar continental types are like, eh? Eh?) He was also devoted to clawing away at the crusty structures of accumulated traditional-thinking bullshit, quite the modern thinker – he had no time for the endless study of, and deferment to, history. In short, the “I’ll pass on the ties, thanks” was a curious supplementary effect of the very same personality and outlook that made him such an original thinker.

If Wittgenstein demurred on the neck-apparel, that’s a good enough example for me to follow suit. Not out of idol worship so much as sympathy. His example, to me, is telling. In Wittgenstein’s day, for a Cambridge academic not to wear a tie was extremely unusual, and doubtless controversial. Not so these days – as a teacher one may be a respectable figure, but is on the lower end of the scale of professional vocations (as opposed to, say, a doctor or a solicitor) and only around the fringes of academia; so a tie for me, then, was very much optional rather than a necessity – very much a style choice. But, once again entering the private sector workplace, it is not – it is a specified requirement. I’m going to have to bow to convention and get used to it – but, strangely, “style” may be the sugar coating to that bitter pill.

Or, more precisely, vanity. “I’m sure I can pick out some pretty cool ties.” I thought, as I sauntered into the men’s section to buy the things. “I’m sure I can rock ‘tie-wearer’s chic’ as well as the best of them.” How would I know what was up to date? Well, who knows? Go for what’ll go with your shirts and rely on the store not to sell hideously dated-looking items, I guess. If in doubt, go for classic style, classic style doesn’t date. Trust your instincts, Luke, use the force. The horror. I used to be as anti-fashion as I was anti-ties, and here I was vainly fussing over tie-trends. What happens, you see, is that you realise, as you get older, that you can actually look quite suave and dapper (in a low-key, endearingly rumpled way, you understand) in this get-up. It doesn’t look stiff, awkward or pretentious on you like it might have when you were 21. It looks ok. Hot damn it, let’s not hold back – it looks SEXY!!! Yes, sexy. Well maybe not sexy – I’ve still to be convinced that ties are quite that. But everyone likes to look good, no matter how no-nonsense and down-to-earth they think they are – if you can feel comfortable in it and make it work for you, then, hell, wear it.

My fashion aversion hadn’t deserted me altogether – I took one look at the skinny ties and instantly dismissed them with an unspoken “f*** off”. I feel cruel towards the actual garments for mentally projecting such outright prejudice towards them now; but there’s just something a little too self-consciously hip and affected about those pieces of apparel that I just can’t get over. However, I am pleased with my tie selection, have some quite alluring shirt combinations, and I’m actually looking forward to wearing them. At the same time I feel faintly like I may have betrayed my past and my principles... but I guess it’s just clothes isn’t it? Doesn’t really matter. It’s not like I’m Stalin or Nick Clegg or anything. And you know that as soon as I discover situations where I can legitimately get away with not wearing them, I will revert to type and not bother – and in the meantime, I may be tied-up, but the top button is resolutely undone. I’m a strictly open-necked clothing motherf***er, you understand.

Thursday, 23 September 2010

Type A and Type B

Student: I want to be Prime Minister!

Me (horrified): Why the hell would you want that? It’d be horrible...

I know, you’re appalled. Not the inspiring and motivational tone a teacher should take, that... but, man alive, really now, what a thankless job.

Some people’s main goal in life is success and achievement. And then there’s people like me: I’m pretty happy if everything is... yeah, kind of ok. If everything is just fine, thanks. If everything is ticking along nicely, then perfect. But don’t get me wrong, I’m not talking about stagnant apathy here – I’m talking about a state of happy, warm contentment, and that, as it turns out, is actually incredibly hard achieve, and seemingly impossible to maintain; especially with all these “success and achievement” types messing stuff up with their obsessive narrow-focus drive and ambition, prodding me from my rest...

What this is, is a fundamental mismatch of priorities – what do you want out of life? Let’s call these different approaches to life Type A (achievement driven) and Type B (contentment driven): These approaches are pretty fundamental, core personality traits and possibly even rooted in your biology; certainly in your upbringing. Ok, it’s a sweeping generalisation – very few people are all one and none of the other, and if you’re like me you can have a fairly Type A approach to some things but a Type B approach to the rest, or flit between the two depending on what situation, mood – or mode – you are currently in. But it’s a kind of scale we can adopt, if you’ll humour me. Look out for it in the work place and you will find at least a couple of salient examples in your colleagues... and if they have to work closely together, this can cause friction and resentments. Type A and Type B people drive each other up the wall. Type A’s plans and schemes are constantly being undermined by Type B’s infuriatingly slow and unfocussed dithering – how can they be so laid-back when there’s stuff to be done? How can they be so accepting of compromise, and even failure, when with a bit more oomph and action we could be achieving this? Type Bs on the other hand are constantly being hassled, prodded and pushed by Type As, who are always messing up their contentment and peace of mind. Why are they so focussed on this stuff? Why is this stuff so important to them? Can’t they be content? Isn’t there more to life? Are they never going to be happy? Why can’t they just leave me alone?! Stop f***ing railroading me with your agenda! Ahem. Sorry, got a bit carried away, that was very un-Type B of me. The point is: Type As and Type Bs infuriate each other, because they simply can’t empathise. They’re unlikely to get along smoothly if they have to work on a project together – but actually, in fairness, probably need each other: To organise, drive and focus Type Bs or to persuade Type As to take a moment and see the bigger picture.

I have stolen this terminology form Meyer Friedman and “R.H.” Rosenman’s research into personality and stress in the ‘50s, ‘60s and 70s, where they identified what they called Type A as ambitious, competitive, aggressive, status-driven, time-conscious, controlling etc. – basically, Gordon “f***ing” Ramsey; and Type B as... well, basically, not Type A – patient, forgiving, non-competitive and “generally lacking an over-riding sense of urgency”. Which makes Type B people sound like Jesus rather than the slackers they probably are, but there you go. People who scored highly on Type A personality tests (and at interviews designed to provoke characteristic responses) were found to be massively more likely to suffer and die from coronary heart disease or similar stress-related conditions. Friedman and Rosenman concluded that, when it comes to stress, Type A = BAD, Type B = GOOD. Which, as someone who scores robustly (though not extremely) Type B on their personality questionnaire, seems at first glance to be a cause to be smug, and all is right with the world. However, as always, there are big problems with this glib assumption – never take psychology research at face value.

First of all, cause and effect is not clear – it could be the high stress that these people are under as part of their working life that is causing both the Type A behaviour and the heart disease; ie. it is situational, and not a core personality trait driving the stress and therefore the CHD. Furthermore, since the “types” cover a whole range of characteristics, it is a bit rich to suggest everyone falls into one or the other, even with a “mixed Type AB” category – for example it is possible to be ambitious, but not aggressive; or forgiving, but still time-conscious; and it is just as likely that it’s just one characteristic such as hostility/aggression, and not a whole sweeping personality “type” that leads to poor handling of stress. Fair enough for the common criticisms; but I’m also slightly suspicious about Friedman and Rosenman’s conclusive assertions – that, for example, Type Bs are in reality just as successful as Type As, so Type B is by far the best way to be. It should be clear – by Type B they do not mean lazy, useless and apathetic idlers, they simply mean calm, positive-thinking, well-adjusted, patient and co-operative, multi-functional types who know how to get that work/life balance right – but these people are extremely rare, and lazy apathetic slackers will surely score highly as Type B also.

Even more damning is a look at rival personality/stress models, such as Stephen V. Kobasa’s “Hardiness scale” – which suggests some personality traits make you more resistant to stress (or “Hardier”). This is based around the “3 C’s”: Commitment – which means you get actively involved with things and see projects through to the end; Challenge – which means you see problems as a positive challenge rather than an insurmountable obstacle; and Control – which means you see yourself as in control of your fate and the situations you find yourself in. If you lack these, you won’t deal so well with stress – you will tend to get stressed easier. According to Friedman and Rosenman I am Type B, so I deal well with stress. According to Kobasa I am not particularly “Hardy”, so I don’t. I don’t score very highly on the “3 C’s” – I’m reasonable on Commitment, not brilliant, but not terrible. I am traditionally a stand-back-and-observe type, but less so now I’m older and more confident... I’m reasonable on Challenge, despite my sneery cynic pose and tendency to displays of depressive histrionics – I’m open minded, and I will generally be up for sorting stuff out and giving things a go in a reasonably positive fashion... It’s on Control that I fall down badly, not because I always feel out of control in situations, or unable to handle myself – I don’t – but because, as an avowed realist I am quite convinced (from both intelligent reasoning and undeniable, slightly bitter experience) that, actually, a hell of a lot of stuff that life throws at you is out of your control. You can only make the best with what you’ve been given, and greater forces than you rage about you constantly, puking up unpredictable and immovable curveballs that you must constantly dodge or climb over. Ok, the more savvy and proactive you are the more you can affect outcomes and transform situations, I understand that, I really do – but plenty of things simply will not be predicted or will not be changed. We are not Gods. There is no “karma” and you don’t make your own “luck” at all – luck is fickle f***er. Going about insisting you are always in control seems to me pure delusion – a happy, motivating, stress-busting delusion, maybe, but still simply sticking its fingers in its ears and going la-la-la in the face of the facts, nonetheless.

Whatever, Kobasa’s model seems to contradict Freidman and Rosenman. The "3 C's" are traits much more likely to be shown by driven, Type A "doers" than laid-back, go-with-the-flow B's. So is this Type A and B stuff bollocks? Well, I suppose we can say it’s an over-simplification, though intuitively there does seem to be something in it (at the risk of damning it with faint praise and consigning it to pop-psychology). Certainly some people clearly are those types (Gordon “f***ing” Ramsey), and the characteristics listed would seem to tend to clump together. Incidentally other types have been added – Type C, who tend to bottle stress up and not deal with it, and may appear fine on the surface but are all worried and knotted up inside. This allegedly leads to a high incidence of cancer in these types, though this is far from proven; and Type D, who typically respond to stress by getting depressed (- well, alright, if we’re talking purely about stress response, that is quite clearly me – my characteristic urge when it all gets too much is to simply gnash my teeth in woe, hide away, and sleep. Stress-leads-to-anger-leads-to-“isn’t-everything-shit”-despondency-leads-to-shut-down). But how these later additions fit exactly with Type A and B is unclear – these types are stand-alone, developed by others, and not necessarily on the same scale as Friedman and Rosenman’s Types A and B.

If we are going to stick with exploring Friedman and Rosenman’s model, on a personal note, I’m still sus about the idea that a Type B personality always leads to better experience of stress. While, yes, I have never had a highly-strung “stressy” personality, once I got a pressured and high-responsibility job, I suddenly realised that, even so, I didn’t have such amazing tools to cope with it: I’m aware that my Type B tendencies are actually at the root of what causes me stress – my slowness, procrastination, disorganisation, lack of assertiveness, terrible time-consciousness, easy boredom with relentless work, incredible bodily resistance to switching gears (getting active in the first place or turning off once I am), my insistence on spending huge swathes of time on unproductive and unrelated interests (case-in-point – this blog), my inability to concentrate and focus on one simple dull or repetitive task without getting side-tracked or going off into thoughtful reverie... all of these things are pretty Type B and constantly mess me up when it comes to doing the job and getting things done: I need to keep a tight rein on them at times – but if the rein is too tight I end up getting angry and depressed. In fact, while I can wind myself into quite an impressive and zingy all-cylinders-firing energetic work-machine (no, really!) when I’m on form, it always takes a while to crank up and I simply cannot maintain a Type A level of efficiency for a protracted period without burning out quickly – I can’t think of anything more life-draining and dispiriting than living a Type A life.

No wonder Type A types tend to see themselves as superior. In terms of energy, drive and focus they clearly are. As someone who dresses distinctly to the B side of the scale, I have plenty of examples of what I would consider under-achievement littered throughout my history. Not because I lacked the ability or know-how, but because I simply didn’t put myself forward or apply myself enough; not out of laziness, but because I just didn’t realise the importance or feel the urgency of it at the time. You live and learn, I guess, but motivation for the likes of me cannot be forced – it’s there or it’s not. If it’s not, and yet you still have to proceed, your life becomes maddeningly dull and empty, and at worst stressful, exhausting and hellish. Obviously a degree of success and achievement and recognition is essential to contentment, but it is not everything. For genuine so-called Type Bs there is a natural ceiling of activity where things just cut off: When the effort and stress out-weighs your likelihood of tangible reward, and starts impinging on your happy, healthy mood, Type Bs just – stop. It’s like a natural in-built safety mechanism, to protect against the possibility of crisis or breakdown. But it means they will never achieve that sheer hyper-organised, bee-like activity; the perfection and glorious heights of people who will single-mindedly drive themselves into obsession pursuing a particular goal, when nothing less will do. For a true Type B, a little less will do, as long as it’s ok. No giving 120% for these lads. 90 at best. Sheesh, ok, maybe 95 if it’s really, really important, but jeez, a fella’s got to have a little in reserve.

Which brings us on to the real problem, which is this (and in this I think Friedman and Rosenman genuinely had something): In the modern workplace it is Type A behaviour that is rewarded. Everywhere I’ve ever worked wants you wound up to hyper efficiency all the time. It’s so obvious it barely needs saying. I got so fed up with the word “outstanding” as teacher that it came to induce an unconscious Pavlovian sneer – “good” wasn’t good enough, and “satisfactory”... well, hell, that just wasn’t satisfactory. Words like “outstanding”, “excellence” and “achievement” are used so liberally, and drilled in so relentlessly, like motivational tourettes, in the modern workplace that they have become virtually meaningless to my ears – in the way that repeating the word “socks” or “tree” to yourself 100 times divorces the sound from any sense. Try it. In most workplaces (private or public sector) these days everything is target driven – and if you meet those targets, the targets are just put up again, in some kind of Sisyphian ritual of existential futility. Companies and institutions have their own Type A ambitions, and none are ever content to simply rest where they are – but then, in business, as in life, I suppose no position is stable – if you are not going up then you are on a dangerous plateau of stagnation and it could be the beginning of a downward turn and, ultimately The End – you cannot afford to get complacent. Still, many a business or institution has precipitated its own demise (or at least done more damage than good) by over-extending themselves – because they could not be content with their current position, they couldn’t leave things be: It wasn’t broke but they still tried to fix it and lost the culture that made things work so well in first place. Toyota. I’ve seen it happen first-hand (not with Toyota). It seems intuitively true that those at the top will tend to be Type A – they must at least be ambitious, competitive, single-minded. Not always, maybe: “Type A” is a sweeping generalisation involving a suspiciously large batch of characteristics after all; and Type Bs may be called upon to fill such roles if they just happen to be good enough. Certainly those calm, positive, well-adjusted, patient and co-operative types do exist in such positions (I’ve seen that first hand too – a life-affirming privilege) – but still, experience all points to non-Type As in senior positions being the exception rather than the rule. And therefore Type Bs will always be at their mercy. I suppose it has to be this way really – if society was run by Type Bs one suspects the result would be a bit shoddy, and certainly progress would be very slow and limited; but I don’t think it would collapse altogether – it would simply be a humbler, messier, more basic and perhaps more uncertain world. But a happy one.

Though I guess a Type A would say: Why the hell would you want that? It’d be horrible...

Wednesday, 15 September 2010

Wedding Ceremony "Reading" v1.3 (final)

This was written in place of an actual reading for my brother's actual wedding ceremony... pretty much verbatim what I actually said with my actual mouth, in the actual church, on the actual day. Terrified that it wasn't quite appropriate, or what they were after, but in the end, thankfully, it went down well. Juj fo' yo'selfs:

When L and K asked me to do a reading, they suggested that, because I’m philosophy teacher, I should find some 'philosophical wisdom' on love and marriage. That seemed like a good idea; but it turned out to be a tall order.

Plato said: “At the touch of love everyone becomes a poet.” Which is nice, but obviously not quite true, because when it comes to dusty old philosophers they don’t seem to talk about love at all unless they’re logically dissecting it or sneering at it with all their existential angst – y’know, not really appropriate for a wedding speech.

The best I could find was Thomas Merton, who was a Trappist monk (and, yes, almost has my name). He said: "The question of love is one that cannot be evaded. Whether or not you claim to be interested in it, from the moment you are alive you are bound to be concerned with love, because love is not just something that happens to you: It is a certain special way of being alive. Love is, in fact, an intensification of life, a completeness, a fullness, a wholeness of life."

Erich Fromm, the psychologist and philosopher said: "Love is the only sane and satisfactory answer to the problem of human existence."

So far, so good. But then I turned to my old favorite, Nietzsche, and only found this: “A pair of powerful spectacles has sometimes sufficed to cure a person in love.” Mmn. Not quite what I was after, thanks Nietzsche.

However – I remembered reading in Alain de Botton’s Consolations of Philosophy about Nietzsche’s life… and specifically one of his excruciating chat-up lines, which seemed fitting on the subject of marriage:

“(Nietzsche’s) search for a wife was… sorrowful, the problem partly caused by (his) appearance – his extraordinarily large walrus moustache – and his shyness, which bred the gauche, stiff manner of a retired colonel. In the spring of 1876, on a trip to Geneva, Nietzsche fell in love with a twenty-three year old, green-eyed blonde, Mathilde Trampdach.”

He talked about Longfellow’s poem ‘Excelsior’ with her, went for a walk with her and offered to play the piano for her – and the next thing she knew he’d asked her to marry him, with these words – and here’s the good bit, this is what he said:

“’Do you not think that together each of us will be better and more free than either of us could be alone – and so excelsior?’ asked the playful colonel. ‘Will you dare to come with me… on all the paths of living and thinking?’ Mathilde didn’t dare.”

Old Nietzsche’s approach may have been all wrong, but I think the sentiment is really sweet. L and K have dared to go on “all the paths of living and thinking” together, and may well feel “better and more free” than either would alone.

But, for me, what’s really amazing is that they show that, sometimes, things can turn out right and you can find the right person and live happily – which for a cynical philosophy bod like myself is awe-inspiring – and a real, proper beacon of hope.

Tuesday, 31 August 2010

Punk.

Punk. Yeah, right on, punk. Am I, in fact, the only person in the multiverse who gets just that faint, niggling hint of annoyance and suspicion every time I hear that word nowadays? Who picks up the undeniable whiff of pretentious airs and graces when that word is applied to anything outside the late ‘70s? Who boggles at the audacity of those who would casually self-apply this word to themselves and what they do?
Why? Why would I feel like that? Well I suppose it’s to do with the perceived status of the word. Punk is about the only musical genre whose credibility is simply beyond question. Words like jazz, metal, techno, hip-hop, country, rock, pop, reggae, d’n’b, “electronica”, indie, house – even funk, for funk’s sake – are perfectly capable of being sneered out as a derogatory pooh-pooh, and happily suffixed with words like “nonsense”, “rubbish” and “bollox”. But you just don’t do that to punk. Well, you can, but people will look at you like you were a) born in the 1800’s and probably listen exclusively to Yes and/or Dire Straits b) a pre-teen girl who idolises Justin Biebles or Jedwank or One True Voice or whatever fresh-faced-clean-living-non-threatening-pop-fodder is spewing out of the corporate mangle at this precise moment. Which is strange; because it’s not like punk has any more consensus on whether the music was actually good or not than in any other genre. Plenty of people don’t like punk. But nobody under 60 – nobody – questions its authenticity. Everyone knows that punk is visceral, angry and nihilistic – punk doesn’t care if you like it or not, in fact it practically wants you to hate it, which is why it’s so cool. It has none of the pomp, pantomime and geekery of metal, none of the self-satisfied corporate cock-suckery that soured hip-hop’s reputation, and seems to have somehow side-stepped the culture-based incomprehension that causes people to refuse to even try to understand, say, reggae, or most electronic dance-based music. Most genres of music are associated with types of people: Your typical fan, who is eminently open to ridicule. Think of a genre and you can always think of a particular kind of worst-case pathetic specimen who likes to listen to it, even if you know this is an unfair stereotype – the stereotype exists. Yuppies. Chavs. Hippies. Red-necks. Teeny-boppers. 18-30 clubber dregs. Jazz-pseuds. Emos and goths (ahh, bless). It’s guilt-by-association – you get enough silly or dull or pretentious or, in whatever way, reprehensible people listening to a type of music and that’s it – its credibility, in the minds of non-fans, is done for. Tainted.
So how has punk escaped this? Well, your typical punk fan is – well, a punk. And only a punk, nothing else. And punks are scary. Not that some hip-hop gangtas and muscles-n-hate metallers can’t be scary - but all punks are scary; it doesn’t matter if they’re weedy or moronic or 14 years old, because they’re all livid, unhinged, self-destructive nihilists who are as likely to nut your face, gob in your chips and knife you in the kidneys as say “Hale-and-well-met, good-fellow”. They “want to destroy passer-by”. That’s what Mr. J. Rotten said. Poor old passer-by. Except that this isn’t true, is it? It’s utter, utter, B.S. – very, very few punks were ever really, fully, 24-hours-a-day like this, by most accounts. Most of them wanted people to think they were. But, like goths, it was an affected pose, a lifestyle fashion choice, like any other pop-culture fad. Punks were just people – hyperactive, speed-fuelled, stroppy, sneery, yoof people, maybe, but still just people all the same. It wasn’t the end of civilization after all. But somehow the stereotype persists that punks really are just like that. Where other genres’ stereotypes have grown familiar and cosy or broadened to incorporate multiple kinds of everyday folks, the punk stereotype has retained its initial shock value and specificity, somehow. Why? Almost certainly because it was so short-lived. We never had time to become familiar and cosy with the punk, or with the kind of general public who listened to punk as a simple matter of personal taste. Whilst the impact of punk had massive pop-culture reverberations, pure punk was a flash in the pan – it was all over, really, by about 1980. Pure punk was non-sustainable. You can be a life-long hippy. But you can’t be a truly anti-everything anarchistic punk and still hold down a day job. Really you need to be dead by your mid-twenties, or you’re just not punk enough. If you survive, you had better be in a state approaching Shane McGowan.
Musically it was also a cul-de-sac, a dead-end. Whilst it may have been a searing blaze of fresh, raw, jettison-the-bullshit energy, what most people seem to forget is that punk was massively conservative and intolerant. It sneered and gobbed at anyone who used anything other than 3 chords on anything other than guitar, bass and drums. Anyone over 25 was too old. And god-forbid you wheel out a synthesizer. When Joy Division, always slightly apart from the pure punk crowd and more accurately described as “post-punk”, did this, Ian Curtis’s girlfriend – the one who wasn’t his wife – allegedly accused them of “sounding like Genesis” (Genesis were the prog-rock enemy to the angry punk-influenced yoof at the time). Needless to say, that’s f***ing absurd; but it amply demonstrates attitudes to variety and experimentation in those punk-drenched times. The Damned got away with early keyboard usage, but only because Captain Sensible made a point of playing three reedy notes shambolically and knocking the stand over at the end of the gig. But their second album was produced by Pink Floyd's (prog/psychedelic) Nick Mason and that was met with howls of derision and seen as a shocking faux-pas that nearly finished them off - despite the fact that any resulting changes to their sound were utterly minimal - a negligible, shoulder-shrugging smidgen more polished and melodic, which would have inevitably been the case whoever produced it. The likes of The Stranglers, despite being in the midst of the punk scene when they started out, playing all the same venues with all the same bands, and clearly sharing much in common in their sound and attitude, were never properly accepted as punk – they had a synthesizer, three of them were too old, two of them could play too well and two of them had facial hair. No chance. This despite the fact that Joe Strummer reportedly split the 101er’s (who often shared the bill with the early Stranglers) after telling Hugh Cornwell “My band is shit, Hugh – I want a band like yours!” – and next week joined The Clash. This conservativism was also a lie – many punk icons from John Lydon (née Rotten) to Rat Scabies later admitted to liking all kinds of diverse musics that they would never have publicly admitted to at the time. In fact, as it turned out, the first wave of UK punk acts did have a broader appreciation of music and more catholic influences that became clearer as they developed: The Clash's London Calling experiments; the Buzzcock's 60's bubblegum pop and repetitive, angular kraut-rock touches; the inherent psychedelia in The Damned and Siouxsie and the Banshee's later goth stylings. Rather it was the music press, the fans and the second-wave copyists who did most to petrify punk into rigid standardisation. Pure punk was such a narrow and inflexible genre that it literally had, to quote Mr. J. Rotten again, “No future” (clever that, see). Those that survived had to veer away from the narrow template. Those that didn’t, imploded. Punk’s lasting legacy was via post-punk – where punk mutated into something else and other genres were infused with some – but only some – elements of its energy, aesthetic and ethos. That stuff had legs, because it was open to endless variation and diversification. Pure Punk wasn’t – it was simply a cleansing fire, a stubble burning sweep to make way for fresh new growth.
So: Let’s return to where I started – why does use of the word “punk” when referring to things nowadays, annoy me? Well, first of all, “punk”, in the pure sense, is as time-locked as ‘50s rock ‘n’ roll – and as such any modern act that claims to be pure punk, without adding any modern spin, should reasonably expect to viewed in the same vein as Showaddywaddy – novelty retro throw-backs. It seems a bit of a conceit to claim to be full-on punk unless you were a) there in ’76-’79 or b) are accepting that you are simply describing your sound as mimicking that of original punk. But more annoying than this is the way the term seems to be bandied around willy-nilly in a way that can seemingly mean whatever you want it to. Sadly, perfectly good acts such as Mogwai, The Prodigy and Sepultura have at some point claimed that they were “punk”. Needless to say, they sound nothing like each other, and none of them would even have come within Walking Distance (that’s a Buzzcocks song. Clever, see?) of being recognized as punks back in ’77. They’d have been laughed out of Bromley. There isn’t even really any identifiable thread in common between them; because it depends which element of punk influence you are talking about: Are you referring to a certain visual style aesthetic? To a lo-fi, amateur do-it-yourself ethos (as if punk was the only genre ever to use this)? To an anti-establishment politicised outlook? To a generally rebellious, bratty, obnoxious demeanour? To an iconoclastic hatred of the “old”? To a genre-based style-template for making records? Or to a commitment to stripped down simplicity (no solos)?
Referring to such elements as “punk” is fine if it’s done knowingly and lightly, but when it’s said in utter po-faced seriousness I just want to tell the speaker to f*** off. I have no problem with a term like “punk-influenced”, but when people insist something vaguely punk-influenced IS punk... oh, right. Spare me, huh? Having pink hair does not make you a “punk” any more than having a synthesizer makes you Genesis. The media do this all the time when talking about the likes of P!nk or Avril Lavigne (who went on record to tell them to stop calling her punk because she didn’t know what that was supposed to mean – good for her!) This is chronic enough, but when people deliberately call themselves “punk”, that’s when my nose really wrinkles with disdain – because not only is it pungent bullshit, a bogus claim; it’s also a really obvious conceited grab at credibility, as if you can somehow summon all of original pure punk’s livid anarcho-political anti-establishment rage, unhinged youthful energy and unimpeachable authenticity simply because you play a cheap instrument badly or have a tongue piecing. Piss off. Punk.

Punk. - appendix 1: reservations about the mythos from someone who Wasn't There.

There are all kinds of unshakable myths surrounding punk that get trotted out again and again as canon. Having grown up with this accepted doom, it tends not to occur to one to actually analyse these myths – and it causes a surprised double-take when one does, and realises these things aren’t necessarily 100% correct. First of all, that “Music had all got pompous, stagnant, tired and boring”: Certainly, there were a fair amount of extravagant rock behemoths, snoozesome trad-rock noodlers, tired hippy hangovers and comfortable, polished MOR blandophiles who were brought down a peg or two by punk’s immediate, straight-up energy and Heath-Robinson approach. Sure, there was plenty of slick, shiny mainstream commercial pap – but then there always is. There may have not been a lot in the top ten album charts of those years immediately before punk that would appeal to da kidz on da street; but, in retrospect, the sheer volume of now-iconic names producing seminal albums full of vibrant, innovative music in the first half the ‘70s makes the charts of the last couple of decades look shamefully paltry and inconsequential. A lot of it may have quite understandably looked distant, dull and alienating to your average stroppy teen living in squalor on an inner-city council estate; but I’m (the) damned if I’m going to adopt the perspective of a stroppy teen as my considered musical-historian-style overview. And yet people do. Whoever they are. Punk’s gravitational claim on credibility is so powerful that everyone just accepts that this was, apparently, the case – and blithely write off the majority of musical output of the time without really thinking specifically about what or who it is that they’re writing off – or whether they properly agree with that. The “serious” music papers may well have been full of prog, and, ok, that’s hard for a lot of people to stomach. Not everyone’s cup of pretentious herbal tea, but it’s not quite fair, or even accurate, to call the more bonkers, out-there, experimental stuff “dull” and “stagnant”, and only the more high-profile were into expensive theatrics. Glam and art-rock was also rife, and an admitted influence on punk, particularly the likes of Bowie and early Roxy Music on the Bromley contingent. Where was this desolate dark-age of music, this terminally tedious and infuriating desert of artistic authenticity? Was 1975 really that bad? Maybe you had to be there.
One thing that is certain is that punk certainly changed the music industry – in many ways for the better, but not in all... In reality, for all the overblown corporate guff it crippled, it also killed off swathes of interesting, artistically credible acts with its conservativism – what they were doing simply went out of fashion. And make no mistake, punk was always about fashion, and image in particular – which brings us on to the next myth – “Punk was back-to-basics, no frills, unpretentious”: Back-to-basics, yes. No frills, yes. Unpretentious, y... noooooo. Punk was staggeringly self-conscious and affected, in the way that only surly teens can be. Every time I see footage from ’76-’79 there is always a slightly uncomfortable moment when all the accrued mythos falls away, just for an instant, and I simply see a gaggle of silly puffed-up spotty yooves gurning away like naggy toddlers and trying that little bit too hard to look nonchalantly wise-assed and intimidating. Grr, look how obnoxious we are. And then I remember that these are icons, worshiped by thousands, and shudder at how age has jaded me. Now that’s nihilism. But punk, as I said, was always about image. It marked a new high in style-over-substance – as long as you had the look and the attitude, the music didn’t really matter – so long as it was fast, simple and nasty. Not that punk didn’t have plenty of good tunes – but it was the image and attitude that drove it, that was the break-through, and if we’re talking about impact on the music industry, that’s one less savoury aspect of punk’s legacy – it ratcheted up the importance of look and attitude for all the po-faced cooler-than-thou underground hipsters. That’s an irony – while the prog crowd may have had a fondness for big theatrics on the stage, their personal image was not really an issue – they were, by and large, thoroughly average looking unassuming blokes with poor personal grooming. Meanwhile, the punks were, in their own way, every bit as image-obsessed as the glam crowd – but often more militant, because their image was about “authenticity”. In this respect Sid Vicious really is representative of the whole thing – only in the band because he looked good, had buckets of ‘tude, and would provide good theatrics –yes, theatrics – on stage and in interviews – never mind that he contributed nothing musically. John Lydon would eventually grow weary and stifled by his own image (and Malcolm McLaren’s insistent control of it) and form Public Image Ltd., who’s first single Public Image drips with palpable distain for the image-obsessed. Punk was eating itself – pretty much marked the beginning of the end for purist punk.
The likes of Malcolm McLaren always wanted punk to be about fashion, of course, and he will tell you again and again how he created UK punk by importing such fashion from the New York punk scene. Myth #3 – “UK Punk started in New York”: Now, in a certain sense I can accept this – there clearly was a big influence from the New York CBGB’s scene that filtered across to UK punk, particularly in many of the fashion touches, and McLaren provides a direct connection here. Also in the simple, stripped-down and sped-up wilful angular ugliness of the sound. But what is infuriating is that when people talk about New York punk and UK punk they seem to just take it as the same thing – and, it’s just... not. The Ramones had long hair, for crying out loud. New York punk such as Patti Smith and the New York Dolls was for arty underground bohemian hipsters, anti-establishment in a grimily subversive way that owed a debt to the likes of the Velvet Underground and The Stooges. UK punk was very much an every-man (or every-yoof) thing, a disenfranchised, working-class youth-movement of sorts – much more political, vocal and angry, from The Sex Pistol’s violent nihilism to The Clash’s outspoken left-wing causes. It’s this side of it that really took hold in the UK, and that’s the side of it that fashion hipsters like McLaren never fully intended or foresaw – and yet it’s impossible to understand UK punk without it.
I’m being over-antagonistic, maybe. There’s clearly a lot of foundation in what I’m calling “myths” here, I just have reservations that such things are often over-stated. It annoys me that you can’t question this sacred canon and expect to retain your street-cred, when in fact there is plenty there to question – the reality is never as simple and unambiguous as the mythos. Maybe you had to be there... and well, y’know, I wasn’t.

Wednesday, 4 August 2010

I’m Claire F***ing Rayner (The Following Analysis is Inadequate and a Complete Waste of Time).

There surely can’t be a more tragic example of “male autism” than trying to put ~@lOvE@~ into quantifiable bar charts. There surely can’t be a more telling sign that one has too much time on one’s soft, work-shy hands than spending a good 45 minutes playing around with bar charts of ~@LoVe@~ for no apparent reason. Hey! Wait a minute! I resent that. When I did these I certainly didn’t have too much time on my hands, sir! No, sir, I was procrastinating from going to bed after the rest of the night had been consumed by a tedious WORK vortex. WORK done, I was “unwinding”. This is how I unwind (there may be something wrong with me). I was procrastinating about going to bed because going to bed meant “eyes-close-eyes-open-more-WORK” (a freaked-out dream to put you ill-at-ease for the start of the day if you’re lucky). I did this, even though pissing about with ~@lOVe@~ bar charts was eating into my healthy sleep time and therefore making tomorrow’s WORK that tiny bit wearier and harsher to deal with (there may be something wrong with me). Today I do have too much time on my hands.

So, here’s some bar charts on ~@LovE@~ that I did.

Non Starter #1 (High Physical Attraction/Low Rapport and Stuff In Common):

Finding the sexiest, prettiest vision you have ever dreamed of with your shiny eyes only means that you will try harder to establish Rapport and find Stuff In Common. If, after a brave effort, none are found then it’s a Non-Starter. Lack of the other two is actually a massive turn-off – it just ain’t true that blokes are all about looks and nothing else, and frankly I resent your assertion to the contrary. I know you were thinking it. Sexist.

You will probably try to delude yourself that you do have a Rapport and you do have Stuff In Common, despite massive evidence to the contrary, such is your weak-willed male neediness. But if they just don’t “get” your conversation and have nothing in common with you, then they are obviously either a terrible, terrible person or a boring moron. This is a good thing to tell yourself when encountering jaw-dropping creatures that are simply out of your league.

Friends – Not Close (High Rapport/Low Physical Attraction and Stuff In Common):

Your conversation crackles with both zip and zing; all double-act quick-stuff and rare banter-gold. But without anything much in common and little Physical Attraction you simply do not have that extra motivation to really properly get to know each other. You are “fun acquaintances”. Not, on its own, the basis for Truuuuuuue Loooooooooooove.

Although... to give good Rapport you have to have something in common – a way of speaking, a sense of humour, certain thought processes... aw shit, this is ill-thought-out. I’ve messed it up. Balls. Ok, let’s pretend I didn’t say that and just run with the idea that surface Rapport is not the same as deeper Stuff In Common.

Non-Starter #2 (High Stuff In Common/Low Physical Attraction and Rapport):

Stuff In Common alone is bloodless, rubbish – clearly no interest there, ‘cos there’s no zing! No way in, and no motivating attraction. Who cares what other people say about how “you two would really get on”... ain’t happening. You may well be soul-mates under the surface, but if there is no Physical Attraction or Rapport you will never find this out.

You could be outwardly very different, but have a lot in common under the surface; or outwardly very similar but... don’t. Though that means at a deeper level still, there is stuff that you fundamentally don’t have in common... Ah, this is bollocks. I’ve messed it up again. Stuff In Common is a rubbish criteria. Should’ve stuck with Compatibility. We’re talking about people’s personalities here, and I haven’t come prepared. Could split personality into all manner of measurements – Two?! F***ing two?! What was I thinking of? A cocking wagon load of bouncing balls.

A Crying Shame (High Rapport and Stuff In Common/Low Physical Attraction):

This makes you feel like real shallow ass-hole. And you’re right, you are a real shallow ass-hole for this. Aw, but getting romantically involved with someone you just don’t physically fancy is just a bad idea – it will come back to haunt you and end in more pain, cruelty and unintentional emotional violence somewhere down the line because something in you will simply not be satisfied and that’s no firm basis for the passionate loving relationship. There is wisdom here. Or maybe you’re just trying to justify being a shallow ass-hole.

Oh! But as you get to know someone it’s possible to discover joyful details that you didn’t notice on first glance, and if the other two columns are high enough it can have a halo effect onto the person’s physical form: You find things about them, that you previously wouldn’t have looked twice at, becoming attractive; even things that were off-putting at first can become cute. Yeah, maybe. Don’t kid yourself you’re in any kind of control. If you definitely, simply don’t “fancy” someone, nothing is likely to change this. And that really is A Crying Shame – and the least understandable or explainable reason for not getting together with someone. Shake your fist at the Gods for your shitty luck, like Charlton Heston whenever he passes a half-buried Statue of Liberty.

Work At It, Dammit (High Physical Attraction and Stuff In Common/Low Rapport):

Sheeeeeeit – give it a chance. Only Rapport is missing – cool your funk. Yeah! The more you get to know each other, the easier it gets – it’s like learning a language, but easier. I’ve got plenty of friends who I couldn’t really talk to or figure out at first; but now we shoot the breeze and slap our thighs and finish each other’s sentences like a pair of hearty old sea-dogs. You should always remember this before passing over the opportunity to talk to the cool but awkward girl in preference for her “fun” friend, you shallow ass-hole.

Danger (High Physical Attraction and Rapport/Low Stuff In Common):

Oh my. This one is so, so, so hard to resist that I needed to use three so’s. Grabs you by the head and the balls simultaneously in this two-pronged attack. You will convince yourself this is a viable enterprise even though you know full well that deep down there is no substance here, things will get tired very quickly, and you will end up being kind of embarrassed that you fell for that shit after the fireworks have fallen back to earth. If prolonged may even lead to illusions of deep and meaningful connection but it’s all waffle and no action-slacks. Not a thimble of compatibility beyond the charming well-oiled chat routine, and nothing to hold you together when the rough hits the smooth. If you actually try to make something of it you will realise to your horror that, after the initial high-times, you are left steaming out the top of your bonce with sheer irritation at the very mention of your partner’s activities and interests, since these are now a crushingly tedious and frustrating part of your life, a distancing wedge between you and your partner that you resent with the whole of your howling broken soul. She’s f***ing boring. And annoying. And kind of embarrassing to be with in public.

There is always the outside possibility that this could in theory lead to a fascinating, turbulent relationship – all passion and arguments, stormy emotions and emotional make-ups. You’re always misunderstanding and annoying each other, but you can “get on” so easily that Making Up Is Not Hard To Do. Oh yeah, high romance. Except that viewed from the outside it just looks kind of pathetic and embarrassing. Sheesh.

That Was a Load of Guff, Thomas.

Yes. Well, I hope you enjoyed that exercise in over-analytical, navel-gazing, nit-picking guff; but this is all analysis after the fact. Even if we accept the catastrophic over-simplification that is my criteria, you can’t use these criteria to judge in advance. If you did you’d be some uber-choosy robo-fascist of romance, and, frankly, a dick, and if you listened to me with my tin-pot, Mickey-Mouse, Eddie-the-Eagle-Edwards romantic history, you’d be some kind of drooling pantaloon. I’m painfully single. I think the above demonstrates why.

Monday, 26 July 2010

On The Complexities of Modern Atheistic Views (Part ~2)

Abstract: All writers have an agenda..... Absolutist Humanism is a particular type of Atheism that has faith in human progress towards Absolute Knowledge..... This is not strictly scientific..... I have tasted lots of pies but am not a connoisseur..... The universe is absurd and nonsensical in that the evidence and calculations necessarily lead us into paradox..... Mathematical logic screws itself with infinity..... Computers that can’t compute things always explode..... Quantum mechanics is mind-meltingly freaked out, man..... Certain particles are a bit like Boy George..... Interpretations in quantum mechanics are counter-intuitive and crude tools only..... Nobody knows how it can be like that..... The big bang has an oblivion problem..... Physics is an opera, dark matter is the deus ex machina..... Shit must make sense so we WILL find those pesky particles..... The history of the universe is possibly a cloud of possibility waiting to be knocked into shape by our possible observations (possibly)..... I bring the hammer down!..... Concepts such as space, time, quantity and cause and effect must be in place before we can make sense of any experience..... Calculations and empirical evidence tell us that these concepts are insufficient..... Wittgenstein had a “refreshing CV”..... Handy Andies are not advised for use in attempting to absorb oceans..... I cannot explain Gödel’s Incompleteness Theorem very well..... To get Absolute Knowledge you would have to transcend the basic concepts that enable us to think and live at all, which, if not impossible, is at the very least a mystical idea analogous to Zen enlightenment, not simply a matter of reducing everything to basic old skool “graspable” logic – now that IS impossible..... Absolutists are f***ing idiots..... I sorely need an editor.....

Why I Don’t Buy Humanism

“To sense that behind anything that can be experienced there is a something that our mind cannot grasp and whose beauty and sublimity reaches us only indirectly and as a feeble reflection, this is religiousness. In this sense I am religious.” Says Einstein.

Dawkins adds: “In this sense I too am religious, with the reservation that ‘cannot grasp’ does not have to mean ‘forever ungraspable’.”

That’s my emphasis added, there, readers. Why? Because that line was the moment, only half way through Chapter 1 of The God Delusion, where something clicked into place and I looked upon the pages with a wild surmise. Aha! Right-ho. That’s what we’re dealing with. It’s the point at which Dawkin’s underlying value system, and the convictions therein, ride up above the belt-line like a builder’s arse. From here on in one can predict all forthcoming twists and turns, since they must all issue from, and be tethered in line with, those founding convictions.

I mean: All writers, all arguments, no matter how rational and reasonable and logical, have an agenda. A gut instinct, a value set, a lifestyle-choice perspective that they intend to impress upon the reader. Do not be fooled by appearances of objectivity. Do not be fooled by any attempts at a dry, dispassionate tone. If you want to get under the skin of any author that aims to convince, the first thing to ask is: “What do they want the world to be like?” Never mind the details of the surface argument for now – dare to be sleazy and underhand, look for evidence of what the author blindly believes, presumes, craves and aspires to – and when you lower the dress of that surface argument again, it will all look a little less immaculate; and it’s cut and shape will make a lot more sense. Yes, it is kind of ad hominem, uncharitable, and pseudo-psychology – but if you do it properly it works. Nietzsche taught me this. Try it with me, with this piece of writing, see what you get.

In fairness to Dawkins he does not try to hide his personal convictions at any point. Dawkins is very up-front about his agenda. But that portion of quote (my emphasis added) told me all I needed to know about where Dawkins is coming from and what he takes on faith (I said “faith”, tee-hee): ‘Cannot grasp’ does not have to mean ‘forever ungraspable’”. Dawkins is an Absolutist. That is, he believes (or at least his statement strongly suggests he believes) in the linear progress of humanity towards absolute knowledge. That one day, through the true light of the scientific method we will know everything, be able to explain everything, grasp everything – there is literally nothing, in any sphere, that will confound our intellect and be alien to us. We will be Gods! Nevermind that to me this sounds like a longing towards the death of consciousness – what any Absolutist is ultimately aiming for is a state of rest when we finally have no more to disturb our thoughts, nothing more to think about, everything ordered and in its right place – it also gestures towards a whole range of connotations and associated ideas that no Atheist really, actually, has to buy into: In philosophy this is identifiable as a broadly Hegelian perspective – Hegel viewed the whole of history as progress towards what he called “The Absolute”. Also, the idea that humans should replace religion with a faith in humanity and the progress of humanity – this view is identifiable as what is called “Humanist” (or at least one of the many meanings of the term. This is what I will mean when I use the term "Humanist" here), which has been the default position of many scientists and some scientific schools of thought in the 20th Century. Finally the idea that human reason is paramount in nature – that the whole of creation (I said “creation”, tee hee) exists to produce us – we are the pinnacle of achievement in the universe. Whilst these beliefs are understandable positions to take, they are beliefs, attitudes, a faith, a value system – Absolutist Humanism is by no means a scientifically verifiable hypothesis or a logically self-evident argument.

If you really want to rile an Absolutist Humanist, point out the faint but definite whiff of Christian world-view that still clings, unnoticed (they are so used to the smell that they cannot smell it themselves), to their garments... the conviction that man is special and above the beasts, and is custodian of nature, which exists for him to do with as he will... the idea that there is an objective, eternal “God’s truth” that exists outside of ourselves and our perspectives... the narrative view of history: Pagans saw nature as cyclical, not linear and progressive – that’s a Christian “fall-from-grace-and-return-to-God-and-Heaven” thing. The Enlightenment values that the modern scientific view grew out of were, of course, originally Christian and, even if the relationship soon became fraught and they went their separate ways, some faint echo of Christian trappings still remain in the Humanist DNA.

Ok, so human progress is undeniable – in technology and medicine particularly our knowledge and control has exploded in its complexity. One simply cannot argue with the methods that have made this possible, one simply needs to look at the results, the sheer explanatory, predictive power of science and its astonishing success in application. Though it hasn’t all been linear – rather the history of scientific knowledge is a massively inefficient scrappy mess of false-starts, dead-ends, dormant periods, scattershot fragments discovered, lost, discovered, lost, re-discovered, re-interpreted etc. – it has only been properly linear since the Enlightenment, after which the learning curve appeared to reach escape velocity. However, it’s clear that to maintain linearity we must rely on ample communication and relative cultural stability. It’s no sure thing that such a period of growth and advancement will continue forever. By many accounts it almost certainly can’t, not at the rate seen in the last couple of hundred years, and it’s no more than a wild optimistic guess, a gee-whizz fantasy (based on analogy with past conquests), that it will result in absolute knowledge and control of everything.

Furthermore it’s perfectly possible that there is no significance at all to human progress. In fact I’m not even sure what I mean by “significance” here. Significant to whom? Not “God”, surely. To ourselves, yes, of course, but that’s solipsistic and hence objectively meaningless. How can we objectively measure the significance of our own progress? By how much impact we have on the universe? In the way that earthworms are an extremely “significant” force in my lawn? How does human beings’ “significance” rate against the “significance” of the gravitational pull of dark matter, then? What scale are we using? Is our “significance” in comparison to other animals (good luck getting other animals, i.e. anyone – anything – other than ourselves, to recognise this fact)? In comparison to E.T. the Extra Terrestrial’s “people”? Objectively we can’t say we (our activities, our degree of complexity, our power, control and influence) are “significant” or “not significant” in any way that ultimately means much in meaningful terms at all. So much for Humanism: as a driving value system it is essentially an admission of solipsistic navel-gazing. But why not, you say? I’m me so I believe in me and my progress. Anything else would be inauthentic and anti-life. Ok. Still doesn’t stop it being solipsistic navel-gazing.

Why Things Are Much Trickier Than Your Common Man-On-The-Street Atheist Understands

Perhaps I don’t really, properly, understand the myriad pies of knowledge that I’ve stuck my fingers into over the years. As a philosopher you get to taste a lot of pies from many different bakeries. Or “spheres of pie-making”. I mean, I know a little about a lot of things, but it’s all a bit slapdash and “jack of all trades” – I’m not a mathematician, a physicist, a historian, a theologian, a neuro-scientist – and as such can’t claim a deep and subtle understanding of those cats’ whole scene. I’m not a connoisseur of their pastry-based produce. But I am familiar with it. I’ve grappled with the conclusions and connotations of what I can glean from such areas, and there is something to be said for a distanced overview – you see things differently and put things together in a way that a specialist might not. And the evidence would appear to me to be overwhelming: The universe is absurd and nonsensical.

I mean that: Yes, on a day-to-day basis, on a mundane level, everything works rationally and logically, sure. And within most microcosms of study things seem to make sense and fit together ok. Well, the majority of the time. But at the fringes, when you start to really push the bounds of our knowledge and understanding, and ask bigger, awkward questions, it all falls apart. Of course it does, you may say, because it’s at the bounds of our knowledge... that’s why they’re “bounds”. But it’s more than this. The problem is that it necessarily falls apart – not through lack of data or calculation errors, but because, logically, it has to. That’s where your evidence and calculations ultimately, necessarily, lead you – into paradoxical absurdity. In philosophy this is so common as to induce a yawn – philosophy focuses on precisely these things. You take a subject, you ask fundamental questions about it, push your answers to their logical conclusion and find yourself faced with some intractable paradox or another that tells you that nothing is quite as it seems, all your common everyday concepts are ill-defined and no one actually knows anything about anything. But in science, mathematics and logic that’s not really supposed to happen. But it does.

Some examples (if you want to know more, or how massively wrong my interpretation probably is, buy a book and read it yourself. Lazy):

Mathematical Infinity: The concept of infinity in mathematics essentially has the power to collapse logic. This stems from the fact that infinity is not simply a big number – it’s not a “real number” at all, even though it can be treated like one – it is qualitatively different to numbers in the number system (ie. it is a difference of quality, not simply quantity). Once you start doing equations with infinity some very odd things start to happen. You can, of course, add one to infinity and it is still infinity. Or you can subtract one from infinity and it is still infinity. You could add infinity to infinity and it would still be infinity. Though presumably if you subtracted infinity from infinity it would be 0... or not. That’s what’s called an “undefined operation” which pretty much means it’s impossible to get a definite answer. If this was a 60’s sci-fi your computer would print out “zzz...zz..does not compute...zzz” and explode. 0, however, is the flip side of infinity and closely related to it. The idea of a black hole (a concept thrown up by the General Theory of Relativity) demonstrates this relationship – finite mass in 0 space = infinite density. Divide any number by 0 and it equals infinity (or summat, anyway... that’s another “undefined operation” and, as any school boy will tell you, if you try this on a calculator it will flash up “zzz...zz..does not compute...zzz” and explode). So far, so what? Well it means that you need to be very, very careful in using both infinity and 0 in mathematics – things can become so slapdash and flexible and “undefined” that you can end up proving that 2+2=5 or that anything, for that matter, = anything. And infinity is not just a problem if treated as a real number – there are similar paradoxical issues relating to infinite series of numbers or infinite sets or infinitesimal fractions and so on. There are types of infinity. Check out the work of Cantor. In philosophy the apparent paradoxes of this pesky beast go back as far as philosophy itself – from the likes of Zeno’s paradox to the basic problem that any measurement against infinite scale (eg. infinite space or infinite time) renders that measurement meaningless – if you cut a 10 metre length of rope into 10 each part is a metre. If you cut an infinite rope into 10, or 100, or 10,000,000,000,000 then what is it? The point is, if we don’t know how wide “space” is, there is actually no way to know how wide your smallest unit of measurement “really” is either...

Infinity has proved so problematic and counter-intuitive that it is tempting to blow a massive raspberry at the idea, throw your rattle out of the pram and insist it doesn’t really exist. But, aside from the fact that infinity is a very powerful tool in both mathematics and physics, and essential to our progress and current understanding of things, to deny it is to have to accept the equally illogical and counter-intuitive position that there is a “biggest number” after which counting stops. But what if you add one to that number? Um...

Infinity is both a limit and limitless. Logically it must exist and yet it can’t exist, because it collapses the structure of logic. Sense and logic are necessarily finite because they need to have structure, boundaries, to work and be measurable – to make sense. Infinity is a chaotic concept because it includes everything – it is therefore boundless and unstructured. Chaos. Oblivion. A paradoxical question mark that hovers around the edges of everything – at the edges, the structured, measurable, finite world we know would appear to flow out (at least theoretically) into senseless infinite white noise – or black oblivion. Infinity, 0. Two sides of the same thing.

Quantum Mechanics: At the quantum level (the mechanics of things smaller than an atom), whilst “shit” might all make good, sound sense in terms of predictions and equations, at a meaningful, humanly graspable level, “shit” is frankly mind-meltingly freaked out, man. In quantum physics it’s not simply abstract theoretical weirdness either; not just odd scratch-head quirks thrown up by our calculations that we can kind of shrug our shoulders at and move on from when we turn to the practical applications – the weirdness is there in the empirical observations – the results of physical experiments involving how particles behave are bizarre. The behaviour of sub-atomic particles is jarringly, unreasonably non-conformist with the rest of physics, like Sid Vicious at a cheese and wine function.

Get this: You want to know whether light is a particle (i.e. light is made of lots of “small bits of stuff” that fire from a light source and hit your eyes) or a wave (i.e. light is like sound – an energy vibration, a motion that ripples through “small bits of stuff” until it gets to the small bits of stuff at your eyes and affects them). So you do an experiment that will tell you. You set up a wall with two tiny parallel slits cut in it, and you fire a laser (a directed beam of light) at it. There is a photo-sensitive screen behind it to see what kind of pattern will be projected after the light has gone through the two slits. Now: If light is made of particles it will be like firing a bunch of tennis balls at the two small slits – those that make it through the two slits will hit the screen behind in two clustered clumps of particles. If light is a wave, however, as the wave hits each slit it will fan out like in ripples like a stone dropped in a pond – and because there are two slits, the two rippled waves will interfere with each other giving you a nice rippled effect of peaks and troughs on the screen behind. So: clustered clump of particles = light is a particle; rippled peaks and troughs = light is a wave. Simple. Yes?

So what happens? Um... both. You get rippled peaks and troughs... made up of particles. This makes no sense. It’s like asking if your coffee is “fresh ground coffee” or “freeze-dried ‘instant’ coffee” – and being told it is both. At the same time. It gets worse – a lot worse. You’ve established that light is a particle (even if it is also apparently a wave), so you try the experiment again, slowed down, firing one light particle a time to see what is going on. Maybe the particles are somehow affecting each other, interfering with each other, to give a wave-like effect. If you just fire one at a time, this should not happen and you will just get the clustered clumps you were expecting. But – even one particle at a time, you still get the wave-like peaks and troughs pattern. So what? Well it means this: for this to happen, that means each particle must be going through both slits at once in order to interfere with itself (insert own innuendo here). This means one particle thing can do two things, be in two places, at the same time – and act like two things affecting each other, even though it is one thing. Ughm. So: you set up a sensor on the slits to see which one it is going through. The sensor doesn’t physically touch or affect the particle in any way, it just observes. What do you expect to see? Will you see it go through both slits at once? No, you just see it go through the one. So have we been getting freaked out for no reason? Um not quite, because now, as soon as you start to observe it, the particle stops acting weird and just acts like a particle – on the screen you now do have the clustered clumps you expected and no wave pattern effect. It really is like, when the particle “knows” it’s being looked at, it stops acting funny so as not to give up its secrets – aha, you won’t catch me out like that! As soon as you take the sensor away, as soon as you stop looking, it does its Boy George role-ambiguity thing again.

The quantum world is very predictable, in that it acts consistently in a way you can predict once you know the new rules – so in that sense it is hard, verifiable science. However, in humanly graspable terms no one has any idea how to explain the how or why of what is going on. Heisenberg’s Uncertainty Principle states that particles do not have an exact position or velocity but are rather smeared out in a kind of cloud of possibility, where they are everywhere they could be at once... a suspended state of “superposition” that is nevertheless very real (real enough to cause a physical wave effect on the screen) – that is until they are observed, at which point they collapse into just one definite point in physical reality. This is known as the “Copenhagen Interpretation”. Our observation of these particles essentially forces them to commit to a solid position. The “Many Worlds Interpretation” suggests that at every point of decision, where something like a particle could go one way or the other, the universe splits off into two forks – in one universe it goes one way, in the other it goes the other.

Neither of these ideas fit with old-skool mechanical physics or intuitive logic. In a universe where everything is mechanical and made out of matter, the state and position of everything is determined by hard cause and effect – the toss of a coin is not really random: When I toss a coin the angle of my thumb, the force exerted, air resistance and the angle at which it lands cause it to fall one way or the other – there is a direct line of cause and effect from the flipping of the coin to the side it lands on. If you know everything about the starting conditions and apply the laws of physics you can, in theory, work out where everything is, what state it is in and where it is going at any point in time. Hence in theory you can trace how everything in the universe will play out from the big bang onwards. There is no randomness, no true “decisions” where something could actually go in one way or the other – everything is predetermined, just an unbroken cause and effect chain playing itself out (and this includes us and our “decisions” since we are simply part of the physical, mechanical universe too). But quantum physics suggests particles appear in one position or another at random when observed; or, if we get rid of the randomness idea and try to preserve cause-and-effect determinism with “many (determined) worlds”, it still suggests that there are true “decision” points where a particle could genuinely go one way or another – and this does not tie up with traditional mechanical logic at all, but that’s the conclusion we’re left with. We just have to swallow our rational pride and accept it – them’s the breaks. There are multiple “interpretations” in quantum mechanics, but none of these give a comprehensive account, none of them are without inconsistencies or problems, and none of them are scientifically provable in the sense that we are sure that’s what is going on – we can only observe the end results. No, these “interpretations” are simply crude tools we have to use to try to get our human heads around the irreconcilable, paradoxical data that the experiments give us. But that data has been verified again and again – if it appears nonsensical it’s not the experiments that are at fault, it is our limited common-sense rationality. Richard Feynman, one of the star names in quantum mechanics famously started one of his lectures with this:

“I think I can safely say that nobody understands quantum mechanics... I am going to tell you what nature behaves like. If you will simply admit that maybe she does behave like this, you will find her a delightful, entrancing thing. Do not keep saying to yourself, if you can possibly avoid it, ‘But how can it be like that?’ because you will go ‘down the drain’ into a blind alley from which nobody has yet escaped. Nobody knows how it can be like that.”

For this reason many physicists simply refuse to speculate on how the hell you explain what happens in the quantum world – that’s metaphysics, philosophy, not hard practical physics.

Big Bang Theory: We seem to have accepted that, given gravity, the universe must have started with a big bang rather than just existed forever. It’s the theory evidence and calculations seem to fit best, though it still throws up the counter-intuitive idea of time beginning, of space having a boundary. What was before the big bang? Well, the question doesn’t really make sense – though we are faced with another problematic oblivion (0) rather than the oblivion of eternity (infinity) that we did have. Before that we had The Creator, which could be seen as an explanatory fudge to stop us having to deal with the problem of eternity. Though that problem is still there in God Himself – God is infinite and eternal. Who created him? Well either he has always existed, he popped into existence, confusingly, out of nowhere or we have an infinite regress of Gods. Same problem, different game.

As for the evidence of the big bang, there are still problems with that – the embarrassing little issue of not enough gravity in the universe for everything to be where it’s supposed to be at the moment, given that the big bang happened... leading to the massive speculation of “dark matter”, that we can’t see and don’t know for sure even exists, to compensate – really? “Dark matter” has always sounded suspiciously like a quick-fix deus ex machina in the opera of physics to me. But then, I’m no physicist. Another (related) problem though, is the failure so far of “big” physics (the laws of motion, gravity, general relativity, big bang ‘n’ all) to tie up with quantum mechanics. There’s a massive leap of faith going on that we will discover particles (using, for example, the Large Hadron Collider) that will enable us to reconcile the two into a unified theory of everything. Well, we’d better hope so, or we’re looking at a massive overhaul of everything... or worse, acceptance that “shit” just doesn’t make any sense. But apparently shit must make sense so that’s ok, we will find those pesky particles. Right?

To properly explain how the big bang happened such a “theory of everything” that incorporates both General Relativity (“big” physics) and quantum mechanical ideas like Heisenberg’s Uncertainty Principle would be needed. This is what physicists are working on now. Stephen Hawking suggests using Feynman’s “Sum Over Histories” approach (essentially a way to calculate the position and behaviour of particles by getting an average of all that particle’s possible paths) to work out the paths of particles thrown out by the big bang – and thus be able to accurately calculate the history of the expansion of the universe. Now, I don’t properly understand how this works and won’t pretend to, but together with the “Copenhagen Interpretation” there is apparently the suggestion here of the possibility that, by observing and measuring the universe, we are collapsing the position of the particles that make it up into just one firm, solid, actual reality. And what that appears to mean is that the universe “out there”, its history even, is not the solid thing we think it is at all, but is being knocked into shape by our observation of it. I have no idea whether to believe this or not or if it’s really a valid interpretation, but it demonstrates just how much more bizarre and counter-intuitive, how much less sure and straight-forward, the territory that we are led into by firm empirical evidence and logical calculation can be. We passed the stage where we can reasonably expect the application of Enlightenment rationality to deliver an easily understandable, neatly structured clockwork universe a long time ago.

Despite my snide remarks above, progress is being made with a “theory of everything” and it is very possible that we might, indeed, one day be able to combine quantum mechanics with General Relativity to develop a proper account of how the universe was self-caused. But even if that does happen, that’s the description of the mechanics – we are still left with an utterly strange, completely “other” universe that doesn’t “explain” what it is or what we are – because there is no frame of reference. The more we know about how things work the stranger it all seems – it does not come close to denoting meaning or purpose (nothing could be truly sufficient for this), or solving those intractable paradoxes about “how can it actually be this way? What sense can be made of this?” We just have to accept that that is, apparently, how it is. We are faced with the absurdity of existence, and a world we may be able to understand in a narrow, mechanical sense (in the sense that we can trace its history and predict its future) but cannot fully grasp – in the sense that we cannot get our heads around it to reconcile it with our lives and experience in a meaningful way.

Why The World Will Never Make Sense

To return to the Dawkins quote: “‘Cannot grasp’ does not have to mean ‘forever ungraspable’”. The key phrase that I take with issue here is “graspable”, since I’m pretty convinced that there are indeed things that are fundamentally “ungraspable” about our existence. So, if we can theoretically describe the mechanics of the universe fully, what, exactly, is “forever ungraspable”?

Well: Mathematical infinity screws up the solid architecture of our numerical counting system. Structure becomes non-structure with no apparent cut-off point, it blurs boundaries between numerical values, it warps and stretches the rational relations between things in a way that can make quantities and measurements impossible pin down, and undermines the fundamental idea that 1 is always 1, 2 is always 2, 3 is always 3 and so on. In quantum mechanics, we have the idea of an uncaused event. Random events, or true points of “decision”, are necessarily uncaused events – they necessarily have no explanation, no reason why one thing went one way and not another – we can talk about laws of probability all we want, but that is simply a description of trends after the fact, for predictive purposes, not an explanation in terms of an origin or genesis of that behaviour, how circumstances could come to be that way. It messes with the fundamental concept of cause and effect that we need to make sense of the universe. Just try to make sense of the idea of a mechanic telling you the reason your car isn't starting is because there is no reason, it's just not doing that any more. Nothing is broken, there's nothing to be fixed, it just happened. You could not accept this; you'd have to presume they simply couldn't find the fault. Also, the idea of one thing existing in multiple places at once messes again with our fundamental need for one thing to be one thing, not two or three or multiple things – to identify what is what, for things to have a definite numerically stable identity in space and time. With big bang theory we come up against limitations in our concepts of space and time – we have to modify what we intuitively mean by these concepts in ways that are paradoxical – that space can have a boundary, that time can have a beginning – and to truly “grasp” what this means we need to be able to think about what is beyond that boundary – i.e. think outside of space and time.

But here is the problem. Concepts such as space, time, quantity and cause and effect are utterly fundamental, basic concepts. The German philosopher Kant (himself a key Enlightenment thinker, a big champion of rationality, and, incidentally – some would say inappropriately – a Christian) called concepts such as these a priori “schemata or "categories" (we don't need to get caught up in Kant's impenetrable jargon here), meaning broadly that they are concepts that must be in place before we can make any sense at all of anything we experience. We cannot think outside of these concepts – they are essential to our being able to order our experience and make judgements on it. For example, just try to think of something outside of space and time, without using any spatial or temporal terms, metaphors or visualisations. In the 20th Century, that genius oddball philosopher of language, logic and the structure of thought, Wittgenstein (check out his bonkers career biography for a refreshing CV), pointed out that such things as spatial and temporal terms are inextricably imbedded in our language, our grammar – we cannot escape using them – and our language dictates how we think just as much as how we think dictates our language. So trying to conceive – in concrete, visualisable, understandable terms – of what is beyond or outside of our universe (spatial metaphor) or before the big bang (temporal metaphor)... well, we may as well try and mop up the Atlantic Ocean with a pack of Handy Andies. Without such concepts in place no conscious thought, let alone science or rationality, would be possible in the first place – and yet science and rationality are now telling us they are insufficient and we must go beyond them.

In pushing the boundaries of our knowledge and understanding we find ourselves pushing the boundaries of our fundamental concepts. We can deal with this things in a narrow, mechanical way from a narrow, mechanical perspective, so long as we don’t think about how shabby our grasp of certain elements is too much – then we find new possibilities for predictive power and description opening up, because we have been forced cut loose from our fundamental concepts and admit they are flawed – ok, let’s just accept our basic concept is not quite precisely accurate and run with what we find. But those flaws in our concepts are brushed under the carpet, contained and left behind without being resolved. So when we try to translate our descriptions of mechanics into “graspable” terms, they reappear and we still simply can’t “make sense” of what this all means. We can’t say life, the universe and everything definitely means this or this or this; nor can we properly say that it is meaningless (to say it is meaningless suggests we understand it and have pinned it down enough to conclude such a thing). We can’t validly say anything much about the universe and existence beyond the narrow, mechanical, descriptive perspective at all – let alone relate it to mythical, narrative, religious or pseudo-religious accounts, because that involves re-engaging with our old concepts and relational meanings that we have had to contain and leave behind to get where we were going. This is fundamentally “other” to our everyday way of living, thinking and constructing meaning. The universe will never “make sense” in this way.

This should not be a surprise. As the study of infinity shows, logic is necessarily finite and structured. Any logical system is necessarily ultimately a contained, closed system. But in order to truly grasp the totality of that contained, closed system, you have to view it from outside. And if you can view it from outside, there must be something, somewhere, beyond that contained, closed system to go to. And if you’re going to report back on what you find outside and relate it to the inside, then you have to expand the sphere of that contained, closed system to include what is on the outside. And then you’ve just got another, bigger, contained, closed system that you have to go outside of again to fully grasp in its totality. Does that make sense? Read it again. I don’t pretend that this will make things any clearer, but I should mention that this idea is analogous to something called Gödel’s “Incompleteness Theorem”. Kurt Gödel, a mathematician and logician, demonstrated (in mathematical proof) that any rigid system of formal logic (eg. mathematical arithmetic) is necessarily incomplete. You can always develop propositions (statements) in a logical system that are undecidable, impossible to show as true or false, impossible to solve, by sticking to the rules of that system and the tools available within it. You may have a proposition that you know is true by going outside of that system (i.e. you can prove it using another system), but you simply can’t prove it using the rules of the system in question itself. This suggests that any complex logical system will be able to throw up more true propositions than it can actually prove by its own rules. No logical system is complete; you cannot grasp the totality of it from within – it has to be transcended to be fully “grasped”.

The problem, it seems obvious, is that our fundamental concepts – such as space, time, cause and effect, number – are insufficient – too narrow and limited, slightly misaligned, flawed, incomplete, inaccurate – to properly explain what is going on. What we are finding is that we are having to stretch these ideas out of all shape, to breaking point, to accommodate our new explanations of things – the way both logic and evidence are showing us that things are. What this means is that to properly “grasp” everything in the sense Dawkins means, we would need to transcend our most basic concepts, concepts that we cannot get outside of because they are fundamental to making sense of anything at all. Such “transcendence” is entering into the realms of Zen Buddhist enlightenment or Acid-guru "doors-of-perception" style “pure consciousness” (and closer to Hegel's original metaphysical conception of "The Absolute") – a radical, far-out mystical idea that is utterly, utterly different to what most Absolutist Atheists have in mind. What they mean by “graspable” is that we will one day reduce everything to the understandable, everyday rational logic that we currently have. That is not the same idea at all, and actually, as it turns out, logically and empirically impossible, since unsolvable paradoxes are thrown up. For everything to become graspable it is not reductionism, but transcendence that would be required. And this is something I don’t think common-or-garden Absolutists, or even most common-or-garden Atheists in general have yet quite... grasped.

I’m with Einstein.