Showing posts with label postmodernism. Show all posts
Showing posts with label postmodernism. Show all posts

Monday, 16 March 2015

The Past is a Foreign Country

'The past is a foreign country; they do things differently there.'

The above is a quote I've been thinking a lot about today and yesterday, courtesy of Robert Higham's "Making Anglo-Saxon Devon". It's the opening sentence from L.P. Hartley's 1953 novel "The Go-Between", which is a novel I have not read. And nor does the Wikipedia article leave me feeling knowledgeable enough to talk about it.


However, it has become an almost common place proverb. A historian need only refer to "that foreign country" or even, as Higham did, that "other country" and we all know what is meant. Perhaps the modern slippage from "foreign" to "other" or "another" is an internal piece of code-switching. Equating "foreign" with "different" does not sit quite right in the modern mind. Or at least, it doesn't sit quite right in my mind. The quote, then, is shorthand for a particular kind of historical maxim: that the past is, to some extent, unknowable; that our assumptions are faulty; that we are distant and distinct from our past. At first glance, this seems sensible and congruous, but I'm going to discuss the flaws in how we apply it to our thinking.

One of the central tenets of postmodernism, and modern-day social movements' theories, such as intersectionality, is that other people's experiences are unknown to us. That our background, beliefs, experience inform our viewing of the world to the extent that they distort it. This is why, for instance, if one is a white person, one should believe a black person about whether something is racist, rather than one's own perceptions of it. Equally, one should believe a woman over a man as to whether something is sexist; a disabled person over an able one as to whether something is ableist. If you've not encountered this line of thinking before, I advise you to sit with it a while before rejecting it out of hand. In the Western world (outside of religious cults or abusive relationships) we're taught to trust the evidence of our own experience as the ultimate arbiter of truth. People often, wrongly, claim that this alone is the whole of the scientific method. 

This problem of personal experience is only compounded when we move to the past. Of course, we accept that we can't recover the experiences of those who lived under Rome, but that doesn't necessarily seem to have stopped us from trying, or invoking those faulty assumptions. Or, of course, we sweep them under the rug, and say well, surely if the Gauls or Britons had hated the Romans that much, we'd have heard about it from the written sources! Casually, forgetting, of course, that and Gauls or Britons Tacitus is likely to have met would have been wealthy men who had adopted certain trappings of Roman culture (whether by Romanization or through code-switching), and who benefited from the Empire. Yes, I'm certain they would have been an excellent source of information about Gallic or British mistreatment under Rome.

My reference here to code-switching or Romanization is emblematic of the entire post-modern problem. Romanization, first really expounded by Haverfield in 1912, is considered to be something of a theoretical monolith. It describes a phenomenon whereby the Britons (or anyone else) simply looked at the pure "Roman" culture and said "Aha- those square buildings are clearly better than our round ones; wine is infinitely superior to beer and who needs p-Celtic when we could be speaking Latin!", and thus instantly switched over to a "Roman" way of being, doing and thinking. The fact that this is not, actually what Haverfield said has not really discouraged the development of the top-down, monolithic model.  Considering this now, we can all see it's pretty ridiculous. However, it fitted well with the early 20th century narrative of empire. Sure, we'd violently subjugated large swathes of the planet, but this was our due! Besides, we had spread the light of English (and these were strongly informed by the Classical) morals and habits and ideas throughout the world, to all the savage peoples who had hitherto lacked them. It fed into the idea that there was an objectively better culture (i.e. Western or Roman) which all human beings could perceive and would desire to emulate.

The superiority of British (Western; Classical, whatever) culture has come under serious criticism in recent scholarship say, the last twenty or thirty years. Whereas in 1990 Millett wrote "The Romanization of Britain" without ever really deconstructing the term, by the late 90s it was unfashionable to use the term, and people cast around for a better descriptor. "Creolisation" was considered (see especially Webster's work), as was "Globalisation" (Hingley) and "Discrepant Identities" (Mattingley). Finally, along came Code-switching (see especially Wallace-Hadrilll). Code-switching is a term originally from linguistics. Wikipedia describes it as "the practice of alternating between two or more languages or varieties of language in conversation". Probably the best example in pop culture is in Dexter- some of the Hispanic characters speak to each other in Spanish (with subtitles) and speak to the non-Spanish speakers in English. Because, of course, they are bilingual, and have no need to speak English if there are no English monoglots present. Conspicuously, this does not describe someone changing over all aspects of their culture. Instead, they learn the cultural (or linguistic) signifiers that allow them to communicate, but not necessarily to assimilate. Under this model, the Britons did not stop speaking p-Celtic, they simply learned additionally to speak Latin. This gives the ancient people back some of their agency. They no longer lose their culture, they simply learn to navigate a new one.

Of course, this development did not happen in a vacuum. It happened in the midst of post-modernism (which was, among other things, about rejecting monolithic, simplifying models), and in the midst of post-colonialism, which sought to fight the damaging narrative that western culture was inherently superior to everyone else's.


So... what's the point? If Romanization is a reflection of Empire and code switching a version of post-colonialism, is one necessarily better than the other? How important is the distinction?

On the one hand, we might say it isn't important at all. The fundamentals of the phenomena we observe ( increasing use of inscriptions, almost universally in Latin; increasing prevalence of square buildings over round, and so on) are basically the same. All that's changed is the gloss we put on it. On the other hand, we might say that a fundamental change has been made; that how we frame the problem is the very problem itself, That to make a distinction between say "pure" Roman culture and "pure" native culture is to see it in the data- whether a "real" distinction is there or not. How are we to reconcile this problem? How foreign, or other is the past, really, and how much is it that we ourselves have done the othering? Further- how much have we convinced ourselves that the past is sterile and remote, only to construct it in our own image? This is the cognitive dissonance at the heart of "the post-modern problem", and it's not one I have the answer to.

Monday, 18 February 2013

History isn't a Science

So, I went to a Dawkins thing in Oxford last Friday, and one thing he did really stuck with me. Not in a good way, either, so I thought I'd share it with you guys, and see if anyone on the internet had any similar views.

He was doing a "conversation" event (an informal discussion rather than a debate) with Stephen Law, a philosopher. (You can find his blog here, and it's well worth a read). At one point, Dawkins said [paraphrased] "I'm not really sure what philosophers do. I can't conceive of a problem that cannot be solved with science, but can be solved with just the mind". Law replied with quite a standard philosophical example "If my mother's cousin's son is also my auntie's nephew's brother, can he also be my grandma's niece's son?" Or something. The point is, you solve this question using just reasoning (and maybe a pen and paper), from your armchair. No measurements, observations, or "science" required. That, in a nutshell, is philosophy.

"Aha," said Dawkins, "But that's science!"

And Law said "Well, no, I didn't do an experiment. That's philosophy"

And Dawkins said "Well, no, that's the scientific method: looking at the facts, reasoning things out in a systematic way".

This innocuous little exchange led me to put my finger on something that'd been bothering me for some time. Scientists are right: the scientific method is incredibly important, and based on logic and reason. But it's more than just logic and reason, it's a whole system of hypotheses and null hypotheses and predictions and tests. Those are what make science work, right? A theory has to be testable. Not necessarily in a laboratory: to take a common example, evolution cannot really be "shown" in a lab (well, it can, but whatever), but the theory makes predictions, such as the ages of certain fossils. And these are then found to be right, so we accept the theory (for now, while making more predictions and testing those). If those predictions were wrong, then the theory would be falsified (disproved) and we'd have to build another one.

That is the scientific method.

However, not all logical pursuits require this kind of stringent testing. Dawkins himself in that talk admitted that the interpretation of, say, Romeo and Juliet did not require "the scientific method", but I'm interested in the subjects that fall down the gap between obviously scientific and obviously artistic: specifically history and archaeology.

In history, for example, we base our work on evidence. We look at the available material, usually written down by someone a long time ago, and we ask it questions: "Is that likely?"; "Could the author have really known that?"; "Did someone else say anything different?". We compare sources, and (using a theoretical framework, made explicit in lots of history texts) we assemble a picture of what happened, when and why.

In archaeology, we often go further and use explicitly scientific techniques. Just look at last week's post explaining stable isotope analysis. No one would doubt the techniques and methods employed by archaeologists owe a great deal to science.

However (and this is the really important however), these subjects are not sciences. They are not using the scientific method. They are applying reason and logic in a structured way, but they do not have the same aims as science. Science is answering fundamentally scientific questions: "What is this made of?"; "How does that process work?"; "What are the laws governing this system?"

Archaeology and history are answering fundamentally humanistic (using the traditional philosophical definition, meaning interested in humans and the human experience) questions: "Who was this person?"; "What did this community eat?"; "What was the effect of that belief on this government?" 

These two often overlap, but it's important to recognise that they are not the same.

So, when a scientist like Dawkins (or Peter Atkins, or whoever) says that, just by using reason and logic, we are using the scientific method, they claim all such subjects as a science. And that damages these subjects. In the last half century history and archaeology underwent major theoretical changes. These focused a great deal on the scientific method, and whether or not it was what the discipline needed. For archaeology, it prompted processualism (basically the belief that archaeology was a science), and the belief that there really were definite answers that we only lacked the methodology to find. Archaeologists therefore tried long and hard to find and articulate the underlying natural laws that govern archaeology. You know what they ended up with? Things like "If someone has been knapping flint, we can expect to find small shards of flint in the soil, unless the area was cleaned really well". These have been derogatorily called "Mickey Mouse" laws, and it's not hard to see why.

Unsurprisingly, processualism was a bit of a failure (though it did help modernise the discipline, and certainly led to some great research). We didn't find the universal truths we were looking for, and people began to be disillusioned by the whole scientific approach.

This disillusionment happened around the same time (prompted, I assume, by current events: thanks, Cold War!) across the humanities and social sciences. You know what it led to? Postmodernism. Postmodernism which in history and archaeology manifests as a denial of any kind of historical truth. Nothing is true; everything is an opinion. And if everything is subjective, why try to write a true account of anything?

Again, postmodernism forced us to look at our prejudices, and to try to make our theoretical frameworks explicit, and those are all good things, but it's also been reasonably damaging to our fields. Some parts of archaeology and history have lost whole decades to postmodernism (phenomenology, anyone?) , decades we could have been using to do some actual research, instead of worrying about how science didn't work and that there was no such thing as truth.

So back off Dawkins! And back off, scientists. The scientific method is more than just reason and logic, and we can use reason and logic without being a science. Stop claiming otherwise, or you condemn the entire humanities to interpretive nonsense.