The language of social sciences has a problem with right levels of drama. We reserve the driest of terminology – ‘positivism’, ‘constructivism’, ‘relativism’ – for describing some pretty fundamental human processes of making sense of the world. We then describe differences between these labels as ‘struggles’, ‘competitions’, or even ‘conflicts’, all of which suggests two large well-opposed sides confident in their preparations and even more confident in their beliefs. It does not suggest an image of me surrounded by dirtied coffee cups and PhD data printouts upon which I have scrawled ‘I want to believe something but I don’t know how’.
The point I want to make today is this. That which social sciences call ‘methodology’ are, at heart, tied to some deeply personal factors. If you’ve never heard of methodology, it’s the intellectual process of justifying why you did certain things during research. If you have heard of methodology, you’ve probably been taught about it as communal process, with a researcher becoming encultured into one (or, though more rarely, more than one) of multiple schools of thought. All schools agree that trying to understand humans and society is quite a complicated task – I’m sure there are people who think that humans and society can be explained using straightforward rules, but I will suggest that these people probably haven’t done much social science (or, indeed, interacted with many humans). But one of the key issues underlying differences between schools is the following question: how do researchers stand in relation to ‘the real world’? To illustrate that, have a picture I did at a Drink & Draw meet:
(Meanwhile the person to my left was doing an ink portrait of – I eventually realised – me, and the person to my right was producing a vivid sketch of the Yorkshire Moor. I’m not a regular.)
As an example of the above, take the question ‘is country A happier than country B?’. It’s not a straightforward one, but one way to approach it might be to compare suicide rates. That way, you can (partly) represent the complicated and personal phenomenon of unhappiness as a series of numbers which everyone can agree on. Not so, say others. Even the seemingly objective ‘rate’ is actually the product of various social processes. Different countries/coroners may record suicide rates differently.Ok, say the positivists, but in that case how do you give any answer to the original question beyond ‘unhappiness is complicated and personal?’. More importantly, how do you give any answer which has any form of reliability (assurance that different parties will agree on the same answer) or validity (assurance that the answer does in some way match up to the world)? Ultimately, how to ensure different researchers can’t just give entirely different answers to the same question, thus reducing so-called ‘answers’ to personal opinion?
That’s a *heavily* summarised social science 101. My argument here is that these questions go beyond the intellectual and into the rather personal. To again raise the point of language, we tend– ‘naïve positivism’, ‘anecdotalism’, and suchlike. Based on my position on that spectrum, the slur that wounds me is anecdotalism – the claim that if you’re going to allow researchers to just give idiosyncratic, rather than standardised, views then all findings are basically just personal stories. (This criticism often appears as the phrase ‘the plural of anecdote is not data’). I feel this is an argument which is strong on dismissive language and weak on clarity. If I was to tell you the anecdote ‘hey so I was in my lab the other day, and I was doing this stuff, and this thing happened which I thought was interesting so I did the same stuff a few more times and, y’know what, the same darn thing kept happening’ then I’m basically reporting an experiment in a rather informal way. Less facetiously, all research comes ultimately from people encountering stuff and relating it to other people and the label of ‘anecdote’ does no work in distinguishing how different fields actually do research in practice. Instead, it’s used to dismiss a stereotyped or idealised view of some research that’s different to yours.
On the flipside of this quite negative view, ‘ideals’ also play a positive role in giving things to aim at; for example, many positivists might accept that 100% validity and reliability is unachievable but nonetheless worth trying for. But a problem I have as a constructivist is that it’s not easy to reject ideas like ‘validity’ or ‘reliability’ when it’s hard to find words which bolster an opposite position – apart from ‘invalid’ or ‘unreliable’, and those labels don’t do wonders for one’s confidence. And confidence is important in all this. That earlier diagrammed question can be more brutally phrased as ‘what’s the point of the researcher?’. What’s the point of all our training, lengthy and sometimes painful and often publicly funded? For those towards the positivist end, this teaches a researcher how to recognise and extract relevant information from the world and to submit these in the ‘correct’ way for peer scrutiny – in other words, they learn how to show validity and reliability. But if you don’t believe in validity or reliability or some ultimate ‘correct’ answer, then what are you learning to do? Are you learning pragmatic skills of how to find and report interesting information, thus making academia a quite extreme and laborious version of long-form journalism? Or are you learning how to present your individual take on the world in a way which makes colleagues go ‘that’s interesting, ergo you are interesting’ – making research a form of what is often called ‘modern art’, where a small group stand around exclaiming ‘what a clever work, let me now explain why it’s clever in a way which shows you that I too am clever’ while a larger group claims ‘my five-year old could have done that’ and a much larger group simply neither know nor care about it at all.
I’m still not confident on the answer to that. But look closely at the above paragraph – I’m talking about researchers, long-form journalists, and modern artists in exactly the same idealising and dismissive way as I’ve argued positivists talk about constructivists. I’m actually not intending to dismiss long-form journalists; my point is that they do their work without the hefty training seemingly required by academics, so why can’t academics? I am being dismissive of the modern art community, while being conscious that my description is of a stereotype rather than first-hand experience (and also a stereotype that I am actually worryingly adept at slipping into). The point is this. My idealised image of the long-form journalist is of someone who presents their reader with a clear, convincing, and detailed picture of a world, while at no point suggesting this is the only or best picture of that world. This is a description I would be happy to align myself with. By contrast my idealised image of the positivist researcher is of someone who wants everyone to see the world (or a portion of it) in the same way as them. I feel this to be a bit simplistic and (potentially) morally problematic. And my idealised image of the modern art practitioner is of someone more interested in promoting their own personal brand than in actually welcoming input from the wider world, or at any point asking ‘is this possibly just a bit stupid’. And all that goes a long way to explaining where I place myself on the above spectrum. That’s not to say I wish positivists and modern artists didn’t exist, or that I automatically dislike their contributions, or that I don’t want to talk to them. I just don’t want to be one of them.
None of that will appear in my PhD methodology chapter. None of it would be accepted intellectually. I think that’s a shame. Because, setting aside differences between methodological schools, my ideal of academia is of a community that understands the implications of language, avoids uninformed dismissal, and welcomes full honesty and candour. And whether or not you agree with that, any academic’s methodology has to proceed from the belief (even hope) that some world, or some portion of it, can be understood somehow. Otherwise we’re a bit pointless. And that is deeply personal.
(The title, fyi, is a reference to an old metaphor about methodologies being ‘lenses’ one can take on and off to see differently. I suggest it’s not as easy as that)
 This example is based on my (secondhand) knowledge of debates around Emile Durkheim’s book Suicide. There’s also an example about vote counting involving some social decisions about what counts as a spoiled ballot, but I’m struggling to find the ref for that. Send help.
 Somewhat ironically I feel, or at least rather annoyingly – see my previous thoughts on clear language
 Or alternatively, in a 17th century way when experimental method was becoming a Big Thing. A lab scientist might respond with the point that experimental results are corroborated by replication from other parties, so they’re not so much personal anecdotes as shared stories. Fine in theory, but in practice that doesn’t really happen.
 I’m sure the same is probably true of ‘naïve positivism’, for the record.
 Apart from, y’know, providing universities with a pool of cheap teaching labour and seminar organisers. (This is, somewhat ironically given the main body of this post, a very dismissive way of describing quite a multifaceted problem).
 I.e. in the stereotypical sense of art that comes across as more weird and pretentious and of questionable aesthetic value Apologies to any real, rather than stereotyped, modern artists. One could also think of niche music groups, fashionistas, or any of the people brilliantly, if often disturbingly, satirised in Charlie Brooker and Chris Morris’ Nathan Barley.
 Again, I refer you to Brooker and Morris (ibid.)