Distinction matters: the status of Research Software Engineers
8 Sep 2021 - 16 min
From the keyboard of CEO JORIS VAN EIJNATTEN, Bits & Bytes is a thought leadership series which explores relevant or intriguing topics in the world of digital research. From software and digital humanities to current trends in academia and more, join us as Joris explores — and explains. Feedback or something you’d like to see addressed? Start a conversation by emailing us.
There is a growing awareness that research involves teamwork. Scientists and scholars work closely together with specialists in all things digital. Where does that leave the people who build software? It’s a complicated issue, involving research support staff dedicated to software as well as researchers who are themselves software experts. Read on.
I write opinionated blogs. In one of them, published not so long ago under the title “the people who build software in the land of dataspeak“, I dealt with the academic status of research software engineers or RSEs. I identified them as full-fledged academic researchers, implying that they were, and should be, distinguishable from people in research support. Thankfully, some people took the trouble to comment on my blogs. Here’s one very thoughtful response in the Open Working blog. I’m most obliged: an opinionated blog that elicits no objections is dead on delivery.
Among the concerns expressed by my critics, one surfaced in particular. I was told in tweets and posts and emails that there was no need for me to set RSEs apart from data stewards. Roles in research are fluid, since data and software are all mixed up. It was also pointed out that we are moving into a new era, in which people will be working in research teams. RSEs will be members of such a team, and so will data stewards. There’s overlap, flexibility, cross-dissemination, complementarity and yes, inclusion all round.
These observations make sense, and I am grateful to my critics for having underlined the concrete dynamics of research environments. At the same time, it seems that they failed to grasp the gist of my message. The fault must be all mine. In what follows I will repair my shortcomings. I hope to explain better the relations between research teams, researchers and research software engineers.
Introducing the software steward
Granted, data and software are often intertwined. Of course, this doesn’t mean that data can be treated in the same way as software; the two may be related, but in theory and practice they are still different things. One central point in my earlier blog was that current policy in the Netherlands tends to ritualize dataspeak to rather irritating extremes while it is sadly lacking in an understanding of software. But granted, in some cases a clear-cut distinction between data and software may not make much sense.
In addition, it’s possible (theoretically speaking) that some people are knowledgeable in both areas. More importantly, people who understand data and people who understand software need to work together closely in a research environment where distinctions tend to be fluid. Data requires software and software requires data. But if fluidity is endemic, why use the label data steward? It seems like a misnomer. Isn’t ‘digital steward’ more to the point?
Imagine proclaiming at your next friends-and-family, virus-free, non-distanced get-together that a digital steward joined your team the other day. Most people will probably think you have bought a mobile humanoid to cater to your digital needs. “Makes life fun, makes you happy!”, as the Japanese robot said. I’m sure the future will bring us machines of this kind, but for the time being the label digital steward may not be the best choice. So, allow me to introduce the title ‘software steward’, by analogy with data steward.
Software stewards would be aware of the fundamental differences between data and software. They would appreciate that simply applying Findable, Accessible, Interoperable, Reusable (FAIR) data principles to software doesn’t work. They would know what goes into a software management plan. They would understand software documentation and the need for being open and explicit about such things as dependencies, library use and hardware requirements. They would recognize the importance of good licensing, archiving and citation practices. They would be aware of proper workflows and version control. They would know how to code.
These are skills that need to be taught and learnt. They are also too specialized to be integrated into one person, unless we’re investing in R2D2s or better still, an actual Marvin the Paranoid Android. We need people who are knowledgeable about data, and people who are conversant with software. We need different kinds of specialists working in an environment that is fluid, digitally speaking. So far, hopefully, so good.
The informed reader will recognize that ‘software steward’ potentially overlaps with something often called a ‘research software engineer’. This is where the debate gets knotty. In my previous blog I argued that the label RSE isn’t going away and that people living in small countries outside the Anglophone world will need to relate to it (larger language groups enjoy the luxury of having neither the need nor the inclination to relate linguistically to English terminology). My earlier blog intended to suggest that although the label research software engineer is problematic, there are ways of making it work.
Why is it problematic? Because the label RSE veils distinctions. The distinction I am specifically referring to here is that between people who are fully qualified researchers (‘scientists’, if you like) and people who are not. That does not make the latter group less significant or less valuable; I am simply pointing out that they are not researchers. What is it that makes one a ‘fully qualified researcher’? Just take a look at the formal research task description for someone in the lowest ranks of the academic pecking order. An ‘assistant professor’ at a university or a researcher at a research institute must ‘perform academic research’. What are the requirements for such tenured positions? Having a PhD and a scientific CV, or at the very least something reflecting advanced analytical skills, acceptance by peers and substantial research experience.
These are not requirements data stewards or software stewards need to meet. Research support is simply not the same as research. The current enthusiasm about working together in research teams isn’t going to change that. Society needs independent and creative minds who innovate on a high theoretical level. They are called researchers. Governments spend billions and billions of taxpayer euros to keep them occupied.
Saying that distinction matters is not meant as a sign of disrespect. Nor does it imply that equality, diversity, inclusion and all the other current buzzwords don’t matter. But ensuring fair treatment and equal opportunities for all does not and cannot amount to abolishing distinctions. There’s honour and value in all kinds of work. There’s honour and value in bricklaying. There’s honour and value in designing houses. There’s honour and value in supporting research and there’s honour and value in doing research.
Yes, research environments are fluid, but they are so only up to a point. At some point, research software engineering becomes a high-level academic specialism reserved for those independent and creative minds required by society (and on whom governments ought to spend more taxpayer euros). That is why I argued that ‘RSEs are university-level specialists who do research into digital technologies and methodologies. They are the immediate equivalents of postdocs, assistant and associate professors, and sometimes top-level technicians who populate most universities.’
And, as one tweet happily suggested, the equivalents also of full professors. Let that addition be noted.
On instrument makers and methodologists
So subsuming different kinds of specialists under the same label – that of ‘RSE’ – doesn’t solve much. Focusing only on the research support side makes matters worse. This is what bugs me about a recent report on data stewards, which also discusses RSEs. It was ‘formally accepted’ by a committee but I’m not quite certain what that means. Yet what the report has to say about RSEs is at best a precipitous concatenation of half-truths. Those involved in policy already have a hard time in understanding why software is such a big deal. Lumping everyone together as equals in a fluid research environment will only make it more difficult to convince the powers that be that research software engineering in many cases is, and should be, a top-level academic profession.
Why would that be the case?, you may ask. Well, compare RSEs to astronomers. Nobody at your upcoming uninhibited corona-free Dutch bitterballen party will be surprised when you tell them that astronomers need instruments. Of course they do! Astronomers require telescopes and stuff. The better informed among your home audience will probably start frowning when you declare that scholars of seventeenth-century literature need instruments too. Of course they do! Scholars of seventeenth-century literature require software and stuff.
Surely, you may continue, there’s a difference between, on the one hand, a fancy array of very large dish antennae and, on the other, a state-of-the-art tool that seeks to comprehend the most complicated artefacts the human mind has ever produced? Nope. There’s no difference. They’re both instruments.
Admittedly, there may be a slight difference in price. Dishes tend to be costlier than code. There is certainly a difference in complexity. Believe me, even the most intricate behaviour displayed by plasma surrounding a binary black hole is disarmingly simple in comparison to the convoluted metaphors and ironies of an early-modern poem. But for the rest, telescopes and software are the same. They are feats of engineering based on a highly abstract understanding of mechanism and methodology rolled into one.
To devise a new approach in telescopy requires an advanced conceptual understanding of the matter at hand. And that, in turn, involves training at the highest level. Leiden University advertises with a ‘High profile astronomical instrumentation degree’, TUDelft with a ‘Physics for Instrumentation’ track. This is serious shit; we’re not talking about a refresher course on Monday afternoon. The same applies to research software, which, incidentally, is today’s primary form of instrumentation. Any fundamental new approach in digital tooling demands an advanced conceptual understanding of the matter at hand. Academic training in applied computer science may be less well established than observational astrophysics, but that doesn’t mean it’s less significant.
My favourite example of institutionalized instrument making are the methodology and statistics departments in social sciences faculties.i Here one finds methodological experts who both dedicate their career to conceptualizing instrumentation and help their colleagues make use of tooling in their research. Like telescopists, social science methodologists are a kind of research engineer, combining theory and practice. Social sciences faculties actually pay full professors good tax money to do this kind of work.
Recognition and rewards
It makes no sense to deny distinctions that lie at the very heart of academic research. But things are always more complicated than they seem. Instrumentation is an area that involves both research support and academic research. A telescope, once designed, needs to be built and then maintained. Software, once conceptualized, needs to be developed and then maintained. Instrumentation requires different people to fulfil its purpose, both researchers and research supporters.
The problem, again, is that the title RSE covers both groups, which just isn’t very helpful. Why not simply distinguish between research software engineers and software (and data) stewards? I readily concede that this is an argument seventeenth-century writers, who read Latin, would have called an oratio pro domo. The people who build software at the Netherlands eScience Center work on an advanced academic level. They’re scientists. It is crucial, not just for them but for their many colleagues in academia, that what they are and what they do is recognized and awarded.
Currently, a new vision of recognizing and rewarding academics is taking hold, with ‘room for everyone’s talent’ and a specific focus on research teams. That’s a good thing, but not, it seems, without its drawbacks and downsides. It certainly shouldn’t imply that now, all of a sudden, everyone connected to a research team is a scientist or a scholar. That would signal the end of serious research, including discipline-embedded research into instrumentation. Distinction matters.
 For those not familiar with these exotic places: take a look at Maastricht University or Utrecht University. Related and less traditional are instances at Erasmus University Rotterdam, Leiden University Medical Center and Wageningen University & Research. And there are many more.