Maël Brunet: Hello Tineke. First of all, can you briefly introduce yourself and say a few words about your work and your current research ?
Tineke Egyedi: Yes, I’m a researcher on standardisation and do consulting work. I worked at the Delft University of Technology until March and now have my own business. At this moment I am actually doing standardisation research in the field of geriatric care. I’m looking at how standards affect care for the elderly, whether and in what way standards and regulation support or hinder the work of nurses and other caretakers . I have been doing this for two months now. Meanwhile I’ve been writing articles with colleagues. One of the main subjects we write about is the influence of standardisation on innovation.
Perhaps it’s relevant to say that I’m currently almost entirely focusing on open standards. In the past I have been looking at what people within the standards arena can learn from the open source movement, asking things like: what motivates people to initiate and sustain collaboration, and how do they coordinate their work and manage to contain the number of software versions. These writings focused on open source and coordination mechanisms. I’ve tried to transfer these insights to the open standards arena.
One of the things that might make me a little odd for OFA and people within the IT standardisation community – the researchers – is that I am interested in almost all areas of standardisation. I like to take a general perspective on standards and standardisation. Take my research into the problem of competing standards. I studied the cases of ODF and OOXML. The core issues are much the same in most areas of standardisation. Each area is interesting in itself. This more or less explains why over the years I have, for example, looked into container standardisation and standards for natural gas, why I look at IT standardisation, and why I am now looking into standards in health care.
MB: It’s interesting to look at the parallels between different areas of standardisation. Obviously we at OFE really focus on ICT standardisation, but it’s easy to forget that there’s a whole other world of standards out there.
TE: Yes. An example is the work I’ve done on standards as a way of increasing flexibility and supporting infrastructure evolution. One of the cases was the switch from standard SGML to XML.
MB: Could you remind me what SGML was for?
TE: SGML was something from the pre-internet era. It was much used in the 1980s for structured information exchange by publishing houses. It was a means to encode information.
MB: So it was essentially XML, before XML existed?
TE: Yes. A group of internet users recognized the power of SGML. As the Internet started to take off, this group wanted to have the same functionality, but online. They wanted to make an SGML “light” version, that was their starting point. XML is an offspring from SGML (as is HTML, by the way). The role of the XML, a standardised container for the information infrastructure , is not unlike that of the ISO freight container in transport. These standards increase infrastructure flexibility.
MB: Right, and you would say it had the same impact on the market in terms of flexibility and enlarging the potential market as the physical containers did?
TE: Yes. XML also had an enormous impact on everything, on the whole ICT infrastructure.
MB: It’s interesting you bring up containers because I was going to ask you a question about the standardisation of container formats. As you may be aware a couple of months ago the Linux foundation released a new project called the ‘Open Container Initiative’, which is looking to build on the success of Docker to standardise container formats. I was curious to hear your thoughts – on this particular example or in general – on standardising emerging / evolving technologies and what challenges that you might face there.
TE: Sorry, Mael, I have not followed this development. So, I cannot comment on the project. But the issue of standardising emerging technologies is very interesting. These last years I’ve been studying the influence of standards on innovation. I’ve been working with two people in particular. One is a technical expert in the area of telecommunications, Mostafa Sherif. He writes about the timing of standardisation, which is important to think about when you’re talking about innovation. He distinguishes three phases: anticipatory, enabling, and retrospective standardisation. Retrospective standardisation is when you already have products on the market. This is very difficult to do because of vested interests in existing standards. Usually you would want to standardise as soon as possible, that is, standardise emerging technologies (anticipatory standardisation). But, again, here the problem is that they may not yet be stable and are still likely undergo changes. Their future is uncertain.
“Anticipatory standardisation is usually preferable to retrospect standardisation”
I’m currently working with Roland Ortt, an innovation expert at the Delft University of Technology. We are analyzing the influence of timing standardisation on a large amount of divers cases of high tech innovations. We have added a phase to Mostafa’s, namely one before anticipatory standardisation, that is, pre-existing standardisation. This addresses the phase before a technology is even invented. It takes into account that we live in a world where standards already exist. Imagine you have a good idea, an invention. This invention will then be influenced by these pre-existing standards.
But back to your question. I think you are talking about anticipatory standardisation, a phase in technology development when you don’t really know what is going to happen. You may not even know if there is a market for it, or if adaptations need to made. The technology may not yet be fit to solve a certain problem , or you have competing solutions to a certain problem. This is a difficult moment to standardise but usually preferable to retrospect standardisation.
MB: You said we should, in general standardise as early as possible. But don’t you think there is also a risk when the market doesn’t even exist, you have to second guess whether there will even be any market applications for that standard and that’s not really the job of a standard setting organization, is it? We’ve seen examples in the past where standardisation bodies go too far in developing standards that aren’t picked up by anyone. So, do you think there is a challenge there?
“Early Internet standardisation was ideal. We needed those standards at that time to act as a platform for further development.”
TE: Yes absolutely. The question you raise isn’t easy. Standardisation is a way of coordinating among companies where you need a standard for a market to take off. Sometimes you need a common platform. Preferably based on an open standard, but it can be a de facto standard as well. Let’s take the Internet example. Most people talk about Internet protocols as something that was a committee standard from the start. In my view this is debatable. It started out as a the outcome of the DARPA project, a product. Only later it was specified as a standard. TCP/IP. It consisted of two protocols, if I remember correctly, because it was adapted to fit the OSI framework, the Open Systems Interconnection framework. This resulted in IP, that was the network protocol, and TCP, that was the transport protocol. But the most important thing is that, talking about platforms, early Internet standardisation was ideal. We needed those standards at that time to act as a platform for further development.
MB: Right, and development couldn’t have taken place without standardisation as the first step in the process?
“If the ODF standard had been embraced more actively by public IT procurers and governments 10 years ago it would have served as a catalyst for E-government developments.”
TE: Yes, either that or it would have taken much longer. That’s the problem with competing standards, I think that if the ODF standard had been embraced more actively by public IT procurers and governments 10 years ago it would have served as a catalyst for E-government developments.It is still inexplicable that a competing standard, Microsoft’s OOXML, was formally ratified by ISO. That created a burden, which had to be dragged along by government procurers all this time. It was such a waste of money, energy and intellectual effort. And ISO’s OOXML has never even been implemented, not even by Microsoft, as I understand it. I really think that’s going to be a classic example of what can go wrong in standardisation.
MB: So let’s zoom in on this idea of competing standards. It’s a complex issue because for large organisations – whether governments or companies – they also have a large user base to accommodate somehow. So it’s very difficult for them even if they decide to switch to one specific standard. We saw this recently, the UK government recently switched to ODF and they’re struggling with the implementation of this policy. So, do you have any advice or insight for large organizations on how to best implement the choice of a single standard?
TE: One of the things that has become clear about how many governments look at the benefits and costs of implementing ODF or open standards in general, is that they fail to calculate the exit costs. There’s something short sighted about that, but at the same time very understandable.
“There’s something short sighted about failing to calculate exit costs.”
I myself don’t like changes in IT , and I think I belong to the 99% of the people who don’t. I’m only a user. Most of us are. I dislike changes, even at the level of an electronic form or website interface. I also dislike having to buy new versions of software. It seems like my problem is exactly the same as that of the UK government, but on a micro scale. What I would need to switch systems as a micro user is that the change is unavoidable and massive. I won’t switch because my provider says “here’s a new very useful interface with all these new features”, which I don’t want. I switch only if I have to. At least a forced switch to an open standard like ISO’s ODF would allow me and my government to access and read documents in 50 years’ time. I would rather do it clean and painful, but for the right reason. At least there would also be a clear public benefit in my switching.
MB: That’s interesting, because I’ve also heard the opposite said, that the way to implement change is to make it as seamless as possible. To make it so the user is not even aware of the change. You’re saying the opposite, to make it a clean break?
TE: I’m not sure the seamless approach is possible. Think of the difficulty of switching from IPv4 to Ipv6. It has been a very lengthy process. In the case of the UK, a critical mass is needed. The switch has to be unavoidable. I shouldn’t be able to save my documents in two file formats.
MB: And I guess you need to train the users, you need to impose the change, but also explain it to them.
TE: Yes, it could help if people understood the urgency of being able to access documents in five years time and in the far future. You have all these legal documents such as house ownership or older rights of passage across more recently privatized land. So much has been documented already. Another example, mentioned by Bjorn Lundell, another OFA fellow, is where and how we stash nuclear waste. We put it away for hundreds of years in mines. The documentation containing the details of this needs to be kept accessible throughout human existence. Perhaps this is an extreme example, but it makes the urgency of data longevity and accessibility very clear. There’s also something else which is important. I think we could save a lot of money if we made the switch to open standards. The Dutch government is wasting much money on developing closed software systems, and not sharing software that has been developed in different parts of government. There is much frustration about the high costs and wasted expenses of government IT projects. And much goes unchecked. In this respect the IT sector is still in the stage of puberty, it has not matured yet. Open standards is an important step towards maturity.
MB: Ok, well before we close I just wanted to make sure to ask you some questions about some of the European policy around standardisation and hear your thoughts. So, in the Digital Single Market Strategy that the Commission published in May they outlined a few ideas, one of them is to update the European Interoperability Framework to a newer version, and potentially make this mandatory. How they would make this mandatory is up in the air right now, because at the moment the EIF is still a fairly high level document. Also the Commission is planning on using the Multi-Stakeholder Platform for ICT standardisation to help develop a European catalogue of standards.
“European policy should also carefully consider the role of bottom-up initiatives to coordination.”
TE: Such a common catalogue would be very helpful, and will especially help European countries that do not yet have the policy in place. And harmonization would certainly help the increase of information exchange between member states as well as the growth of interconnected public IT services. For this, a common policy would seem essential. But let me also add that such a policy should also carefully consider the role of bottom-up initiatives to coordination. Also at the European level. Let me explain. Together with some colleagues, I’ve been writing about bottom-up developed, user-driven, self-organizing infrastructures. We coin these ‘inverse infrastructures’. This is also relevant for an e-gov standardisation infrastructure. Certain citizen- and local government-driven bottom-up initiatives coincide with e-gov goals. So, more thought should be spent by European policy makers on possible complementary roles of bottom-up and top-down approaches.
MB: Well I think we can end it here, you’ve given us a lot of great information. Thank you so much Tineke for taking the time to do this interview.
TE: You’re very welcome.