Companies and information: The leaky corporation | The Economist


The leaky corporation

IN EARLY February Hewlett-Packard showed off its new tablet computer, which it hopes will be a rival to Apple’s iPad. The event was less exciting than it might have been, thanks to the leaking of the design in mid-January. Other technology companies have suffered similar embarrassments lately. Dell’s timetable for bringing tablets to market appeared on a tech-news website. A schedule for new products from NVIDIA, which makes graphics chips, also seeped out.

Geeks aren’t the only ones who can’t keep a secret. In January it emerged that Renault had suspended three senior executives, allegedly for passing on blueprints for electric cars (which the executives deny). An American radio show has claimed to have found the recipe for Coca-Cola’s secret ingredient in an old newspaper photograph. Facebook’s corporate privacy settings went awry when some of the social network’s finances were published. A strategy document from AOL came to light, revealing that the internet and media firm’s journalists were expected to write five to ten articles a day.

Meanwhile, Julian Assange has been doing his best to make bankers sweat. In November the founder of WikiLeaks promised a “megaleak” early in 2011. He was said to be in possession of a hard drive from the laptop of a former executive of an unnamed American bank, containing documents even more toxic than the copiously leaked diplomatic cables from the State Department. They would reveal an “ecosystem of corruption” and “take down a bank or two”.

“I think it’s great,” Mr Assange said in a television interview in January. “We have all these banks squirming, thinking maybe it’s them.” At Bank of America (BofA), widely thought to be the bank in question, an internal investigation began. Had any laptop gone missing? What could be on its hard drive? And how should BofA react if, say, compromising e-mails were leaked?

The bank’s bosses and investigators can relax a bit. Recent reports say that Mr Assange has acknowledged in private that the material may be less revealing than he had suggested. Financial experts would be needed to determine whether any of it was at all newsworthy.

Even so, the WikiLeaks threat and the persistent leaking of other supposedly confidential corporate information have brought an important issue to the fore. Companies are creating an ever-growing pile of digital information, from product designs to employees’ e-mails. Keeping tabs on it all is increasingly hard, not only because there is so much of it but also because of the ease of storing and sending it. Much of this information would do little damage if it seeped into the outside world; some of it, indeed, might well do some good. But some could also be valuable to competitors—or simply embarrassing—and needs to be protected. Companies therefore have to decide what they should try to keep to themselves and how best to secure it.

Trying to prevent leaks by employees or to fight off hackers only helps so much. Powerful forces are pushing companies to become more transparent. Technology is turning the firm, long a safe box for information, into something more like a sieve, unable to contain all its data. Furthermore, transparency can bring huge benefits. “The end result will be more openness,” predicts Bruce Schneier, a data-security guru.

From safe to sieve

When corporate information lived only on paper, which was complemented by microfilm about 50 years ago, it was much easier to manage and protect than it is today. Accountants and archivists classified it; the most secret documents were put in a safe. Copying was difficult: it would have taken Bradley Manning, the soldier who is alleged to have sent the diplomatic cables to WikiLeaks, years to photograph or smuggle out all the 250,000 documents he is said to have downloaded—assuming that he was not detected.

Things did not change much when computers first made an appearance in firms. They were used mostly for accounting or other transactions, known as “structured information”. And they were self-contained systems to which few people had access. Even the introduction in the 1980s of more decentralised information-technology (IT) systems and personal computers (PCs) did not make much of a difference. PCs served at first as glorified typewriters.

It was only with the advent of the internet and its corporate counterpart, the intranet, that information began to flow more quickly. Employees had access to lots more data and could exchange electronic messages with the outer world. PCs became a receptacle for huge amounts of “unstructured information”, such as text files and presentations. The banker’s hard drive in Mr Assange’s possession is rumoured to contain several years’ worth of e-mails and attachments.

Now an even more important change is taking place. So far firms have spent their IT budgets mostly on what Geoffrey Moore of TCG Advisors, a firm of consultants, calls “systems of record”, which track the flow of money, products and people within a company and, more recently, its network of suppliers. Now, he says, firms are increasingly investing in “systems of engagement”. By this he means all kinds of technologies that digitise, speed up and automate a firm’s interaction with the outer world.

Mobile devices, video conferencing and online chat are the most obvious examples of these technologies: they allow instant communication. But they are only part of the picture, says Mr Moore. Equally important are a growing number of tools that enable new forms of collaboration: employees collectively edit online documents, called wikis; web-conferencing services help firms and their customers to design products together; and smartphone applications let companies collect information about people’s likes and dislikes and hence about market trends.

It is easy to see how such services will produce ever more data. They are one reason why IDC, a market-research firm, predicts that the “digital universe”, the amount of digital information created and replicated in a year, will increase to 35 zettabytes by 2020, from less than 1 zettabyte in 2009 (see chart); 1 zettabyte is 1 trillion gigabytes, or the equivalent of 250 billion DVDs. But these tools will also make a firm’s borders ever more porous. “WikiLeaks is just a reflection of the problem that more and more data are produced and can leak out,” says John Mancini, president of AIIM, an organisation dedicated to improving information management.

Two other developments are also poking holes in companies’ digital firewalls. One is outsourcing: contractors often need to be connected to their clients’ computer systems. The other is employees’ own gadgets. Younger staff, especially, who are attuned to easy-to-use consumer technology, want to bring their own gear to work. “They don’t like to use a boring corporate BlackBerry,” explains Mr Mancini.

The data drain

As a result, more and more data are seeping out of companies, even of the sort that should be well protected. When Eric Johnson of the Tuck School of Business at Dartmouth College and his fellow researchers went through popular file-sharing services last year, they found files that contained health-related information as well as names, addresses and dates of birth. In many cases, explains Mr Johnson, the reason for such leaks is not malice or even recklessness, but that corporate applications are often difficult to use, in particular in health care. To be able to work better with data, employees often transfer them into spreadsheets and other types of files that are easier to manipulate—but also easier to lose control of.

Although most leaks are not deliberate, many are. Renault, for example, claims to be a victim of industrial espionage. In a prominent insider-trading case in the United States, some hedge-fund managers are accused of having benefited from data leaked from Taiwanese semiconductor foundries, including spreadsheets showing the orders and thus the sales expectations of their customers.

Not surprisingly, therefore, companies feel a growing urge to prevent leaks. The pressure is regulatory as well as commercial. Stricter data-protection and other rules are also pushing firms to keep a closer watch on information. In America, for instance, the Health Insurance Portability and Accountability Act (HIPAA) introduced security standards for personal health data. In lawsuits companies must be able to produce all relevant digital information in court. No wonder that some executives have taken to using e-mail sparingly or not at all. Whole companies, however, cannot dodge the digital flow.

To help them plug the holes, companies are being offered special types of software. One is called “content management”. Programs sold by Alfresco, EMC Documentum and others let firms keep tabs on their digital content, classify it and define who has access to it. A junior salesman, for instance, will not be able to see the latest financial results before publication—and thus cannot send them to a friend.

Another type, in which Symantec and Websense are the market leaders, is “data loss prevention” (DLP). This is software that sits at the edge of a firm’s network and inspects the outgoing data traffic. If it detects sensitive information, it sounds the alarm and can block the incriminating bits. The software is often used to prevent social-security and credit-card numbers from leaving a company—and thus make it comply with HIPAA and similar regulations.

A third field, newer than the first two, is “network forensics”. The idea is to keep an eye on everything that is happening in a corporate network, and thus to detect a leaker. NetWitness, a start-up company, says that its software records all the digital goings-on and then looks for suspicious patterns, creating “real-time situation awareness”, in the words of Edward Schwartz, its chief security officer.

There are also any number of more exotic approaches. Autonomy, a British software firm, offers “bells in the dark”. False records—made-up pieces of e-mail, say—are spread around the network. Because they are false, no one should gain access to them. If somebody does, an alarm is triggered, as a burglar might set off an alarm breaking into a house at night.

These programs deter some leakers and keep employees from doing stupid things. But reality rarely matches the marketing. Content-management programs are hard to use and rarely fully implemented. Role-based access control sounds fine in theory but is difficult in practice. Firms often do not know exactly what access should be assigned to whom. Even if they do, jobs tend to change quickly. A field study of an investment bank by Mr Johnson and his colleagues found that one department of 3,000 employees saw 1,000 organisational changes within only a few months.

This leads to what Mr Johnson calls “over-entitlement”. So that workers can get their jobs done, they are given access to more information than they really need. At the investment bank, more than 50% were over-entitled. Because access is rarely revoked, over time employees gain the right to see more and more. In some companies, Mr Johnson was able to predict a worker’s length of employment from how much access he had. But he adds that if role-based access control is enforced too strictly, employees have too little data to do their jobs.

Similarly, DLP is no guarantee against leaks: because it cannot tell what is in encrypted files, data can be wrapped up and smuggled out. Network forensics can certainly show what is happening in a small group of people working on a top-secret product. But it is hard to see how it can keep track of the ever-growing traffic that passes through or leaves big corporate IT systems, for instance through a simple memory stick (which plugs into a PC and can hold the equivalent of dozens of feature-length films). “Technology can’t solve the problem, just lower the probability of accidents,” explains John Stewart, the chief security officer of Cisco, a maker of networking equipment.

Other experts point out that companies face a fundamental difficulty. There is a tension in handling large amounts of data that can be seen by many people, argues Ross Anderson, of Cambridge University. If a system lets a few people do only very simple things—such as checking whether a product is available—the risks can be managed; but if it lets a lot of people do general inquiries it becomes insecure. SIPRNet, where the American diplomatic cables given to WikiLeaks had been stored, is a case in point: it provided generous access to several hundred thousand people.

In the corporate world, to limit the channels through which data can escape, some companies do not allow employees to bring their own gear to work or to use memory sticks or certain online services. Although firms have probably become more permissive since, a survey by Robert Half Technology, a recruitment agency, found in 2009 that more than half of chief information officers in America blocked the use of sites such as Facebook at work.

Yet this approach comes at a price, and not only because it makes a firm less attractive to Facebook-using, iPhone-toting youngsters. “More openness also creates trust,” argues Jeff Jarvis, a new-media sage who is writing a book about the virtues of transparency, entitled “Public Parts”. Dell, he says, gained a lot of goodwill when it started talking openly about its products’ technical problems, such as exploding laptop batteries. “If you open the kimono, a lot of good things happen,” says Don Tapscott, a management consultant and author: it keeps the company honest, creates more loyalty among employees and lowers transaction costs with suppliers.

More important still, if the McKinsey Global Institute, the research arm of a consulting firm, has its numbers right, limiting the adoption of systems of engagement can hurt profits. In a recent survey it found that firms that made extensive use of social networks, wikis and so forth reaped important benefits, including faster decision-making and increased innovation.

How then to strike the right balance between secrecy and transparency? It may be useful to think of a computer network as being like a system of roads. Just like accidents, leaks are bound to happen and attempts to stop the traffic will fail, says Mr Schneier, the security expert. The best way to start reducing accidents may not be employing more technology but making sure that staff understand the rules of the road—and its dangers. Transferring files onto a home PC, for instance, can be a recipe for disaster. It may explain how health data have found their way onto file-sharing networks. If a member of the employee’s family has joined such a network, the data can be replicated on many other computers.

Don’t do that again

Companies also have to set the right incentives. To avoid the problems of role-based access control, Mr Johnson proposes a system akin to a speed trap: it allows users to gain access to more data easily, but records what they do and hands out penalties if they abuse the privilege. He reports that Intel, the world’s largest chipmaker, issues “speeding tickets” to employees who break its rules.

Mr Johnson is the first to admit that this approach is too risky for data that are very valuable or the release of which could cause a lot of damage. But most companies do not even realise what kind of information they have and how valuable or sensitive it is. “They are often trying to protect everything instead of concentrating on the important stuff,” reports John Newton, the chief technology officer of Alfresco.

The “WikiLeaks incident is an opportunity to improve information governance,” wrote Debra Logan, an analyst at Gartner, a research firm, and her colleagues in a recent note. A first step is to decide which data should be kept and for how long; many firms store too much, making leaks more likely. In a second round, says Ms Logan, companies must classify information according to how sensitive it is. “Only then can you have an intelligent discussion about what to protect and what to do when something gets leaked.”

Such an exercise could also be an occasion to develop what Mr Tapscott calls a “transparency strategy”: how closed or open an organisation wants to be. The answer depends on the business it is in. For companies such as Accenture, an IT consultancy and outsourcing firm, security is a priority from the top down because it is dealing with a lot of customer data, says Alastair MacWillson, who runs its security business. Employees must undergo security training regularly. As far as possible, software should control what leaves the company’s network. “If you try to do something with your BlackBerry or your laptop that you should not do,” explains Mr MacWillson, “the system will ask you: ‘Should you really be doing this?’”

At the other end of the scale is the Mozilla Foundation, which leads the development of Firefox, an open-source browser. Transparency is not just a natural inclination but a necessity, says Mitchell Baker, who chairs the foundation. If Mozilla kept its cards close to the chest, its global community of developers would not and could not help write the program. So it keeps secrets to a minimum: employees’ personal information, data that business partners do not want made public and security issues in its software. Everything else can be found somewhere on Mozilla’s many websites. And anyone can take part in its weekly conference calls.

Few companies will go that far. But many will move in this direction. The transparency strategy of Best Buy, an electronics retailer, is that its customers should know as much as its employees. Twitter tells its employees that they can tweet about anything, but that they should not do “stupid things”. In the digital era of exploding quantities of data that are increasingly hard to contain within companies’ systems, more companies are likely to become more transparent. Mr Tapscott and Richard Hunter, another technology savant, may not have been exaggerating much a decade ago, when they wrote books foreseeing “The Naked Corporation” and a “World Without Secrets”.

On organizations - Only human


On organizations - Only human

The subject of the this blogpost is organizations, the study of which is nothing new. Thinking about organizations is probably as old as humanity itself. And thinking about organization is also core to the venture.

In the contemporary world, the French philosopher, Michael Foucault, thinks that organizations are places of controlled violence. Another tradition, embodied by the American philoso-pher John Dewey, views organizations as edifying forums. Both thinkers are concerned with the possibility of altering organizations, using theory as the tool of social transformation.

Whatever we think of organizations, either as incubators of new cultures, or as defined by the cultural milieu of our societies at large, most, including the cited philosophers, think they are artificial and therefore vulnerable to being remade. Both philosophers harbor limited feelings about the transfigurative power of theory. Their disagreement is not theory, but hope. Whereas Dewey allows room for hope in the process of change, Foucault takes a more fatalistic stance.

Today we are present at the creation of a new world. With the emergence of a global information infrastructure, the long heralded digital era finally comes of age. The dispute between these two schools of thought surfaces afresh. The unleashed creative forces of the digital era will change the face of our world dramatically.

Previously resolved questions rise to prominence again and will herald different answers this time around. For example, some pundits were proclaiming until recently that the 21st century would be the Asian century. I daresay that in the century to come, no specific geographical area will prevail. Rather people with an intimate understanding of (in-formation) technology, an insight into the transfigurative power of (social) networks, and a profound knowledge of their combined power will shape our common future.

As a consequence, old divisions will evaporate and new fault lines will appear. Par-ticularly worrying is the social division between the technologically adept and an ever-larger population excluded from the promised land of technology. Money is a power-ful remedy for poverty, but a feeble one to address a dearth of knowledge. When con-ceiving our (digital) future these rifts will be with us: the world we live in is a fragile place and the just endeavor for a fair balance difficult. It is in the ability of living things to evolve, to create, to act, and to process information, so that new metaphors for life in the digital age will be found.


No new comments allowed (anymore) on this post.

Value Webs und anderswo funktioniert das auch


Value Webs und anderswo funktioniert das auch

Mein letzter Post handelte von Spinnen und Seestern Organisationen. Meine Ansicht ist klar: In einem sich stark verändernden Umfeld (z.B. local search) sind traditionelle hierarchisch organisierte Firmen zu langsam, zu wenig flexibel und im Sinn der Schumpeter'schen kreativen Destruktion nicht überlebensfähig. Es braucht andere Formen von Organisation, z.B. Value Webs. Damals habe ich das so gesehen:

Das Model eines Value Web

„They leverage the strengths of each link in the value chain, improve efficiencies, reduce expenses, and focus on the interoperability of process and supporting systems.“ [1]

Die verschiedenen Stufen im Prozess der Wertkreation wurden oft als Wertkette definiert [2]. Diese Sicht der Wertkette behandelt Information als unterstützendes Element. Das Management sammelt, verarbeitet und entscheidet aufgrund von Informationen über Warenbestand, Produktion, Bestellungen und Logistik. Überraschend viele dieser Prozesse lassen sich in digitalen Medien abbilden. Rayport und Sviokla [3] sprechen vom Marketspace und von eigentlich zwei Wertketten, der physischen – traditionellen – Wertkette und der virtuellen Wertkette (Virtual Value Chain). Während die traditionelle Wertkette linear ist, kann die Virtuelle Wertkette eher mit einem Netz verglichen werden, die in jedem Punkt zugänglich und konfigurierbar ist.

Der Value Web Broker – ein Wertvermittler

Der zentrale Punkt des Value Web Models ist der Value Web Broker, oder Wertvermittler. Als zentraler Intermediär übernimmt er die Kombination von Komponenten und fügt sie zum finalen Produkt zusammen. Als Beispiel dient E*Trade, der online Finanzanbieter, der Zugang zur Börse, Kontokorrent, Hypotheken, Finanzinformationen und weitere Finanzdienstleistungen anbietet. Die Dienstleistungen werden nicht selber hergestellt, sondern sind gebündelte Drittprodukte, die unter dem Dach von E*Trade verkauft werden. E*Trade gilt als vertrauenswürdiger Intermediär ohne eigene spezifische Produktinteressen und wird deshalb von Kunden als Partner geschätzt.

Das Value Web Model [4], S – Zulieferanten, EM Platform – elektronischer Markt

Eine loyale und enge Beziehung mit Kunden

Ein wichtiges Glied im Value Web ist die sehr enge Beziehung mit dem Kunden. Sicher, über Kundenbindung gibt es ganze Bibliotheken an Fachliteratur und guten Vorschlägen, wie so etwas zu bewerkstelligen ist. Im Internet erfährt die ganze Diskussion eine neue Wende, denn eine oft rein digitale Beziehung setzt andere Anforderungen an deren Ausgestaltung. Dell Computers, als Beispiel, lässt den Kunden das Produkt nach seinen eigenen Wünschen und Bedürfnissen zusammenstellen. Der Kunde wird so ein Teil des Wertsystems von Dell.

Bestehende Wertketten werden in mannigfaltige Geschäfte aufgespalten, von denen jedes seine eigenen Quellen für einen Wettbewerbsvorteil haben wird. Wenn einzelne Funktionen unterschiedliche Grössen- oder Verbundvorteile aufweisen, ist die resultierende Wertkette immer ein Kompromiss – eine Durchschnittsberechnung aller Effekte. Das Aufbrechen der einzelnen Komponenten in ihre Einzelteile und deren Rekonfiguration in einem Value Web erlaubt einige dieser Effekte zu reduzieren, da die jeweils passenden Produktkomponenten durch den Wertvermittler in einer flexiblen Form zusammengefügt werden.


[1] Bleeker S., “The Virtual Organization”, The Futurist, March-April 1994, pp. 9-14.

[2] Porter M., Competitive Advantage, New York: The Free Press, 1985.

[3] Rayport J. and Sviokla J., “Managing in the Marketspace”, Harvard Business Review, November - December 1994, pp. 141-150.

[4] Selz D., Value Webs – Emerging forms of fluid and flexible organizations, St.Gallen: Dissertation University of St.Gallen, 1999.

Natürlich ist man mit solchen Gedanken von "flüssigen" Organisationen nicht allein. Und natürlich hat sich seither viel getan. Besagter Artikel am Anfang des Posts verweist auf ein aktuelles Buch. Ein anderes interessantes Beispiel ist Ricardo Semmler's Semco. Ein brasilianisches Unternehmen, das seit Jahren zeigt, dass echte Anteilnahme am Betriebsgeschehen der beste Garant für kurz- wie langfristigen Erfolg ist.



No new comments allowed (anymore) on this post.

Der Unterschied zwischen physischen und digitalen


Der Unterschied zwischen physischen und digitalen Gütern

Über den Unterschied zwischen physischen und digitalen Gütern wurde schon viel geschrieben. In diesem Blog habe ich mich oft mit dem Unterschied zwischen Musik auf physischen Tonträgern und in digitaler Form beschäftigt (Z.B. hier über Geschäftsmodelle, Irrungen & Wirrungen).

Was für die Musikindustrie gilt, gilt ganz allgemein für alle Industrien: Die Kostenunterschiede zwischen den beiden Formen der Verteilung (physisch auf CD, digital als MP3) sind gewaltig. Hier kurz erklärt:

Ein physisches Gut kostet. Die erste Einheit kostet einen Betrag x. Danach kostet jede weitere Einheit ein Betrag x-y wobei y die sogenannten Skaleneffekte darstellt. Will heissen: Beim ersten produzierten Stück fallen alle Kosten, z.B. für die Miete der Fabrikhalle direkt diesem Stück zu. Das erste Stück zu produzieren, ist sehr teuer. Aber ab dem zweiten Stück können diese Kosten auf alle produzierten Teile verteilt werden, womit die Durchschnittskosten pro Stück fallen. Die Gesamtkosten sind die Summe aller Kosten pro gefertigtes Teil.

Ein digitales Gut kostet auch. Um die erste Einheit zu erstellen, ist oft ein erheblicher Aufwand notwendig: Büroräume für das Start-Up, Löhne für die Programmierer, Hardware, Software (Die Open Source Bewegung ermöglicht hier nochmals deutlich tiefere Setup-Kosten). Allerdings, ist das erste digitale Gut einmal produziert, so sind alle weiteren quasi gratis. Eine digitale Kopie herstellen kostet quasi nichts. D.h die Durchschnittskosten pro Stück fallen deutlich schneller als bei einem physischen Gut. Die Gesamtkosten sind quasi identisch ob nun ein Stück oder eine Million Stück produziert wird.

Vergleichen wir nun die beiden Welten, so sticht die massive Differenz der Entwicklung der Durchschnittskosten pro Stück ins Auge, ebenso wie die markanten Unterschiede bei den Gesamtkosten.

Die Konsequenzen sind vielfach: Nebst der völlig anderen Kostenstruktur (und entsprechend Finanzierungsbedarfs einer Firma für digitale Güter im Vergleich zu einem physischen Anbieter) und den Möglichkeiten auf dem Markt ganz anders zu agieren (Ein Grund weswegen viele Angebote gratis sind), sind die anderen Effekte etwas versteckter.

Im Kern fordert ein digitales Gut eine völlig andere Herangehensweise als ein phyisches Gut in seiner Produktion und Vermarktung. Vielleicht einer der entscheidenden Gründe warum die (physische) Musikindustrie sich so schwer tut mit den neuen digitalen Realitäten. Aber so wie's der Musikindustrie ergeht, ergehts ganz vielen anderen Industriezweigen auch. Und die meisten sind nach wie vor schlecht gewappnet für diese Art von Veränderung. Einer der Gründe ist wohl, dass die Organisationen und ihre Entscheidungsträger die neue digitale Welt mit den ihnen bekannten Kosten- und Marktstrukturen aus der phyischen Welt analyisieren und bewerten. Das geht ins Auge. Genau wie andersrum auch: Erinnern wir uns zum Beispiel an WebVan. Andere Beispiele sind näher vor der Türe, doch davon ein andermal mehr.



No new comments allowed (anymore) on this post.

Lecture: Virtual Teams at Uefa


Lecture: Virtual Teams

Today I am in Nyon at the Uefa and talk about virtual teams. Slides available here (pdf, 7MB)

Comments (0)  Permalink

Reading List

The Economist, Special Report: Mobile telecoms, April 12th-18th 2008 (Available online)

Stefan Klein & Angeliki Poulymenakou (Editors), Managing Dynamic Networks: Organizational Perspectives of Technology Enabled Inter-firm Collaboration, Springer, 2006

Ori Brafman & Rod A. Beckstrom, The Starfish and the Spider - The Unstoppable Power of Leaderless Organizations, Penguin, 2006

Jeffrey Liker, The Toyota Way: Fourteen Management Principles from the World's Greatest, McGraw-Hill, 2004

Edward Tufte:

Columbia Disaster in a Powerpoint:

About Self Organizing Organizations:

DOMINO - IST Research project on network


DOMINO - IST Research project on network management

Topic area: Dynamic Organizational Management for Inter-firm Network Orchestrations
Project partners: DOMINO Consortium partners (from Greece, Denmark and Switzerland)
Additional partners: The project is funded by the European Commission within the IST initiative of the 5th Framework Programme
Timeframe: November 2001 to April 2003
Project team: PD Dr. Kai Riemer (contact person), Marcel Gogolin, Prof. Dr. Stefan Klein, Dr. Claas Müller-Lankenau, Dr. Carsten Totz
Deutsche Version: hier klicken

Short project outline 

DOMINO addresses the issue of management of dynamic network organizations in terms of relationships, resources, structures, processes, people, etc. The primary objective of the project is to maximize the current understanding in this field and thus to set the basis of common reference. To this end, the project will pursue the development of an organizational and business taxonomy, presenting the types and the characteristics of network organizations. Furthermore the analysis and explanation of management practices within several types of dynamic network organizations will lead to a framework of management guidelines for the set-up, coordination and management of these types of organizations.

The success of the project lies much on the innovative research character and on gaining awareness required for maximizing its business and academic impact. Therefore, the project will establish a Knowledge Portal and will pay a lot of attention on dissemination and exploitation.

Project Partners 

DOMINO Consortium consists of several partners (research partners: universities; facilitators and user organizations) from four European countries: Denmark, Germany, Greece, Switzerland.
Project partner Partner status Country
Research Center of Athens University of Economic and Business. Athens University. Project Coordinator. Research partner Greece
Copenhagen Business School (CBS) Research partner Denmark
Dept. of Information Systems, IOS research group. University of Muenster Research partner Germany
Institute for Technology Management (ITEM), University of St.Gallen Research partner Switzerland
DIOS A/S - Danish Institute for Organization Facilitator Denmark
Exodus S.A: Facilitator Greece
Namics AG Facilitator & User Switzerland & Germany
Project Online S.A. User Greece
ECR - Efficient Consumer Response Association Schweiz User Switzerland
New World Multimedia APS User Denmark

Project Introduction 

Current organization and management thinking of how businesses should evolve to withstand the competition within the digital economy and the new market dynamics is being highly criticized due to its disability to provide suitable management practices. The problem seems to be located in the fact that corporate world is changing fast at the extent management practices and paradigms cannot keep up with business evolution. So far, both research and corporate community have been dominated by management paradigms of the past that have to be revised and reinvented for the use in the information age.

The global and knowledge economy raises many issues and questions of how firms should react to the new wave of socio-technical developments that still remain unanswered. It is widely accepted that organizations strive for new managerial tools that would allow them to interpret the new corporate reality and prepare them to confront the complexities of dynamic and networked value creation.

It seems to be rather urgent that the theoretical progress conducted in the field of business management is complemented by a more actionable research that would set the ground for the provision of new management models along with guidelines for their application. At the moment, several isolated efforts have been made by distinguished thinkers to explain emerging forms of organizing such as:
  • business webs/networks,
  • dynamic value constellations and
  • smart organization.
Nevertheless, integrated initiatives are missing and this is what the corporate world inquires and what DOMINO addresses.

DOMINO line of argument 

  1. There is a trend towards networking: Empirical evidence suggests an increasing importance of networking and partnering activities (next to an ongoing M&A trend).
  2. External contingencies as drivers: Major market contingencies for networking are: technology development, globalization, changing demand patterns and the trend towards an information or knowledge economy.
  3. Networks are promising: Networks promise considerable economic advantages for companies challenging the above mentioned market drivers (motives for networking): Enhancing business scope, enter new markets, accessing resources, risk sharing, innovation management, specialization and division of labour, coordination and efficiency benefits (in supply chains), etc.
  4. But: Networks are also precarious: Networks are complex, precarious/shaky and vulnerable arrangements with a considerable risk of failure and potential higher coordination efforts arising from inter-firm (distance) coordination.
  5. Ergo: What matters is a network management: The mere collaboration is not enough to achieve the promised benefits. Due to networking risks and costs, a careful management of the network (and its relationships) has to be applied. Therefore DOMINO mainly addresses management issues regarding: structure, coordination, relationships, resources and knowledge management.
  6. And: Management depends on specific type of network: Management issues, organizational patterns or best practices applicable highly depend on the nature of a single network. Therefore an explicit business taxonomy in terms of classification criteria has to be built up, to be able to classify specific network actions.

Research methodology 

The work methodology adopted by DOMINO comprises a series of "explore & explain" actions. Both exploration and explanation phases consist of a theoretical and empirical research, which will allow the project to identify the concepts from the literature, seek for them in businesses, converge theoretical and empirical findings, apply and validate them in firms, and finally promulgate actionable guidelines for business exploitation and further research.




Managing Dynamic Networks Link Abstract    
Klein, Stefan; Poulymenakou, Angeliki (eds.) (2006): Managing Dynamic Networks: Organizational Perspectives of Technology Enabled Inter-firm Collaboration, Springer-Verlag, Berlin, Heidelberg, 2006.

Institutional Design of Mixed-mode Electronic Marketplaces Link Abstract    
Gogolin, Marcel; Klein, Stefan (2006): Institutional Design of Mixed-mode Electronic Marketplaces, Klein, Stefan; Poulymenakou, Angeliki (eds.): Managing Dynamic Networks, Springer-Verlag, Berlin, Heidelberg, pp. 93-111.

Networks as Orchestrations Link Abstract    
Poulymenakou, Angeliki; Klein, Stefan (2006): Networks as Orchestrations: Management in IT-enabled Inter-firm Collaborations, in: Klein, Stefan; Poulymenakou, Angeliki (eds.): Managing Dynamic Networks, Springer-Verlag, Berlin, Heidelberg, pp. 3-15.

Network Management Framework Link Abstract    
Riemer, Kai; Klein, Stefan (2006): Network Management Framework, in: Klein, Stefan; Poulymenakou, Angeliki (eds.): Managing Dynamic Networks, Springer-Verlag, Berlin, Heidelberg, pp. 17-66.

Social Capital in Supplier Relationships Link Abstract    
Riemer, Kai (2006): The Role of Social Capital in Managing Relationships with IT-Suppliers - A Case Study in Electronic Commerce, in: Klein, Stefan; Poulymenakou, Angeliki (eds.): Managing Dynamic Networks, Springer-Verlag, Berlin, Heidelberg, pp. 125-166.




Für einen MBA Kurs, den ich nächste Woche für ein Firmen internes Programm bei Nestlé halten werde, habe ich das Thema Netzwerkorganisation und virtuelle Organisationen aufbereitet. Dabei habe ich auch die Forschungsergebnisse des Domino Projekts wieder hervorgeholt, entstaubt und festgestellt, dass diese Resultate noch immer sehr relevant sind. Die Essenz aus dem Abschlussbuch habe ich hier zusammengestellt (PDF, 500kb).

Today in: Geneva


Today in: Geneva

Today I do something special: I am invited to hold a seminar on virtual organizations at my alma mater, the Université de Genève, for their HEC MBA Course

The course slides are to be found here (pdf, 10 MB). The framework discussed may be found here (pdf, 480Kb). The ground rules discussed may be found here (pdf, 140Kb)