A Short History of BCP

Management often expects a short briefing on the background of a subject that is being presented to them for the first time. The purpose of this chapter is to provide you with some interesting and amusing historic material that you can use if asked to give an historical context to the practice of business continuity planning.

Business continuity concepts, such as preparing for a crisis and protecting vital records, are as old as life itself. The idea of protecting ourselves when confronted by a crisis is a deeply ingrained biological imperative. Reflexes help us cope with unexpected threats while our brain lays down new neural pathways based on past experience that help us anticipate and avoid future dangers. This is called learning[1].

Our immune system marshals the body’s resources and uses complex chemistry to defend against threatening infections. In a confrontational situation, our hearts pump faster delivering greater amounts of nutrient-rich, oxygenated blood to our muscles. We begin to draw on our stored energy pools[2] and all of our senses enter a heighten state of awareness. When threatened, we automatically release adrenaline and other stimulants that help us deal the emergency. Face it, the “fight or flight” response of all higher animals is an instinctive “emergency response system.”

With the development of language came some of the first storage techniques and data protection strategies. In the pre-writing era, a key way of accurately passing information from one generation to the next was through the repetitive telling of epic tales and poems. Songs and ballads were also widely used as memory devices[3]. These songs were regularly and repetitively taught to children as a way of passing on important information such as tribal history, religious traditions, and other moral or philosophical beliefs. While there was always the possibility that a single re-teller would change or modify one or more parts of the message, that fact that others had a “copy “ of the original message stored in memory (literally!), provided a useful error correction mechanism. Using a bit of imagination we can even view initiation and puberty rites as the forerunners of testing and auditing practices.

Writing developed in Mesopotamia and Egypt over 6,000 years ago. What started out as pictograms representing a specific object evolved into hieroglyphics which associated images with sounds[4]. This in turn led to the development of alphabets which are the basis of all written European languages. A modern form of these word pictures has survived in the puzzle style known as a rebus[5]. Modern Asian languages such as Japanese and Chinese remain highly pictorial and use different symbols to represent sounds[6].

Example of Early Writing

Example of Early Writing

There is much speculation as to why written language developed. One theory is that the need for record keeping at trading centers drove the creation of writing. This is supported, at least anecdotally by the large number of accounting records that have survived. The number of bills of lading and shipment inventories in our possession from these early times far exceed the quantity of religious, historic or other types of texts that have been preserved. Initially, these documents were recorded on very durable materials such as stone or, in the case of cuneiform, on clap tablets. The use of these materials speaks to the importance placed on these earliest of written vital records. Some of these symbols are still in use today as with the symbol , which the medical community recognizes as standing for the concept of a recipe.[7]

Around 5,000 years ago, the Egyptians invented papyrus – a paper like substance that served as the primary recoding medium for almost 3,000. Then, in 104 AD, Ts’ai Lun of China learned to crush the bark of the Mulberry tree, mix it with water and perfected a drying technique that resulted in the creation of the first paper.

A Notable Disaster

Jumping forward in time, the burning of the great library of Alexandria is an early example of how a fire led to the catastrophic loss of data because the information was stored on papyrus, and not something more durable.

Most scholars attribute the founding of this library to Ptolemy I Soter of Egypt at the beginning of the 3rd century BC[8]. At its peak, the library was reputed to have contained between 400,000 to 700,000 scrolls, each of which held the equivalent of approximately twenty-five of today’s typed pages[9].

The library was truly a world repository of knowledge. One theory holds that the library was founded when a student of Aristotle by the name of Demetrius Phalereus asked Ptolemy II for a safe place to store his teacher’s private writing. From this beginning, the library continued to grow until the reign of Ptolemy III when, by law, anyone entering the city of Alexandria was required to turn over any books or scrolls to the library scribes where they were quickly copied and the duplicates returned to the owners while the originals were retained by the library staff.

Tragedy struck sometime in October of 48 BC when Julius Caesar ordered the burning of Egyptian fleet and in the ensuing conflagration, the library is reported to have caught fire and was destroyed[10].

Interestingly, recent archeological discoveries indicate that a second “branch” of the library operated beneath the Temple of Serapis in the same city[11]. Perhaps this represents one of the earliest examples of what computer technologies of today would call a “mirrored site.”

Manuscript Copying, Data Integrity and Technology

While the copying of material (an early form of data replication) can be traced to the time of the Alexandrian library, in Western Europe this practice came into widespread use in Europe during the “Middle Ages.” Monastic clerks undertook the task of copying a wide variety of materials including histories, philosophical writing and scientific studies. Unfortunately, since manuscript copying was performed by individuals with little or no supervision focused on the truthfulness of the information copying process, this growth in data replication led to a corresponding increase in information corruption. History shows that many monks decided to insert their own views into some of the material they were copying , which led to the first cases of widespread data integrity problems.

Eventually, technology came to the rescue. Block printing techniques in both the Far East and in Europe eliminated the constant morphing of information and with the introduction of Gutenberg’s moveable type printing press in 1436, record and document protection took a giant leap forward.

The Beginning of the Computer Age

Computing devices have been around longer than most people think, especially whenyou realize that early structures such as Stonehenge in the UK were astronomical calculators. One of the earliest hand-held devices is the “Antikythera Mechanism” found in the wreck of a ship off the Greek islands around 1900. This device dates back to around 87 BC and was also a calculator used to predict the movement of the stars and Zodiac. Equally old are the abacus and there is evidence that “tally sticks” were used by the Mesopotamians thousand of years ago.

A rash of calculating devices emerged in Europe in the 17th and 18th century, but they were not programmable and thus considered calculating devices, not computers. In 1801, Joseph Marie Jacquard used a punch paper card to “program” his textile loom to produce a repetitive pattern, but no calculations were involved.

A giant leap forward in technology occurred when, in 1833, Charles Babbage began work on what is now recognized as the world’s first computer. Known as the Analytical Engine”, he demonstrated that the unit could be programmed and anser mathematical calculations submitted to it on punched cards. Lacking funds, his invention was never commercialized. It wasn’t until 1890 when the U.S. Census bureau contracted with Herman Hollerith of the Computing Tabulating Recording Corporation (CTR Corp.) that computers were seen as a commercial success[12].

The use by Babbage of punch cards revealed a weakness in the design of the process. Punched cards were rather fragile and easily damaged. Babbage and his colleague Ada Byron, Countess of Lovelace, began to research strategies for easily, accurately, and quickly duplicating their card decks. Many historians mark this effort at data protection through replication as the beginning of modern day disaster recovery techniques and the origin of our current business continuity industry.

Electronic calculating devices were in used during the 1930 and 1940, but the first electronic computers made their first appearance in the early 1940’s in Europe and the U.S[13]. Then, in the 1950’s, International Business Machines began using magnetic tape as a storage medium[14]. Descendants of this technology are still in wide use today and represent the dominant medium used in backup applications.

By the 1970s it was common to keep information in electronic form on magnetic tapes which were far more compact and easier to handle than paper-based storage media, thus simplifying the task of data protection. Magnetic tapes were classified by the density per inch of information stored on them and also by how the information was distributed across the storage medium. For example, recording techniques were developed that tightly packed information on magnetic tapes by creating side-by-side tracks.

Some of the tape technologies that found widespread commercial use in the computer industry were 7-track, 9-track tape formats. The 9-track format in particular gained acceptance as, in parallel, engineers found ways to increase the amount of information they stored in each of these tracks per inch of material. 9-track tapes evolved from packing densities of 800 bits per inch (bpi) to 1600 bpi and eventually to 6250 bpi formats. Over time other tape formats came into common use including 4 and 8 mm tape and DLT (digital linear tape), all three of which are still in use today.

Cross Over Technology

Advances in tape formatting technology provide an excellent example of the technical crossover that often takes place between the computer industry and other marketplaces. Many of the same advancements in tape technology that benefited the computer business found their way into the music industry. Commercial recording companies were anxious to find a lower cost and more easily handled medium for music and voice capture than vinyl records and seized the opportunity that tape advances offered. One example of this cross over of technology is the popular 8-track music tapes of the late 1970s. With the introduction of compact tape cassettes, sound recording moved into a new area. Music pirating came into vogue as people made copies (backups in computer parlance) of their originally purchased cassettes (the source code in this case) and exchanged them with friends. Some rock bands such as the Grateful Dead even encouraged fans to duplicate recordings and share them freely. This action of coping digitally recorded music was the precursor to today’s online music craze which has lead to the widely successful Apple IPOD and helped promote the creation of the MP/3 data recording standard.

The introduction of inexpensive but reliable recording devices allowed some businesses such as brokerage firms to inexpensively record conversations between their employees and clients as a way of mitigating the risk of lawsuits for inappropriate or incorrect stock trades. At about the same time, home voicemail entered the consumer market. Originally based on tape technology (in a variety of formats), eventually alternative storage media such as “solid state disks” (which aren’t really circular in construction!) replaced the wide range of tape cassettes that were used in home recording devices while leaving the core concepts the same. Today, with the increased capacity and access speed offered by various storage products, several industries are moving to require the long term archiving of phone conversations and other customer interactions as part of a company’s business record[15].

The storing of information on magnetic media such as tape dramatically simplified the disaster/recovery process. Tape technology was a more compact medium of storage and the retrieval time was vastly shorter since information could be “streamed”[16] back into computer memory at a much higher speed than with paper tape or punched cards. Also, magnetic tape took up less space and proved to be a more durable and easily handled medium than its paper predecessors. Today, magnetic tape is viewed to be a low cost but “delicate” medium (relative to other more rugged technologies such as solid state “jump drives”) that requires care and attention. The widespread use of tape and other removable storage technologies[17] has given rise to an entire sub-specialty of the backup and recovery industry known as “media management” which is concerned with tracking both the physical location and technical characteristics of these various storage media.

With the introduction in 1956 of the first magnetic disk drive (the IBM RAMAC[18]), the storage segment of the computer business took off and remains one of the most robust areas of technology. Computer technology continued to improve as did the associated storage technologies. Operating systems began to reside in the computer’s magnetic “core” memory. This computer memory technology was followed in rapid succession by a number of silicon-based memory technologies. In parallel, the first random-access[19] rotating memory storage devices (magnetic disks and drums) were brought to market. All the while the volume of data being digitally stored on devices was growing at a phenomenal pace as was the problem of how to protect against loss, damage or destruction.

Just as processing technology advanced at an accelerating pace, so did the associated technologies of information protection. Ever more sophisticated technologies were developed that allowed the manipulation of data in ways that began to also provide increased levels of data protection. Various algorithms were introduced that not only could detect when data copied from one memory location to another was incorrect, but also could do much to automatically correct the detected errors[20]. Soon these advances found their way into mass storage devices[21] which brought on an important advance in the integrity of stored digital information.

In parallel, speed and quantitative advances in computer processing systems, sub-systems and their components were matched by a qualitative improvement in the reliability of these manufactured devices. Together, these advances made possible entirely new approaches to the challenge of data integrity and reliable, long-term storage.

With the steep decline in the cost of rotating mass storage devices, RAID[22] technology – a storage-integrity technique that could be embedded in the disk drive’s controller[23] became available. This data integrity technology allowed magnetic disk drives[24] to offer the same class of error detection and correction capabilities that the earlier block parity techniques brought to computer system random access memory. Software versions of this same RAID technology came into general use as did low cost dedicated RAID storage units. Today, RAID is a common, and very affordable technology that provides a dramatic increase in storage reliability for even home personal computers.

And, in another example of how technology can be re-purposed, Apple’s IPOD family has spurred on the wide scale use of very small form disks and a related technology – memory sticks known collectively as MP3 players in recognition of the dominate music recording format used to store songs and other audio-oriented products such as Podcasts.[25].

Early computing facilities were highly specialized affairs which grew out of the use of sensitive mechanical tabulation equipment in the 1920s, 1930s and 1940s. Government and businesses alike quickly realized that special environmental conditions and highly trained technicians were a requirement for successful computer operations. Since efficiency was the goal of computing[26] it was significantly more cost effective to centrally locate the computer and all of its associated subsystems in one location, leading to the birth of the computer center. Early cost justification arguments for investment in computing equipment emphasized the labor saving benefits of automation over manual processes[27] and assumed the presence of computing experts who would manage the operation. This effort gave birth to the Information Systems (IS) group. Early political infighting at many companies eventually led to the creation of the Information Technology department model which combines several technologies including telecommunications and computing.

Driven to show a better return on the investment in equipment, computer designers made increased performance speed their primary goal. The strategy, still followed in some ways today, was to build larger and larger machines[28] which could “crunch” numbers much faster. This initiative gave rise to the mainframe and eventually, the supercomputer[29]. Unfortunately, usability never became a mainstream goal of any major computing powerhouse[30].

Other Advances

Another commonly used technology in the 1960s, 1970s and even through the 1980s, was a photographic medium known as microfiche.

The coping of information to microfiche was known as COM (computer output to microfiche) and it was often used in conjunction with so-called mainframe[31] computers. Today’s equivalent is called COLD (computer output to laser disk and DVDs). Just as with magnetic media, the focus of this technology was data protection.

In parallel other aspects of a computer’s overall architecture and supporting sub-systems were being improved. Some of the more significant advances included:

  • Uninterrupted Power Supplies (UPSs). Specially configured battery systems that, in the event of a loss of primary electrical power, deliver a continuous supply of energy for a specified period of time (generally measured in minutes). This technology allows for many processes to either shut down with a minimum of data corruption or operate while a secondary source of electrical power is engaged (often a backup electrical generator).
  • Computer Clusters[32]. An array of multiple computer systems under the direction of a common set of software applications which share the computational workload and can even automatically reconfigure themselves to continue operating in the event of a loss of one or more members of the cluster.
  • Non-Stop and High Availability Computers. These are sophisticated computers constructed with redundant subsystems (both N+1 and 2N architectures) so that in the event of a component failure, the computer can continue to operate in a nearly seamless manner. These same techniques can even be applied to sophisticated software programs such as databases and e-mail systems.
  • Network Attached Storage (NAS) units[33]. Dedicated storage systems, complete with their own operating systems, software applications and computer processor for management of the storage assets, network connections and computational activities. These devices were a direct outgrowth of the file server concept first introduced by companies such as Sun Microsystems and Apollo computers.
  • Storage Area Networks (SANs). The creation of these massive shared storage resource devices was made possible by parallel advances in networking and backplane technology as well as distributed file system software.

Software advances in the fields of synchronous and asynchronous data replication were made and brought to market. Today it is possible and affordable for even a small firm with little or no Information Technology (IT) staff to safely store multiple copies of all of its current data and have the information ready for quick retrieval in the event of a failure of one of the primary computing resources.

Other developments in the field of electronic archiving, e-vaulting and document management followed a similar evolution. These technologies concern themselves with the long-term storage of specific files, records and documents. This is in contrast to backup and data replication technologies, both of which are concerned with creating an exact copy of all the information present in a computer system at a specific point in time, regardless of the importance or transitory nature of the information.

Document management and related “information life cycle management” (ILCM) technologies continue to attract investment attention as various industry and government regulations begin to call for the long term retention of records, files and documents. A closely related field is that of “compliance management[34]” which deals with fulfilling the requirements of various statues and regulations. Importantly, compliance management deals not just with securely storing these documents but also requires that protections and tracking systems be established that insure:

  • the records are preserved in their original form (un-tampered with),
  • that access and authorship must be traceable,
  • and that the documents can be keyword searched and retrieved in short periods of time[35].

Compliance management is a very storage intensive area and one that is projected to undergo explosive growth through the end of the first decade of the twenty-first century.

Technical Advances and Language

So profound is the impact of computer technology on our daily lives that it has effected our ways of thinking and speaking. New words have entered our language and older ones took on new meanings. The information stored in this new digital medium became known by the vernacular “data”[36]. The coding of information into a digital format was called “inputing,” while reporting of results was called “outputing.” [37]. Computer instructions initially had to be converted into binary code, which gave rise to the term “coding” and sequential ordering of these coded instructions was associated with the word programming. The original collection of programmed instructions used to achieve a process was called “source code” and a failure of any type meant that the system was “down.[38]” An early error with one of the first computer systems was eventually traced to the fact that insects had gotten into the circuitry, an event that gave rise to wide-scale application of the word “bugs” as a reference for a range of problems[39]. Over time computer circuitry became much more reliable and people came to realize that errors were less likely to be due to a processing error than to the input of incorrect or damaged information. This realization gave birth to the concept of “GIGO”[40].

At this same time the practice of making one or more duplicate copies of data was seen as an important operational strategy and the process of “backing up” data became commonplace. Eventually, someone got the idea that it would be better if this copied information was taken to a second location away from the central computer site (hence the term “offsite”) in an effort to reduce the risk of damage or loss (an early form of risk assessment and mitigation)[41]. Many of these techniques and terms are still in use today, but some terms – like “offsite” have taken on a more general meaning than just digital data. For example, people who work from home or while traveling are said to be working “offsite.”

Some records and classes of information were considered to have an ongoing value and the age-old concept of archiving was applied to computer information. As more of our economy became dependent on intangible assets, archived digital information was used as a form of collateral in various business transactions. This practice led to the adoption of the concept of “escrowing” source code – a term adopted by the computer industry that had previously been reserved for financial assets or real property.[42]

Other Concerns

While the computer industry pushed the frontiers of technology ahead, businesses began to understand that data backup was only one facet of disaster planning. Others issues such as alternative sites and replacement staffing began to attract attention. Many of the larger companies also understood that having backup tapes was little consolation if a disaster prevented access to their data center.

In an effort to address these issues, the Sun Oil Company in 1978 initiated an arrangement with twenty other Philadelphia-based organizations to act as backup site for each other and to share resources among the group. The group signed a lease for property at 401 Broad Street in Philadelphia and with this step, another aspect of the modern disaster recovery business was born. Some five years later, Sun Oil spun out the division of its data processing group that was managing this resource and the SunGard company was born[43].

An Ever Changing World

The September 11, 2001 destruction of the World Trade Center in New York City and the destruction of part of the Pentagon Building in Washington DC put a number of business continuity plans to the test, but also sparked the attention of management teams all around the world that instantly understood that disasters can strike anyone, anywhere without warning.

On a local level, hundred’s of businesses located in downtown Manhattan simultaneously faced a catastrophic loss of people, facilities, infrastructure and data. Many business continuity plans which had been developed in the few years since the first attack on the NYC World Trade Center, where activated. The fact that so many businesses were able to survive and eventually return to normal operations is a testament to the excellent planning and professional execution of these plans by the continuity professionals at these firms.

Today, business continuity planning is no longer viewed as an arcane discipline. The Internet has become a major marketplace for millions of businesses, institutions and government agencies that find that they must now operate twenty-four hour a day, seven days a week as the world enters the 21st Century and the next phase of business evolution – a true information-age economy.

Longer term, commerce and communications will become even more entangled and traditional businesses will continue to change. In a world economy where it is becoming ever more difficult to differentiate between an organization and its’ supply chain, service level agreements (SLAs) will increasingly be used to manage the interaction between vendors and clients. Being able to confidently operate a business under the terms of an SLA will become a major secondary mission of business continuity planners and the need for consultants to facility this transformation is expected to grow exponentially over the next few years.

And to think it all began with teaching children songs around campfires thousands of years ago!


[1] The process of natural selection should also be viewed as a species-wide coping strategy that applies to all forms of life from viruses on up.

[2] These are also known as fat deposits.

[3] Think of them as “verbal storage units.” Some of these tales were also associated with various ceremonies and dances – both of which are memory enhancing techniques and can also be used to convey information as in the Hula dance of the natives of Hawaii and the rain ceremonies of the indigenous natives of America.

[4] Katie Harrow: “A Brief Guide to the History of the Written Word”, www.newarchaeology.com/articles

[5] An image that represents the word or phase phonetically. Example:  translates, “I love you.”

[6] For example: Hiragana and Katakana versus Kanji.

[7] The exact origin of this term is a matter of debate with some claiming that the symbol is an abbreviation for the Latin words: “recipere” or “recipe,” which means “take, thou.” Others claim it had its origin in an invocation to the Roman god Jupiter while another theory claims it is derived from the ancient Egyptian hieroglyph for the god Horus.

[8] From Wikipedia: The Free Encyclopedia. at http://en.wikipedia.org/wiki/Alexandria.

[9] Which would put the combined page total at between 10,000,000 and 17,500,000 pages of information.

[10] In fairness, it must be noted that other sources attribute the destruction of the library to at least two other culprits: the Orthodox Patriarch Theophilus of Alexandria and Caliph Omar of Damascus.

[11] From the Egyptian Weekly, Al Ahram, at http://weekly.ahram.org.eg/2003/668/he1.htm

[12] CTR Corp. became IBM

[13] For example, the Zuse Z3 was publicly shown in May of 1941 in Germany.

[14] IBM commercialized this technology which was first used on the UNIVAC 1 in 1951.

[15] Guidelines already exist that encourage U.S. brokerage firms to archive “instant message” discussions.

[16] That is – sequentially transmitted to the computer system. The term is a reference to the speed with which water or another object moves and is derived from a design term such as streamlining.

[17] For example, optical disk-based products like CDs and DVDs.

[18] 5 MBs for $50,000. Adjusted fro inflation, we have seen a improve in commodity disk drive price/performance of around 70 million!

[19] As opposed to tape technology which, because of its physical layout was more of a sequential access media.

[20] For example: Hamming Codes which offer block parity and forward error correction.

[21] For example the use of cyclical redundancy check sums (CRCs) and longitudinal redundancy check (LRCs).

[22] Originally an acronym for “redundant arrays of inexpensive disks,” this term has been redefined because of the general price decrease of disk drive technology to mean redundant arrays of independent disks.

[23] More precisely, in the commands that were programmed into the drive controller known as “firmware.”

[24] This same technology could be applied equally well to magnetic tapes. During his time at Data General Corporation, the author was involved in just such an engineering initiative that applied RAID techniques to magnetic tapes. For economic and performance reasons, this product line was not commercially released.

[25] Memory stick are used in some very small MP3 players as well as so called “jump drives” or “thumb drives and can be found dangling on key chains around the world.

[26] Early attempts to cost justify investment in computing equipment emphasized the efficiency of automation over manual processes. Computer architecture was initially focused on increasing speed by creating larger and larger machines. The economic goal behind this strategy was the achievement of “economies of scale.”

[27] This is a cost displacement argument that states that the same or less money can be spent to gain better performance.

[28] This approach to computer design is best described as “brute force” computing.

[29] And today, massive parallel computers and the field of grid-based computing.

[30] Except perhaps Apple Computer which has a marketshare measured in the low single digits.

[31] This term arises from the fact that in the construction of early computers, the electronics used for the central processing unit made up the primary or “main” part of the device. These electronic boards were physically put into a metallic frame – thus the term “mainframe.”

[32] The author was involved in the introduction and marketing of the first computer clusters by Digital Equipment Corporation in the early 1980s.

[33] Here again, the author was involved in the creation and launch of what is widely regarded as the first NAS system – the Epoch-1 from Epoch Systems, Inc.

[34] This topic is dealt within detail in another section of this book.

[35] This retrieval time is usually measured in days.

[36] As in data center or data processing.

[37] The “put” in these words is widely acknowledged to refer to the “put” in computing

[38] There is some evidence that the term “down” came from the observation that during the repair process technicians took the damaged component circuit boards down from the mounting cage holding the rest of the computer system in place while they repaired it.

[39] People often talk about “buggy” software or applications and its use is not limited to the computer business since it is not unusual to hear statements like: “I’m still working the bugs out of this car’s engine.”

[40] An acronym for the phase: garbage in, garbage out.

[41] Spawning a number of other companies who catered to this need including Iron Mountain.

[42] Escrowing was generally reserved for the source code of various application programs.

[43] From Sungard Magazine at http://www.sungard.com/magazine/sungard3235annhistory.pdf

Pin It on Pinterest