History of Data Destruction

October 20, 2020 at 9:00 am by Amanda Canale

For thousands of years, humans have recorded and documented history, stories, and their life experiences. These written records have transformed from cave wall drawings and papyrus scrolls to printed novels and Kindle books. With the transformation of the written word, the methods of destruction have also evolved. Let’s dive into some of the history of data destruction methods and some of the key players involved.

4000 B.C. Egypt: The Invention of Papyrus
Papyrus, the world’s first ever form of paper, was invented in ancient Egypt thousands of years ago in approximately 4,000 B.C. People began using it to document history, life events, news, and stories. With the inception of recorded information came the need to destroy that information, whether to prevent confidential information from being stolen or placed into the wrong hands or destroying information that was deemed inappropriate or blasphemous. When the need for destruction would arise, without modern day shredding technology, people were forced to resort to manual destruction of papyrus scrolls. Fire was also a viable option to destroy recorded information, as seen in the 48 B.C. destruction of the Royal Library of Alexandria and its loss of 500,000 scrolls’ worth of recorded history.

1909 New York City: Abbot Augustus Low’s Paper Shredder Patent
New York City-based inventor Abbot Augustus Low is known for his invention of the first ever paper shredder in 1909. Unfortunately, Low passed away shortly after filing the shredder’s patent and was unable to manufacture it beyond just an initial prototype. His invention was primarily intended to be used in banks and counting houses.

1935-1959 Germany: From Pasta to Particles
It wasn’t until thirty years later in 1935 when the paper shredder was actually first manufactured. Adolf Ehinger created the first real paper shredder as a matter of life or death; at the time, he was living in Nazi Germany and was being questioned about the anti-Nazi literature in his garbage. Ehinger created a paper shredder that mimicked a hand-cranked pasta maker to destroy the literature and was able to successfully avoid persecution.

After this incident, Ehinger added an electric motor to his paper shredder which he was able to market and sell throughout the Cold War in the 1950s. Once his machine quickly started gaining popularity, his company, EBA Maschinenfabrik, crafted the first cross-cut paper shredder. This newer model not only shredded the documents into strips, but also sliced them into smaller pieces similar to confetti to ensure extra security.


1940s: The World’s First Degausser
After the introduction of iron ships in the late 1800s, scientists and crew members soon discovered that iron had an interesting effect on compasses and magnetic fields. It wouldn’t be until decades later when they would use this information to create the first ever magnetic degausser.

Decades later during the early days of World War II, Canadian chemist Charles F. Goodeve was working for the British Royal Navy researching methods to disarm war mines. In 1939, a British naval shore was targeted by a German mine that, luckily, had been disarmed before causing any harm. After conducting research on the now disarmed mine, Goodeve and his team were able to discover that the mines were equipped with triggers that would detonate based on the surrounding gauss level. A gauss level, named after scientist and mathematician Carl Friedrich Gauss, is a unit for measuring magnetic density. This discovery was major news back then as the British Navy was able to install electrical cables lining the circumference of their ships that would carry an electrical current, ultimately neutralizing the ship’s magnetic field. This first act of degaussing allowed the British naval ships to remain completely undetected by the Germans and enemy mines. It was this revolutionary technology that has led to modern-day degaussing of tapes and other magnetic devices.

1968: The Inception of Security Engineered Machinery
Korean War veteran and SEM founder Leonard Rosen created the first ever paper disintegrator in 1968 after the infamous Pueblo Incident. The Pueblo Incident occurred on January 23, 1968 when the USS Pueblo, a U.S. Navy intelligence vessel, was intercepted by North Korean patrol boats. In an act of desperation to protect national secrets, the Pueblo crew members began furiously trying to destroy the onboard classified information. Unfortunately, the crew was unsuccessful in their mission and were forced to surrender, leaving their attackers with free reign over the remaining documents.

In comes Leonard Rosen. This incident didn’t sit well with Mr. Rosen, a Korean War Veteran, who began to draft a better paper destruction method specifically for confidential and classified information. Within a matter of a few weeks, he had created the world’s first paper disintegrator. What makes the disintegrator different and more secure than a paper shredder is that it uses a repeating knife chopping process and screen that the particles must pass through. Disintegrator particles pass through the sizing screen in irregular shapes, sizes, and orientations and fill the waste chambers at different times, all of which makes it much more difficult to piece the now destroyed records back together.

SEM Founder Leonard Rosen with his invention, the disintegrator.

Since 1968, data destruction methods have only become increasingly more advanced and secure. The commodified use of paper shredders has transformed from being solely in government buildings to now virtually every place of business and personal homes. Shredders have steadily gained popularity over the years due to infamous incidents like the Watergate Scandal in 1973 and the Iranian Embassy siege in 1979, and are now equipped to shred magnetic drives and other forms of optical media.

For over 50 years, SEM has been the driving force behind innovative data destruction methods and has laid the groundwork for end-of-life best practices. Today, we are the industry leader for electronic media crushers and shredders, and have data destruction equipment in every U.S. embassy, military base, naval ship, and government building across the globe. We know that the best way to protect federal and personal information is to conduct all end-of-life data destruction in-house with SEM’s state-of-the-art destruction equipment.

A History of Data Reconstruction and Why Proper Data Destruction Matters

January 20, 2020 at 8:00 am by Paul Falcone

Throughout history, data has been recorded and documented in many different ways. From painting on walls to writing scribes to printing books to digitizing information, the world will continue to find increasingly unique and complicated ways to store and share all of this information. But with each of these developments comes a new challenge, because no matter how many new ways are created to store and share data, there will be an equal number of ways to destroy and lose that data.

And once that data is destroyed, is it really gone forever?

Today, when media is not destroyed with high security end-of-life equipment, there is almost always a chance that some, if not most, data can be recovered. But in the past, it was much harder to recover records that were seemingly destroyed beyond repair. This didn’t stop talented groups of people, enemies in war, and researchers from attempting to try, and this post will examine a few of the notable times in the last century that data recovery was used when all was thought to be destroyed or lost.

The US Embassy in Tehran, Iran

In November 1979, after years of tension across various issues, the Iranian Revolution erupted as a push back against the Shah leadership to replace the government with an Islamic republic. At the time, the US backed the Shah leadership that was being revolted against, and, during the revolution the embassy located in the Iranian capital, Tehran, was overrun by students in the city who were part of the revolution. This takeover began what is now known as the 444 Day Crisis, a hostage situation that would define Jimmy Carter’s presidency and last over a year to see its eventual conclusion.

Acting as fast as they could, CIA personnel within the embassy tried to destroy and shred all of the classified materials that resided within the complex until the last moment of capture, but unfortunately they couldn’t destroy everything. It turns out that even the classified materials that were shredded were not completely safe from the Iranian forces that moved in, holding the now 52 hostages prisoner within the embassy. During the next 444 days, and the years that followed, the Iranian government dedicated a team to focus on manually reconstructing the shredded data, eventually publishing the classified materials for the world to see.

The documents contained a variety of classified materials and top secret information. Some of the information contained details on US plans to recruit high ranking Iranian officials, journalists, and more. They also included information on how to open safes within the embassy, photos of Russian air bases, and detailed biographies of persons of interest in Iran and the surrounding nations. This loss of classified data was considered to be the single largest loss of materials at the time, and the effects of the hostage situation and revolution are still felt around the world today.

The National Personnel Records Center Fire

national-personal-records-fire

On 12 July, 1973 fire alarms sounded as the Military Personnel Records Center had a fire break out on the sixth floor, the top floor of the building. This branch of the National Personnel Records Center was home to over twenty million records of past United States service members from the 20th century. All with no duplicates, back up, or photocopy. The fire would continue to burn out of control in the building that was over 1.2 million square feet, nearly three football fields long by one football field wide, and ultimately took two days until the fire was completely extinguished.

At the time of the fire, over 52 million records were housed in the Military Personnel Records Center. By the time the fire was put out on July 14, the entire sixth floor was destroyed and an estimated 16-18 million records were damaged or lost, including roughly 80 percent for Army personnel discharged between 1912 and 1960 and 75 percent for Air Force personnel discharged between 1947 and 1964. Since no backups of any of the records existed, damaged and partial records were saved and documented in the hopes that some form of recovery could be possible.

Almost immediately, a team was assembled to work on a data reconstruction initiative. Records that were only partially damaged were manually reconstructed, while the majority were stored to be accessed at a later date. These records were vacuum dried and then frozen to store away so that the paper wouldn’t degrade any more than the deterioration that had already occurred. A team of 30 full time employees work specifically with responding to families requesting information related to files lost or damaged in the fire. An additional 25 employees work on preservation, attempting to store and reconstruct the damaged files. In the beginning, any reconstruction effort was done manually by these 25 employees, using nothing but their eyesight to try and reassemble the burnt pieces.

Advancements in technology in recent years have allowed for faster and easier reconstruction. While still difficult, infrared sensors and cameras can now pick up additional data that the naked eye cannot see. These exposed patterns from the infrared sensors allow data reconstruction specialists to take pictures showing this data. It is then further manipulated in software like Photoshop, ultimately allowing specialists to identify and place pieces together to complete the puzzle that would have been impossible years prior.

The team continues to receive over 5,000 requests a day and are constantly observing new technologies that can aid in reconstructing the lost information.

The Columbia Space Shuttle

The drive that fell from the Columbia space shuttle.

 

On 1 February 2003, the space shuttle Columbia was making its re-entry into the earth’s atmosphere after 17 days in space. Unknown to the team members, a piece of the shuttle’s insulation foam had become detached from the space shuttle, causing it to catch on fire and combust upon its re-entry. The disaster resulted in the loss of life of everyone on board and the shuttle completely disintegrating as it fell to earth. Six months later, in a muddy riverbed, a rotational hard drive was found that was believed to be from the shuttle, and Kroll Ontrack was hired to try to recover the data off of it.

The drive was present on the Columbia during the explosion upon re-entry into the Earth’s atmosphere. Then, after the explosion, the drive fell over 40 miles at terminal velocity while on fire into a riverbed, where it stayed for six months prior to being found. Ultimately, once the team finished their work, over 99% of the data that resided on the drive had been recovered.

To begin the data reconstruction process, the exterior of the drive was carefully cleaned and deconstructed, allowing the team to extract the rotating metal plates. After carefully reassembling the plates to working condition, they were placed in new hardware that allowed them to spin again and see the information that had been gathered from outer space. Ontrack today continues to use their expertise to extract data off of media that is deemed impossible to recover.

Why Proper Data Destruction Matters

Why do all of these data reconstruction stories matter? Apart from them being incredible feats of (both good and bad) data reconstruction, it drives home the important message that disposing of data properly is imperative. A drive falling from outer space on fire is not secure. Shredding documents through embassy shredders is not enough. A fire burning for over a day that destroys 18 million documents wasn’t enough to destroy everything completely.

So, if data is present that is classified, top secret, or even contains personally identifiable information (PII), precautions need to be taken to ensure that data is disposed of securely. Having the correct equipment, and finding the right data decommissioning plan, is the first important step. That way data that is supposed to be gone forever, stays gone forever.

Also, if you think you lost data, chances are there’s a way to get it back. Even if you fell from space.

Data Storage Technology: Then and Now

December 5, 2019 at 2:29 pm by Paul Falcone

Data is stored in a wide variety of ways to perform a seemingly limitless number of applications. In essence, whether you’re filing paper in a cabinet, burning files to a disk, or writing information on a hard drive, you are manipulating data. And in today’s digital age, we are witnessing continually expanding capabilities for the creation, dissemination, and destruction of data.

As these capabilities grow, so too does the need to store more data in more electronic formats. Consider, for example, that in 2018, it was estimated that over the previous two years alone, 90% of all the world’s data was generated. Of necessity, manufacturers have responded by producing new technology that stores unprecedented amounts of data.

With data storage technology rapidly evolving and being adopted by businesses across all industries, organizations are being forced to likewise adopt and implement data management and data end-of-life destruction plans that are aligned with these new data storage processes. As such, it’s important to have an understanding of today’s state-of-the-art storage media technology.

Hard Disk Drives (HDDs)

Hard disk drives are typically found in most laptop and desktop computers. They can be internally mounted within the computer chassis or externally connected through appropriate ports, such as USB. Within the HDD casing are spinnable metal disks (platters) with a mirror finish optimized for storing magnetic charges. These platters are divided into sectors that contain subdivisions measured in bits or bytes. Above the platters, the read and write head waits for instructions from the CPU and motherboard. After you click Save, the read and write head is directed to the appropriate sector on the platter to apply an electrical charge. Each bit within the sector will then carry a magnetic charge that translates to a binary 1 or 0, strung together to form a code capable of instructing your computer to complete a specific task, e.g. opening a saved document or utilizing saved software code to complete an update.The limitations with HDDs relate to their instability around magnetic fields, as well as the possibility for data to become scrambled if materials within the platter fail and become malleable when not intended to do so.

Western Digital and Seagate are championing new technologies: microwave-assisted magnetic recording, or MAMR, and heat-assisted magnetic recording, or HAMR, to further expand hard disk memory capacity. These new technologies utilize more stable materials when constructing the platters, resulting in smaller sector size that enables more data to be written on the platters. These materials are made malleable for data processing by using new HAMR and MAMR read and write arms. These innovations will bring consumer-level HDDs to the market that are as durable as current enterprise-level drives.

Solid State Drives (SSDs)

Unlike HDDs, SSDs use semiconductor chips built of transistors and cells (similar to the RAM chips attached to your motherboard) that utilize flash memory instead of magnetism for storage. Whereas RAM is referred to as a form of volatile memory (i.e., nothing is retained once the machine loses power), SSDs (like HDDs) are nonvolatile and retain data after a machine is powered down.

While HDDs utilize a spinning platter and mechanical parts that activate with the machine’s power, SSDs contain no mechanical parts. Instead, SSDs operate using NAND flash memory, the same technology utilized in thumb drives/small USB storage devices. There are two types of flash memory: NOR and NAND. NOR flash reads faster but is more expensive and takes longer to erase and write new data. NOR flash is ideal for high-speed, read-only usage such as code storage for devices like mobile phones and medical equipment. In contrast, NAND has a higher storage capacity than NOR.

NAND flash is ideal for typical SSD storage drives because their construction enables them to read and write new data much faster and also to house more data. NOR cells are wired parallel, while NAND cells are wired in a series. With fewer wires and cheaper construction costs, NAND cells are better suited for consumer SSD storage.

NAND cells form transistors arranged in a grid that receive precise charges to create 1s or 0s; if the current is blocked to a specific transistor, it has a value of 0, and if the transistor conducts the current, it has a value of 1. At the intersection of each column and row on the grid are two transistors called the control gate and the floating gate. The control gate accepts the charge and the electrons move to the floating gate and apply charges to the transistors, resulting in a unique pattern of 1s and 0s.

Given the way data is created, stored, and accessed, SSDs are able to access all pieces of data at an equal speed and read and write significantly faster than HDDs, which rely on a spinning disk and mechanical parts to locate the right data within the right region. A computer user employing powerful applications (e.g., video and image editors, animation software, large video games) would notice their computers operating significantly faster with an SSD than an HDD.

HDDs are still relevant, however, because of their potential longevity. SSDs can write data quickly to an empty space, but overwriting stresses the circuits and creates more transistor resistance. As information gets manipulated and rewritten on an SSD, the old data will be completely erased before the new data is saved. This could eventually render an SSD as a read-only device without the ability to manipulate or write new data to the drive.

Optical Storage Devices

Since the introduction of compact discs (CDs) in 1982, optical media has become ubiquitous. Even with the recent trend toward cloud-based, digital storage options, optical media is commonplace. Because of its potential for speed, stability, and the ease of reproduction, optical storage is here to stay for the forseeable future.

Optical devices use optical technology (i.e., the use of light to transfer data from one point to another) to write information to a surface that can then be interpreted by a laser. Optical media has three necessary layers: plastic, reflective aluminum, and polycarbonate. The laser forges nano bumps on the plastic layer of the disc in a spiral-shaped pattern that correspond to the 1s and 0s of binary code. When a computer uses a laser to read the data, the reflective aluminum layer bounces the laser back to a detector on the device that transcribes the 1s and 0s to conduct a specific action without having to access every file within the disc. The outer polycarbonate layer serves as a protective coat to preserve the integrity of the data on the disc.

As optical technology became more advanced, utilizing improved laser ability to create smaller bumps and compile more data within the plastic layer, digital versatile/video discs (DVDs) emerged in the late 1990s with the ability to store a significantly larger amount of data than CDs. Blu-ray technology advanced this innovation even further by utilizing a shorter-wavelength blue laser to create smaller bits of data on up to two plastic storage layers capable of storing 25GB of data each.

Implications for Data End-of-Life Destruction Solutions

As innovation continues to fuel the technological space and allows data to be stored in ever-smaller formats, the destruction of data at end-of-life becomes more challenging. Drives and disks must be broken down into even smaller pieces to ensure those tiny bits and bumps of data cannot be recovered by the increasingly sophisticated tools and expertise that characterize data criminals.

This is particularly important for companies and organizations that work with classified information, personally identifiable information (PII), or any other form of confidential/sensitive information. Creating an in-house plan utilizing sophisticated data end-of-life technology from companies like SEM—which currently boasts the only devices rated for the successful destruction of enterprise drives—is the best way to ensure total data annihilation. .

 

Data Destruction from 1980-2014: a Retrospective

June 1, 2014 at 5:01 pm by SEM

After 34 years in the information destruction industry, I am finally riding off into retirement or to use a famous golf analogy, “I am on the back nine.”…..actually, I am on the 18th hole walking toward the clubhouse. I would like to take a moment to reflect on the various changes in the shredding industry (if any) and mention a few of the accepted methods used today to destroy the most common forms of media.

What’s changed: Back in the early 80’s, or as far back as I can remember, the most common acceptable methods of destroying paper was incineration or disintegration. Both methods are still in use today but incineration is less common due to environmental restrictions and inconvenience. Back in the day, high security cross cut shredders were not yet approved for top secret level paper. Disintegrators, which pulverize paper into tiny bits, was the most common destruction method with most federal government organizations and private industry companies that were tied directly to the defense industry. Disintegrators were first introduced by SEM back in the mid 60’s and due to their ruggedness & versatility, are still used today to destroy a variety of media forms.

When classified top secret paper shredders finally arrived on the scene in the early 80’s, the approved shred size was 0.8 x 11.1 mm (1/32″ × 7/16″). Events in history, including the Iranian US Embassy hostage takeover and the episode with Colonel Oliver North shredding the Iran –Contra documents in the late 80’s, created a significant public awareness on document shredding.

In the mid 80’s, it was uncommon to see the general public or non-government companies shredding sensitive documents. In October 2008, the government established a new set of guidelines requiring an even smaller (1mm x 5mm) shred size for top secret paperwork. In government circles, this is referred to level 6 / P-7 or NSA approved.

 

What happened to the Paperless Society ? When microfiche arrived  (are you old enough to remember micro fiche?), it was common to hear people say that we have finally entered a “ paperless society”. I remember how scary it was for a young sales guy to hear that statement, especially when you are trying to make your living selling paper shredders. From a security standpoint microforms, which included mostly micro fiche and 35mm film, created somewhat of a destruction issue due to the fact that it was a reduced image material that required extreme destruction standards. Imagine that, a whole book on one sheet of film. At the time, the government destruction standard for microforms was a dust-like particle. Back then, there were very few approved microfiche shredders on the market and they could only shred very small amounts at a time. It was a tediously slow process.

What hasn’t changed: So here we are in 2014. We have our high speed super computers storing information on a variety media including: CDs/ DVDs, data tapes, hard disk drives, solid state drives and all kinds of media too long to mention that is storing sensitive information. Even though these forms of media are supposed to reduce the amount of paper, the paperless society never materialized and paper is still here in force. The way in which information is stored may have changed but what has not changed are the methods to eliminate the information. In the end, whether its paper or optical disks or data tapes or hard drives, shredding is the most accepted method to delete the data.

I will now move on to another great challenge in life for me, breaking 100 (of course, I am referring to golf). See you at the 19th hole!